The chiefs of two Greater Toronto Area police forces now slamming the brakes on the use of a contentious facial recognition tool admit the app should have undergone greater scrutiny before it was tested by their investigators.
Clearview AI, a U.S.-based app providing an artificial intelligence-powered tool to law enforcement agencies, “clearly” deserves “a lot more thought and scrutiny,” Peel Regional Police chief Nishan Duraiappah told the Star in a sit-down interview Friday, the same day federal and provincial regulators launched an investigation into the potentially illegal tool.
Both Duraiappah and Halton Regional Police chief Stephen Tanner say the tool was first introduced sometime last fall to investigators from their respective Internet Child Exploitation (ICE) units. The officers were attending a professional development event, both chiefs say — a gathering that typically sees officers from different forces swapping tools and techniques and includes law enforcement product vendors.
Billed as a database of more than 3 billion images scraped from Facebook, YouTube and millions of other websites, the app’s ability to match an image of an unidentified person caught the attention of ICE investigators, each one of whom “just lives and breathes identifying children that have been exploited,” Duraiappah said.
“We can’t be the gatekeeper of every conference that we go to,” Duraiappah said. “I can see how an officer got a test licence and said, ‘Hey, this sounds interesting. Could it be useful to our work?’”
Nonetheless, when he learned in January that officers were testing the technology, Duraiappah — like police chiefs in Toronto and Durham Region — immediately halted its use in light of mounting concerns about inherent privacy concerns.
Speaking to the Star this week, Tanner agreed that in hindsight, greater evaluation was needed before his Halton officers tested the tool.
“I would say we — I or we — could have done more background checking into it before we implemented any pilot projects,” he said.
Unlike other chiefs, Tanner did not initially stop his officers from using the tool when he learned they were testing it, because he believed it was fair game to use an app using images scraped from the open web, he said.
“Clearview is just a name — I don’t know what Clearview is, really. I’ve read a little bit about it in the last few days,” he said.
But while both chiefs acknowledge the need for a thorough review of Clearview AI, they stress that when used appropriately, facial recognition technology boasts dramatic potential to speed up investigations, identify victims of child pornography, link crimes across geographic regions, and more.
“There’s gotta be a space for us to use AI (artificial intelligence),” said Duraiappah. “I’m trying to be sensitive, but also not lose the narrative about how this could be good for us.”
Clearview AI’s database of images may violate laws protecting Canadians against businesses collecting personal information, including photos posted online, without their consent. The investigation launched Friday by privacy commissioners of Canada, Quebec, Alberta and British Columbia is seeking to clarify the issue.
In an email Friday, Tor Ekeland, lawyer for Clearview AI, said the tool “only accesses publicly available data from the public internet,” saying the app is “is strictly an after-the-fact investigative tool for law enforcement, and is used to solve crimes including murder, rape and child exploitation.”
Alongside Peel and Halton, Toronto and Durham police have confirmed they recently tested Clearview AI. A spokesperson for Durham Region said the chief was not available for an interview, while Toronto police chief Mark Saunders will speak to the media once that service’s ongoing review has been completed, a Toronto police spokesperson said.
On Friday, Hamilton police confirmed to the Star that officers from its force’s tech crime unit were given login credentials for Clearview AI as part of a trial period, but haven’t used the tool for any investigative purposes. A spokesperson said the force has no plans to adopt the use of Clearview AI.
The RCMP and Ontario Provincial Police will not say if they have used the facial recognition tool. It’s not clear if any arrests have been made as a result of the technology; both Tanner and Duraiappah said the tool wasn’t used on active cases.
Duraiappah says he’s unperturbed about “parking” Clearview AI — it wasn’t “a dominant software that we were committed to.”
But he wants there to be a greater conversation about other ways police can make use of facial recognition. Recently, he said, a civilian officer watched 1,000 hours of surveillance video to identify a face in an effort to solve a homicide — something he said could much more easily be done leveraging AI tech.
Both Duraiappah and Tanner said police should be able to search, via facial recognition, their internal databases of photos taken of suspects under the Identification of Criminals Act, which allows police, under certain circumstances, to fingerprint and photograph people charged with a crime.
Toronto police revealed last year that they were using facial recognition technology to compare images of potential suspects captured on public or private cameras to its internal database of approximately 1.5 million mugshots.
“I’m really interested in innovative ways to leverage technology to help us towards a public safety goal, without really jumping the queue on privacy issues,” said Duraiappah.
Get more of today’s top stories in your inbox
Sign up for the Star’s Morning Headlines email newsletter for a briefing of the day’s big news.
Brian Beamish, Ontario’s privacy commissioner, expressed concerns about expanding the use of facial recognition technology for police before all of the privacy considerations are carefully worked out.
“The potential privacy dangers of facial recognition have really come to a head around the Clearview AI application,” Beamish told the Star.
“I think we all understand that there are some legitimate law enforcement uses for biometrics and even facial recognition. But I think we have to be really careful about going down that road.”