Colorado law enforcement welcomes speedy AI facial recognition technology along with rules, some advocates worry about privacy and misuse

Listen Now
4min 51sec
Arvada Police officer Brian Malone
Dave Snelling/Arvada Police Department
Arvada Police officer Brian Malone stands where he used artificial intelligence facial recognition software and eventually helped a woman threatening to jump 60 feet off the bridge off an RTD park and ride garage.

Editor’s note: This story includes a description of an attempted suicide.  If you or someone you know is considering suicide or other acts of self-harm, please contact Colorado Crisis Services by calling 1-844-493-8255 or texting “TALK” to 38255 for free, confidential, and immediate support.


The woman in Arvada had her legs dangling off a 60-foot bridge that went straight down to the concrete and told police she was thinking about jumping.

Arvada Police Officer Brian Malone tried to gain some rapport with her when the officer, who had been on patrol by himself for a year, was called out to help. He asked her name. He chatted with her about her day and what she was worried about, but she warned him if he got any closer, she would jump.

Malone noticed that the name she initially gave him only agitated her when he repeated it. She continued to use both arms to pick her body up and swing herself over the 60-foot drop-off —  risking a deadly fall far below. 

Malone eventually stepped aside and snapped a still photo of her with his body-worn camera. Using artificial intelligence facial recognition software, he ran the photo through the department’s database.

Within a few seconds, there was a 92 percent match from a mugshot with a different name.

“We were strictly there to lend any support we could,” Malone said. “We weren’t trying to lock up anybody. We weren’t there to arrest anybody.”

Malone then decided to use her actual name — the one AI found scanning thousands of existing booking photos. That bought a few moments and something clicked. He eventually was able to get close enough to her to pull her from the ledge and put her in an ambulance.

They haven’t broadly touted it, but law enforcement agencies have used facial recognition technology since the 2000s — comparing photos of suspects or victims to existing databases of mug shots to get accurate names or contact information. Until about 2017, though, law enforcement officers said the technology was limited, slow, and, quite frankly, flawed.

But in recent years, as artificial intelligence has evolved to make the software quicker and more accurate, officers said, agencies in Colorado started using it more broadly.

“So often people go to big-box stores and shoplift, and there are photographs in there or they use fraudulent credit cards. It has been very difficult as we’ve worked through the challenges with the state legislature to allow us to run the faces through these databases,” said David Shipley, a former Adams County detective and commander who now runs the Colorado Information Sharing Consortium, which coordinates the statewide database of faces, mostly from mug shots. “It’s worth the effort to run a photograph of an unknown subject to give us leads on who this might be.”

Rules set around AI

For agencies approved to use the program, these cop intuitions are legal within the guardrails set by state lawmakers in two different state laws governing the law enforcement use of facial recognition technology. 

Officers are required to subject any AI facial recognition result to a “human review” — particularly if a result sparks a match that could lead to an arrest. Law enforcement agencies also must fully disclose their use of facial recognition technology to the community and to the governing body that oversees them, be it a city council or county commissioners.

These are high enough hurdles, at least so far, that only six agencies in the state are fully authorized to use facial recognition technology legally for purposes of investigations — the vast majority of them super small, including Meade, Brighton and Avon PDs.

Several others are in the process of training and having public meetings.

The state’s largest agencies, Denver Police, El Paso County Sheriff’s Office, the Colorado Springs Police Department and the Jefferson County Sheriff’s Office, aren’t using it at all — at least not yet.

Law enforcement agencies that use AI say it is just another tool in the box to help them identify suspects, solve crimes, find runaways and other missing people, and, as in the Arvada story, help figure out when someone may be being dishonest about their name or need mental health help. 

“We couldn’t get an arrest warrant for somebody simply based on this one factor,” said Det. Dave Snelling, a spokesman for the Arvada Police Department, which is one of the larger agencies with access to the database. “And I don’t know any jurisdictions that would, based on this.”

Colorado Springs police cruiser
Hart Van Denburg/CPR News
A Colorado Springs police cruiser.

The vast majority of the Lexis Nexis database is composed of booking photos, or mug shots, taken at jails and law enforcement agencies during prior arrests. But there are photos in there, too, of runaways and other information provided by families when reporting concerning activities or crimes, said Devoney Cooke, another Arvada police officer.

But she ensures it’s not random.

“It’s not a photo from a traffic camera or something off their social media or something off the street, it’s not just a random photo,” Cooke said. “People think that everything is taking a photo of them and every photo that gets taken of them can be utilized by us to compare to facial recognition, and that’s not the case.”

Snelling compared it to a fingerprint match.

“There are so many points to a fingerprint and we have to have so many before it’s a match, basically it’s the same thing with facial recognition,” he said. “Now with the technology and rising crime, we have to do something … it’s the same as cell phone tower searches.”

Imperfections and potential flaws

But privacy and civil rights critics say law enforcement’s impulse to use the faster AI technology to solve crimes is worrisome. 

Plus, facial recognition software often has racially based flaws.

In Detroit, a Black man was wrongfully arrested in 2020 and spent 30 hours in jail for a crime he didn’t commit after police detained him because of a face match through AI technology. 

“The Detroit story is not a one-off,” said Anaya Robinson, a senior policy strategist at the ACLU of Colorado. “People of color in general, women, children, trans individuals. Facial recognition is not flawless … It still has a lot of significant problems. If law enforcement is going to use it in the state, we would like to see better safeguards.”

Broadly, Robinson is worried about how long they’re keeping the data in the facial recognition database, which photos they’re using as comparisons and how they’re getting the initial still photo to begin with. 

Currently in Colorado, Shipley said law enforcement agencies are only authorized to compare faces to existing mug shots in a Lexis Nexis database and not, say, faces from social media or the drivers licenses bank operated by the state’s Department of Revenue.

“We have to look at it only as a lead generator because the computer or the technology can make suggestions of who you are, but real police work has to happen to make sure your civil rights are protected,” said Shipley, who has been a peace officer since 1977. “You have to look at any technology to be a suggestion and not the absolute.”

Anaya Robinson at ACLU offices in Denver
Hart Van Denburg/CPR News
Anaya Robinson at the ACLU offices in Denver, July 24, 2024.

But even if law enforcement agencies are only using booking photos, Robinson pointed out that policing and criminal records are severely overpopulated by Black and brown faces and that facial recognition technology has been proven to be biased against people of color as well.

“When we’re trying to use technology that we know misidentifies people of color within databases that are heavily populated by people of color the rates of misidentification are going to increase more than they would otherwise,” Robinson said, noting that the vast majority of people who create the facial recognition software are white and they’re utilizing more white faces to train the technology. “We have concerns that there is potential that it is not being followed as carefully as it needs to be.”

Lawmakers grapple with AI oversight

Even though law enforcement agencies acknowledge they have used some kind of facial recognition technology for more than two decades, the legislature got involved in governing it only in 2022.

In addition to protocols put in place for how officers can use the software, the law created a task force that monitors how local governments use facial recognition software. A more recent law, though, has morphed that task force into one that looks at AI broadly with less emphasis on law enforcement.

Democratic state Rep. Brianna Titone, who represents parts of Arvada, said there are benefits to facial recognition technology, but there needs to be a privacy balance.

240223-LEGISLATURE-DEMOCRATS-TITONE
Hart Van Denburg/CPR News
Democratic state Rep. Brianna Titone in the House, Feb. 23, 2024.

“I don’t think law enforcement has the intention to use or misuse the technology, but sometimes it’s easy to cross that line,” said Titone, who helped author another bill signed by Gov. Jared Polis a couple of months ago that added consumer protections in state law regarding it. “When it comes to any kind of AI in general, we have to have a constant conversation about it. As this technology gets better and the uses become more common, we have to have a conversation about how to place guardrails on these things without stifling the innovation that is good.”

Robinson points out that even if law enforcement agencies have rules about when and how they use AI technology, that doesn’t mean individual officers will always do the right thing.

“Historically, law enforcement officers use technology in their departments in inappropriate ways,” he said. “Single officers don't always follow those policies. And having access not only to a database full of individual personal data, which includes not only folks who have been charged and arrested, but also witnesses victims, and being able to use that with facial recognition allows for a level of surveillance of more of the general population than feels comfortable from a civil liberties perspective.”

Wheat Ridge Police Chief Chris Murtha is in training and organizing the requisite community meetings to get approved to use facial recognition technology in the suburb west of Denver. 

Murtha remembers in the old days when detectives would pore over piles of old suspect pictures and booking mugs to try and see whether they could identify a suspect of a crime.

The process could take weeks.

“To replace the human element, several detectives hand-searching photographs to find one that matches,” he said. “We used to be forced to go through it, spend resources. It doesn’t give us a new result, we likely would have found the same photographs using AI. It’s just a matter of expediency and using hand resources in a better way.”


Editor’s note: This story includes a description of an attempted suicide.  If you or someone you know is considering suicide or other acts of self-harm, please contact Colorado Crisis Services by calling 1-844-493-8255 or texting “TALK” to 38255 for free, confidential, and immediate support.