AI facial recognition tech has a dark side, as Rite Aid case shows


Full story

Artificial intelligence facial recognition technology is used at airports, grocery stores, stadiums and entertainment venues. People even use it to unlock their smartphones several times a day. However, computers don’t always get things right.

The biometric face-scanning technology is growing in popularity among private businesses. However, what happens when proper safety measures are not put in place?

Tuesday, Dec. 19, the Federal Trade Commission (FTC) cracked down on pharmacy chain Rite Aid, banning the retail store from using AI facial recognition for the next five years.

In a federal complaint, the FTC accused Rite Aid of using the technology without warning customers. The agency also said the system sparked false positive alerts, flagging “customers as matching someone who had previously been identified as a shoplifter.”

The false alerts prompted Rite Aid employees to follow customers in the store, search them, ask them to leave, or call the police, despite the customers not engaging in any criminal activity.

In February, Porcha Woodruff was 8 months pregnant and getting her two children ready for school in Detroit, when the cops knocked on her door and said she was under arrest for a carjacking. However, she did not commit the crime.

The events all started when an automated facial recognition search mistakenly matched video evidence with Woodruff’s mugshot photo, which was taken 8 years prior for driving with an expired license.

Woodruff spoke with WXYZ-TV about the ordeal and recalled being afraid.

“My kids were there crying, and I’m just stuck because I’m like, ‘What am I going to do? These people are telling me they’re trying to arrest me and take me to jail for something I have no clue or no idea about,” Woodruff said.

She later filed suit against the Detroit Police Department for false imprisonment.

A study by the National Institute of Standards and Technology found that Black and Asian people are 10-100 times more likely to be falsely matched using facial recognition when compared with white people.

According to The Innocence Project, six people have already reported being accused of a crime following a false positive facial match. All six people were Black.

Earlier in 2023, customers filed a class-action lawsuit against Amazon Go in New York City for failing to notify customers the biometric technology was tracking their information. The lawsuit was filed one year after a law went into effect that requires businesses to post signs if they’re using the technology.

But despite the dark side, stadiums and music venues across the country are also implementing the AI technology for “faster, touchless entry,” and even shorter lines to snag some food or beer during halftime.

Cleveland’s First Energy Field, CITI Field in New York City, and the Rose Bowl in California have all implemented some form of facial recognition technology.

At dozens of airports, Clear, a private security screening company, plans to unveil the tech at its screening checkpoints in 2024. The Transportation Security Administration also announced it would implement AI facial recognition.

Privacy advocates and critics of AI facial recognition have warned that regulation on the technology is limited, and that personal information could end up in the wrong hands.

Meanwhile, proponents of the rising AI software argue that facial recognition helps companies achieve a safe and secure operation.

Tags: , , , , , , , ,

Full story

Artificial intelligence facial recognition technology is used at airports, grocery stores, stadiums and entertainment venues. People even use it to unlock their smartphones several times a day. However, computers don’t always get things right.

The biometric face-scanning technology is growing in popularity among private businesses. However, what happens when proper safety measures are not put in place?

Tuesday, Dec. 19, the Federal Trade Commission (FTC) cracked down on pharmacy chain Rite Aid, banning the retail store from using AI facial recognition for the next five years.

In a federal complaint, the FTC accused Rite Aid of using the technology without warning customers. The agency also said the system sparked false positive alerts, flagging “customers as matching someone who had previously been identified as a shoplifter.”

The false alerts prompted Rite Aid employees to follow customers in the store, search them, ask them to leave, or call the police, despite the customers not engaging in any criminal activity.

In February, Porcha Woodruff was 8 months pregnant and getting her two children ready for school in Detroit, when the cops knocked on her door and said she was under arrest for a carjacking. However, she did not commit the crime.

The events all started when an automated facial recognition search mistakenly matched video evidence with Woodruff’s mugshot photo, which was taken 8 years prior for driving with an expired license.

Woodruff spoke with WXYZ-TV about the ordeal and recalled being afraid.

“My kids were there crying, and I’m just stuck because I’m like, ‘What am I going to do? These people are telling me they’re trying to arrest me and take me to jail for something I have no clue or no idea about,” Woodruff said.

She later filed suit against the Detroit Police Department for false imprisonment.

A study by the National Institute of Standards and Technology found that Black and Asian people are 10-100 times more likely to be falsely matched using facial recognition when compared with white people.

According to The Innocence Project, six people have already reported being accused of a crime following a false positive facial match. All six people were Black.

Earlier in 2023, customers filed a class-action lawsuit against Amazon Go in New York City for failing to notify customers the biometric technology was tracking their information. The lawsuit was filed one year after a law went into effect that requires businesses to post signs if they’re using the technology.

But despite the dark side, stadiums and music venues across the country are also implementing the AI technology for “faster, touchless entry,” and even shorter lines to snag some food or beer during halftime.

Cleveland’s First Energy Field, CITI Field in New York City, and the Rose Bowl in California have all implemented some form of facial recognition technology.

At dozens of airports, Clear, a private security screening company, plans to unveil the tech at its screening checkpoints in 2024. The Transportation Security Administration also announced it would implement AI facial recognition.

Privacy advocates and critics of AI facial recognition have warned that regulation on the technology is limited, and that personal information could end up in the wrong hands.

Meanwhile, proponents of the rising AI software argue that facial recognition helps companies achieve a safe and secure operation.

Tags: , , , , , , , ,