Facial Recognition For Surveillance: When Your Identity Relies on a Software Algorithm?
Facial Recognition For Surveillance: When Your Identity Relies on a Software Algorithm?
The revelation that IBM created a very detailed recognition software for law enforcement, and it was deployed in secrecy isn’t great news for personal liberties. The previous examples aren’t exactly ones of great success either.

At a time when there is great debate about the privacy and security of billions of internet users and also how social media and the internet is being used as a tool to change the narrative, so to say, there is now a new issue to contend with. Surveillance. Something that we have all feared, and perhaps even suspected its existence in many different forms, but at most times have not been able to prove decisively. The latter no longer is a limitation for the citizens of the New York City. For many years now, IBM has collaborated with the New York City Police Department (NYPD) to develop a sophisticated system that allowed law enforcement to search for and identify people by their skin color, hair color, gender, age, and various facial features.

Now as it turns out, IBM got access to some of the footage recorded by the cameras meant for law enforcement, for developing object identification technology. The Intercept and the Investigative Fund report that the NYPD has admitted to sharing footage with IBM. “Video, from time to time, was provided to IBM to ensure that the product they were developing would work in the crowded urban NYC environment and help us protect the City. There is nothing in the NYPD’s agreement with IBM that prohibits sharing data with IBM for system development purposes. Further, all vendors who enter into contractual agreements with the NYPD have the absolute requirement to keep all data furnished by the NYPD confidential during the term of the agreement, after the completion of the agreement, and in the event that the agreement is terminated,” says the New York City Police Department.

In the years after the 9/11 attacks in New York, law enforcement agencies countered the terrorism threat by carpeting the city with surveillance cameras, and centralized its video surveillance operations to a single command center—called Lower Manhattan Security Coordination Center. Subsequently came sophisticated analytics software which was embedded into the surveillance cameras.

New York, or other cities in the US aren’t the only ones using advanced recognition techniques for law enforcement. In May this year, a British non-profit organization Big Brother Watch released the ‘Face Off: The Lawless Growth of Facial Recognition in UK Policing’ report detailed how technology was being misused. “We are deeply concerned that the securitisation of public spaces using biometrically identifying facial recognition unacceptably subjects law abiding citizens to hidden identity checks, eroding our fundamental rights to privacy and free expression,” the report said. In May, the South Wales Police revealed that its face-recognition software had erroneously flagged thousands of attendees of a soccer game as a match for criminals; 92 percent of the matches were wrong.

In the U.S. this June, Amazon’s facial recognition software called Rekognition, has already drawn criticism after the American Civil Liberties Union tested it and was able to show the incorrect results that matched the faces of 535 members of the Congress against a database of 25,000 criminal suspect mugshots, and the Rekognition software actually generated 28 false matches. An identification — whether accurate or not — could cost people their freedom or even their lives,” ACLU said in a statement.

Apart from being used for regular law enforcement, these analytics captured still images of individuals and then detected and labelled these images to identify visible physical tags. The idea was to allow the law enforcement agencies to have a more structured search method when dealing with hours of video footage when identifying a threat or for narrowing down on a specific suspect. It was only in 2011 when Inspector Salvatore DiPace, then at the command of the Lower Manhattan Security Initiative confirmed in an interview with Scientific American that the police department was indeed testing whether the analytics software could box out images of people’s faces and subsequently mark certain facial features.

It was in March this year that the documents of a lawsuit seeking information on the NYPD's "Forensic Imaging System" brought the details to light. "NYPD's face-recognition system appears to include data for every NYPD arrestee, meaning that each arrestee is subjected to face-recognition searches," said papers filed by Georgetown University's Center on Privacy and Technology. The facial recognition technology scans an individual face with over 16,000 points matched, and includes features such as facial hair, hairstyle, skin tone and more.

IBM’s documents suggest that the ‘Near Field People Search Profile’ factors in baldness, eyeglasses, sunglasses, head colour, skin tone, and texture and tri-colour combo search on torso area with 13-colour palette.

Stephanie Glaberson, the attorney for Center on Privacy and Technology wrote in the statement that there is "substantial evidence that face recognition is widely used by the department, and likely is used in every arrest."

“IBM is committed to responsibly advancing and using new technologies and is recognized for policies that promote diversity and inclusion. These values are why we have numerous programs underway to better identify and address bias in technology, including making publicly available to other companies a dataset of annotations for more than a million images to help solve one of the biggest issues in facial analysis — the lack of diverse data to train AI systems. IBM would not bid on client work that would enable ethnic bias,” says the company in a statement released to Fast Company.

In the US, there is something known as The Freedom of Information Act. (FOIA). This federal law allows for full or partial disclosure of previously unreleased information and documents controlled by the United States government. The scope includes, among other things, records or information compiled for law enforcement purposes, but only to the extent that the production of such law enforcement records or information could reasonably be expected to constitute an unwarranted invasion of personal privacy.

While law enforcement and private companies often tend to work together for new solutions, the fears in this case is that the system tuned on millions of unsuspecting New Yorkers could be used for racial profiling. The secrecy behind the entire project is also unsettling for some—millions of people were photographed, perhaps matched against a database, all without any knowledge of it happening.

Also read: A Lawsuit Against Google For Sneaky Location Tracking Impacts All of us

What's your reaction?

Comments

https://ugara.net/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!