Popular Articles
Today Week Month Year


BAN THE SCAN: Civil rights groups call on the state of New York to outlaw facial recognition
By Olivia Cook // Nov 03, 2023

Thirty-two civil rights groups, under the leadership of the Surveillance Technology Oversight Project (STOP), have called on the state of New York to outlaw government use of facial recognition and other biometric technologies in residential buildings and public accommodations or facilities.

These groups have declared that facial recognition technology (FRT) is an "immediate threat" to New Yorkers" safety and civil rights.

In a memo of support for two pending state bills 1014-2023 and 1024-2023, "Ban the Scan" advocates pointed out that biometric technology, including facial recognition, can be "biased, error-prone and harmful."

With the advancement of facial recognition technology, here are some of the main privacy and security concerns that can't just be ignored.

Barriers to cybercrime are low

Greater complexity and interdependence among security systems give cybercriminals more opportunity for widespread global damage, according to cybersecurity industry experts.

No security system is airtight and this alone can make biometric databases, including facial recognition records an extremely attractive target for tech-savvy hackers looking to exploit invaluable information.

"Now the barriers to cybercrime entry are low and cybercrime is becoming a service. Moreover, unlike in the past, more nation-states are entering the cybercrime arena. And that to me is concerning in itself," said Kevin Mandia, CEO of intelligence-led security company FireEye.

Outright violation of data privacy laws – from the collection, improper storage and mishandling of facial recognition and other biometric data – leads to the decline or complete loss of public trust and confidence in both government agencies and private companies that use these technologies.

We are building the infrastructure of human freedom and empowering people to be informed, healthy and aware. Explore our decentralized, peer-to-peer, uncensorable Brighteon.io free speech platform here. Learn about our free, downloadable generative AI tools at Brighteon.AI. Every purchase at HealthRangerStore.com helps fund our efforts to build and share more tools for empowering humanity with knowledge and abundance.

Collected facial recognition data could be misused

Facial recognition is not immune to conscious or unconscious judgmental bias that leads to discrimination and wrongful convictions against certain groups.

While faces are becoming easier to capture even from remote distances and are cheaper to collect and store with today's tech advancements, faces cannot be "encrypted" unlike many other forms of data, according to information systems/information technology (IS/IT) professionals at the Information Systems Audit and Control Association (ISACA).

Facial recognition data breaches increase the potential for harassment, identity theft, stalking, surveillance and monitoring.

Data collection could infringe on individual privacy

The collection of data through the use of facial recognition technology can be done without a person's consent or knowledge – a clear infringement of a person's freedom and privacy. (Related: FBI has been testing facial recognition software on Americans for YEARS without their knowledge or consent.)

The accuracy and bias of data and algorithms are another privacy risk as facial recognition and biometrics are not completely error-free. Some have produced false negatives, false positives or misidentifications.

Because of its infallibility, any person can be wrongly accused of a crime, offense or violation; denied access to essential services or discriminated against age, gender or race – evidenced by a study published in the journal Cognitive Science that exposed error rates across demographic groups – with the poorest accuracy consistently found in subjects who were Black, female and between 18 to 30 years old.

A research article published in the Harvard Journal of Law & Technology explained why racial bias is prevalent in facial recognition technology. It listed three distinct factors that drive racially disparate results: the lack of diversity and representation in the training data and algorithms; human selection of facial features; and image quality issues.

This was confirmed by the National Institute of Standards and Technology (NIST) study that tried out 189 facial recognition algorithms submitted by 99 major surveillance tech developers with 18.27 million images.

Researchers found that many of these algorithms are between "10 up to 100 times more likely to misidentify a Black, East Asian and Native American face than a white."

Facial recognition could infringe on freedom of speech and association

Self-censorship, suppression of dissent and associated chills are just three reasons why uncontrolled use of facial recognition technology can take away or limit someone's rights to freedom of speech and association.

"The fear and uncertainty generated by surveillance inhibit activity more than any action by the police, and if you feel you're being watched, you self-police, and this pushes people out of the public space," said Joshua Franco, a senior research advisor and the deputy director of Amnesty Tech at Amnesty International.

Visit PrivacyWatch.news for more stories like this.

Watch this video about state surveillance being exposed.

This video is from the Planet Zedta channel on Brighteon.com.

More related stories:

TSA’s use of FACIAL RECOGNITION tech in US airports rouses privacy concerns.

Fairway grocers in NYC now using facial recognition to profile customers.

Amazon now requiring marketplace sellers to submit video for a facial recognition database.

Fashion company creating clothing line that shields people from AI facial recognition technology.

Sources include:

ChildrensHealthDefense.org

Amnesty.org

ADA.gov

YWCAWorks.org

Static1.SquareSpace.com

PrivacyEnd.com

CSOOnline.com

ISACA.org

LinkedIn.com

JOLT.Law.Harvard.edu

NIST.gov

VICE.com

Brighteon.com



Related News
Take Action:
Support NewsTarget by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NewsTarget.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.

NewsTarget.com © 2022 All Rights Reserved. All content posted on this site is commentary or opinion and is protected under Free Speech. NewsTarget.com is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. NewsTarget.com assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published on this site. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
News Target uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.