- Senate Bill S5609, led by Sen. Julia Salazar, seeks to ban police from using biometric surveillance technologies like facial recognition and gait analysis, with limited exceptions for fingerprint scanners and DNA databases.
- S5609 includes the creation of a Biometric Surveillance Regulation Task Force to evaluate the effectiveness and legality of such tools and would grant individuals the right to sue if their rights are violated due to misuse.
- The legislation follows troubling reports of police using celebrity photos in facial recognition searches and mining sealed juvenile records, highlighting civil rights concerns and algorithmic bias.
- NY Gov. Kathy Hochul recently signed Senate Bill S7543B (the LOADinG Act), which mandates human oversight and biennial bias audits for algorithmic systems used by state agencies, especially those affecting public rights and benefits.
- Despite supporting both S5609 and S7543B, Hochul has expanded subway surveillance, raising questions about the consistency of New York’s approach to privacy and public safety.
New York lawmakers are advancing a bill that will sharply restrict police use of biometric surveillance technologies to reshape the balance between law enforcement and civil liberties in the state.
Senate Bill S5609, introduced by Senator Julia Salazar and co-sponsored by six fellow legislators, passed the Senate Internet and Technology Committee this May in a 5-2 vote, setting the stage for further debate in the Senate Codes Committee. (Related: Half of America already in law enforcement’s facial recognition network.)
The legislation would prohibit police departments and individual officers from acquiring, possessing or deploying a broad class of biometric systems, ranging from facial recognition to gait analysis.
S5609 explicitly bans systems, whether automated or semi-automated, that identify individuals based on biometric features such as facial structure, iris patterns and even movement signatures. It includes tightly defined exceptions for mobile fingerprint scanners and continued contributions to the existing DNA data bank of the state, signaling an attempt to preserve some forensic capabilities while limiting surveillance that is more prone to misuse and error.
In recent years, high-profile incidents have amplified concerns about the unregulated use of biometric tech by police. In some cases, officers reportedly fed celebrity photos into facial recognition databases when suspect images were unavailable. In others, sealed juvenile records were mined to generate match leads – practices that experts say violate due process and constitutional protections.
Such examples underscore the risks of entrusting sensitive, often error-prone systems to institutions with little transparency. Studies have shown that facial recognition algorithms can exhibit significant racial and gender biases, leading to mistaken arrests and civil rights violations across the country.
As part of its oversight architecture, S5609 would establish a Biometric Surveillance Regulation Task Force, a 12-member panel drawn from law enforcement, privacy advocacy groups, civil rights organizations and data protection experts. The task force would be charged with evaluating the current use, effectiveness and legal implications of biometric surveillance and issuing recommendations for any future regulations.
If passed, S5609 would provide a private right of action for individuals whose rights are violated due to biometric surveillance abuse.
Hochul has also signed S7543B – an AI oversight law for state agencies
On Dec. 21, 2024, New York Democratic Gov. Kathy Hochul also signed Senate Bill S7543B, dubbed the Legislative Oversight of Automated Decision-Making in Government Act or LOADinG Act, into law.
S7543B, sponsored by Sen. Kristen Gonzalez along with 15 other state senators, mandates that all automated decision-making systems (ADMS) used by state agencies undergo rigorous human oversight and periodic impact assessments to ensure they do not violate civil rights, produce discriminatory outcomes, or otherwise harm the public. These assessments must be revisited every two years and include evaluations for bias, discrimination or harm.
The law, effective immediately with full implementation within a year, applies to any algorithmic system that influences decisions related to public assistance benefits; individual rights, civil liberties or safety; and statutory or constitutional protections.
But despite supporting both S5609 and S7543B, Hochul has expanded subway surveillance, raising questions about the consistency of New York’s approach to privacy and public safety.
FutureTech.news has more stories like this.
Watch the video below that talks about government agencies that were caught lying about the facial recognition program.
This video is from MyPodcastDropped2320 channel on Brighteon.com.
More related stories:
Russia to launch nationwide facial recognition payment system this year.
Malfunctioning facial recognition technology may put innocent individuals at risk.
Mastercard rolls out payment system that uses FACIAL RECOGNITION technology.
U.K.’s Crime and Policing Bill 2025 reignites facial recognition controversy.
Fairway grocers in NYC now using facial recognition to profile customers.
Sources include:
ReclaimtheNet.org
Trackbill.com
Brighteon.com
Read full article here