While facial recognition technology is unregulated at the federal level, 23 states have now passed or expanded laws to restrict the mass scraping of biometric data, according to the National Conference of State Legislatures.

Last month, Colorado enacted new biometric privacy rules, requiring consent before facial or voice recognition technology is used, while also banning the sale of the data. Texas passed an artificial intelligence law in June that similarly outlaws the collection of biometric data without permission. Last year, Oregon approved data privacy rules requiring consumer opt-in before companies hoover up face, eye and voice data.

“What we need are laws that change the behavior of technology companies,” Adam Schwartz, the privacy litigation director at the Electronic Frontier Foundation. “Otherwise these companies will continue to profit on what should be our private information.”

Not all state laws give people right to sue tech companies The states that have passed the safeguards view them as a defense against the prevalence of digital tracking in everyday lives, and in a number of cases, the laws have been used to extract large payouts from tech companies.

Google and Meta have each paid Texas $1.4 billion over allegations that the companies datamine users’ facial recognition data without permission; Clearview AI, a facial recognition company popular with law enforcement, ponied up $51 million to settle a case approved in March over the firm scraping billions of facial images online without consent; And in July, Google resolved a smaller case for $9 million in Illinois after a lawsuit alleged the company did not obtain written consent from students who used a Google educational tool that collected their voice and facial data.

Illinois’s requirement that companies receive written permission before gathering biometric data goes farther than most states, which require digital consent — or checking a box for a company’s terms and conditions policy, something experts say is a largely symbolic gesture in practice.

“I’m not saying it’s better than nothing, but if you’re hanging these legal frameworks on a model of informed consent, it’s clearly ineffective,” said Michael Karanicolas, a legal scholar at Dalhousie University in Canada who studies digital privacy. “Nobody is reading these terms of service. Absolutely nobody can effectively engage with the permission we’re giving these companies in our surveillance economy.”

Karanicolas said Illinois’ biometric privacy law, which was passed in 2008, has real teeth because it allows individuals to sue companies, which privacy advocates say the tech industry has lobbied hard against. California and Washington state allow residents to sue in some types of cases.

But most of the laws, like in Texas, Oregon, Virginia and Connecticut and elsewhere, rely on state attorneys general to enforce them. Advocates say allowing citizens to sue, what’s known as “a private right of action,” helps people fight back against data-guzzling companies.

“And that can lead to these big class-action settlements, and there are legitimate critiques of them, with class members often getting very little money, and lawyers getting rich, but they can be genuinely effective at shaping companies’ attitudes about personal information and generate corporate change,” Karanicolas said.

  • LibertyLizard@slrpnk.net
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    5 days ago

    Do any of these laws ban the police or other government agencies from doing this? While I don’t love private companies having my face, that’s what really concerns me. Though obviously it is an interconnected issue.

    • paraphrand@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 days ago

      Banning private companies should keep police from having a source to buy from.

      But that’s not enough.

      • LibertyLizard@slrpnk.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 days ago

        Yeah that’s what I meant but if the real issue is the police they will just find a way to do it themselves with enough time. We need to get ahead of this. They are building a totalitarian state in front of our eyes.

  • Basic Glitch@sh.itjust.worksOP
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    5 days ago

    Advocates say allowing citizens to sue, what’s known as “a private right of action,” helps people fight back against data-guzzling companies.

    “And that can lead to these big class-action settlements, and there are legitimate critiques of them, with class members often getting very little money, and lawyers getting rich, but they can be genuinely effective at shaping companies’ attitudes about personal information and generate corporate change,” Karanicolas said.

    Me, being exploited every time I leave my house and barely able to pay my bills while a $10B Meta data factory the size of Manhattan is receiving corporate welfare and being built in my unregulated state:

    Even if there’s no big payoff for individuals, privacy is privacy and money is money. My preference would be no exploitation, but fighting the state and private corporations as an individual is a losing battle (especially right now). I’m very surprised there’s not already somebody trying to get the city or state to actually adopt a Facial Recognition tech ban.

  • xodoh74984@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    It’s a step in the right direction, but reserving the right to sue companies that collect and share our most sensitive personal information and whereabouts is not enough. It is a cost of doing business to them to be weighed against the potential for profit. This line of thinking is now taught in business schools.

    Nothing will change materially until the executives are faced with the potential of jail time.