By: Attorney General Charity Clark
The Legislature is taking up a comprehensive data privacy law – one that gives you the power to sue if your data has been mishandled in violation of that law – right in the nick of time.
Facial recognition can let strangers identify us on the street. Geolocation tracking apps and spyware can map our physical location. Artificial intelligence (AI) tools create “deepfake” videos of political leaders and celebrities that are indistinguishable from reality – and have already been used to create nonconsensual pornography. Social media algorithms collect data they sell and use to addict children to their screens. Data brokers sell our information, including predictions about our interests, our personalities, and our vulnerabilities, to anyone willing to pay for it. And even companies who do prioritize data privacy are vulnerable to data security breaches and the criminal enterprises that steal what they cannot buy.
I am pleased by strong legislative leadership prioritizing passage of a comprehensive data privacy law. For years, I have advocated for a law that includes data minimization provisions, requiring data brokers to offer consumers an “opt out” from having their data bought and sold, and establishing a Biometric Data Privacy Act (BIPA) that would require notice and consent for collecting and using things like facial recognition.
Of these provisions, the industry collecting our data has been universally critical of one key component: your right to sue when your own biometric information has been mishandled in violation of a BIPA. Curiously thin on reasons for their dislike, industry has tried to use their dislike of this so-called “private right of action” as a boogeyman, spooking legislators by saying that a bill will struggle to pass if it is included. So far, the only reason I have heard why they don’t like this provision is simple: it will cost industry more money. But without a private right of action, only the Attorney General’s Office will have the authority to sue for violations of a BIPA. In other words, industry wishes to transfer their costs of violating BIPA to you, the taxpayer.
Beyond data minimization, a BIPA, and requiring a data broker opt-out, there are other additional protections the Legislature could consider now or in the future. Artificial intelligence, for example, is a frontier with tremendous opportunity – and danger. Without appropriate regulation, use of AI by criminal actors – for example, scammers – could have a destabilizing impact on the online economy. Just as scam robocalls destabilized the buying and selling of goods and services over the phone, a lack of trust by consumers could have a chilling effect on online purchases. And while AI presents a potential harm to commerce, one particular type of AI – the use of deepfakes to make nonconsensual pornography – is currently much more prevalent. I urge the Legislature to also prioritize addressing this problem.
As Attorney General, I have made data privacy a priority, especially for children. We must align our data privacy laws with Vermont’s values of privacy and personal freedom. Legislators, stakeholders in the business community, and data privacy advocates must work cooperatively to craft law and policy that is effective while leaving business free to develop and grow.