That selfie you posted on Instagram? Companies are using it in unethical ways | Opinion
Right now, we do not own our biometric information. We should.
Biometrics are the physical features that differentiate us — like our fingerprints, eyes and facial structure. And companies that build facial recognition tools can now gather photos of us — like that selfie you just posted on Instagram — and sell the information to a company that could then use it to control you in some way.
This sounds like science fiction, but it’s quite real. One high-profile recent example is Madison Square Garden, which used the technology to identify people employed by firms it didn’t like and then banned them from events — even though they had tickets. A second key example is voice casting companies, whose technology allowed scammers to imitate the voice of a grandson to defraud his grandparents. In neither case was the technology used illegally. This needs to change through new federal regulation.
While concert venues use security cameras to keep us safe, we do not expect those cameras to use facial recognition technology to control us, advertise to us, manipulate us, or blackmail us.
Think of the damage a person could do with this technology simply by recording people outside an abortion clinic, mosque or synagogue? Patients at cancer clinics could become targets for targeted advertising, blackmail, bullying or scams. Even now, our real-time whereabouts within the United States could be sold to hostile foreign governments or terrorist groups.
In other countries, facial recognition has been used to bully citizens into conformity or to discriminate, and to spy and retaliate against protesters, like women in Iran. Widespread facial recognition could decimate the witness protection program since witnesses could be identified by biometrics.
To address all of this, American citizens need a new reasonable right to biometric privacy.
It is reasonable to not want our every step in a public space tracked and recorded in a database next to our names. It is reasonable to not want our likenesses stolen to scam our family members. We do not expect that our pictures on social media can be used for training private facial recognition algorithms without our permission. It is reasonable to not be coerced into selling access to our biometrics for use in future applications that we may not comprehend now.
These activities should all be made illegal, through the granting of a reasonable right to biometric privacy.
Companies that use biometric information should have to obtain a license or similar certification. There should be guidelines for each domain to address where training data comes from, and permission to use it. There should also be guidelines for how that data will be stored and protected, what data should be allowed for what purposes, who is permitted to grant the use of their own biometrics or sell them, and how the models should be tested before use and monitored during use, considering ethics and fairness. And there should be a whole set of rules protecting the biometrics of children and those unable to meaningfully grant consent.
This is all daunting, but could be handled by a new federal agency governing artificial intelligence. It would be analogous to the Department of Transportation in that it would improve safety for the use of AI-based technologies, just as the DOT massively improved road safety after it was founded.
This new AI regulatory entity would work to reduce algorithmic circulation of misinformation, deal with intellectual property issues stemming from copyrighted material being used to train generative AI, reduce the massive-scale experimentation being conducted on us by Big Tech, as well as protect the use of our biometrics by helping to enact a reasonable right to biometric privacy.
We are already seeing the dawn of an era where we can be controlled and manipulated like never before, starting with access control to concert venues like Madison Square Garden. Without a right to a reasonable expectation of biometric privacy, these unethical uses of biometrics are likely to spread like wildfire. It is time to stop that now.
Cynthia Rudin is a computer science professor and artificial intelligence scholar at Duke University. Lance Browne a student at East Chapel Hill High School and an intern.