Kashmir Hill on the End of Privacy
In this intriguing discussion, Kashmir Hill, a New York Times reporter and author of 'Your Face Belongs to Us,' reveals the startling implications of Clearview AI’s facial recognition technology. She shares insights from her groundbreaking investigation into this secretive startup, which can identify anyone from a simple photo. Topics include the ethical dilemmas of surveillance, the chilling potential for wrongful arrests, and the personal vendettas that tech like this can enable. Hill emphasizes the urgent need for stronger privacy regulations as society navigates this new digital landscape.
Highlights
-
Episode AI notes
- Clearview AI took risks that big companies like Google and Facebook were not willing to take, giving them a competitive advantage in developing facial recognition technology.
- Facial recognition technology poses significant risks to privacy and can disproportionately affect marginalized communities.
- Clearview AI pushes responsibility onto police departments to use their technology, but ethical concerns regarding the use of a massive database of individuals need to be addressed.
- Big tech companies build surveillance networks for targeted advertising, but there is debate about their effectiveness, and Facebook is not listening to users through their iPhones.
- The sources of growth for Clearview AI as a private company are predicated on crawling through websites like Flicker to identify individuals, and filing a DMCA request is an effective way to remove content from the internet. Time 0:00:00
-
Clearview AI: The Radical Startup That Reorganized the Internet
Summary:
Clearview AI is a company that stands out because it is willing to do what big companies like Google and Facebook have not: use facial recognition technology openly.
Unlike Google and Facebook, Clearview AI started as a startup with a bold vision and nothing to lose, making them a regulatory entrepreneur. They have created a database organized by faces, aiming to reorganize the internet.
Their goal is to make their database as extensive as possible before competitors catch up.
Transcript:
Speaker 2
Talk about Clearview AI itself because the big companies have kind of had this capability for a while and to their credit, they haven't really done much with it. You know, Google Photos inside of Google Photos will do some face mashing with that's not public as far as we know, Facebook can obviously do it, but they keep that inside of Facebook. Clearview is just like, we're doing it. We take a bunch of data and we're doing it and now the cops can look at your face. Why is this company different? How did it start?
Speaker 1
I think this is really surprising to people that it's something that's in the book that Google and Facebook both developed this ability internally and decided not to release it. And these are not companies that are traditionally that conservative. When it comes to private information, I'm Google's the company that sent cars all over the world to put our pictures of our homes on the internet. What was different about Clearview AI is that they were a startup with nothing to lose and everything to gain by doing something radical, doing something that other companies weren't Willing to do. And I mean, I put them in the same category of being a regulatory entrepreneur as an Uber or an Airbnb that this was their differentiator. They said, we're going to make this database and we're going to reorganize the internet by face. And that's our competitive advantage. And we wanted to make our database as big as we can before anyone else can catch up to us. Time 0:08:31 -
The Risks of Facial Recognition Technology and its Impact on Privacy and Marginalized Communities
Summary:
Cashmere Hill discusses the risks of facial recognition technology in relation to privacy and law enforcement.
She points out the trade-offs involved, where law enforcement uses rhetoric to justify its use. The speaker raises concerns about preventing questioning of constitutional rights and the moral implications of challenging privacy.
While acknowledging the technology's error rate and inability to audit law enforcement's use of it, the speaker highlights the potential disproportionate impact on marginalized communities based on historical and statistical evidence.
Transcript:
Speaker 2
Welcome back. We're talking with Cashmere Hill about the big risks facial recognition poses to privacy. So this runs right into the trade-offs of all technology that is used by law enforcement. It seems like they are a battering ram of rhetoric when it comes to why law enforcement is using it. Like you say we're catching pedophiles and thus no more questions should be asked. Whatever I hear that the red flags go off for me. You're trying to prevent me from asking questions about the fourth and fifth amendment. You're trying to prevent me from asking questions about privacy by making them seem morally wrong to ask. But there's a part of me that says, look the technology definitely has an error rate. I don't know what the cops are doing. I can't audit their use of it. When they do rely on technology like this, history and statistics suggest that they will have a disproportionate impact on marginalized communities. Time 0:27:12 -
Clearview AI and the Ethical Implications of Facial Recognition Technology
Summary:
Clearview AI pushes responsibility to police departments, stating they only provide the technology.
Arrests should not be made solely based on Clearview matches, but more investigations are needed. Society needs to evaluate the implications.
While Clearview has helped solve crimes, the ethical concerns regarding a massive database of individuals need to be addressed.
Should everyone be subject to facial recognition technology for any crime, and what are the rules surrounding its use?
Transcript:
Speaker 1
Clearview definitely kind of pushes that onus to police departments and saying we're just providing the technology for them to use. They should never arrest somebody based on a Clearview match alone and that they need to do more investigating. I think for us as a society there's just a lot to evaluate here. I've talked to a lot of officers who yeah they've solved crimes with Clearview AI as a starting point. Horrific things, you know, abuse of children. But I think we need to ask ourselves, are we comfortable with this database of probably hundreds of millions of people? Probably you and me, should we all be in the lineup every time the police are trying to solve a crime, whether it's shoplifting or murder? If they are going to use facial recognition technology, what are the rules? Time 0:28:16 -
Big Tech's Ubiquitous Surveillance Networks and Targeted Advertising
Summary:
Big tech companies build surveillance networks for targeted advertising, but there is debate on its effectiveness.
People believe Facebook is listening to them, but it would be illegal and impractical. The digital fingerprint is used to show ads.
Nihilism comes into play as people accept invasion for the convenience of their phones.
Transcript:
Speaker 2
We'll put that in a practice for me. I've read a lot of your reporting. A lot of your reporting is about how the big tech company is build these ubiquitous surveillance networks mostly to put advertising in front of us. At the end of it all like they're just trying to sell us some paper towels right and like faster than ever before and there's billions of dollars in between me and the paper towels but like That's what it's for right it's like very targeted advertising and there's some debate about whether it's even effective which I think is very funny but that's what it's largely for And I go out you know I see my family I listen to our readers and they're like Facebook is listening to us on our iPhones and they won't believe me that probably not you know that's probably Not happening that there's this other very complicated multiple billion dollar mechanism that just makes it seem like Facebook is listening.
Speaker 3
I mean it would be very illegal.
Speaker 1
It would be very illegal if they were.
Speaker 3
It would be illegal and also like it would be harder it feels like it would be much harder to light up your microphone all the time and listen to you then just assemble that the digital fingerprint
Speaker 2
That they've managed to assemble and show you the ads for a vacation if I was talking about and you know you can like explain it but then people just like fall back on well Facebook is just Listening to me on my phone and I still have a phone and it's fine and that's the nihilism right that's what that's where the nihilism comes into play where even when people assume that One of the most invasive things that can happen is happening they're like well I have my phone so useful I definitely need to keep letting Facebook listen to me. Time 0:32:59 -
Sources of Growth for a Private Company
Summary:
A private company's sources of growth are uncertain as they are not a government actor.
They target news and player sites by crawling through websites like Flicker, finding photos that individuals may be unaware of. The only effective way to remove content from the internet is to file a DMCA request.
Transcript:
Speaker 2
Do we know what the sources are of that growth? Is it still the public internet or are they signing deals? How's that working?
Speaker 1
You know unfortunately they're not a government actor they're a private company so I can't send them a public records request and find out what all their sources are. So I mostly see it through when I see an example of a search whether they run it on me or I see it show up in a place investigation but yeah I mean it seems like pretty wide out there. I mean news sites and player sites they seem to be pretty good at targeting places that are likely to have faces in one of my last meetings with Wontontat before I was done with the book They had just crawled flicker and he himself was finding all these photos of himself when he was a kid like a baby coder in Australia. He said it's a time machine we invented it and he did a search on me and it showed photos I didn't know where I'm flicker that one of my my sister's friends took it was me at a point in my life When I was depressed I was heavier I weighed more I like don't put photos from that time on the internet but there they were a clear view had them.
Speaker 2
You know we have a joke on the verge staff that the only functional regulation of the internet is copyright law like if you want something to come down off the internet your fastest way To doing it is to file a DMCA request. Time 0:37:08