It’s a pattern we’ve seen too many times: Real-world violence migrates online, and after protracted denial, platforms that profit from abuse now promise to save us. Their “solutions” are convenient–and conveniently profitable–but only make matters worse. The latest example in the news is particularly wrenching: Tinder’s flailing efforts to address the all-too-real threat of intimate partner violence with unproven, error-prone background checks. But not only will the company’s new surveillance software fail to keep users safe, it will put even more of us at risk. It’s hard to capture every way that Tinder’s plan is poised to fail. First, background checks don’t work, not at telling you who is likely to abuse. Background checks are great at telling you about drug use and financial difficulties when users are Black, Brown, or from other over-policed communities. Want to know if someone has a history with marijuana use? Tinder is here to help. But if you want to know if someone is likely to commit an act of intimate partner violence, suddenly the data doesn’t look so good. That’s because the vast majority of abusers are never charged for their violence, and when they are, it fits the same pattern of discrimination that defines every dimension of American policing. Not only are white abusers more likely to go free, but BIPOC survivors are often arrested alongside their abusers. So, if you are a wealthy white abuser, statistically, Tinder will give you green light nearly every time. Even worse, using these faulty checks re-victimizes survivors of intimate partner violence, placing the burden of preventing attacks on them. Soon, users who don’t run the free trial check, or who exhaust the trial and are unable to pay, could be blamed for failing to predict their own attack. This will become the latest justification for police, university officials, and others in positions of power to ignore survivors, silence their complaints, and deny them support. AI may be ineffective at preventing crime, but it is very effective at preserving the status quo. Background checks ignore the lived experience of survivors, buying into the outdated narrative that somehow the cycle of abuse could have been stopped if only people knew their partner had a criminal history from the start. Just as damming, the system relies on the broken logic of broken windows policing, and the belief that someone should be completely exiled from society, their whole life discarded, if they have been convicted of a crime. Tinder’s digital chivalry is just as antiquated as the analog patriarchs of the past. These performative protections ignore the reality that Tinder is built on a business model that puts survivors at risk. Tinder and other data apps are fueled by our most intimate data and most intimate moments. That information is a constant stream of data for advertisers and anyone else willing to pay. But that data is increasingly accessible, not just to those who want to track us around the internet with ads, but to abusers who wish to track us to our businesses and home. Tinder can’t claim credit for keeping users safe when it continues to sell the very data that puts them at risk. For just a few hundred dollars, anyone could purchase datasets that include Tinder users’ location history. They could reconstruct their movements and romantic lives . . . and worse. A niche market for the tech-savvy stalker. But the threat to Tinder users doesn’t just come from those who break the law, but also from those who claim to enforce it. Increasingly, law enforcement agencies are purchasing the very same location data as advertisers. It’s scary enough when your local police department can buy a record of every place you’ve been, but it’s outright terrifying when ICE can do the same. If Tinder truly wanted to protect its users, it wouldn’t invest in this new, misguided form of user surveillance; it would stop enabling surveillance of its users. And if it wanted to help those targeted by abusers both on and off its platform, it would invest in the countless community-based groups that provide survivors what they need most: low-tech resources like a safe place to sleep when escaping their abuser. Yes, technology has made this problem worse–it has put people in harm’s way–but the solution isn’t more unproven, discriminatory technology. Instead, the solution is to listen to survivors. Real protection prioritizes their expressed needs, like supporting financial independence, data security, and the groups that fight for and with survivors of intimate partner violence every day. Albert Fox Cahn is the founder and executive director of the Surveillance Technology Oversight Project (S.T.O.P.), a New York-based civil rights and privacy group, and a visiting fellow at Yale Law School’s Information Society Project. Sarah Roth is an advocacy and communications intern at S.T.O.P., a recent graduate of Vassar College, and prospective JD candidate.
Login to add comment
Other posts in this group
On a typical day, you can’t turn on the news without hearing someone say that Congress is broken.
When news broke that the United Healthcare CEO was shot in broad daylight early last month, outrage erupted online. But it wasn’t aimed at the assassin. Instead, it was directed at the broken U.S.
Ashley Abramson first came across Sophie Cress in a cold pitch to her work email. Cress was asking to be an expert source for any stories Abramson was working on as a freelance reporter. “I’ve got
Threads, Meta’s X and Bluesky rival, is testing ads with certain brands in the United States and Japan, the company said Friday.
“We know there will be plenty of feedback abo
Sooner or later, the politicians who most admire Donald Trump begin to emulate him. They