Roe v. Wade is gone, but this is not 1973. In some ways, it’s worse.
When the Supreme Court ruled last week that banning abortion isn’t unconstitutional, abortion immediately became illegal in several states with “trigger laws” primed to take effect with just such a ruling. It’s about to become illegal in several more states in which previously passed laws restricting abortion had been blocked by federal courts.
A lot of people are about to lose access to safe, legal abortions, and those who provide abortion access or support will face consequences ranging from civil suits to arrest in some states. These are grim times for abortion access.
And the forecast is even grimmer because we now live in an era of unprecedented digital surveillance. I’ve spent most of my career helping to protect activists and journalists in authoritarian countries, where it is often wise to think several steps ahead about your digital privacy and security practices. Now we must bring this mindset back within our own borders for people providing abortion support and people seeking abortions.
The first step is operational security. Abortion providers, abortion support networks’ staff and volunteers, and abortion seekers must take steps immediately to thoroughly compartmentalize their work and health from the rest of their digital lives. That means using aliases, using separate phones and emails, downloading a privacy-protecting browser, and being very cautious about installing applications on personal phones.
For people who are pregnant, it is important to start with an understanding of the existing threats. People who have already been prosecuted for their pregnancy outcomes were surveilled and turned in by trusted people, including doctors. The corroborating evidence included Google search histories, texts, and emails. It is time to consider using Tor Browser for searches relating to pregnancy or abortion, using end-to-end encrypted messaging services with disappearing messages turned on for communications, and being very selective about who is trusted with information about their pregnancy.
It is also important to look to the future and reconsider the treasure troves of data we create about ourselves every day—which now may be weaponized against us. People who may become pregnant should rethink their use of period-tracking apps, which can collect data that may be subpoenaed if they are suspected of aborting a pregnancy. They may use an encrypted period-tracking app such as Euki, which stores all of the user information locally on the device—but beware that if that phone is seized by the courts, they may still be able to read the information on it. People who may become pregnant also should carefully review privacy settings on services they continue to use, and turn off location services on apps that don’t absolutely need them.
But the biggest responsibility now lies with the tech industry. Governments and private actors know that intermediaries and apps often collect heaps of data about their users. If you build it, they will come—so don’t build it, don’t keep it, dismantle what you can, and keep it secure.
Companies should think about ways in which to allow anonymous access to their services. They should stop behavioral tracking, or at least make sure users affirmatively opt in first. They should strengthen data deletion policies so that data is deleted regularly; avoid logging IP addresses, or if they must log them for anti-abuse or statistics, do so in separate files that they can aggregate and delete frequently. They should reject user-hostile measures like browser fingerprinting. Data should be encrypted in transit, and end-to-end message encryption should be enabled by default. They should be prepared to stand up for your users when someone comes demanding the data, and at the very least, ensure that users get notice when their data is being sought.
There’s no time to lose. If there is one thing I have learned from a decade and a half of working with vulnerable populations in authoritarian countries, it is that when things start to go wrong, they get worse very quickly. If tech companies don’t want to have their data turned into a dragnet against people seeking abortions and people providing abortion support, they need to take these concrete steps right now.
Leaving frightened people on their own to figure out their digital security in a world where it is hard to understand what data they’re creating and who has access to it is not an option. Tech companies are in a unique position to understand those data flows and to change the defaults to protect the privacy rights of this newly-vulnerable class of users.
The Supreme Court rolled back rights by half a century on Friday, but now is not the time to shrug and say it’s too late and nothing can be done. Now is the time to ask hard questions at work. You hold the world’s data in your hand and you are about to be asked to use it to be Repression’s Little Helper. Don’t do it.
While others work to restore rights that were so callously stripped away, good data practices can help tech companies to avoid being on the wrong side of history.
Eva Galperin is the the director of cybersecurity at the Electronic Frontier Foundation.
Connectez-vous pour ajouter un commentaire
Autres messages de ce groupe
OpenAI released its newest reasoning model, called o3-mini, on Friday. OpenAI says the model delivers more intelligence than OpenAI’s first s
It looks like brothers Jake and Logan Paul won’t be squaring off in the boxing ring anytime soon. Instead, they are launching a family reality series, Paul American, starting March 27 on
After 17 years, Airbnb’s Brian Chesky is hitting reset—reinventing the business from the ground up and expanding the brand in unexpected ways. Chesky joins Rapid Response to explain why n
Capital One has launched an AI agent designed to help consumers with one of the most frustrating, time-consuming processes in life: buying a car.
The banking giant’s Chat Concierge
The Chinese artificial intelligence app DeepSeek could not be accessed on Wed
Apple on Thursday disclosed its