Swipe right on data privacy: What tech companies can learn about consent from dating apps

I’ve been in the data privacy field for over 15 years, but nothing could prepare me for the complications and complexities that AI has created for companies since it hit the mainstream in 2022.

Each week a new company is in the spotlight—from the backlash against Meta for using Facebook and Instagram users’ data for AI training, Microsoft’s Recall feature (which they were later forced to walk back), to Adobe users canceling their subscriptions over fears their artwork fed generative models. 

The conversation almost always comes down to whether users can opt-out of handing their data over to AI systems. And if they do have the option, what does that experience look like? Is the notice of data scraping buried in pages of legalese in the Terms and Conditions? Do you have to spend hours going back and forth with a customer service rep? Is the opt-out mechanism fraught with dark patterns to trick or guilt users into opting in, or dissuade them from even trying?

Data shows that nearly half of Americans blindly click “accept all cookies” when navigating websites online. This isn’t something to celebrate. It demonstrates a failure of tech companies to explain what information they collect, why it’s collected, and how it’s used. 

The root of the problem is that by default, consumers are almost always automatically opted-in to new experiences and features—and the resulting data collection and use that comes with them. Even when given the option to opt out, the process is often too confusing or full of friction for anyone to bother. So consumers continue handing over their data, day after day until finally the cup spills over into public backlash and damning headlines. This cycle isn’t just bad for consumer privacy rights—it’s bad for businesses and the future of AI models. 

In the data privacy world, we champion the concept of data minimization. It’s considered best practice to collect the least amount of data possible, and to inform consumers upfront about what we collect and how we use it. So, why do tech companies continue pushing the default opt-in option, accepting it as the norm?

The other day, I was reading an article in WIRED about Anthropic’s privacy policies. I learned that the company uses user prompts and outputs to train its model, Claude. But, an Anthropic spokesperson claims, they only use this data for training when “the user gives us express permission to do so, such as clicking a thumbs up or down signal on a specific Claude output to provide us feedback.” 

Reading this made me think back to my past role as chief privacy officer of the dating app Grindr, and leader of privacy programs at Match Group (parent company of Tinder and Hinge). It also inspired me to think about a new model for a default-free future. 

Mimicking dating app gestures, like a swipe left or right or a simple thumbs up or thumbs down, at each point of data collection is a brilliant way to gather consent—especially when collecting data for model training. One consumer might be comfortable with their data being used to train one kind of model, like a model for disease detection, but not others. Another consumer might hand over all their health data for training, but withhold their creative writing. Yet another might “swipe left” on it all. 

The reality is that cutting off access to all data is not the solution for any business. And AI models need it the most. Researchers have estimated that we could run out of data for model training within the decade. 

There are too many transformative use cases for AI in our world—from healthcare and finance to consumer technology and education—to let this happen. 

U.S. consumers are already aware of the risks of handing over their data. Eight in 10 adults state that the potential risks of companies collecting their data outweigh the benefits of using their products and services. But consumers can be open-minded when enterprises inform them how their data will be used.

For example, nearly half of adults said they were comfortable with data collection for purposes of national security, education, or genetic research, but not with personal data being used to monitor their mental state or aid criminal investigations. Simply being transparent goes a long way in getting users to consent to opt-in, or to even use digital products in the first place. 

Implementing point-in-time, easy-to-use controls like swiping to enhance the way we gather consent gives users real agency to decide when to grant access to their data based on what it is and how it will be used. Not only are these controls intuitive and scalable, but they align with consent best practices that reduce business risk. And it just might result in more and better data collected to train the AI models of the future. 

While some businesses might argue that a swipe-to-consent model would result in less data collection—I disagree. An informed and enfranchised consumer is a loyal and engaged consumer. 

For the future of AI, this is a model I’d swipe right on. 

https://www.fastcompany.com/91235779/swipe-right-on-data-privacy-what-tech-companies-can-learn-about-consent-from-dating-apps?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Creată 29d | 27 nov. 2024, 11:50:02


Autentifică-te pentru a adăuga comentarii

Alte posturi din acest grup

AI use cases are going to get even bigger in 2025

Over the past two years, generative AI has dominated tech conversations and media headlines. Tools like ChatGPT, Gemini, Midjourney, and Sora captured imaginations with their ability to create tex

25 dec. 2024, 07:30:03 | Fast company - tech
YouTube TV price hike got you down? 5 free alternatives

Was YouTube TV’s recent price increase the straw that broke the camel’s back for you? Wh

25 dec. 2024, 07:30:02 | Fast company - tech
TikTok is full of bogus, potentially dangerous medical advice

TikTok is the new doctor’s office, quickly becoming a go-to platform for medical advice. Unfortunately, much of that advice is pretty sketchy.

A new report by the healthcare software fi

25 dec. 2024, 00:30:03 | Fast company - tech
45 years ago, the Walkman changed how we listen to music

Back in 1979, Sony cofounder Masaru Ibuka was looking for a way to listen to classical music on long-haul flights. In response, his company’s engineers dreamed up the Walkman, ordering 30,000 unit

24 dec. 2024, 15:10:04 | Fast company - tech
The greatest keyboard never sold

Even as the latest phones and wearables tout speech recognition with unprecedented accuracy and spatial computing products flirt with replacing tablets and laptops, physical keyboards remain belov

24 dec. 2024, 12:50:02 | Fast company - tech
The 25 best new apps of 2024

One of the most pleasant surprises about this year’s best new apps have nothing to do with AI.

While AI tools are a frothy area for big tech companies and venture capitalists, ther

24 dec. 2024, 12:50:02 | Fast company - tech
The future belongs to systems of action

The world of enterprise tech is built on sturdy foundations. For decades, systems of record—the databases, customer relationship management (CRM), and enterprise resource planning (ERP) platforms

23 dec. 2024, 22:50:06 | Fast company - tech