Katie Couric, Rashad Robinson, and Chris Krebs say it’s time to pull immunity for social media platforms

Misinformation hit a crescendo during the pandemic, sowing distrust in COVID-19 vaccines and inciting riots at the Capitol. Now a coalition of experts on misinformation and disinformation are making a specific set of recommendations to lawmakers on how to fix the issue–and big tech might not be so happy. Most notably, the proposal calls for changes to Section 230, the controversial part of the 1996 Communications Decency Act that protects online platforms from getting sued over user-generated content. Research center and think tank Aspen Institute brought together a who’s who commission of experts on disinformation  to illuminate the problem and offer strategic steps to address it. The commission’s chairs include journalist Katie Couric, civil rights leader and president of Color of Change Rashad Robinson, and Chris Krebs, the former director of the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency. Spread via the internet, disinformation and its close cousin misinformation have contributed to a series of public harms over the past decade, including interference in the 2016 U.S. elections, disruption of pandemic-related public health efforts, fomentation of genocide in Myanmar, and the January 6th siege of the Capitol. Disinformation, intentionally misleading information, is engineered to go viral, taking advantage of social media algorithms that favor outrageous perspectives. Misinformation, false information with no clear intent to deceive, similarly keeps slipping past social media companies efforts to curtail it. Last month, Facebook whistleblower Francis Haugen helped explain why those mitigation strategies fail. The former Facebook product manager and member of the company’s civic integrity group called out the social network for misleading the public about how much it actually does to protect users from harmful content. “The thing I saw at Facebook, over and over again, was there were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen told 60 Minutes. “And Facebook, over and over again, chose to optimize for its own interests, like making more money.” For years, Facebook has ducked responsibility for content on its platform, assuring regulators and the public that it is doing its best to balance free speech while reining in misinformation and speech that incites violence and hate. Haugen’s account suggests that is not the full picture. Social media companies have not yet been held to account, shielded by Section 230. Legislators have threatened to change the law (rhetoric that reached a fever pitch after the Capitol riots), but so far haven’t touched it. The Aspen Institute’s Commission on Information Disorder Final Report suggests removing this immunity for content that advertisers have paid to promote, as well as any content that has gone viral because of a platform’s recommendation algorithms. They also note that while free speech is a constitutional right, private platforms are not the public square and companies have the right to restrict speech. The commission’s recommendations are thorough, going much farther than simply suggesting amendments to Section 230. The report faults the federal government for failing to understand the issues and create meaningful rules that protect the public. (“Congress…remains woefully under-informed about the titanic changes transforming modern life,” the authors write.) The commission also notes that despite Big Tech’s pleas to be regulated, industry leaders have “outsized influence in shaping legislative priorities favorable to its interests.” To guide future legislative efforts, the commission suggests the government force social media platforms to be more transparent through data audits. One of the biggest hurdles to understanding both the effects of disinformation and the magnitude of the problem has been a lack of cooperation from the platforms themselves. Researchers often struggle to get the depth of information they need. (Facebook has been known to outright ban researchers who attempt to get this information without the company’s explicit participation.) The report says social platforms should be required to disclose “categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.” It also says that there should be federal protections in place for researchers and journalists who investigate social platforms in the public’s interest (even if they violate the platform’s terms and conditions in the process). It suggests that Congress require social media companies to publish transparency reports that include content, source accounts, reach, and impression data for posts that reach large audiences, and offer regular disclosures on key data points about digital ads and paid posts that run on their platforms. And it calls for clear content moderation practices as well as an archive of moderated content that researchers can access. In addition to these transparency measures, the commission asks the federal government to establish a strategic approach to countering misinformation and disinformation and the creation of an independent organization devoted to developing well-informed countermeasures. This could include efforts to educate the public on misinformation and how to discern between fact and propaganda online. Finally, the report calls for investment in local newsrooms and diversity measures, both in newsrooms and at social media companies. To support newsrooms, the report points to the creation of a digital advertising tax, much like the one Maryland passed. The report says some of those proceeds should go towards struggling local newsrooms to bolster reputable reporting. The report also suggests incentivizing donations to local news operations through tax credits. The report also recommends platforms hire diverse workforces to ensure that a broad spectrum of experiences are considered when companies design rules and content mitigation strategies. Rashad Robinson, president of Color of Change and one of the report’s commissioners, says that the government could play a role here. “Diversity should be part of how the government evaluates these companies,” especially their efforts to protect users, he says. Robinson has worked for years on civil rights issues related to the web and has spent a fair amount of time talking to regulators. “These are recommendations that I fundamentally believe are actionable,” he says.

https://www.fastcompany.com/90696655/katie-couric-rashad-robinson-and-chris-krebs-say-its-time-to-pull-immunity-for-social-media-platforms?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Utworzony 3y | 15 lis 2021, 11:21:32


Zaloguj się, aby dodać komentarz

Inne posty w tej grupie

Tinder wants you to flirt with an AI bot before you flop with a human

Think you’ve got game? Time to put it to the test with Tinder’s latest launch in collaboration with OpenAI.

On Tuesday, Tinder rolled out The Game Game—a new experience designed to help

1 kwi 2025, 21:20:06 | Fast company - tech
‘Imagine having Cybertruck money and buying a Cybertruck’: TikTok is full of people trading in their Teslas to the sounds of Taylor Swift

The old Tesla can’t come to the phone right now. Why? Oh, ‘cause she’s dead.

Over the past few days, a new trend has emerged on TikTok: people are posting their Tesla trade-ins accompani

1 kwi 2025, 19:10:03 | Fast company - tech
Kickstarter isn’t just for indie passion projects anymore

Despite a ">triumphant world premiere at Cannes last May, the politically unsparing Donald Trump biopic The Apprentice was stuck in

1 kwi 2025, 16:40:05 | Fast company - tech
‘inZOI’ challenges ‘The Sims’ with a fresh take on life simulation

Countless hours, days—perhaps even weeks—of my life have been spent creating Sims characters, building them houses, marrying them off, and making babies. Now, there’s a new life-simulatio

1 kwi 2025, 14:20:11 | Fast company - tech
SpaceX flight launches 4 space tourists into first-ever polar orbit

A bitcoin investor who bought a SpaceX flight for himself and three polar explorers blasted

1 kwi 2025, 14:20:10 | Fast company - tech
AI researchers want to map the 3D world. That means going vertical—and possibly nuclear

Spatial intelligence is an emerging approach to deploying AI in the physical world. By combining mapping data with artificial intelligence, it aims to deliver “smart data” tied to specific locatio

1 kwi 2025, 12:10:05 | Fast company - tech
3 years into war with Russia, this Ukrainian startup is powering a drone revolution

Ukraine’s war with Russia—sparked by Russia’s invasion in the spring of 2022—is now entering its fourth year. So t

1 kwi 2025, 12:10:04 | Fast company - tech