Katie Couric, Rashad Robinson, and Chris Krebs say it’s time to pull immunity for social media platforms

Misinformation hit a crescendo during the pandemic, sowing distrust in COVID-19 vaccines and inciting riots at the Capitol. Now a coalition of experts on misinformation and disinformation are making a specific set of recommendations to lawmakers on how to fix the issue–and big tech might not be so happy. Most notably, the proposal calls for changes to Section 230, the controversial part of the 1996 Communications Decency Act that protects online platforms from getting sued over user-generated content. Research center and think tank Aspen Institute brought together a who’s who commission of experts on disinformation  to illuminate the problem and offer strategic steps to address it. The commission’s chairs include journalist Katie Couric, civil rights leader and president of Color of Change Rashad Robinson, and Chris Krebs, the former director of the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency. Spread via the internet, disinformation and its close cousin misinformation have contributed to a series of public harms over the past decade, including interference in the 2016 U.S. elections, disruption of pandemic-related public health efforts, fomentation of genocide in Myanmar, and the January 6th siege of the Capitol. Disinformation, intentionally misleading information, is engineered to go viral, taking advantage of social media algorithms that favor outrageous perspectives. Misinformation, false information with no clear intent to deceive, similarly keeps slipping past social media companies efforts to curtail it. Last month, Facebook whistleblower Francis Haugen helped explain why those mitigation strategies fail. The former Facebook product manager and member of the company’s civic integrity group called out the social network for misleading the public about how much it actually does to protect users from harmful content. “The thing I saw at Facebook, over and over again, was there were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen told 60 Minutes. “And Facebook, over and over again, chose to optimize for its own interests, like making more money.” For years, Facebook has ducked responsibility for content on its platform, assuring regulators and the public that it is doing its best to balance free speech while reining in misinformation and speech that incites violence and hate. Haugen’s account suggests that is not the full picture. Social media companies have not yet been held to account, shielded by Section 230. Legislators have threatened to change the law (rhetoric that reached a fever pitch after the Capitol riots), but so far haven’t touched it. The Aspen Institute’s Commission on Information Disorder Final Report suggests removing this immunity for content that advertisers have paid to promote, as well as any content that has gone viral because of a platform’s recommendation algorithms. They also note that while free speech is a constitutional right, private platforms are not the public square and companies have the right to restrict speech. The commission’s recommendations are thorough, going much farther than simply suggesting amendments to Section 230. The report faults the federal government for failing to understand the issues and create meaningful rules that protect the public. (“Congress…remains woefully under-informed about the titanic changes transforming modern life,” the authors write.) The commission also notes that despite Big Tech’s pleas to be regulated, industry leaders have “outsized influence in shaping legislative priorities favorable to its interests.” To guide future legislative efforts, the commission suggests the government force social media platforms to be more transparent through data audits. One of the biggest hurdles to understanding both the effects of disinformation and the magnitude of the problem has been a lack of cooperation from the platforms themselves. Researchers often struggle to get the depth of information they need. (Facebook has been known to outright ban researchers who attempt to get this information without the company’s explicit participation.) The report says social platforms should be required to disclose “categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.” It also says that there should be federal protections in place for researchers and journalists who investigate social platforms in the public’s interest (even if they violate the platform’s terms and conditions in the process). It suggests that Congress require social media companies to publish transparency reports that include content, source accounts, reach, and impression data for posts that reach large audiences, and offer regular disclosures on key data points about digital ads and paid posts that run on their platforms. And it calls for clear content moderation practices as well as an archive of moderated content that researchers can access. In addition to these transparency measures, the commission asks the federal government to establish a strategic approach to countering misinformation and disinformation and the creation of an independent organization devoted to developing well-informed countermeasures. This could include efforts to educate the public on misinformation and how to discern between fact and propaganda online. Finally, the report calls for investment in local newsrooms and diversity measures, both in newsrooms and at social media companies. To support newsrooms, the report points to the creation of a digital advertising tax, much like the one Maryland passed. The report says some of those proceeds should go towards struggling local newsrooms to bolster reputable reporting. The report also suggests incentivizing donations to local news operations through tax credits. The report also recommends platforms hire diverse workforces to ensure that a broad spectrum of experiences are considered when companies design rules and content mitigation strategies. Rashad Robinson, president of Color of Change and one of the report’s commissioners, says that the government could play a role here. “Diversity should be part of how the government evaluates these companies,” especially their efforts to protect users, he says. Robinson has worked for years on civil rights issues related to the web and has spent a fair amount of time talking to regulators. “These are recommendations that I fundamentally believe are actionable,” he says.

https://www.fastcompany.com/90696655/katie-couric-rashad-robinson-and-chris-krebs-say-its-time-to-pull-immunity-for-social-media-platforms?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

созданный 3y | 15 нояб. 2021 г., 11:21:32


Войдите, чтобы добавить комментарий

Другие сообщения в этой группе

AI coding tools could bring us the ‘one-employee unicorn’

Welcome to AI DecodedFast Company’s weekly newsletter that breaks down the most important news in the world of AI. You can sign up to receive this newsletter every week 

24 апр. 2025 г., 18:40:03 | Fast company - tech
Bot farms invade social media to hijack popular sentiment

Welcome to the world of social media mind control. By amplifying free speech with fake speech, you can numb the brain into believing just about anything. Surrender your blissful ignorance and swall

24 апр. 2025 г., 13:50:11 | Fast company - tech
The economic case for saving human jobs

Few periods in modern history have been as unsettled and uncertain as the one that we are living through now. The established geopolitical order is facing its greatest challenges in dec

24 апр. 2025 г., 13:50:11 | Fast company - tech
Patreon’s rivalry with Substack is growing. Who will win over creators?

Substack and Patreon are vying to become creators’ primary revenue stream.

For most influencers, payouts from platforms like Meta or Google aren’t enough to build a sustainable career. R

24 апр. 2025 г., 11:40:04 | Fast company - tech
TikTok’s ‘SkinnyTok’ trend is under fire from EU regulators

The European Commission is coming for “SkinnyTok.”

EU regulators are investigating a recent wave of social media videos that promote extreme thinness and “tough-love” weight loss advice,

24 апр. 2025 г., 00:10:04 | Fast company - tech
The subreddit r/AITA is headed for the small screen

The infamous “Am I The A**hole?” subreddit is making its way to the small screen.

Hosted by Jimmy Carr, the new game show for Comedy Central U.K. will feature members of the public appea

23 апр. 2025 г., 19:30:03 | Fast company - tech
Ex-OpenAI workers ask state AGs to block for-profit conversion

Former employees of OpenAI are asking the top law enforcement officers in California and Delaware to s

23 апр. 2025 г., 17:10:06 | Fast company - tech