A week ago, Democratic Senator Elizabeth Warren signed as cosponsor of the Kids Online Safety Act (KOSA), which has a good chance of becoming law. The purpose of this bill is to address the youth mental health crisis that many attribute to the use of social media.
Among other proposed measures, KOSA’s duty of care would require social media platforms to design their products and services to prevent and mitigate certain harms to minors, such as mental health disorders, physical violence, online bullying, and harassment of the minor.
Several opponents have highlighted how KOSA would force platforms to censor perfectly legal content. Nevertheless, less attention has been given to another of KOSA’s bigger flaws: its illusory faith in technology’s power to act upon and change society.
The issue with this underlying belief is that it gives little consideration to the complex and sociotechnical character of the problems this legislative intervention is intended to solve. Over the years, social-psychology research has shown that many of the social harms associated with the internet can be traced back to offline social dynamics. As Sonia Livingstone notes in her research article, “The Rise and Fall of Screen Time,” digital technologies might be salient, but they are only one of a number of variables affecting children’s welfare and life chances. Besides genetic and biological factors, circumstances such as loneliness, abusive parents, poverty, social inequality, etc., have also been shown to be significant. And while parents may have some control over some of these factors, others have to do with biology, socioeconomic status, and structural disadvantages.
Additionally, policymakers’ illusory faith in technology forgets that the effects of technological fixes are not guaranteed. Research has long suggested that while designers and engineers can affect the future user by embedding certain constraints into technological products, users are responsible for “decoding” the technology in their hands, determining their specific uses and social meanings. Therefore, legislators can ask social media companies to encode certain values in design, but they can’t mandate them to ensure the outcomes. Design cannot ensure that children are ultimately included by their classmates or that information about the Israel-Gaza conflict doesn’t cause distress.
Thirdly, as extensive research on youth online risk has evidenced, the impact of digital technologies is different on different individuals, and in many cases depends on context. While a social media post from a fitness influencer may inspire an adolescent to live healthy, it may trigger anxiety for another. Same with content published by news services about climate change. Quoting what Wilbur Schramm and his colleagues made clear in 1961 for the case of television, “[T]he relationship is always between a kind of television and a kind of child in a kind of situation.”
To be clear, social media platforms are not free of responsibility here, and technological design does have power to influence certain behaviors. It is by now well-established how social media platforms use manipulative design to (among other things) extract consent and enhance engagement. Tech companies should be required by legislators to refrain from using collected personal data to exploit users’ decisional vulnerabilities and utilizing interface design to promote social media addiction. But there are limits to design interventions for solving social problems that are multifactored.
Congress must pass legislation that will be truly effective to solve the youth mental health crisis, which has more social and biological than technological roots. This means, for example, supporting ongoing initiatives to increase the pipeline of mental health professionals to solve the current shortage of counselors, psychologists, social workers, and therapists; promoting community and support group initiatives to reduce the social isolation and loneliness among younger people; and putting in place youth opportunities programs to help children and adolescents growing up in poverty. The Biden-Harris administration’s proposed comprehensive national strategy and its recent announcement that more than $200 million will be invested in youth mental health are steps in the right direction.
Anxiety, depression, online bullying, and the other harms that KOSA’s duty of care expects social media platforms to prevent through design are not neatly defined problems with a computable solution. They are complex, contextual, and in many cases deeply social. That’s precisely why parents, educators, and healthcare professionals have long struggled to remedy them. What makes legislators believe that technological fixes will solve them?
Autentifică-te pentru a adăuga comentarii
Alte posturi din acest grup

Getting an email in the mid-’90s was kind of an event—somewhere between hearing an unexpected knock at the door and walking into your own surprise party. The white-hot novelty of electronic mail i


For well over a decade now, consumers have been used to new iPhones coming out in the fall, like clockwork. However, according to a series of reports, Apple may be planning to change its iPhone re

Booking travel has become a bit of a game—especially if you want to get the best possible prices and avoid getting ripped off.
That’s because hotels and airlines have developed the lovel

Uber is facing internal staff unrest as it attempts to implement a three-day-per-week return to office (RTO) mandate and stricter sabbatical eligibility.
An all-hands meeting late

A study has confirmed what we all suspected: “K” is officially the worst text you can send.
It might look harmless enough, but this single letter has the power to shut down a conversatio

SoundCloud is facing backlash after creators took to social media to complain upon discovering that the music-sharing platform uses uploaded music to train its AI systems.
According to S