Shadowbanning on Facebook: As a Black transgender woman, why am I invisible?

As a Black transgender woman, it is hard to imagine my life without social media. It was essential for me to live my truth. It provided connection with others like me near and far. It remains a space where individuals who are isolated physically can find meaning by linking with a wider community. Right now, queer youth, especially trans youth, need these vital bonds, which are often easier to form virtually. But in their current, unchecked incarnations, these platforms—especially Facebook and, increasingly, Instagram and YouTube—are becoming dangerous places for trans people. That’s because there is a mounting right-wing takeover of these vehicles and what appears to be a corresponding crackdown, whereby algorithms circumscribe the reach of trans-friendly content. The deluge of posts and messages that attack the humanity of transgender people put us at risk, especially the youngest members of our community. According to the Trevor Project, negative online environments can contribute to higher suicide rates for LGBTQ+ youth.  This forces us to weigh whether the good derived from social media companies is being undone by the harm. The answer has implications not only for the people who run social media companies, their shareholders, and the policymakers in Washington, D.C., and elsewhere, but also for society at large. Our increasing reliance on the digital tools pioneered by social media companies—for everything from securing mortgages to ordering groceries—is eroding the fundamental humanity of everyone, regardless of gender identity. Algorithms are amping up our worst instincts and threatening to roll back 50 years of human rights progress in the United States.   Before detailing why, I want to be clear that this isn’t a social media “hateration fest.” It’s more complicated and nuanced than that.  The reason: I would not be who I am if not for the trans people and organizations I encountered on social media platforms while coming to terms with my gender identity. I grew up in a society hostile to people like me, and I was afraid of myself. That fear kept me from unfolding, kept me hidden. But my therapist finally coaxed me into beginning the journey of my gender right around the time social media exploded. Following trans people like Janet Mock, as well as organizations like the Transgender Law Center and Callen-Lorde Community Health Center, removed barriers within myself that helped me become me. In fact, seeing Janet on television a decade ago added to my sense of possibility. The most basic point is that they showed me I was not alone, and connected me to ways that I could realize myself.  I founded an organization, TransLash Media, to center the humanity of trans people through journalism and nonfiction narratives, and it would not be possible for us to build our audience without social media. TransLash reaches transgender people, and those who wish to learn more about us, through these platforms. Our videos, documentaries, podcasts, and zines would never have made it through the stringent filter and closed doors of traditional media organizations. Like so many startups and new voices, we are dependent on these outlets to reach those who need to hear from us the most.  This is true for countless other trans people and organizations in the U.S. and worldwide: Connecting through social media networks is our lifeline. That’s why the slow, steady erosion of access to those platforms for trans people is so alarming.  As much as social media is essential for trans communities to connect, it is, sadly, even more useful for those spreading anti-trans hate. In fact, according to the think tank Media Matters, two out of every three Facebook posts about trans issues in 2021 were originally posted by right-wing sources. Not only were the majority of posts clearly anti-trans, but they were far more likely to receive engagement than trans-friendly or even trans-neutral sources: up to 16 times more likely to be exact. (A Meta spokesperson declined my request for comment.) We’re not talking about run-of-the mill, conservative points of view, but voices like Breitbart, which cheered last year’s insurrection, or Ben Shapiro who calls being trans “a psychological disorder.” This is extremism.   There is little about the volume of these posts and the levels of their engagement that is truly organic. The underlying algorithms are actively fueling “extremism and hate,” as the LGBTQ+ media-monitoring group GLAAD points out. Anger and ridicule attract more eyeballs, which algorithms then amplify and spread with devastating efficiency. And the more these platforms bolster anti-trans hate, the more existing trans voices are outshouted and weakened.  Which brings us back to allegations of shadowbanning. Activist and columnist Ashlee Marie Brown, who is Black and trans, recently called out Instagram for a noticeable drop in her engagement after the Meta company instituted a “sensitive content filter.” The Algorithmic Justice League, an advocacy organization formed by tech-industry scholars and former employees, called it an example of “algorithmic bias.” At TransLash, we have noticed our engagement on Facebook plummet in the last year, even as we reach record engagement on nearly every other social media platform. To date, no one at the company has been able to give us a clear answer as to why this keeps happening.   This tyranny of algorithms is being unleashed at the worst time for the trans community. As I detail in an investigative podcast series, The Anti-Trans Hate Machine: A Plot Against Equality, at least 127 anti-trans bills were introduced in nearly 40 states last year. Even more disturbing is the fact that 2021 was the deadliest year on record for trans people. With AI and machine learning fueling human bigotry, what should we do? The good news is that promising options are emerging. As Safiya Umoja Noble lays out in her groundbreaking book, Algorithms of Oppression, we must acknowledge first and foremost that social media platforms rely on “machine learning where bias is both embedded and then amplified.” Timnit Gebru, a former engineer at Google, is launching the Distributed Artificial Intelligence Research Institute to chart new pathways in removing harmful algorithms. Her group, Black in AI, has already been hard at work.  With all of this intelligent brainwork on how to keep social media platforms free and open, the question is, Why haven’t these ever-growing hostile spaces already been transformed? The glib and easy answer is that there is too much profit in the way things are; but I suspect they haven’t changed because of darker, potentially more disturbing reasons.  The bigotry and anger being spewed is perhaps not marginal but fundamental to our current society. On some collective level, we are comfortable with transphobia, racism, homophobia, and misogyny not as things of the past but as guideposts for our future—especially as algorithms grow in importance. If this is the case, the analysis of how we arrived here is far more weighty than the actions of swapping out ignorant coders, engineers, and CEOs, and replacing them with theoretically better people. Instead, what we need to do is more wide-ranging and existential: It’s nothing less than a close examination of the human culture that nurtures extremist elements at the expense of those who are vulnerable, while searching to find a better life for themselves. Imara Jones is an entrepreneur, activist, and journalist whose work has won Emmy and Peabody Awards. She is the creator of TransLash Media, a cross-platform journalism, personal storytelling, and narrative project, which produces content to shift the current culture of hostility toward transgender people in the U.S.

https://www.fastcompany.com/90721395/shadowbanning-on-facebook-as-a-black-transgender-woman-why-am-i-invisible?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Établi 3y | 23 févr. 2022 à 11:21:22


Connectez-vous pour ajouter un commentaire

Autres messages de ce groupe

What does Meta’s Oversight Board even do?

When Meta established its Oversight Board to adjudicate on decisions it made about removing content from its platforms in 2020, the goal was for the select group of individuals from the media, civ

10 janv. 2025 à 10:40:03 | Fast company - tech
6 years ago, Elon Musk offered help during wildfires. This time he blamed DEI

When a devastating wildfire hit California in November 2018, a powerful CEO went on Twitter to ask how his company could help. That

10 janv. 2025 à 01:20:06 | Fast company - tech
Sony PlayStation is adding smell—yes, you read that right—to its games

Sony has unveiled a new gaming system that could allow PlayStation players to sniff their way through games like The Last of Us.

Unveiled at CES 2025, the Future Immersive Enter

9 janv. 2025 à 20:40:08 | Fast company - tech
Dell’s new PC names are boring—and a smart move

There are surely many reasons that the parents of tech icon Michael Dell di

9 janv. 2025 à 20:40:08 | Fast company - tech
AI at CES: a look at the wild PCs, smart glasses, and body-scanning mirrors turning heads at the conference

Welcome to AI DecodedFast Company’s weekly newsletter that breaks down the most important news in the world of AI. You can sign up to receive this newsletter ever

9 janv. 2025 à 18:30:05 | Fast company - tech
Watch Duty overtakes ChatGPT in the App Store as California wildfires spread

As devastating fires continue to rage in the Southern California ar

9 janv. 2025 à 18:30:04 | Fast company - tech