They described one another as grifters, prima donnas, and clowns. They lied, reflexively and clumsily, in pursuit of money and relevance. Through the power of social media, they would change the course of American political history.
In the aftermath of January 6, reporters and investigators would focus on intelligence failures, White House intrigue, and well-organized columns of white nationalists. Those things were all real. But a fourth factor came into play: influencers.
Donald Trump was, in a way, the ultimate right-wing influencer, skilled at gaming his platform of choice, Twitter, to bend the news cycle to his will. Behind him came a caravan of crazies, hoping to influence their own way to stardom. For many of them, the platform of choice was Facebook.
The Stop the Steal Facebook group was born out of a mid-Election Day chat between a mother-daughter duo named Amy and Kylie Kremer, conservative activists with a history of feuding with rival Tea Party groups over fundraising. Formed with a few clicks by Kylie, the group’s name was chosen in an attempt to squat on the #StopTheSteal hashtag already trending on Twitter and Facebook.
In later interviews with the special January 6 Committee set up by the House to investigate the events of that fateful day, neither woman could explain why their group took off compared to other similarly named entities. The fact that the Kremers created it via an already-sizable Facebook page with verified status couldn’t have hurt. Within a few hours of its creation, Stop the Steal’s membership had expanded into the hundreds of thousands, with both everyday users and vocal online activists flocking to it like moths to a porch light.
“I think it was growing by 1,000 people every 10 seconds, which kind of broke a lot of algorithms,” Kylie Kremer told investigators.
Facebook took it down, but that didn’t mean the movement, and its attendant influencers, stopped gaining ground. Things got messy almost immediately, as personalities vying for online attention through outrageous words and behavior clashed. Ali Alexander, an ex-felon and far-right conspiracy theorist, set up a fundraising website and an LLC with the “Stop the Steal” name, soliciting donations that the Kremers and other activists would later accuse him of pocketing. Brandon Straka, a New York City hairstylist and founder of #WalkAway, a movement that had grown to more than a million followers on Facebook to ostensibly encourage Democrats to leave their party, also joined in.
The Kremers’ first attempt at convening the influencers on a conference call devolved into shouting. Straka sparred with Kylie Kremer, later telling investigators, “I found her to be emotionally unstable, and a—and incompetent.” Kylie Kremer in turn clashed with the conspiracy theorist media personality Alex Jones, who was also trying to get in on the action. She filed a report with Washington, D.C., police accusing him of “threats to bodily harm” after he allegedly threatened to push her off the stage while reportedly yelling, “I’m gonna do it. I’m gonna do it. I’m going to take over.”
Things ramped up once Trump tweeted, on December 19, that there would be a “big protest in D.C. on January 6th. Be there, will be wild!” The tweet blindsided the Kremers and White House staff alike, and it didn’t unite anyone so much as up the ante of the squabbling.
The stakes were getting bigger, and anyone who was anyone in the world of right-wing influencers wanted a piece of the action. The 73-year-old heiress to the Publix supermarket fortune donated $3 million in total to the effort, some of it going to Jones, some to Trump adviser Roger Stone, but a huge chunk—$1.25 million—going to Charlie Kirk, the founder of the conservative student group Turning Point USA, to “deploy social media influencers to Washington” and “educate millions.” When investigators later confronted Kirk with documents showing he had billed the heiress for $600,000 of buses that were never chartered, he responded by invoking his Fifth Amendment right against self-incrimination.
This band of self-declared patriots came together at a unique moment in American history, with social media coming fully into its power, and they had a ready audience in the social media-obsessed president in the White House. Trump wanted the likes of Alexander and Jones to speak at his January 6 event, according to texts sent by one of his aides, Katrina Pierson. Or as she put it in a text to Kylie Kremer: “He likes the crazies.” There was a nominal division between the “crazies” on one side and the Kremers on the other, but all were coming together to make sure January 6 would be unforgettable.
“I mean, there were so many things that were being said or pushed out via social media that were just concerning,” Kylie Kremer told investigators, while defending the decision to maintain a loose alliance with what she variously called mercenary, larcenous, and quite possibly mentally ill social media activists who were posting about civil war, 1776, and their willingness to die for liberty. “It took all of us getting the messaging out to get all the people that came to D.C.,” she said.
The influencers’ belligerence was the source of their power. “The more aggressive people, like the Alis and all those guys, they began to get a little bit more prominence because of the language that they were using,” Pierson told investigators.
Trump may have promoted the Kremers’ official January 6 protest on his Twitter account but, in the end, one activist noted, they collected only 20,000 RSVPs on Facebook. Ali’s bootleg site, pumped with louder language and even wilder conspiracy theories, pulled in 500,000.
Pierson’s “crazies” were, in fact, the luminaries of Zuckerberg’s Fifth Estate. “These people had limited abilities to influence real-life outcomes—if Ali Alexander had put out a call for people to march on the Capitol, a few dozen people would have shown up,” says Jared Holt, who researched the run-up to January 6 for the Facebook-funded Atlantic Council’s Digital Forensic Research Lab. “But it’s the network effects where they took hold, where people who are more respectable and popular than Ali reshape his content.”
To keep the influencers hyping the January 6 rally—but nowhere near the president himself—Pierson helped broker a deal for what she called “the psycho list” to speak at a different event on January 5. Amid a frigid winter drizzle in D.C.’s Freedom Plaza, Ali and Straka ranted alongside Jones, disgraced former New York police commissioner Bernard Kerik, and the guy behind the “DC Draino” meme account, which had 2.3 million followers on Instagram alone.
The next day, at the real rally, the Kremers instructed security to be ready if Ali, Jones, or Straka attempted to rush the stage and seize the microphone by force.
Straka told investigators that he would have liked to speak on January 6 himself but, barring that, he made the best of things. “I’ve got my camera, I’ve got my microphone,” he recalled thinking. “I am going to turn it into an opportunity to create content for my audience.”
The story of how Facebook’s defenses were so soundly defeated by such a hapless crew begins right after the Stop the Steal group’s takedown, and it starts right at the top.
Although Facebook had vaguely alleged that it had taken down the group because of prohibited content, the truth was that the group hadn’t violated Facebook’s rules against incitement to violence, and the platform had no policy forbidding false claims of election fraud. Based on the group’s obvious malignancy, however, Facebook’s Content Policy team had declared a “spirit of the policy” violation, a rare but not unheard-of designation that boiled down to “because we say so.”
Zuckerberg had accepted the deletion under emergency circumstances, but he didn’t want the Stop the Steal group’s removal to become a precedent for a backdoor ban on false election claims. During the run-up to Election Day, Facebook had removed only lies about the actual voting process—stuff like “Democrats vote on Wednesday” and “People with outstanding parking tickets can’t go to the polls.” Noting the thin distinction between the claim that votes wouldn’t be counted and that they wouldn’t be counted accurately, Samidh Chakrabarti, the head of Facebook’s civic-integrity team, had pushed to take at least some action against baseless election fraud claims.
Civic hadn’t won that fight, but with the Stop the Steal group spawning dozens of similarly named copycats—some of which also accrued six-figure memberships—the threat of further organized election delegitimization efforts was obvious.
Barred from shutting down the new entities, Civic assigned staff to at least study them. Staff also began tracking top delegitimization posts, which were earning tens of millions of views, for what one document described as “situational awareness.” A later analysis found that as much as 70 percent of Stop the Steal content was coming from known “low news ecosystem quality” pages, the commercially driven publishers that Facebook’s News Feed integrity staffers had been trying to fight for years.
Civic had prominent allies in this push for intelligence gathering about these groups, if not for their outright removal. Facebook had officially banned QAnon conspiracy networks and militia groups earlier in the year, and Brian Fishman, Facebook’s counterterrorism chief, pointed to data showing that Stop the Steal was being heavily driven by the same users enthralled by fantasies of violent insurrection.
“They stood up next to folks that we knew had a track record of violence,” Fishman later explained of Stop the Steal.
But Zuckerberg overruled both Facebook’s Civic team and its head of counterterrorism. Shortly after the Associated Press called the presidential election for Joe Biden on November 7—the traditional marker for the race being definitively over—Facebook staff lawyer Molly Cutler assembled roughly 15 executives that had been responsible for the company’s election preparation. Citing orders from Zuckerberg, she said the election delegitimization monitoring was to immediately stop.
Though Zuckerberg wasn’t there to share his reasoning, Rosen hadn’t shied away from telling Chakrabarti that he agreed with Zuckerberg’s decision—an explanation that Chakrabarti found notable enough to make a record of. He quoted Rosen in a note to the company’s HR department as having told him that monitoring efforts to stop the presidential transition would “ ‘just create momentum and expectation for action’ that he did not support.”
The sense that the company could put the election behind it wasn’t confined to management. Ryan Beiermeister, whose work leading the 2020 Groups Task Force was widely admired within both Civic and the upper ranks of Facebook’s Integrity division, wrote a note memorializing the strategies her team had used to clean up what she called a “powderkeg risk.”
Beiermeister, a recent arrival to Facebook from the data analysis giant Palantir, congratulated her team for the “heroic” efforts they made to get Facebook’s senior leadership to sign off on the takedowns of toxic groups. “I truly believe the Group Task Force made the election safer and prevented possible instances of real world violence,” she concluded, congratulating the team’s 30 members for the “transformative impact they had on the Groups ecosystem for this election and beyond.”
Now, with the election crisis seemingly over, Facebook was returning its focus to engagement. The growth-limiting Break the Glass measures were going to have to go.
On November 30, Facebook lifted all demotions of content that delegitimized the election results. On December 1, the platform restored misinformation-rich news sources to its “Pages You Might Like” recommendations and lifted a virality circuit breaker. It relaxed its suppression of content that promoted violence the day after that, and resumed “Feed boosts for non-recommendable Groups content” on December 7. By December 16, Facebook had removed the caps on the bulk group invitations that had driven Stop the Steal’s growth.
Only later would the company discover that more than 400 groups posting pro-insurrectionist content and false claims of a stolen election were already operating on Facebook when the company lifted its restrictions on bulk invitations. “Almost all of the fastest growing FB Groups were Stop the Steal during the period of their peak growth,” the document noted.
A later examination of the social media habits of people arrested for their actions on January 6 found that many “consumed fringe Facebook content extensively,” much of it coming via their membership in what were sometimes hundreds of political Facebook groups. On average, those groups were posting 23 times a day about civil war or revolution.
Facebook had lowered its defenses in both the metaphorical and technical sense. But not all the degradation of the company’s integrity protections was intentional. On December 17, a data scientist flagged that a system responsible for either deleting or restricting high-profile posts that violated Facebook’s rules had stopped doing so. Colleagues ignored it, assuming that the problem was just a “logging issue”—meaning the system still worked, it just wasn’t recording its actions. On the list of Facebook’s engineering priorities, fixing that didn’t rate.
In fact, the system truly had failed, in early November. Between then and when engineers realized their error in mid-January, the system had given a pass to 3,100 highly viral posts that should have been deleted or labeled “disturbing.”
Glitches like that happened all the time at Facebook. Unfortunately, this one produced an additional 8 billion “regrettable” views globally, instances in which Facebook had shown users content that it knew was trouble. The company would later say that only a small minority of the 8 billion “regrettable” content views touched on American politics, and that the mistake was immaterial to subsequent events. A later review of Facebook’s post-election work tartly described the flub as a “lowlight” of the platform’s 2020 election performance, though the company disputes that it had a meaningful impact. At least 7 billion of the bad content views were international, the company says, and of the American material only a portion dealt with politics. Overall, a spokeswoman said, the company remains proud of its pre- and post-election safety work.
Facebook had never gotten out of the red zone on Civic’s chart of election threats. Now, six weeks after the election, the team’s staffers were scattered, Chakrabarti was out, and protections against viral growth risks had been rolled back.
In the days leading up to January 6, the familiar gauges of trouble—hate speech, inflammatory content, and fact-checked misinformation—were again ticking up. Why wasn’t hard to guess. Control of the Senate depended on a Georgia runoff election scheduled for January 5 and Trump supporters were beginning to gather in Washington, D.C., for the protest that Trump had promised would “be wild!”
The Counterterrorism team reporting to Brian Fishman was tracking pro-insurrection activity that he considered “really concerning.” By January 5, Facebook was preparing a new crisis coordination team, just in case, but nobody at the company—or anywhere in the country, really—was quite ready for what happened next.
On January 6, speaking to a crowd of rowdy supporters, Trump again repeated his claim that he had won the election. And then he directed them toward the Capitol, declaring that, “If you don’t fight like hell, you’re not going to have a country anymore.” Floods of people streamed toward the Capitol and, by 1:00 p.m., rioters had broken through the outer barriers around the building.
Fishman, out taking a walk at the time, sprinted home, according to a later interview with the January 6 Committee. It was time to start flipping those switches again. But restoring the safeguards that Facebook had eliminated just a month earlier came too late to keep the peace at Facebook, or anywhere else. Integrity dashboards reflected the country’s social fabric rending in real time, with reports of false news quadrupling and calls for violence up tenfold since the morning. On Instagram, views of content from what Facebook called “zero trust” countries were up sharply, suggesting hostile entities overseas were jumping into the fray in an effort to stir up additional strife.
Temperatures were rising on Workplace, too. For those on the front lines of the company’s response, the initial silence from Facebook’s leadership was deafening.
“Hang in there everyone,” wrote Mike Schroepfer, the chief technology officer, saying company leaders were working out how to “allow for peaceful discussion and organizing but not calls for violence.”
“All due respect, but haven’t we had enough time to figure out how to manage discourse without enabling violence?” an employee snapped back, one of many unhappy responses that together drew hundreds of likes from irate colleagues. “We’ve been fueling this fire for a long time and we shouldn’t be surprised that it’s now out of control.”
Shortly after 2:00 p.m., rioters entered the Capitol. By 2:20 p.m., the building was in lockdown.
Several hours passed before Facebook’s leadership took their first public steps, removing two of Trump’s posts. Privately, the company revisited its determination that Washington, D.C., was at “temporarily heightened risk of political violence.” Now the geographic area at risk was the entire United States.
As rioters entered the Senate chamber and offices around the building, while members of Congress donned gas masks and hid where they could, Facebook kept tweaking the platform in ways that might calm things down, going well past the set of Break the Glass interventions that it had rolled out in November. Along with additional measures to slow virality, the company ceased auto-deleting the slur “white trash,” which was being used quite a bit as photos of colorfully dressed insurrectionists roaming the Capitol went viral. Facebook had bigger fish to fry than defending rioters from reverse racism.
Enforcement operations teams were given a freer hand, too, but it wasn’t enough. Everythi
Connectez-vous pour ajouter un commentaire
Autres messages de ce groupe

The Fast Company Impact Council is a private membership community of influential leaders, experts, executives, and entrepreneurs who share their insights with our audience. Members pay annual

Cryptocurrency exchange Bybit said last week hackers had stolen digital tokens worth around $1.5 billion, in what researchers called the biggest crypto heist of all time.
Bybit CEO Ben Z


Anthropic released on Monday its Claude 3.7 Sonnet model, which it says returns results faster and can show the user the “chain of thought” it follows to reach an answer. This latest model also po

This morning, Apple announced its largest spend commitment to da


In 2024, Amazon introduced its AI-powered HR ass