Inside the harrowing world of online student surveillance

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741. Megan Waskiewicz used to sit at the top of the bleachers, rest her back against the wall, and hide her face behind the glow of a laptop monitor. While watching one of her five children play basketball on the court below, she knew she had to be careful. The mother from Pittsburgh didn’t want other parents in the crowd to know she was also looking at child porn. Waskiewicz worked as a content moderator for Gaggle, a surveillance company that monitors the online behaviors of some 5 million students across the U.S. on their school-issued Google and Microsoft accounts. Through an algorithm designed to flag references to sex, drugs, and violence and a team of content moderators like Waskiewicz, the company sifts through billions of students’ emails, chat messages, and homework assignments each year. Their work is supposed to ferret out evidence of potential self-harm, threats or bullying, incidents that would prompt Gaggle to notify school leaders and, in some cases, the police. As a result, kids’ deepest secrets—like nude selfies and suicide notes—regularly flashed onto Waskiewicz’s screen. Though she felt “a little bit like a voyeur,” she believed Gaggle helped protect kids. But mostly, the low pay, the fight for decent hours, inconsistent instructions, and stiff performance quotas left her feeling burned out. Gaggle’s moderators face pressure to review 300 incidents per hour, and Waskiewicz knew she could get fired on a moment’s notice if she failed to distinguish mundane chatter from potential safety threats in a matter of seconds. She lasted about a year. “In all honesty, I was sort of half-assing it,” Waskiewicz admitted in an interview with The 74. “It wasn’t enough money, and you’re really stuck there, staring at the computer, reading and just click, click, click, click.” Content moderators like Waskiewicz, hundreds of whom are paid just $10 an hour on month-to-month contracts, are on the front lines of a company that claims it saved the lives of 1,400 students last school year and argues that the growing mental health crisis makes its presence in students’ private affairs essential. Gaggle founder and CEO Jeff Patterson has warned about “a tsunami of youth suicide headed our way” and said that schools have “a moral obligation to protect the kids on their digital playground.” Eight former content moderators at Gaggle shared their experiences for this story. While several believed their efforts in some cases did shield kids from serious harm, they also surfaced significant questions about the company’s efficacy, its employment practices, and its effect on students’ civil rights. Among the moderators who worked on a contractual basis, none had prior experience in school safety, security, or mental health. Instead, their employment histories included retail work and customer service, but they were drawn to Gaggle while searching for remote jobs that promised flexible hours. They described an impersonal and cursory hiring process that appeared automated. Former moderators reported submitting applications online and never having interviews with Gaggle managers—either in-person, on the phone, or over Zoom—before landing jobs. Once hired, moderators reported insufficient safeguards to protect students’ sensitive data, a work culture that prioritized speed over quality, scheduling issues that sent them scrambling to get hours, and frequent exposure to explicit content that left some traumatized. Contractors lacked benefits, including mental health care, and one former moderator said he quit after repeated exposure to explicit material that so disturbed him he couldn’t sleep, and without “any money to show for what I was putting up with.” Gaggle content moderators encompass as many as 600 contractors at any given time, and just two dozen work as employees who have access to benefits and on-the-job training that lasts several weeks. Gaggle executives have sought to downplay contractors’ role with the company, arguing that they use “common sense” to distinguish false flags generated by the algorithm from potential threats and do “not require substantial training.” While the experiences reported by Gaggle’s moderator team resemble those of social media platforms like Meta-owned Facebook, Patterson said his company relies on “U.S.-based, U.S.-cultured reviewers as opposed to outsourcing that work to India or Mexico or the Philippines,” as the social media giant does. He rebuffed former moderators who said they lacked sufficient time to consider the severity of a particular item.

“Some people are not fast decision-makers. They need to take more time to process things, and maybe they’re not right for that job,” he told The 74. “For some people, it’s no problem at all. For others, their brains don’t process that quickly.” Executives also sought to minimize the contractors’ access to students’ personal information; a spokeswoman said they only see “small snippets of text” and lacked access to what’s known as students’ “personally identifiable information.” Yet, former contractors described reading lengthy chat logs, seeing nude photographs and, in some cases, coming upon students’ names. Several former moderators said they struggled to determine whether something should be escalated as harmful due to “gray areas,” such as whether a Victoria’s Secret lingerie ad would be considered acceptable or not. “Those people are really just the very, very first pass,” Gaggle spokeswoman Paget Hetherington said. “It doesn’t really need training; it’s just like if there’s any possible doubt with that particular word or phrase it gets passed on.” Molly McElligott, a former content moderator and customer service representative, said management was laser-focused on performance metrics, appearing more interested in business growth and profit than in protecting kids. “I went into the experience extremely excited to help children in need,” McElligott wrote in an email. Unlike the contractors, McElligott was an employee at Gaggle, where she worked for five months in 2021 before taking a position at the Manhattan District Attorney’s Office in New York. “I realized that was not the primary focus of the company.” Gaggle is part of a burgeoning campus security industry that’s seen significant business growth in the wake of mass school shootings as leaders scramble to prevent future attacks. Patterson, who founded the company in 1999 by offering student email accounts that could be monitored for pornography and cursing, said its focus now is mitigating the pandemic-driven youth mental health crisis. Patterson said the team talks about “lives saved” and child safety incidents at every meeting, and they are open about sharing the company’s financial outlook, so that employees “can have confidence in the security of their jobs.” “We are just expendable” Under the pressure of new federal scrutiny along with three other companies that monitor students online, Gaggle executives recently told lawmakers that it relies on a “highly trained content review team” to analyze student materials and flag safety threats. Yet, former contractors, who make up the bulk of Gaggle’s content review team, described their training as “a joke,” consisting of a slideshow and an online quiz, which left them ill-equipped to complete a job with such serious consequences for students and schools. As an employee on the company’s safety team, McElligott said she underwent two weeks of training, but the disorganized instruction meant that she and other moderators were “more confused than when we started.” Former content moderators have also flocked to employment websites like Indeed.com to warn job seekers about their experiences with the company, often sharing reviews that resembled the former moderators’ feedback to The 74. Gaggle described a two-tiered review procedure but didn’t disclose that low-wage contractors were the first line of defense.“If you want to be not cared about, not valued, and be completely stressed/traumatized on a daily basis, this is totally the job for you,” one anonymous reviewer wrote on Indeed. “Warning, you will see awful, awful things. No, they don’t provide therapy or any kind of support either. “That isn’t even the worst part,” the reviewer continued. “The worst part is that the company does not care that you hold them on your backs. Without safety reps, they wouldn’t be able to function, but we are just expendable.” As the first layer of Gaggle’s human review team, contractors analyze materials flagged by the algorithm and decide whether to escalate students’ communications for additional consideration. Designated employees on Gaggle’s Safety Team are in charge of calling or emailing school officials to notify them of troubling material identified in students’ files, Patterson said. Gaggle’s staunchest critics have questioned the tool’s efficacy and describe it as a student privacy nightmare. In March, Democratic senators Elizabeth Warren and Ed Markey urged greater federal oversight of Gaggle and similar companies to protect students’ civil rights and privacy. In a report, the senators said the tools could surveil students inappropriately, compound racial disparities in school discipline, and waste tax dollars. The information shared by the former Gaggle moderators with The 74 “struck me as the worst-case scenario,” said attorney Amelia Vance, the cofounder and president of Public Interest Privacy Consulting. Content moderators’ limited training and vetting, as well as their lack of backgrounds in youth mental health, she said, “is not acceptable.” In its recent letter to lawmakers, Gaggle described a two-tiered review procedure but didn’t disclose that low-wage contractors were the first line of defense. CEO Patterson told The 74 that they “didn’t have nearly enough time” to respond to lawmakers’ questions about their business practices and didn’t want to divulge proprietary information. Gaggle uses a third party to conduct criminal background checks on contractors, Patterson said, but he acknowledged they aren’t interviewed before getting placed on the job. “There’s a lot of contractors. We can’t do a physical interview of everyone, and I don’t know if that’s appropriate,” he said. “It might actually introduce another set of biases in terms of who we hire or who we don’t hire.”

“Other eyes were seeing it” In a previous investigation, The 74 analyzed a cache of public records to expose how Gaggle’s algorithm and content moderators subject students to relentless digital surveillance long after classes end for the day, extending schools’ authority far beyond their traditional powers to regulate speech and behavior, including at home. Gaggle’s algorithm relies largely on keyword matching and gives content moderators a broad snapshot of students’ online activities, including diary entries, classroom assignments, and casual conversations between students and their friends. After the pandemic shuttered schools and shuffled students into remote learning, Gaggle oversaw a surge in students’ online materials—and of school districts interested in their services. Gaggle reported a 20% bump in business as educators scrambled to keep a watchful eye on students whose chatter with peers moved from school hallways to instant messaging platforms like Google Hangouts. One year into the pandemic, Gaggle reported a 35% increase in references to suicide and self-harm, accounting for more than 40% of all flagged incidents. Waskiewicz, who began working for Gaggle in January 2020, said that remote learning spurred an immediate shift in students’ online behaviors. Under lockdown, students without computers at home began using school devices for personal conversations. Sifting through the everyday exchanges between students and their friends, Waskiewicz said, became a time suck and left her questioning her own principles. “I felt kind of bad because the kids didn’t have the ability to have stuff of their own, and I wondered if they realized that it was public,” she said. “I just wonder if they realized that other eyes were seeing it other than them and their little friends.” Student-activity-monitoring software like Gaggle has become ubiquitous in U.S. schools, and 81% of teachers work in schools that use tools to track students’ computer activity, according to a recent survey by the nonprofit Center for Democracy and Technology. A majority of teachers said the benefits of using such tools, which can block obscene material and monitor students’ screens in real time, outweigh potential risks. Likewise, students generally recognize that their online activities on school-issued devices are being observed, the survey found, and alter their behaviors as a result. More than half of student respondents said they don’t share their true thoughts or ideas online as a result of school surveillance, and 80% said they were more careful about what they search online. A majority of parents reported that the benefits of keeping tabs on their children’s activity exceeded the risks. Yet, they may not have a full grasp on how programs like Gaggle work, including the heavy reliance on untrained contractors and weak privacy controls, revealed by The 74’s reporting, said Elizabeth Laird, the group’s director of equity in civic technology. “I don’t know that the way this information is being handled actually would meet parents’ expectations,” Laird said. Another former contractor, who reached out to The 74 to share his experiences with the company anonymously, became a Gaggle moderator at the height of the pandemic. As COVID-19 cases grew, he said he felt unsafe continuing his previous job as a caregiver for people with disabilities, so he applied to Gaggle because it offered remote work. About a week after he submitted an application, Gaggle gave him a key to kids’ private lives—including, most alarming to him, their nude selfies. Exposure to such content was traumatizing, the former moderator said, and while the job took a toll on his mental well-being, it didn’t come with health insurance. “I went to a mental hospital in high school due to some hereditary mental health issues and seeing some of these kids going through similar things really broke my heart,” said the former contractor, who shared his experiences on the condition of anonymity, saying he feared possible retaliation by the company. “It broke my heart that they had to go through these revelations about themselves in a context where they can’t even go to school and get out of the house a little bit. They have to do everything from home—and they’re being constantly monitored.” In this screenshot, Gaggle explains its terms and conditions for contract content moderators. The screenshot, which was provided to The 74 by a former contractor who asked to remain anonymous, has been redacted. Gaggle employees are offered benefits, including health insurance, and can attend group therapy sessions twice a month, Hetherington said. Patterson acknowledged the job can take a toll on staff moderators, but sought to downplay its effects on contractors, and said they’re warned about exposure to disturbing content during the application process. He said using contractors allows Gaggle to offer the service at a price school districts can afford. “Quite honestly, we’re dealing with school districts with very limited budgets,” Patterson said. “There have to be some trade-offs.” The contractor requesting anonymity said he wasn’t as concerned about his own well-being as he was about the welfare of the students under the company’s watch. The company lacked adequate safeguards to protect students’ sensitive information from leaking outside of the digital environment that Gaggle built for moderators to review such materials. Contract moderators work remotely with limited supervision or oversight, and he became especially concerned about how the company handled students’ nude images, which are reported to school districts and the National Center for Missing and Exploited Children. Nudity and sexual content accounted for about 17% of emergency phone calls and email alerts to school officials last school year, according to Gaggle. Contractors, he said, could easily save the images for themselves or share them on the dark web. Patterson acknowledged the possibility but said he wasn’t aware of any data breaches. “We do things in the interface to try to disable the ability to save those things,” Patterson said, but “you know, human beings who want to get around things can.” “Made me feel like the day was worth it” Vara Heyman was looking for a career change. After working jobs in retail and customer service, she made the pivot to content moderation, and a contract position with Gaggle was her first foot in the door. She was left feeling baffled by the impersonal hiring process, especially given the high stakes for students. Waskiewicz had a similar experience. In fact, she said the only time she ever interacted with a Gaggle supervisor was when she was instructed to provide her bank account information for direct deposit. The interaction left her questioning whether the company that contracts with more than 1,500 school districts was legitimate or a scam. “It was a little weird when they were asking for the banking information, like, ‘Wait a minute—is this real or what?'” Waskiewicz said. “I Googled them, and I think they’re pretty big.” Heyman said that sense of disconnect continued after being hired, with communications between contractors and their supervisors limited to a Slack channel. Despite the challenges, several former moderators believe their efforts kept kids safe from harm. McElligott, the former Gaggle safety team employee, recalled an occasion when she found a student’s suicide note. “Knowing I was able to help with that made me feel like the day was worth it,” she said. “Hearing from the school employees that we were able to alert about self-harm or suicidal tendencies from a student they would never expect to be suffering was also very rewarding. It meant that extra attention should or could be given to the student in a time of need.” I thought it would just be stopping school shootings or reducing cyberbullying, but no, I read the chat logs of kids coming out to their friends.”Former Gaggle moderatorSusan Enfield, the superintendent of Highline Public Schools in suburban Seattle, said her district’s contract with Gaggle has saved lives. Earlier this year, for example, the

Created 3y | May 4, 2022, 9:20:55 AM


Login to add comment

Other posts in this group

Am I allowed to unsubscribe to my friend’s newsletter?

There are certain social media rules we can all agree on: Ghosting a conversation is impolite, and replying “k” to a text is the equivalent of a backhand slap (violent, wrong, and rude). But what

Jan 27, 2025, 10:40:04 AM | Fast company - tech
Frustrated with today’s ‘attention economy’? You’re really going to hate what comes next

In the 1990s, the internet was a bit of a wonderland. It was new and liberating and largely free of

Jan 25, 2025, 12:20:09 PM | Fast company - tech
Why tech in Congress lags  behind the modern world

On a typical day, you can’t turn on the news without hearing someone say that Congress is broken.

Jan 25, 2025, 12:20:08 PM | Fast company - tech
$TRUMP was just the beginning: The new administration is finding all sorts of ways to cash in

At President Donald Trump’s inauguration on Monday, Detroit pastor Lorenzo Sewell took the stage to pray for the incoming administration, peppering his

Jan 25, 2025, 12:20:07 PM | Fast company - tech
Did you show ‘negative sentiment’ for insurance companies after the UHC CEO shooting? Police were watching

When news broke that the United Healthcare CEO was shot in broad daylight early last month, outrage erupted online. But it wasn’t aimed at the assassin. Instead, it was directed at the broken U.S.

Jan 25, 2025, 12:50:02 AM | Fast company - tech
How an AI-generated ‘expert’ sank into media deadlines

Ashley Abramson first came across Sophie Cress in a cold pitch to her work email. Cress was asking to be an expert source for any stories Abramson was working on as a freelance reporter. “I’ve got

Jan 24, 2025, 10:30:03 PM | Fast company - tech
Meta’s Threads is finally getting ads

Threads, Meta’s X and Bluesky rival, is testing ads with certain brands in the United States and Japan, the company said Friday.

“We know there will be plenty of feedback abo

Jan 24, 2025, 8:10:07 PM | Fast company - tech