Discord doesn’t want to count strikes when users run afoul of its rules.
As part of a slew of fall product updates, the online chat platform announced a more flexible approach to moderation. Instead of handing out strikes for every policy violation, Discord will tailor its warnings or punishments to fit the crime, while providing steps users can take to improve their standing.
“We think we’ve built the most nuanced, comprehensive, and proportionate warning system of any platform,” Savannah Badalich, Discord’s senior director of policy, told reporters.
Alongside the new warning system, Discord is also launching new safety features for teens: It will auto-blur potentially-sensitive images from teens’ friends by default, and it will show a “safety check” when teens are messaging with someone new, asking if they want to proceed and linking to additional safety tips.
In both cases, Discord wants to show that it’s taking safety seriously after years of controversy and criticism. A report by NBC News in May documented how child predators used the platform to groom and abduct teens, while other reports have found pockets of extremism to thrive there.
Discord likes to point out that more 15% of its employees work on trust and safety. As the company expands beyond its roots in gaming, it hopes to build a system that’s more effective at moderating itself.
The no-strike system
Discord’s moderation rules have always been a bit tricky to pin down, perhaps by design.
While individual servers can set their own rules, Discord itself has not laid out a specific number of strikes or infractions that lead to suspension across its platform. Users in turn had no way to know where they stood, even if Discord was quietly keeping a tally of their infractions.

The new system tries to be more transparent while still stopping short of a distinct strike count. When users violate a rule, they’ll get a detailed pop-up describing what they did wrong along with any temporary restrictions that might apply. They can then head to Discord’s privacy and safety menu to see how the violation affects their account standing and what they can do to improve it.
Discord says it will have four levels of account status—including “All Good,” “Limited,” “Very Limited,” and “At Risk”—before users reach a platform-wide suspension. Serious offenses such as violent extremism and sexualizing children are still grounds for an immediate ban, but otherwise Discord isn’t assigning scores to each violation or setting a specific number of violations for each level.
It’s a different approach than what some of its peers are doing. Facebook, for instance, has a 10-strike system with increasing penalties at each level, while Microsoft recently launched an eight-strike system for Xbox users, with some violations counting for more than one strike.
Ben Shanken, Discord’s vice president of product, says the company will treat each type of violation differently, but ultimately it wants to leave more room for subjectivity.
“If your friend is just trying to report a message to troll you a little bit, we don’t want that to result in your account getting banned,” he says. “We’ve built from the ground up to try and be more bespoke about it.”
Early warnings for teens
As for Discord’s new teen safety features, the company says it will use image recognition algorithms to detect and blur potentially sensitive images from friends, and will block those images in DMs from strangers. Teen can then click the image to reveal its contents or head to Discord’s settings to disable the feature. While image blurring will be on by default for teens, adults will have an option to enable it as well.
Meanwhile, Discord will begin sending safety alerts to teens when they get messages from strangers. The alerts will make sure the teen is sure they want to reply, and will include links with safety tips and instructions on how to block the user.
Discord says the new warnings are part of a broader initiative to make its platform safer for teens. In June, NBC News reported on dozens of kidnapping, grooming or sexual assault cases over the past six years in which communications allegedly happened on Discord. It also cited data from the National Center for Missing & Exploited Children showing that reports of child sexual abuse material on Discord increased by 474% from 2021 to 2022, with the group claiming slower average response times from Discord over that period.
Shanken says Discord started working on the new safety features about nine months ago, and that the company will build on those features over the coming year. The plan is to give teens more control over their communications, while also getting smarter at detecting potential safety issues and flagging them for users.
“We’d much rather have a teenager receive an alert and block a user than just send a report to us, and us having to go figure that out,” he says.
Like other big tech companies, Discord dreams of being able to use AI and automation to build self-moderating systems. But while other technology companies are making cuts to those moderation efforts, Shanken says that hasn’t been the case at Discord. He notes that the team working on safety is Discord’s second-largest technology group.
“It’s true that these parts of the business are pressured in tougher economic times, and that’s not been the case at Discord,” he says. “We’ve only continued to expand our investment over the past couple of years.”
Ak chcete pridať komentár, prihláste sa
Ostatné príspevky v tejto skupine

The Apprentice, the long-running reality TV show th

Google has made it much easier to find the answers we seek without navigating to various websites, but that has made it much harder to do business for media companies and other creators. And this



It’s a . . . well, weird job market out there, to put it diplomatically. If you’ve suddenly found yourself looking for work among what feels like a never-ending onslaught of layoffs, you’

Lindsay Orr was active and healthy, running marathons and hiking all around Colorado. During pregnancy, she developed a persistent headache and dangerously high blood pressure—hallmark symptoms of

Tesla’s stock has dropped by nearly half in three months. Even so,