Discord removed thousands of violent extremist and criminal servers in 2020

Illustration for article entitled Discord Deleted Thousands Of Violent Extremist and Criminal Servers In 2020

Photo Samuel Corum Getty images

Thanks to the endlessly depressing degree to which covid has held everyone inside, Discord is more relevant than ever. But as the company revealed in his latest transparency report, which has led to new challenges – and improved efforts to overcome other challenges that it probably should have put more effort into before.

Discord, reportedly in talks with Microsoft to sell Bethesdas for about 1.3, released the transparency report today. Amidst standard operational insights for the second half of 2020 from Discord, some details stood out. First, the total number of user reports in 2020 has increased quite steadily – from 26,886 in January to 65,103 in December – with the number initially increasing in March. This makes sense; people were trapped in their houses, and Discord grew quickly because of that. Spam resulted in the most account deletions (over 3 million), with exploitative content, including non-consensual pornography, coming in a distant second (129,403) and harassment in a third (33,615).

Discord also pointed out that of the reports it produced, it most often took action on issues related to child-harm material, cybercrime, doxxing, exploitative content, and extremist or violent content. “This can be partly explained by the priority the team has given to issues in 2020 that would cause the most damage in the real world,” the company said in the transparency report.

Indeed, according to the report, Discord removed more than 1,500 servers for violent extremism in the second half of 2020, which the report said was “an increase of nearly 93% from the first half of the year.” It cited groups like the Boogaloo Boys and QAnon as examples.

“This increase can be attributed to the expansion of our anti-extremism efforts and growing trends in online extremism,” the company wrote. “One of the online trends seen during this period was the growth of QAnon. We adjusted our efforts to address the movement and eventually removed 334 QAnon-related servers. “

Cybercrime server removals rose in the same way over the course of 2020, up 140% from the first half of the year. In total, Discord removed nearly 6,000 cybercrime servers in the second half of 2020, which it said followed a significant increase in reports. “More cybercrime areas than ever before were flagged as Trust & Safety, and more were eventually removed from our site,” Discord wrote.

Discord also emphasized its focus on methods that enable it to “proactively detect and remove the most hurtful groups from our platform,” noting its efforts against extremism as an example, but also by noting where it is at fault. made.

“We were disappointed to realize that one of our tools during this time is to detect proactively [sexualized content related to minors] servers had an error, ‘Discord wrote. “As a result, there were fewer overall flags for our team. That bug has since been fixed and we have continued to remove servers that the tool is displaying. “

The other problem here is that Discord has made a concerted effort to remove QAnon content around the same time that other platforms did– after most of the damage had already been done. While removal may have been proactive by Discord’s internal definition, platforms were slow to be even reactive when it came to QAnon as a whole – leading to real and permanent damage in the United States and around the world. In 2017, Discord also functioned as an important stage for Unite The Right rally in Charlottesville, Virginia that eventually led to violence and three deathsWhile the platform has since tried to clean up its act, it is played host to an abundance abuse and alt-right activity as recently as 2017.

Some transparency is much better than none, but it’s worth noting that technology companies’ transparency reports often provide little insight into it how decisions are made and the bigger priorities of the platforms that essentially rule our online lives. Earlier this year, for example, Discord banned the r / WallStreetBets server at GameStop stonksapalooza. Spectators suspected foul play – some outside interference. Speak against Kotakuhowever Two sources made it clear that a labyrinthine internal moderation policy ultimately led Discord to make that decisionBad timing and poor transparency before and after took care of the rest.

This is just a small example of how this dynamic can unfold. There are much morePlatforms may say they are transparent, but in the end they just give people some barely contextualized numbers. It’s hard to say what true transparency looks like in the age of all-encompassing tech platforms, but it isn’t.

Recommended stories

Source