Twitch’s first transparency report is here – and long overdue

Twitch was released today his first-ever transparency report, detailing his efforts to protect the 26 million people who visit the site every day. When it comes to transparency, the ten-year-old Amazon-operated service had a lot to catch up with.

Twitch benefited from a 40 percent increase in channels between early and late 2020, supported by the popularity of both live streaming technology and video games during the pandemic. However, this explosive growth is also the company’s biggest challenge when it comes to fighting harassment and hatred. Unlike recorded videos, live content is often spontaneous and ephemeral. Things just happen, in front of a live audience of thousands or tens of thousands. That could be anything from 11-year-olds going live Minecraft– exposing them to potential predators – to now banned gambling celebrity Guy “Dr Disrespect” Beahm streaming from a public restroom at E3.

In its new transparency report, Twitch acknowledges this difficulty and, for the first time, provides specific details on how well it moderates its platform. While the findings are encouraging, what Twitch hasn’t historically been transparent about speaks just as harshly.

Twitch earned an early reputation as a hotbed for toxicity. Women and minorities streaming on the platform received targeted hatred from an audience hostile to people they believed deviated from gamer stereotypes. Twitch’s vague guidelines around so-called “sexually suggestive” content served to fuel the self-proclaimed anti-boob police force to report female Twitch streamers en masse. Volunteer moderators kept an eye on Twitch’s quick chat to avoid harassment. And for problematic streamers, Twitch relied on user reports.

In 2016, Twitch introduced an AutoMod tool, now enabled by default for all accounts, that blocks what the AI ​​considers inappropriate messages from viewers. Like other major platforms, Twitch also relies on machine learning to flag potentially problematic content for human review. Twitch has invested in human moderators to rate flagged content as well. Still, a 2019 study by the Anti-Defamation League found that nearly half of Twitch users surveyed reported being harassed. And a GamesIndustry.Biz report from 2020 quoted several Twitch employees as describing how executives at the company did not prioritize safety tools and were dismissive of hate speech.

During this time, Twitch did not have a transparency report to communicate its policies and inner workings to a user base dealing with abuse. In an interview with WIRED, Twitch’s new head of trust and security, Angela Hession, says security was Twitch’s “ most important investment ” in 2020.

Twitch has learned over the years that bad faith bullying can use their vague community standards as a weapon, and in 2020 released updated versions of the “ Nudity and Dress ”, “ Terrorism and Extreme Violence ” and “ Harassment and Hateful Conduct ” guidelines in 2020 . Last year, Twitch appointed an eight-person security advisory board, made up of streamers, anti-bullying experts and social media researchers, who would create policies to improve security and moderation and healthy streaming habits.

Last fall, Twitch brought in Hession, formerly the head of safety at Xbox. Under Hession, Twitch finally banned images of the connected flag and blackface. Twitch is on fire, she says, and there is a good chance for her to imagine what security looks like there. “Twitch is a service designed to encourage users to feel comfortable sharing and entertaining each other,” she says, “but we also want our community to be safe and secure at all times.” Hession says Twitch has increased its content moderators four times in the past year.

Twitch’s transparency report serves as a victory lap for its recent moderation efforts. AutoMod or active moderators touched more than 95 percent of Twitch content in the second half of 2020, the company reports. The number of people who reported being harassed via direct Twitch messages dropped by 70 percent over the same period. Enforcement measures increased by 788,000 in early 2020 to 1.1 million by the end of 2020, reflecting the increase in the number of users, according to Twitch. User reports also rose from 5.9 million to 7.4 million over this period, again attributing Twitch to its growth. The same goes for the channel bans, which have increased from 2.3 million to 3.9 million.

Source