“Taking action against misconduct that occurs entirely off our service is a novel approach for both Twitch and the industry at large, but it’s one we believe — and hear from you — is crucial to get right,” Twitch wrote in a blog post, detailing its new rules that apply to all Twitch users.
Twitch’s rules previously focused on streamers’ behavior on the platform and while it had historically taken action against serious misconduct that happened off platform, it didn’t specify this in its guidelines. (Twitch is owned by Amazon, whose founder, Jeff Bezos, owns The Washington Post.) The new update is a response to multiple incidents, including the wave of #MeToo allegations that swept the gaming industry last year. When several women raised concerns over Twitch streamers, alleging misconduct, the company realized its current policy around poor behavior that occurs outside of the platform needed more clarity, Twitch spokesperson Gabriella Raila told The Post.
The company wrote that “until now, we didn’t have an approach that scaled.” Previously, Twitch would review harassment that happened outside of its platform and look at evidence before taking action, but it didn’t specifically address users who are leaders or members in hate groups or participate in other extreme behavior.
“There’s something quite provocative about this gesture at a time when major media companies like Twitter and Facebook were years late in deplatforming white supremacists and domestic terrorists who were openly spreading hate speech and inciting violence on their own social media pages,” said Laine Nooney, assistant professor and historian of video games at New York University.
The company has been taking different measures in recent months to clean up its platform. In January, Twitch beefed up its policy against hateful images and explicitly banned the Confederate flag. It also remade a popular gaming emote, PogChamp, after the man pictured in the emote tweeted comments that encouraged further violence at the Jan. 6 Capitol riot.
Twitch is hiring a third-party law firm to support investigations and increased the size of its internal team that works with law enforcement, it said. The findings of investigations will be shared with the people involved, but will not be made public by Twitch. Those teams will also look for evidence to verify user reports.
People who report this behavior can submit evidence including direct links to public posts, or uploaded content of the user breaking the rules. Twitch notes that screenshots can still be edited, so they need to be supported with other verifiable evidence or confirmed by the law firm as authentic. Evidence could include police reports, rape kits, texts, emails, photos, or speaking to third parties to corroborate stories, according to Twitch’s Raila.
Twitch also stated it currently doesn’t have the capacity to handle other serious offenses that the policy doesn’t mention. The current list of offensive behavior is focused on “the most serious offenses that pose an immediate physical safety threat in order to ensure we are equipped to take action when these impact our community.”
“Imagine being a female streamer who was sexually assaulted by another member of the Twitch community, but that streamer never ‘officially’ breaks Twitch’s rules. Under Twitch’s previous guidelines, that female streamer had to coexist in the same streaming ecology as their assailant, which definitely wouldn’t feel safe,” Nooney said. “Under the revised rules, Twitch can remove the offending party without needing to find proof in their content at all.” Instead, the company would prove out misconduct that happened outside of the platform.
Nooney added, “In a way, Twitch’s revised guidelines push the platform toward reflecting community norms that are more akin to in-person social relations. I don’t need a friend to be violent in my home to have reason to break off a friendship; I merely need credible evidence that they were violent elsewhere.”