Censorship on YouTube and Twitch: What creators need to know
In August 2019, YouTubers sued Google for discrimination.
LGBTQ+ creators on YouTube took action after several of their videos were demonetized and suppressed in search results. The plaintiffs—including Amp Somers, Lindsay Amer, Chrissy Chambers, and Chase Ross, among others––are either LGBTQ+ sex educators or vloggers who show large audiences various aspects of queer life.
YouTube fought back by defending the quality of its moderation algorithm, saying, “We work incredibly hard to make sure that when our machines learn something—because a lot of our decisions are made algorithmically—that our machines are fair. There shouldn’t be [any automatic demonetization].”
A separate group of YouTubers would beg to differ. In October 2019, several creators reverse-engineered YouTube’s ad revenue bot to determine whether or not it’s penalizing queer content—and the results were interesting.
After testing more than 15,000 words, the group determined that 33% of videos with LGTBTQ+ keywords in the titles were demonetized by the algorithm.
Even so, in early 2021, the court sided with YouTube—but not because they weren’t discriminating. A federal judge in San Jose ruled that Google and YouTube are “private entities”, which means they aren’t bound by the First Amendment's prohibition against restricting speech.
As creators from various groups continue to lose out on AdSense revenue, they’ll need to ask themselves some tough questions about the content they post—which begs the ultimate question: How can creators express themselves and avoid censorship if they’re not hurting anyone?
American censorship vs. the world
Before we dive deep into censorship of creators on YouTube and Twitch, we want to clarify a few things.
YouTube is banned in China, Iran, Syria, and Turkmenistan. Twitch is blocked in China, Vietnam, and periodically in Iran.
Internet censorship is a complex human rights issue that affects people differently all over the world. “Censorship” may mean one thing in the U.S. and something else entirely in other countries.
For the sake of scope, we’ll be focusing on censorship of primarily American creators on YouTube and Twitch. Keep reading to find out:
- How moderation works on YouTube and Twitch
- The gray areas and complexities of each platform’s community guidelines
- How you can make attempts to avoid censorship on each platform
- How you can defend yourself (and other creators) against demonetization, suspensions, or bans on each platform
Censorship on YouTube
YouTube community guidelines
You can find YouTube’s community guidelines here.
For creators, YouTube also has monetization guidelines that outline how to create “advertiser-friendly content”.
How moderation works on YouTube
YouTube handles moderation in two ways: through artificial intelligence-powered moderation algorithms and through human moderators.
In March 2020, YouTube began to rely more on its AI moderation algorithms to remove content, but just six months later they brought back human moderators. As it turns out, their AI moderation algorithm isn’t ready to fly solo—it was responsible for a significant increase in incorrect video removal.
YouTube’s moderation algorithm looks at four main components when evaluating content:
- The use of profanity at the beginning of a video
During the first few hours after a video upload, YouTube’s AI moderation algorithm scans these components to evaluate whether or not a video is family friendly, advertiser friendly, or in violation of community guidelines.
Advertisers have played a major role in shaping monetization guidelines for YouTube. Known to creators as the “Adpocalypse”, advertisers began to demand more control in 2016 after PewDiePie posted an anti-semitic video that also contained ads.
YouTube has since enacted new policies to allow advertisers to pull their ads from videos they find offensive, which has had disastrous effects on AdSense revenue for many creators who cover sensitive subject matter.
Where YouTube moderation goes wrong
In 2020 and 2021, YouTube demonetized conservative political commentator Steven Crowder for making anti-LGBTQ+ comments about Vox commentator Carlos Maza and for violating YouTube's presidential election integrity policy.
YouTube saw the incident as an opportunity to update its moderation algorithm. In an attempt to prevent further far-right extremism on the platform as a whole, YouTube demonetized more far-right content—except the result was that they demonetized and deplatformed channels that discuss these topics in a critical way.
In a nutshell: YouTube’s AI moderation algorithm can’t yet understand nuance within content, and creators often lose out on income as a result.
How to avoid censorship on YouTube
Depending on your subject matter, you may face an uphill battle in avoiding censorship and demonetization of your videos on YouTube. That’s because YouTube’s moderation strategy isn’t perfect and remains subjective.
YouTube’s moderation algorithm is full of flaws that fail to capture nuance and context. And YouTube’s human moderators are just that … human. That means they’re vulnerable to their own biases, as much as they try to remain objective.
That being said, there are some basic things you can do to adhere to community guidelines and avoid censorship. First, you can avoid these topics:
But if your content covers some of these topics with important nuances that YouTube just isn’t getting, you can take some extra measures:
1. Bleep swear words. YouTube probably won’t ban you for swearing on your channel, but you may get flagged as inappropriate if it’s excessive. Don’t feel the need to bleep out the occasional f-bomb, but if you’re swearing a lot, you may want to bleep or include a content warning.
2. Blur any suggestive body parts. YouTube doesn’t allow any nudity or sexually suggestive content, which includes showing genitals, breasts, and butts. If you do need to be naked for whatever creative reason, blur out the naked body parts to avoid being flagged.
3. Categorize age-restricted content. You know your channel—if some videos aren’t appropriate for kids, categorize them as such. But be aware that some advertisers stay away from age-restricted content, so you may see less AdSense revenue on those videos.
4. Be strategic with titles and thumbnails. YouTube’s moderation algorithm pays special attention to titles and thumbnails as a way to minimize clickbait on the platform. Unfortunately this can backfire for creators who post sensitive subject matter, even if it doesn’t outright violate community guidelines. Be careful not to include offensive language and suggestive imagery in titles and thumbnails.
5. Include content warnings at the beginning of your video. Content warnings give your viewers a choice about whether or not they watch a video with sensitive material. When you empower your audience and let them know what to expect, you may avoid a flag from disgruntled viewers who were simply caught off guard by the subject matter.
How to fight demonetization on YouTube
If one of your videos has been marked as “not suitable for most advertisers”, you’ll see a yellow dollar sign next to the video. That’s when you can appeal YouTube’s decision by submitting the video for manual review.
According to YouTube:
“If you think our systems made a mistake, then you can request human review. Your review gets sent to an expert and their decisions help our systems get smarter over time. Deleting the video and re-uploading won't help. Videos can only be submitted for review one time and the review decision cannot be overturned.”
If your appeal doesn’t pan out, you can:
- Edit your video titles and descriptions for ad friendliness
- Delete videos YouTube has flagged as problematic (helpful if your whole channel is demonetized)
- Reapply for monetization, usually after 30 days
But if YouTube still isn’t giving you a break after you’ve made concessions, we recommend the following alternative paths:
- Invest in brand partnerships. If you can score brand deals that sponsor entire videos, your lack of AdSense revenue will hurt much less.
- Build a following on another channel. Many a creator have left YouTube during the “Adpocalypse” to build a following somewhere else. For example, Twitch isn’t just for gamers anymore—you may want to consider testing whether or not the platform is right for your content.
Censorship on Twitch
Twitch community guidelines
You can find Twitch’s community guidelines here.
Twitch also empowers content creators with two channel moderation features: AutoMod and human moderator promotion.
How moderation works on Twitch
Twitch moderation is different from YouTube’s because Twitch content is delivered live.
A Twitch stream is comprised of two main components: a livestream from the creator, and a chat for participants. Both parts are moving at breakneck speed in real time, which can make moderation a challenge.
Twitch leaves chat moderation to the discretion of the channel owner. Creators have two options: Use AutoMod to ban keywords they don’t want to see in the chat and/or promote other users they trust to act as human moderators in the chat.
But what if a creator is violating community guidelines? Twitch users can flag unwanted behavior to Twitch, and human moderators work with creators, channel moderators, and affected users to determine the best course of action.
With the exception of AutoMod, Twitch doesn’t seem to be using AI algorithms for channel moderation. But keep in mind that this could change as Twitch grows. As of 2021, Twitch has 9.36 million active streamers—whereas 720,000 hours of video are uploaded to YouTube every day.
The larger the platform, the more it needs to rely on AI for moderation. But Twitch isn’t there just yet, and the platform is making an effort to keep moderation as human as possible.
That’s why everyone is watching Twitch’s community guidelines and harassment policies develop almost in real time. In January 2021, for example, Twitch updated their Hateful Conduct and Harassment Policy to strengthen their no tolerance policies on online harassment, with a specific focus on sexual harassment and hateful conduct toward marginalized groups.
Where Twitch moderation goes wrong
In 2019, Twitch streamer and host of The Young Turks Network Hasan “HasanAbi” Piker’s was suspended for saying that “America deserved 9/11”.
Piker was commenting on Texas congressman Dan Crenshaw’s appearance on Joe Rogan’s podcast, during which he defended America’s practice of maintaining military bases in more than 100 countries. Piker expressed disgust at the comment, saying, “We deserved 9/11, dude, I’m saying it…. We brought it on ourselves.”
Piker later said he “used imprecise language” to criticize the American government, but that he doesn’t support terrorism.
How to avoid censorship on Twitch
First, something you should know: In April 2021, Twitch cemented a new policy that would hold Twitch streamers accountable for their behavior off Twitch.
That means Twitch could ban a streamer for harassing someone on, say, Discord or Twitter. Twitch has announced that it will partner with other social media platforms to investigate digital communication that would indicate harassment.
But what about the basic rules? Here’s what you should pay special attention to during your streams and off Twitch:
1. Don’t use offensive speech. Use common sense when you speak, and default to kindness. Don’t use any language that discriminates against ethnicity or race, religious beliefs, gender, gender identity, disability, appearance, age, or veteran status.
2. Be careful who you invite to voice comms on your stream. You’re not only accountable for what you say on your own stream, but for what everyone else is saying on your stream, too. You can sustain a suspension or ban for what someone else says on voice comms, so make sure to educate your guests on the rules before you invite them.
3. Set your channel to “mature audiences” if you swear a lot. You can use curse words on Twitch, but if you know you swear like a sailor, categorize your channel appropriately.
4. Don’t even joke about doxxing. Twitch takes serious offense to doxxing, which is when someone publicizes a user’s address and contact information without their consent. Twitch doesn’t even want you to doxx yourself, so don’t tell anyone where you are during your stream.
5. Moderate your Discord server. Now that Twitch is moderating off-service behavior, you’re also responsible for moderating your Discord community and making sure people aren’t harassed on the platform. Use Discord’s moderation tools like you would use Twitch’s AutoMod tool for your chat.
How to fight a suspension or ban on Twitch
If a streamer has violated community guidelines, Twitch will first issue a warning with content removal if the offense isn’t serious. Next steps after that are a suspension and eventually a permanent ban if problems persist. Suspensions range from one to 30 days.
To appeal a suspension or ban, streamers need to contact Twitch at help.twitch.tv. You’ll have the opportunity to state your case as to why Twitch should reverse a suspension or ban.
Keep in mind that Twitch appeals are conducted by actual humans, who can take anywhere from a few days to a month to respond. Twitch does prioritize streamers who are part of their affiliate and partner programs, so if that’s not you it could take longer.
Learn more about account enforcements and chat bans here.
How to fight internet censorship as a creator
The harsh truth is that there isn’t much recourse for creators who feel they’ve been unjustly censored on YouTube and Twitch. Demonetization, suspensions, and bans can have a devastating effect on revenue if appeals have been repeatedly denied.
And as we’ve learned from LGBTQ+ creators who lost censorship lawsuits against YouTube, current laws are ill-equipped to deal with the creator economy and freedom of expression on a mass digital scale. Lawmakers simply haven’t caught up to the realities of online communication and content creation—and likely won’t for quite some time.
But that doesn’t mean you can’t contribute your voice to the fight against online censorship. Onlinecensorship.org, a joint project by the Electronic Frontier Foundation and Visualizing Impact, collects reports from creators who have been censored by a variety of social media platforms.
Onlinecensorship.org offers resources, appeal information, and original research on content censorship in an effort to pressure social media platforms to be transparent about why they make certain decisions about content. If you’ve been censored, submit a report here.
Final thought: If you’ve been censored as a content creator, be loud about it. Post about your experience on all your social accounts, and be clear about why you feel wronged.
Things don’t change when people are silent—exposure and accountability on a large scale lead to change in policy, and your voice can help all creators, especially from marginalized groups, maintain freedom of expression.