YouTube is changing its anti-harassment policies

A young woman wearing headphones walks past a billboard advertisement for YouTube on September 27, 2019 in Berlin, Germany.

YouTube has announced that it’s making long-awaited changes to its harassment policy, saying it will tighten rules around what’s considered a threat and toughen punishment for repeat offenders.

For years, the video platform has faced intense scrutiny from critics, including its own employees, who say it’s allowed hate speech and harassment to flourish — particularly with content that targets racial minorities, women, LGBTQ individuals, and other historically marginalized groups.

Controversy around YouTube’s policies hit a high point in June after Vox Media journalist Carlos Maza called public attention to the repeated harassment he was receiving from conservative YouTube commentator Steven Crowder. Over the course of two years, Crowder routinely used racial and homophobic slurs in his widely watched videos attempting to debunk Maza’s work.

After initially saying that Crowder’s videos didn’t violate YouTube’s community guidelines, the company ended up reversing course and penalized Crowder by suspending his ability to earn ad revenue. Still, it stopped short of removing any of his videos from the platform. 

Amid criticism for how it handled the situation, the company promised six months ago that it would take a “hard look” at its policies. Now, we’re seeing the results of those changes, which appear to be a step in the right direction for YouTube, whose critics have long demanded that it do a better job policing harmful content.

The changes announced on Wednesday are incremental and will largely depend on execution rather than policy. If, going forward, YouTube does take down more content that meets a broader definition of harassment, it will undoubtedly provoke controversy, particularly at a time when the company continues to face pressure from Republican leaders such as President Donald Trump over claims that the video platform censors conservative speech.

“One of the goals is to make sure that the free speech and public debate that exists on YouTube platform is not stifled, but that it continues to exist,” Neal Mohan, YouTube’s chief product officer, told Recode about the changes. Mohan said the company is going through an “incubation process” in which the company is training thousands of raters to more accurately identify speech that constitutes harassment under the new policies. Raters are YouTube staff who help determine if content violates the company’s community guidelines.

When asked specifically about the Crowder-Maza controversy, Mohan confirmed that YouTube would take down several videos posted by Crowder in which he attacks Maza. Mohan declined to comment on any other specific videos that YouTube plans to remove from its platform.

In an interview with Recode, Maza shared his reaction to YouTube’s decision to take down the videos in question.

“It doesn’t fix the problem — which is not that these videos exist, but that YouTube is designed to make videos like this in perpetuity for profit,” Maza said, citing the continued popularity of anti-immigrant and anti-LGBTQ content on YouTube despite anti-hate speech policies have already been in place. “I’m skeptical about YouTube’s willingness to enforce these policies. The truth is, they should have penalized and prohibited this kind of content already.”

Crowder did not respond to Recode’s request for comment on YouTube’s changes. But in a video posted on his YouTube account on Tuesday, the conservative commentator warned his audience about rumors that an upcoming YouTube policy change would cause a “purge” of conservative content.

He said he refuses to apologize for the “controversy that was generated” by his channel, saying that his goal now is to “keep as much content as accessible as possible” and to “find out what the rules are and learn to play by them.”

In June, Vox’s editor-in-chief, Lauren Williams, and head of video, Joe Posner, wrote an open letter asking YouTube to better clarify and enforce its harassment policy.

A spokesperson for Vox Media declined to comment on YouTube’s harassment policy changes and plans to take down certain Crowder videos, and referred Recode to the previous open letter.

Beyond the Crowder situation, the remaining question is exactly how stringently YouTube will enforce these policies across the billions of videos that are watched on its platform every day. While these guidelines help clarify certain scenarios that constitute harassment, there’s still plenty of room for ambiguity around how these rules may or may not be applied.

Broadening the definition of harassment

YouTube is making three significant shifts in its content moderation policies, all of which essentially make it easier to identify and remove videos on the grounds of harassment.

The first change is that it will “no longer allow content that maliciously insults someone based on protected attributes such as their race, gender expression, or sexual orientation,” and it will apply these rules not just for private individuals, but for public figures as well.

The company is also expanding its definition of threats to include “veiled or implied threats,” not just direct ones — for example, someone menacingly holding a knife in a video while talking about a person, even if they don’t actually state a threat.

The next policy YouTube is updating involves repeat offenders. The company will now reserve the right to dish out “strikes” and eventually remove accounts that repeatedly upload content that may not qualify harassment in a specific instance, but that altogether demonstrate a pattern of targeting and harassment.

Lastly, the company says it’s applying these rules to the comment sections of videos, and that it expects to remove more comments than it currently does (in the third quarter of 2019, YouTube removed 16 million of them). YouTube will be rolling out a tool to help video creators moderate comments by auto-flagging ones that are potentially inappropriate, and giving owners of the video the ability to review them before they’re posted. It will be making this an optional feature that’s turned on by default for YouTube’s largest channels with the most active comments sections.

How YouTube made the changes

In its blog post, YouTube said it consulted with video creators and organizations that focus on online bullying and journalists’ rights, as well as “free speech proponents,” and “policy organizations from all sides of the political spectrum,” in drafting this policy. Mohan declined to give specific names of organizations that helped with the process.

Mohan also told Recode that YouTube also turned to members of employee resource groups (ERGs), including ones for LGBTQ and black employees.

In the past, Google’s workforce has publicly critiqued how YouTube handled hate speech and harassment. After the Maza harassment controversy, in June, the company’s employees protested Google’s presence at the San Francisco Pride Parade because of the platform’s perceived lack of protection for LGBTQ individuals who were being harassed.

In response to YouTube’s announcement, one former Google employee told Recode, “I do think the most important thing is not what they say — because their policies already seemed to prohibit a lot of the harassment we were complaining about — but what they do. We’ll know in a few months if these changes are in any way meaningful.”


MARKETING Magazine is not responsible for the content of external sites.

Subscribe to our Telegram channel for the latest updates in the marketing and advertising scene