YouTube Stars Keep Making Money Even After Breaking the Rules

Europe

YouTube stars attract millions of eyeballs and generate billions of dollars in ad revenue for the media giant, which pledges to run its business without tolerating hateful and otherwise harmful videos.

But some of the workers hired to flag problematic content accuse YouTube of playing favorites, doling out more lenient punishments for top video creators whose work brings in the most money for the company. Eleven current and past moderators, who have worked on the front lines of content decisions, believe that popular creators often get special treatment in the form of looser interpretations of YouTube’s guidelines prohibiting demeaning speech, bullying and other forms of graphic content.

Moderators said that YouTube made exceptions for popular creators including Logan Paul, Steven Crowder and Pew Die Pie. Google-owned YouTube denies those claims, saying it enforces rules equally and tries to draw the line in the right places.

YouTube, the world’s largest video platform with nearly 2 billion people logging in monthly, has faced fierce backlash from critics who say it is enabling hateful and inappropriate content to proliferate. With each crisis, YouTube has raced to update its guidelines for which types of content are allowed to benefit from its powerful advertising engine – depriving creators of those dollars if they break too many rules. That also penalizes YouTube, which splits the advertising revenue with its stars.

Creators who break YouTube’s rules face the consequence of having their channels or videos stripped of ads – or their content being removed entirely. But unlike at rivals like Facebook and Twitter, many YouTube moderators aren’t able to delete content themselves. Instead, they are limited to recommending whether a piece of content is safe to run ads, flagging it to higher-ups who make the ultimate decision.

The moderators interviewed by The Washington Post say that their recommendations to strip advertising from videos that violate the site’s rules were frequently overruled by higher-ups within YouTube when the videos involved higher profile content creators who draw more advertising. Plus, they say, many of the rules are ineffective and contradictory to start with. The moderators, who spoke on the condition of anonymity to protect their employment prospects, describe a demoralizing work environment marked by ad hoc decisions, constantly shifting policies and a widespread perception of arbitrary standards when it came to offensive content.

YouTube spokesman Alex Joseph said in a statement that the company conducts a “systematic review of our policies to make sure we’re drawing the line in the right place. We apply these policies consistently, regardless of who a creator is.” YouTube has made nearly three dozen changes to its policies over the last year. He declined requests for an interview with executives overseeing moderation operations.

The moderators who spoke with The Post said they rate videos internally using criteria that focus on advertisers, not viewers. Ratings like G or PG help YouTube decide how to market the videos to users and advertisers, and moderators say the guidelines can be confusing. For example, YouTube policies ban advertising on videos that have partial nudity but only if the partially nude image is considered the “focal” point, or main focus, of the video. If the image is just “fleeting,” it can be allowed.

Google-built software used to log problematic content frequently stalls or breaks down, and moderators say they are typically given unrealistic quotas by the outsourcing companies of reviewing 120 videos a day, which often prompted skipping over long videos. YouTube says it doesn’t have quotas.

As a consequence, inappropriate and offensive material often stays up longer than it should, they said.

The frustration expressed by the rank-and-file moderators, who work for third-party outsourcing companies at offices across the U.S., also comes at a moment when these types of social media contractors are pushing for better pay and benefits, as well as psychological support to help deal with PTSD caused by their work.

“When I started this job I thought, I’m going to help get bad content away from kids,” said a former moderator for YouTube in Austin. The moderator’s conclusion when she quit her job last year was that the operation was designed instead to protect the source of YouTube’s revenue. “Our responsibility was never to the creators or to the users – it was to the advertisers.”

YouTube acknowledges that it has two sets of standards for conduct on its site. In apparent contrast to the experience described by moderators, the company says it has stricter rules for creators who can benefit from advertising on their videos because they are effectively in business with YouTube. General community guidelines are somewhat looser. Moderators are divided to police those two groups separately, Joseph said, because the division makes their work more specialized and efficient.

But YouTube’s business model of sharing ad revenue with popular creators also creates distinct operational challenges. Pulling advertising from a controversial creator may help protect a brand’s reputation, maintain advertiser relationships and preserve the public trust. But it also costs YouTube revenue, said Micah Schaffer, a technology policy consultant and a former director at YouTube who focused on trust and safety.

“It’s a huge problem to have a double-standard for different users, particularly if you are more lenient with the high-profile users, because they set the tone and example for everyone else,” Schaffer said.

Some creators have long felt that YouTube treats its most lucrative channels differently from smaller, independent ones.

“I don’t get the same respect that some company with a press team does,” said Stephen, a 25-year-old YouTuber who goes by his first name and runs “Coffee Break,” a channel with 340,000 subscribers. “Creators are getting fed up, and demanding the same respect and transparency and even handedness from YouTube” that bigger creators receive.

For most of its 14-year existence, YouTube has viewed itself as a platform of free expression, rather than a social network or online community. That has led to what some consider a more anything-goes approach to policing videos and has resulted in the company being slower to develop tools and operations to address harm.

Starting in mid-2017, brands including PepsiCo and Walmart boycotted YouTube after their ads appeared alongside hateful and extremist content, prompting it to tighten enforcement. YouTube chief executive Susan Wojcicki in December of that year promised publicly to take down content that was “exploiting our openness,” pledging to change its approach and bring the total number of people monitoring across Google for violations of its policies to 10,000 within a year. Included in the 10,000 are many third-party contractors, who also moderate Google’s app store and other Google products. (That compares with about 30,000 safety and security professionals dedicated to reviewing content at Facebook.)

Moderators point to an incident in late 2017 as evidence of arbitrary standards. YouTube star Logan Paul, whose channel currently has more than 19 million subscribers, uploaded a video of himself alongside a Japanese man who had recently hanged himself from a tree in a forest. (The forest, at the base of Mount Fuji, is known as a sacred site and a destination for suicide victims.)

“Yo, are you alive?” Paul asked the corpse.

YouTube punished him by removing his videos from a premium advertising program, and Paul took down the video. But just a few weeks later, Paul posted a video of himself shooting two dead rats with a taser. Both the rat and suicide video violated community guidelines against violent or graphic content. Paul had previously had other infractions.

Moderators interviewed by The Post said they expected that a high-profile creator with several egregious infractions would have received a permanent ban on ads on the entire channel or that his channel could be removed. Instead, Paul’s ads were suspended for two weeks.

“It felt like a slap in the face,” a moderator said. “You’re told you have specific policies for monetization that are extremely strict. And then Logan Paul broke one of their biggest policies and it became like it never happened.”

Paul did not respond to a request for comment. Joseph, the YouTube spokesman, said that the company felt the two-week suspension was an extra stringent punishment designed to set an example for the community. While Paul had other infractions, he’s never received three strikes in a 90-day period, which triggers termination.

The YouTube moderators said the rapid hiring growth and frequent policy changes created a disorganized and stressful environment, which sometimes made policing content confusing, forcing managers to make one-off decisions to interpret them.

Moderators say they internally flagged a viral Miami rap duo City Girls video earlier this year that featured a contest for a form of butt-shaking known as “twerking.” Two of the moderators said the video violated broad prohibitions for advertising alongside videos that depict buttocks in a “sexually gratifying” way. They reported it out of principle despite YouTube making categorical exceptions to its rules for music videos, some of the most highly viewed content on the site. The City Girls video now has 100 million views.

Still, a former team leader in Austin, Texas, who quit last year said that “the answers [we received from YouTube] weren’t really rooted in the policies we had.” Instead, the team leader suspected it was about whether YouTube would lose revenue or advertisers would be upset if they couldn’t advertise on certain videos. That person added that policies changed “at least once a month” – including frequent changes to how children’s content is moderated, creating confusion.

Joseph says YouTube is in the process of tightening policies around children and other topics and has made many changes.

After a public outcry, YouTube executives in June decided to strip advertising off a popular right-wing broadcaster’s channel for repeated verbal abuse of a gay journalist. But some of the company’s content moderators had already been pushing for that for weeks.

An Austin-based team assigned to review videos by the right-wing broadcaster, Steven Crowder, found that many of them violated YouTube’s policies at least a month before the decision. The team held weekly meetings to flag the most egregious violations to their managers, and they decided to flag Crowder for posting demeaning videos, which is against the rules. Crowder has more than 4 million subscribers.

A week later, the manager reported back to the team: YouTube decided not to remove advertising on those videos.

“The consensus on the floor was that the content was demeaning and it wasn’t safe,” said one of the moderators. “YouTube’s stance is that nothing is really an issue until there is a headline about it.”

A month later, after the journalist who was attacked posted his communications with YouTube, the company said at the time the video didn’t violate its policies despite hurtful language. The next day, executives reversed course and said the company had suspended Crowder’s ability to make money on the platform through Google’s ad services. Later, YouTube added that he would be able to make money again after removing a link to a homophobic T-shirt he sells online. Finally, YouTube once more clarified that Crowder’s demonetization was the result of a “pattern” of behavior and not just about the T-shirt.

YouTube’s Joseph declined to comment on decisions on individual videos but said the company had removed advertising on dozens of other Crowder videos before the blanket ban.

“YouTube has well-written policies. But the policies as written are totally anathema to how YouTube actually operates,” said Carlos Maza, the Vox video reporter who faced attacks from Crowder. “Anything they do is only in reaction to crises.”

A current moderator added, “The picture we get from YouTube is that the company has to make money – so what we think should be crossing a line, to them isn’t crossing one.”

© The Washington Post 2019

Articles You May Like

Slash the cost of your electric bill with this free app

Leave a Reply

Your email address will not be published. Required fields are marked *