Disturbing Videos Show YouTube Has a Serious Graphic Content Issue

A disturbing YouTube video recently highlighted serious problems in the site’s content moderation system. After hiking a few hundred yards into the Aokigahara forest, YouTuber Logal Paul encountered a suicide victim’s body hanging from a tree. More than 247 people attempted to take their own lives there in 2010 alone, according to The Japan Times.  Instead of turning the camera off, Paul kept filming, according to Wired. He later uploaded close-up shots of the corpse, with the person’s face blurred out.

“Did we just find a dead person in the suicide forest?” he said to the camera. “This was supposed to be a fun vlog.” He also made several jokes about the victim. Paul’s gaffe joins a wealth of other disturbing content that slipped onto YouTube recently. It happens for a number of complex reasons. Spoiler alert: Replacing humans with code marks part of the problem.

1. The star apologized, but he already went too far

youtube vlogger logan paul

Logan Paul | Jesse Grant/Getty Images for Vh1 Press

Within one day, more than 6.5 million people viewed the vlog, and the internet just about exploded with outrage. Even though the video clearly violated YouTube’s community standards, Paul himself deleted it and apologized. “I should have never posted the video, I should have put the cameras down,” he said in a video. “I’ve made a huge mistake. I don’t expect to be forgiven.”

While YouTube does not explicitly encourage its vloggers to pull insane stunts, they do profit from it. YouTube also takes 45% of the advertising money generated via creator’s videos. According to SocialBlade, an analytics company that estimates the revenue of YouTube channels, Paul could make as much as $14 million per year.

Next: Did Paul try hard enough to shield his fans?

2. The YouTube star took some steps to protect viewers

A screenshot from Paul's apology video

A screenshot from Paul’s apology video. | Logan Paul via YouTube

While Paul did de-monetize the suicide video, included warnings, and posted suicide hotlines, experts say he did not go far enough. “The mechanisms that Logan Paul came up with fell flat,” Jessa Lingel, of the University of Pennsylvania’s Annenberg School for Communication, told Wired. “Despite them, you see a video that nonetheless is very disturbing. … Are those efforts really enough to frame this content in a way that’s not just hollowly or superficially aware of damage, but that is meaningfully aware of damage?”

Next: The platform itself could also have tried harder.

3. YouTube’s own standards and procedures fell short

Part of YouTube's policy on graphic content

Part of YouTube’s policy on graphic content. | Screenshot via YouTube

A Google spokesperson sent out a statement apologizing for the video. “Our hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational, or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information,” it said. “We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center.”

The video also clearly violated YouTube’s own graphic content standards. Those read, in part, “It’s not okay to post violent or gory content that’s primarily intended to be shocking, sensational, or gratuitous. If a video is particularly graphic or disturbing, it should be balanced with additional context and information.”

Next: This does not represent the first time YouTube has encountered this problem.

4. Anti-Semitic views led Disney to fire this vlogger

youtuber pewdiepie

PewDiePie posted a video defending himself, at first. | Screenshot via YouTube

The Walt Disney Company told the Wall Street Journal it terminated YouTube celebrity Felix “PewDiePie” Kjellberg over a series of anti-Semitic videos in 2017. Wearing a “Make America Great Again” hat, Kjellberg used a photo of Hitler as a segue between clips. He also ran a clip from a Hitler speech in a video criticizing a YouTube policy, posted swastikas drawn by his fans, and watched a Hitler video in a brown military uniform, among other Nazi references.

Kjellberg stands as the most popular YouTuber with more than 53 million subscribers. Initially, he apologized for his offensive content. Later, after advertisers pulled their support and YouTube canceled the second season of his show, he went on the defensive. He posted a video in which he blamed the media for misrepresenting him and scorned the Journal, which initially presented his work to Disney and YouTube. The star further defended his ability to “joke about anything,” according to Polygon.

Next: The next controversy involves showing distressing videos to kids.

5. Disturbing content also appears in children’s videos

a wrongheads video via youtube

Some of the content registers as just strange, while others gets distressing. | Screenshot via YouTube

As James Bridle writes in a Medium post, the popular YouTube kids channel features scores of disturbing content. The videos vary widely, from mildly unsettling to incredibly creepy, and many appear randomly generated. As a report on the issue from The New York Times notes, the app sees more than 11 million weekly viewers. Supposedly, child-friendly content gets automatically filtered from the main site.

Malik Ducard, YouTube’s global head of family and learning content, called the inappropriate videos “the extreme needle in the haystack.” He added, “Making the app family friendly is of the utmost importance to us.” Despite that assertion, some experts say YouTube does not try hard enough.

Next: This problem demonstrates a frightening reality about the way the site gets moderated.

6. Algorithms do not replace human intervention

a stock photo of people looking at a laptop

Sometimes, you just need real people. | Jacoblund/iStock/Getty Images

Josh Golin, executive director of the Campaign for a Commercial-Free Childhood, told The New York Times that inappropriate videos on YouTube Kids demonstrates the hazards of today’s media. “Algorithms are not a substitute for human intervention, and when it comes to creating a safe environment for children, you need humans,” Golin said. His group and the Center for Digital Democracy filed a complaint with the Federal Trade Commission in 2015. The report accused YouTube Kids of deceptive marketing to parents based on inappropriate videos.

Google continually defends its errors by pointing out the staggering amount of content it hosts. In total, that amounts to more than 400 hours of content uploaded to YouTube every minute. It does maintain a staff of moderators — but how well do they work?

Next: The way Google uses its moderators presents a problem.

7. Moderators work intense, traumatizing jobs

a coder at a computer

Content moderators work long hours and see terrible things. | Adam Berry/Getty Images.

Google reportedly employs “a 10,000-strong army of independent contractors to flag offensive or upsetting content” in search results. One expert estimated well over 100,000 people, many of them low-paid offshore workers, get paid to police online content across the industry. Wired undertook a deep investigation into how this content really gets monitored.

Jane Stevenson served as the former head of the occupational health and welfare department for Britain’s National Crime Squad. Her occupational health consultancy, Workplace Wellbeing, focused on high-pressure industries like disturbing content moderators. “From the moment you see the first image, you will change for good,” Stevenson says. “There’s the thought that it’s just the same as bereavement, or bullying at work, and the same people can deal with it … All of these things are normal things. But is having sex with a 2-year-old normal? Is cutting somebody’s head off — quite slowly, mind you; I don’t mean to traumatize you but beheadings don’t happen quickly — is that normal behavior? Is that something you expect?”

Next: Watching distressing content takes a serious toll on workers.

8. Content moderators suffer intense effects from the job

Stressed business woman in the office

The job can result in significant stress. | iStock.com/kieferpix

One woman who monitors disturbing content told Wired constant exposure to videos made some of her coworkers paranoid. Two of her female coworkers became so suspicious of depravity, they don’t even leave their children with babysitters and sometimes miss work as a result. “I get really affected by bestiality with children,” she said. “I have to stop for a moment and loosen up, maybe go to Starbucks and have a coffee.”

Psychologists who work with these workers say they often demonstrate signs of PTSD, and they burn out quickly. While Google has pledged to hire even more human moderators, it also must do more to staunch that kind of content, in general.

Next: YouTube did pledge to take measurable steps against the problem.

9. Google plans to hire more moderators, improve algorithms

Google apps on an Android smartphone

They will at least try to make a difference. | iStock.com/ymgerman

While YouTube can’t control how disturbing content affects its workers, it pledged to try and make sure less of it gets through. In a blog post by YouTube CEO Susan Wojcicki, the company said it plans to increase its content moderation staff to 10,000 in 2018. In addition, YouTube also plans to create regular reports in which it talks transparently about the flags it receives, and how it removes inappropriate content.

TechCrunch also reports that the company plans to carefully consider which channels and videos become eligible for advertising using a set of stricter criteria, combined with more manual curation. All of these strictures won’t prevent content like Paul’s and Kjellberg’s and definitely won’t stop disturbing content from sneaking into kids’ videos. It does, however, show Google and YouTube will at least try.

Follow The Cheat Sheet on Facebook!