Will the Logan Paul video be a reckoning for YouTube?

By the time Logan Paul arrived at Aokigahara forest, colloquially known as Japan’s “suicide forest,” the YouTube star had already confused Mount Fuji with the country Fiji. His over 15 million (mostly underage) subscribers like this sort of comedic aloofness—it serves to make Paul more relatable.

After hiking only a couple hundred yards into Aokigahara—where over 247 people attempted to take their own lives in 2010 alone, according to police statistics cited in The Japan Times—Paul encountered a suicide victim’s body hanging from a tree. Instead of turning the camera off, he continued filming, and later uploaded close-up shots of the corpse, with the person’s face blurred out.

“Did we just find a dead person in the suicide forest?” Paul said to the camera. “This was supposed to be a fun vlog.” He went on to make several jokes about the victim, while wearing a large, fluffy green hat.

Within a day, over 6.5 million people had viewed the footage, and Twitter flooded with outrage. Even though the video violated YouTube’s community standards, it was Paul in the end who deleted it.

“I should have never posted the video, I should have put the cameras down,” Paul said in a video posted Tuesday, which followed an earlier written apology. “I’ve made a huge mistake, I don’t expect to be forgiven.” He didn’t respond to two follow-up requests for comment.

YouTube, which failed to do anything about Paul’s video, has now found itself wrapped in another controversy over how and when it should police offensive and disturbing content on its platform—and as importantly, the culture it foments that led to it. YouTube encourages stars like Paul to garner views by any means necessary, while largely deciding how and when to censor their videos behind closed doors.

‘Absolutely Complicit’

Before uploading the video, which was titled “We found a dead body in the Japanese Suicide Forest…” Paul halfheartedly attempted to censor himself for his mostly tween viewers. He issued a warning at the beginning of the video, blurred the victim’s face, and included the number of several suicide hotlines, including one in Japan. He also chose to demonetize the video, meaning he wouldn’t make money from it. His efforts weren’t enough.

“The mechanisms that Logan Paul came up with fell flat,” says Jessa Lingel, an assistant professor at the University of Pennsylvania’s Annenberg School for Communication, where she studies digital culture. “Despite them, you see a video that nonetheless is very disturbing. You have to ask yourself: Are those efforts really enough to frame this content in a way that’s not just hollowly or superficially aware of damage, but that is meaningfully aware of damage?”

The video still included shots of a corpse, including the victim’s blue-turned hands. At one point, Paul referred to the victim as “it.” One of the first things he said to the camera after the encounter was, “This is a first for me,” turning the conversation back to himself.

There’s no excuse for what Paul did. His video was disturbing and offensive to the victim, their family, and to those who have struggled with mental illness. But blaming the YouTube star alone seems insufficient. Both he, and his equally famous brother Jake Paul, earn their living from YouTube, a platform that rewards creators for being outrageous, and often fails to adequately police its own content.

“I think that any analysis that continues to focus on these incidents at the level of the content creator is only really covering part of the structural issues at play,” says Sarah T. Roberts, an assistant professor of information studies at UCLA and an expert in internet culture and content moderation. “Of course YouTube is absolutely complicit in these kinds of things, in the sense that their entire economic model, their entire model for revenue creation is created fundamentally on people like Logan Paul.”

YouTube takes 45 percent of the advertising money generated via Paul and every other creator’s videos. According to SocialBlade, an analytics company that tracks the estimated revenue of YouTube channels, Paul could make as much as 14 million dollars per year. While YouTube might not explicitly encourage Paul to pull ever-more insane stunts, it stands to benefit financially when he and creators like him gain millions of views off of outlandish episodes.

“[YouTube] knows for these people to maintain their following and gain new followers they have to keep pushing the boundaries of what is bearable,” says Roberts.

YouTube presents its platform as democratic; anyone can upload and contribute to it. But it simultaneously treats enormously popular creators like Paul differently, because they command such massive audiences. (Last year, the company even chose Paul to star in The Thinning, the first full-length thriller distributed via its streaming subscription service YouTube Red, as well as Foursome, a romantic comedy series also offered via the service.)

“There’s a fantasy that he’s just a dude with a GoPro on a stick,” says Roberts. “You have to actually examine the motivations of the platform.”

For example, major YouTube creators I have spoken to in the past said they often work with a representative from the company who helps them navigate the platform, a luxury not afforded to the average person posting cat videos. YouTube didn’t respond to a follow-up request about whether Paul had a rep assigned to his channel.

All Things in Moderation

It’s unclear why exactly YouTube let the video stay up so long; it may have be the result of the platform’s murky community guidelines. YouTube’s comment on it doesn’t shed much light either.

“Our hearts go out to the family of the person featured in the video. YouTube prohibits violent or gory content posted in a shocking, sensational or disrespectful manner. If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age-gated,” a Google spokesperson said in an emailed statement. “We partner with safety groups such as the National Suicide Prevention Lifeline to provide educational resources that are incorporated in our YouTube Safety Center.”

YouTube may have initially decided that Paul’s video didn’t violate its policy on violent and graphic content. But those guidelines only consists of a few short sentences, making it impossible to know.

“The policy is vague, and requires a bunch of value judgements on the part of the censor,” says Kyle Langvardt, an associate law professor at the University of Detroit Mercy Law School and an expert on First Amendment and internet law. “Basically, this policy reads well as an editorial guideline… But it reads terribly as a law, or even a pseudo-law. Part of the problem is the vagueness.”

What might constitute a meaningful step toward transparency would be for YouTube to implement a moderation or edit log, says Lingel. On it, YouTube could theoretically disclose what team screened a video and when. If the moderators choose to remove or age-restrict a video, the log could disclose what community standard violation resulted in that decision. It could be modeled on something like Wikipedia’s edit logs, which show all of the changes made to a specific page.

“When you flag content, you have no idea what happens in that process,” Lingel says. “There’s no reason we can’t have that sort of visibility, to see that content has a history. The metadata exists, it’s just not made visible to the average user.”

Fundamentally, Lingel says, we need to rethink how we envision content moderation. Right now, when a YouTube user flags a video as inappropriate, it’s often left to a low-wage worker to tick a series of boxes, making sure it doesn’t violate any community guidelines (YouTube pledged to expand its content moderation workforce to 10,000 people this year). The task is sometimes even left to an AI, that quietly combs through videos looking for inappropriate content or ISIS recruiting videos. Either way, YouTube’s moderation process is mostly anonymous, and conducted behind closed doors.

It’s helpful that the platform has baseline standards for what is considered appropriate; we can all agree that certain types of graphic content depicting violence and hate should be prohibited. But a positive step forward would be to develop a more transparent process, one centered around open discussion about what should and shouldn’t be allowed, on something like a public moderation forum.

Paul’s video represents a potential turning point for YouTube, an opportunity to become more transparent about how it manages its own content. If it doesn’t take the chance, scandals like this one will only continue to happen.

As for the Paul brothers, they’re likely going to keep making similarly outrageous and offensive videos to entertain their massive audience. On Monday afternoon, just hours after his brother Logan issued an apology for the suicide forest incident, Jake Paul uploaded a new video entitled “I Lost My Virginity…”. At the time this story went live, it already had nearly two million views.

If you or someone you know is considering suicide, help is available. You can call 1-800-273-8255 to speak with someone at the National Suicide Prevention Lifeline 24 hours a day in the United States. You can also text WARM to 741741 to message with the Crisis Text Line.


Smart Phone APP that allows kids to anonymously report bullying

Amanda Todd was 15 when she committed suicide.

It was October 10, 2012, about a month after she posted a heart-wrenching video on YouTube, in which she used a series of flashcards to explain how she had been bullied by classmates and anonymous strangers, online and off, over the years. The post went viral after her death. It’s been viewed more than 10 million times on YouTube and is often cited in the ongoing conversation about the need to criminalize cyber bullying.

But for Todd Schobel, punishing bullies once tragedy strikes isn’t enough. What we need, he says, are more ways to catch bullies in the act.

Schobel first heard Amanda’s story while listening to the radio in his car. He was inspired to launch Stop!t, an app that lets students anonymously report bullying. Since launching in August, Stop!t has been adopted by 78 schools in 13 states, and today, the company is announcing it has raised $2.6 million to scale not only in school districts, but on college campuses and in the workplace, as well.

“We all know bullying is never going to go away,” Schobel says, “but we think we can give it a good shot of penicillin.”

The fact is, bullying isn’t what it used to be. The age of the internet has spawned a new type of bully, one that can access its victims anytime, anywhere, with the click of a button. It’s an issue not just for the victims, but for the bystanders as well.

As bullying continues to cause tragedy after tragedy, schools in particular are increasingly being held accountable for failing to intervene. With Stop!t, Schobel wants to arm both victims and bystanders with a tool that can track bullying no matter where it occurs.

“It used to be if it happened on school grounds, schools needed to take action, but if it happened off school grounds, they weren’t obligated,” Schobel says. “With cyberbullying there is no school grounds anymore. If it affects the learning environment for the students, the school has to take action.”

Schools pay a flat rate of $2 to $5 per student per year to use Stop!t. First, a school must sign up and pre-program a list of trusted adults and administrators who should have access to the reports. Students download the app, enter their school’s unique identification code, and when an instance of cyberbullying occurs, they can take a screenshot of the interaction and anonymously send it to the administrative team. It’s that last part that Schobel says is key.

“Cyber abuse often goes unreported, because people don’t tell people,” he says. “They get embarrassed, or there’s fear of retribution or of being called a snitch.”

By reporting anonymously, students can tell administrators who the victims and bullies are without implicating themselves. That has had one important side effect, according to Brian Luciani, principal of David Brearley High School in Kenilworth, New Jersey: Since adopting the app last year, Luciani says, the school has received 75 percent fewer bullying reports.

As Luciani explains it, that’s because the very knowledge that every student has a reporting tool in their pockets is deterring bullies from bullying in the first place. “It would be disingenuous to say it’s all because of the Stop!t app, but I think it was a huge help toward kids thinking twice about what they post and send each other,” Luciani says.

For Schobel, that’s no surprise. “When you increase the likelihood of getting caught, then it becomes a deterrent,” he says.

Schobel is now focusing on ways to get more institutions to adopt Stop!t. He’s looking into working with insurance companies that protect large school districts, which could vastly expand Stop!t’s footprint in schools.

Meanwhile, he and his 17-person team are working on a version of the app that could be easily adapted for other environments like workplaces, college campuses, and even the military. “Unfortunately,” Schobel says, “the market’s gigantic.”