One controversy that journalists continue to face is that of what violence or graphic images are appropriate. One such instance where the question of regulation of such detailed images arose when YouTube star Logan Paul released a tragic video set in Japan’s “suicide forest”.
Paul released a video of his travels in Japan with his friends that became his infamous downfall. Paul went to Aokigahara, the forest also known as the “Sea of Trees”. It is the world’s second most popular suicide site, increasingly becoming “the perfect place to die,” according to The Atlantic.
In December 2017, Paul posted a video with the body of a man who died by suicide in the forest. In the video, Paul and his friends can be heard laughing when they encountered the body and made comedy out of the victim’s body. There was immediate criticism of his video worldwide, with people outraged over Paul’s disrespect of the victim and making the morbidity of the forest comedic. The video became the second most-trending video on YouTube days after it was posted, but people demanded that YouTube take the video down. However, YouTube did not actually respond until days after Paul finally deleted the video himself. Paul then wrote an apology defending his intentions of the video and released another video apologizing on his YouTube channel.
It was only after all these actions that YouTube commented on the situation. Many criticized YouTube’s late response to the video and criticized the company for not banning Paul from posting more videos. The CEO of YouTube responded to these complaints, saying that Paul didn’t violate an adequate amount of the company’s policies to be banned or remove his videos from the site, according to Business Insider.
“When someone violates our policies three times, we terminate. We terminate accounts all the time,” CEO Susan Wojcicki said. “He hasn’t done anything that would cause those three strikes.”
This event brought about the question whether companies such as YouTube or other digital media sites should increase regulations of content produced or posted on the websites. Did Paul have the freedom to post this video without violating laws or just ethical standards?
He attempted to make up for his mistake and later posted another video about suicide prevention efforts and donated $1 million to suicide prevention groups. However, the question remains whether YouTube should be allowed to censor videos about sensitive topics such as suicide. Should the platforms be held responsible for the video or do individuals need to have better ethical standards themselves?
In my opinion, I believe there ought to be regulation for “hate speech” and speech that could trigger violence or death in the future; in this case, I believe that suicide should be treated very sensitively as viewers could learn methods of suicide from Paul’s video as the victim was seen after he hanged himself in the forest.
In particular, an issue that arrives from YouTube’s lack of a quicker response to the video was that the company was still earning a significant amount of money as the video remained high on the trending list. According to an article in Independent.co.uk, many even in the UN are campaigning for platforms to be held responsible for allowing videos to remain posted on their pages. This is a possible way to mediate more regulation of speech online. In addition, Ofcom has suggested that online companies such as Google and Facebook should be identified as publishers of the information posted. Then they could be held accountable and regulated. However, this becomes a slippery slope in my opinion because the limits of what should be included as “inappropriate” posts would be vague and possibly interpreted.
I personally believe the issue in Paul’s case was that the video should have been deleted by YouTube earlier. One way to combat this particular instance is to somehow regulate the opportunity to benefit from the vile video in profits. However, this could be highly difficult to implement and would be hard to decide whose control YouTube would then fall under. Thus, regulating speech and posts in the digital age remains a highly difficult issue to resolve with many actors involved and impacted by any decisions.