Holding Logan Paul Accountable

Valerie Wu, Features Editor

Hang on for a minute...we're trying to find some more stories you might like.

Email This Story

In recent years, social media networks have come under increasing fire for their unwillingness to regulate content. Youtuber Logan Paul’s recent video of a dead body in a suicide forest, which went viral earlier this year, drew an explosive amount of controversy.

Some were outraged over the fact that YouTube had failed to censor the video, especially because the company had flagged videos with far less provocative content before. Others claimed that the system worked the way it was supposed to.

After enough outcry, Paul felt pressured to remove the video and apologize, demonstrating that when content creators go too far, they’ll hear about it.

However, by then the video had already trended for over 24 hours, amassing over 6 million views. This rapid self-aggrandizement is now a staple of YouTube’s content creators.

It raises the question: who is responsible for inappropriate content–the creator or the platform that carries it?

And it’s not just YouTube that’s been in the spotlight lately. Saturday Night Live star Leslie Jones was the target of vicious online abuse, with many trolls directing racist and sexist hate speech at her for starring in the 2016 Ghostbusters revival.

Milo Yiannopoulos, the alt-right commenter and professional troll, not only tweeted hate-tweets at Jones, but also started a fake account in Jones’ name. Twitter eventually banned Yiannopoulos from their network.

There’s more: Trump threatened nuclear war against North Korea on Twitter, leading his critics to suggest he was not only violating Twitter’s terms of service but also breaking the law. Moreover,  bullying is prevalent online, especially on social media platforms such as Instagram and Snapchat. Rape and death threats terrorize female gamers online, but the anonymous nature of many of these platforms leaves them with little recourse.

Legally, the responsibility for the damage social media platforms cause does not fall to the networks. Section 230 of the Communications Decency Act, a federal law, grants immunity to such networks from the illegal content posted by its users.

Yet ethically, online networks need to do a better job at regulating their content, especially when it comes to the prevention of hate speech such as threats.

The issue isn’t a matter of freedom of speech, but rather who we’re choosing to give our platforms to. When social media platforms choose to focus on content that is as problematic as it is sensational, the problem becomes not just the content creator, but the platform itself.

YouTube should establish clear ramifications for “offensive” content in the style of mainstream media. Instead of turning a blind eye to content under the premise of “freedom of speech,” staffers at YouTube need to demonetize and remove accounts that have been labeled as such, instead of turning to ineffective auto-moderation.

Incidents like Logan Paul’s aren’t isolated and can’t be entirely prevented with the elimination of one account. But when there are fewer influencers who create problematic content, then these incidents can be reduced.

Moreover, platforms need to better enforce their standards of content. Creators have largely been valued more than the consumer. Instead of adopting the business practice of “salutary neglect,” YouTube needs to establish consumer protection rules in their “terms and conditions”–rules that currently do not exist within the contract.

By establishing checks on the power of social media platforms through baseline laws, we can ensure that platforms are vetting content before it’s posted, as well as preventing those with power from abusing it.

That’s why this characterization of “misdeeds” is so important; we should confront the very social structures that have led to these misdeeds in the first place.

Social media networks need to ensure that the lines between monetized entertainment and reality are clear. In today’s digital age, accountability is more important than ever.