top of page

Social Media's Control on Censorship

Should social media be responsible for censorship?

Social media has always been known as a place where freedom of speech is the be and end-all. Since the internet was created, especially, the internet has essentially been governed by the people using it, with little interference from the government or larger political bodies. And if you didn’t like something, you simply looked away.

But with more and more people using it,

and perhaps even misusing it,

who should be the one to monitor it?

With the exclusion of actual illegal activity (meeting minors, posting videos harming people, etc) should the internet be under any strict rules? And since it’s “worldwide”, who would end up enforcing those rules?

When Donald Trump was banned from Twitter in January 2021, it was because Twitter was facing pressure from other members of Twitter, especially after the Capitol riots. But should the social media site be the ones taking responsibility for tweets?

It’s a touchy area– do we allow people to say things that could be linked to an increase in violence, or do we hope that the masses are smart enough to critically think about what others say?

In a conversation in real life, we wouldn’t simply take things at face value. We wouldn’t immediately agree with someone during the conversation, so we shouldn’t be doing it online either. This is the importance of fact-checking, not believing everything we see, and not allowing harmful information to influence our decisions.

Politics may be the area we see online censorship the most and could be the area where it does the most harm. Seeing all sides of the story is always important, but even more so when we are trying to decide who is doing a good job running things.

We need to have all of our facts, removed from political bias and personal opinions. We need to be able to form our own opinions, but we can’t do that if we’re only getting partial information.

There is quite a split with people on whether social media companies should be the ones doing the fact-checking, like in the case of Trump and Twitter. People are skeptical of how the sites know if the information is misleading. If there is no answer to back that up, it becomes a story of “he said/she said”.

The better course of action could be what Facebook is doing now: labeling its posts with a reminder to read the entire article before your share. Being aware of the entire information is crucial to forming a proper opinion.

66% of Americans say not too much to no confidence over social media correctly labeling posts as misleading or harmful. Which leaves it up to the “consumer” to decide for themselves.

Especially since most of our information is gathered online (especially political), should anything be censored at all?

It also begs the question:

should the internet have any right to silence government officials?

Each country has its own set of rules and regulations that govern it. But the internet is… well, worldwide. So is there anything that would “properly” censor them? And if so, should we even do it?

The internet is also always evolving and changing. It’s learning all the time, and so are we alongside it.

But the entire premise of the internet has always been a bit of a free-for-all.

There’s also a theory about dark or hateful topics being censored on mainstream media sites, only to be pushed into far corners of the internet and becoming a bigger problem.

“Our mathematical model predicts that policing within a single platform, such as Facebook, can make matters worse and will eventually generate global dark pools in which online hate will flourish,” the study says. [ X ]

Obviously, no hate speech or hurtful things would be ideal. However, if we still allow these things it could create room for growth and learning opportunities from other people involved. Instead of the harmful things being shoved into a corner of the internet and becoming worse, if they were simply out in the open perhaps other people could help that individual grow and see why their views can be harmful in a constructive way.

“What do you think would happen if the 20,000 moderators on Facebook were all mental health workers and counselors and people who are actually engaging — as long as it’s not illegal, like true harassment, like that stuff has to go — but for the edge cases, these people who are like disturbed people … what would happen if we had 20,000 people who were productively engaging them?” [ X ]

There isn’t a clear-cut solution. A lot of social media can be messy and confusing. But doing our part as individuals to educate and being compassionate and understanding is sometimes all we can do.

So, until social media creates the “perfect system” to deal with this (which, let’s be honest: it probably won’t)-- it’s up to us.

13 views0 comments

Recent Posts

See All


bottom of page