PRO
Gracie Thrush
Staff Reporter
Social media users have the ability to post whatever they want, whenever they want. To most, this may seem like an easy way to have their voice heard, but it becomes an issue when individuals use this ability to post dangerous content. The moderation of posts and content is necessary in order to maintain communication between the user and the social media site, and to protect people’s privacy.
Although scrolling through social media outlets can be highly entertaining, it is easy to get lost in the overwhelming amount of content. According to Brand Bastion, this can result in users seeing unwanted posts, creating resentment toward the social media site and its marketed brands. Social media users will stop flocking to the apps, as they are not seeing what they desire. This affects platform popularity, because less attention is brought to the sites as a whole.
Furthermore, social media websites need regulation to protect their most vulnerable audience: children. While most sites require users to be at least 13 years old, there is no definitive way for sites to ensure that users are being truthful about their age. According to Maastricht University, most social media outlets have policies to protect their younger viewers from harmful content. This form of moderation protects minors from cyberbullying, sexualization and violence.
In order to protect users and their privacy, social media sites have started to take many precautions. According to New Media Services, there are four ways in which social media users can work with the sites in order to protect themselves: pre-moderation, post-moderation, reactive moderation and user-only moderation. This means that users can personalize what they want to see and what they want to be blocked. Added interaction not only decreases the amount of unwanted content on the site, but also creates trust between the site and user.
To ensure that users feel safe online and trust social media outlets, the ability to block or restrict certain posts is necessary.
CON
Bennett Bloebaum
Staff Reporter
Everyone has seen social media moderation in action. From Instagram to Pinterest, many accounts are taken down daily. While social media regulation is necessary to keep users safe, it must also be limited to protect freedom of speech.
Moderating users is important in preventing illegal or dangerous activity. However, if these sites are able to censor whatever people say, then that is an absolute overreach of power.
Usually, it is not employees going through a set list of criteria to decide which posts come down; instead, it is an algorithm that takes posts down and bans people from using the site. According to New America, Facebook sends all flagged content into an automated system that then decides if the post should be taken down or not before an employee even sees the content. This method of moderation can lead to multiple problems, such as a post being flagged and taken down that didn’t actually go against Facebook’s guidelines.
An example of the inaccuracy of automated flagging is when a woman went to a Cracker Barrel and posted about her experience with racial discrimination. According to USA Today, her posts were taken down because they were mistaken for hate speech when she was actually condemning it.
The final reason for reducing social media moderation is because it goes against the First Amendment right to freedom of speech. The First Amendment does not apply to companies; they don’t have to allow users their freedom of speech. However, that does not mean that this right should be ignored. In many cases, social media is one of the only safe methods of reporting on dangerous, underreported events, such as the Hong Kong protests, CNBC explains. Due to censorship and moderation, these events cannot be accurately covered.
The importance of free speech supersedes the censoring of social media platforms. With the various issues in algorithm regulation, as well as the crucial need for freedom, social media sites should not moderate their content

“Yes. To a certain extent, it could help filter out false advertisements, fake news, offensive posts and more unsafe content.”

“Yes. They should moderate content as a precautionary measure. It is important that we prevent bots and protect users from inappropriate content.”

“No. Unless the content is offensive or against the law, users should be able to express themselves on social media.”

“Yes. Social media companies should be able to filter what content is uploaded to their sites, especially if it goes against their terms of service.”

“Yes. I think there should be regulations, but social media companies should face federal repercussions if they don’t follow the regulations.”