Dealing with Trolls on the Internet

Technology, Trolls on the Internet

Let’s face it, there will always be trolls, reprobates and scoundrels on the internet. This is one of the biggest threats to the multibillion-dollar industry which cannot be solved even by the best computer programmers on earth. On their part, corporations such as Youtube, Twitter, and Facebook have embarked on wars to bring to an end to harassment and abuse. As a result, these three entities have spent billions of dollars creating and training sophisticated algorithms with the aim of bringing to an end this menace. Also, these three corporations have spent billions on moderators armies whose job is to do away with the hateful content. Despite these efforts, trolls still persist on the internet. This leaves us with questions about how we can control the toxicity found online. What if these solutions lie with us? New research conducted by three universities namely the University of Michigan, Georgia Institute of Technology and Emory University has a drastic solution to online toxicity. The study suggests a nuclear option that works by identifying the spaces where hate speech is made. The next step involves shutting down these spaces. This is unlike the currently used tactic that involves identifying bad actors. This happens individually or even in groups.

The study was conducted using the online message board known as Reddit. The research also involved the analysis of close to 100 million posts of this platform. Some of the forums that have been suspended by Reddit that were used in the study involve r/CoonTown as well as r/fatpeoplehate. While CoonTown was used to spread racist messages, fatpeoplehate was used to discriminate on overweight people. During the research, the three universities came up with hateful terms that had been used in the forums. The researchers then searched the trend of these people on the platform. They also compared the behavior of this people to determine whether they had spread their work to other forums. According to the researchers, they wanted to answer some questions like what happens whenever a toxic community is shut down. They also wanted to find out whether the users of these forums change their abusive behavior once the group is shut down. Other aims of the research were to discover whether hate speech reduced online and whether these people immigrated to other forums. Surprisingly, the study concluded that bans worked in reducing hate speech on the forums. Those that wanted to carry on with hate speech decided to migrate to other platforms.

Leave a Reply