top of page
Search

What is Human & AI Moderation?

Nowadays we all have social media accounts. That is unless you’ve been living under a rock or are one of those green flag hinge men. Even though we use these on the daily; how content is moderated or by whom is rarely thought of.


So, what exactly is content moderation? Why do platforms need it? How does it work? And what is the role of AI and humans?


Content moderation is when user-generated content is reviewed and monitored on online platforms to make sure that it is in line with the platform’s guidelines and safety standards. This is where harmful, inappropriate, and illegal content should be filtered out.


What platforms need content moderation?


The first platform that typically pops in one’s head would be social media. Yes, these platforms certainly require moderation, however, there are also a whole host of other online platforms that you may not realise do the same. For example, but not limited to:

·         E-commerce and marketing

·         Gaming

·         Fintech

·         Dating apps

·         Forums & online communities

·         Sharing economy


And many more…


Quiet often content moderation may be viewed as censorship; however, it is all about how to keep platforms and users safe.


Why do platforms need moderation?


This is a straightforward answer. Even though the internet is a wonderful, and helpful thing it is also weird, scary and there are plenty of bad actors out there. This is also coupled with the fact that many people feel that the everyday laws of society don’t apply online. And so, we need content moderation to help stop the spread of illegal and harmful content to keep users and platforms safe. Sounds straightforward in theory, but it is a complicated and large industry that requires the expertise of online safety experts and requires one to ask the difficult, philosophical questions.


How does it work?


How content moderation works depends on multiple factors. Firstly, this depends on the type of platforms and what its specific moderation needs are. For instance, what a marketplace is different to what a social media platform may need. Secondly, there are various methods that a platform may choose in which to moderate such as.


  • Pre-moderation (reviewing everything before its uploaded)#

  • Post-moderation (reviewing content after it is uploaded)

  • Reactive moderation (users reporting inappropriate content that is then flagged and reviewed)

  • Distributed moderation (when online community members review and vote on content to see if it’s within the regulations)

  • User- only moderation (relies on users to filter out content that may be inappropriate. Hides posts automatically after being reported several times)


Lastly, there are two primary types of moderation: human and AI moderation. These are most often used in combination as both have their advantages and disadvantages.


What is human moderation?


This is when a team of humans will manually screen and monitor content that is uploaded and reported on an online platform. When human moderators make decisions, they are following the platforms TOS and guidelines. How human moderation exactly works is platform dependent. Some platforms will have moderators review all content. Others have them moderate what their AI has flagged, or what users have been reporting.


What is AI moderation?


After the hype of this year, a lot of people are probably sick of hearing about AI, but when it comes to moderation it’s amazing what is happening in the AI landscape. Many platforms use AI and machine learning tools to assist their human moderators and keep their platforms safe. How this exactly works depends on the company. Most often the AI is used to scan content and immediately approve, refuse, or sent to a human if unsure. Other platforms may not have the AI make the take down decision, but immediately escalate to a human to make that decision. It varies!


Overall, how moderation works, and even the way to moderate is all a debate within itself. It depends upon the specific platforms needs, and the way that they select to do so.  There is still a need for both humans and AI to work together. It will be interesting to see in coming years how this industry changes and adapts to improvements in AI and what that means for human moderators.

 

 
 
 

Comments


bottom of page