• Harrison Jones

The Harsh Reality Of Being A Social Media Moderator

Ever wondered how those gruesome or explicit videos you see on Facebook are removed? No, it's not a robot's job - But actually, people who work in this field who have to manually decide whether certain content on social media is safe or unsafe for general viewing.



The Harsh Reality Of Being A Social Media Moderator

It is no easy task, to delve through the thousands of videos that show things like murders, explicit sex, racial abuse, nudity and heavy violence. Facebook moderators have been hit very badly up until now, and they are finally being compensated.



Facebook has agreed to pay out up to $52 million for those who have signed a form to state their work could trigger PTSD (Post Traumatic Stress Disorder), the first of its kind, as social media continues to grow by the millions, as well as the likelihood of explicit content getting posted online.


It was definitely something every moderator wanted at Facebook, as even though they benefited from the flexibility of working from home, they were earning "basically minimum wage", with the most reported annual income being just $28,000. Facebook has increased this across America though. For example, in areas like Washington D.C. and New York, moderators are now reportedly earning $20 an hour, so each year they'll be taking home something in the region of $33,000 - $35,000.



The Harsh Reality Of Being A Social Media Moderator

What's most important in this field of work is that social media companies, like Facebook and YouTube, provide every employee with a contract waiver that states they accept that the job role and what it entails may cause post trauma. Accenture, who operate out of Austin, Texas, makes it very clear in their contracts, with a big box, titled in black bold letters that the job can cause mental issues as a result of direct exposure to such content.

Despite the workforce being well trained and in good supply of people, some disturbing videos will make it through to Facebook timelines, and can stay up there for several minutes or hours, until enough reports have been made by Facebook users for it to be removed. The issue with Facebook is that it doesn't have an actual "filtering" tool, so the problem becomes more apparent by the fact that you only need to be 13 to sign up. Those people, who are only entering their teen years, will be left visibly traumatised when they see this kind of stuff - Compared to your normal Google search, Facebook doesn't have an obvious feature to strictly filter out content that is suitable for adults, or is gruesome in any way.


If you want to get into this kind of work, do be warned, it isn't for the faint-hearted, and you will probably have a tough time getting used to the scale of explicit content that comes through the review section for posts.


Thank you for reading 'The Harsh Reality Of Being A Social Media Moderator' by IT Block. IT Block is an IT support services provider based in Singapore and we love to share our IT expertise with the world.


#itblock #socialmedia #onlinesafety #facebook #moderator #cyber





6 views