Employees of Facebook working as moderators are people who must decide in ten seconds if the content that is published in the social network, will have to be vetoed; same as thousands of publications that they do not consider suitable for their users.
Last week Facebook was in the news but not for the updates on its platform, but for the Facebook Files published by the English newspaper The Guardian. In the investigation, they exposed the political controversies applied by the so-called Moderators, workers who must decide in more or less ten seconds if the content is to be published.
In the leaks of the Facebook manuals given to the English newspaper it was revealed rules that should be followed by moderators; most caused commotion in several countries. Among the most controversial are:
videos of violent deaths do not always have to be erased or discarded, as they can help raise awareness about specific problems.
- Images of physical non-sexual abuse and bullying of children should not be removed unless they contain sadistic elements.
- Images of abused animals can be published. If the content is disturbing it is classified as such, but it is not deleted.
- Abortion videos are allowed.
- You can publish artistic content of sexual activity, nudes or graffiti by hand.
- Comments like "Someone should shoot Trump" should be removed because the president has protected status.
- Phrases like "fuck you" and "die" can be published because they pose no credible threat.
- Expressions about religion and migration are allowed: "Islam is a religion of hatred," "Migrants are dirty and thieves," and "Fellow immigrant."
- All kinds of sexual language are allowed in comments and publications.
- Self-harm videos are allowed on Facebook Live.
In the same document, Facebook says that such violent expressions are used to express users' frustration and that they are relieved to publish it. However, the social network has received much criticism these days because it would be allowing bullying of children and violent acts in live transmissions like suicides and murders.
But according to the Guardian newspaper itself, the whole controversy revolves around the so-called moderators, since they are people who work under extremely stressful working conditions, are overburdened with work and are also paid low salaries; enough reasons to filter the Facebook Files to the press.
"We are underpaid and undervalued," says a content moderator who decided to keep his identity a secret. In addition, the man said that Facebook paid him $ 15 per hour if he eliminated terrorist content, after training two weeks for that work.
Although the problem goes beyond the economic. According to the testimonies revealed, the moderators must attend psychologists on a daily basis, since most of them do not manage to sleep peacefully and show they have nightmares after seeing violent and depraved content for hours.
A spokesman for the company acknowledged that this work is difficult and ensured that they are providing the necessary psychological support to the moderators, as it is not easy to see beheadings, child sexual abuse or abortions. However, an analyst hired by Facebook said that "the training and support provided so far is not enough."
The anonymous moderator also said that many of his peers are immigrants with basic English skills, hired to remove content in their mother tongue. The vast majority end up very affected after a day of work and they decide not to request psychological help to the company for fear of being dismissed or not to receive the payment, assured the English newspaper source.
After Facebook received letters of indignation for allowing the video of the murder of an 11-month-old baby by his father in Thailand to be circulated for several hours, Mark Zuckerberg decided to hire three thousand new moderators to control the publications, especially the transmissions Live.
If you where the moderator, would you banned this writing? Would it depend on whether you are a religious person or not? What variables will bend your judgement?
In January of this year, some Microsoft workers on the "Online Safety Team," who ran similar duties as Facebook moderators, sued the company for severe post-traumatic stress as a result of exposure to images of sexual abuse and heinous crimes.
Facebook has expressed concern about the issue and through a brief statement said that "sometimes we make mistakes, but safety is our highest priority." They also promise measures such as the rotation of their employees in other areas of the company, timely psychological counseling and incentives that avoid trauma for the difficult but very important work they perform.
Did this post widen your know-how ...? Great, please UPVOTE and in addition, if it is such a nice one, i.e. above average, then go ahead and RESTEEM. Appreciate your comments also.