Stuck in the web of evil

Porn, violence, hate speech: even though they are a daily occurrence on Facebook, we are not very likely to encounter them as users. That’s because such content is actively being deleted by Facebook. But on which grounds? And by whom? Facebook does not want to say. SZ-magazine spoke to a number employees working for Arvato, the company responsible for deleting Facebook-posts, and uncovered a digital world shrouded in secrecy.

It is the summer of 2015 when the following job posting appears on the internet: “Service Center Employees wanted. Would you like to become part of an international team with good career opportunities?” Requirements: knowledge of a foreign language, flexibility and trustworthiness. Job location: Berlin.

As soon as I came across the advert I smiled to myself: “this is my chance. After months of scouring job ads, I finally found one that didn’t require any German language skills.

The person who said this prefers to stay anonymous. Back then, the job which she applied for was so new that no fitting title had been invented for it yet. Its description hints at a call centre job, the reality which successful applicants awaits, is far removed from that: some call it ‘content moderation’, while others rather liken it to ‘online rubbish collection’. The mission of the people working here: keeping their clients’ website clean. By clicking through the hate and horror that online users spread, they are the ones that have the last say in whether to remove or not to remove a post. It is a profession that is barely known in the outside world and most people don’t even know it exists at all.

It was long assumed that such activities were taken care of by external service providers in developing countries such as India or the Philippines. One of Arvato’s largest clients is Facebook. The social network, which has 28 million users in Germany and 1.8 billion worldwide, prefers to keep silent on how it removes the dangerous content that its users upload on a daily basis. Only in January 2016 did it become known publicly that service provider Arvato, a subsidiary of media company Bertelsmann, employs over one hundred ‘content moderators’, working for Facebook in Berlin. On principle, Arvato refuses to say how much it gets paid by Facebook, nor does it want to make public what criteria its moderators have to comply with in order to get selected.

For over several months, SZ-magazine spoke with a number of current and former employees of Arvato eager to tell their stories, defying their employer’s explicit orders not to speak with journalists. Many of them feel poorly treated by their employer, they’re suffering from the online images that they’re seeing on a daily basis. They’re complaining about stress, exhaustion and find that their working conditions should become known to the wider world. All of them occupy different positions within the company hierarchy and all come from multiple ethnic backgrounds.

Many of the interviewees asked us to use their actual names as many had either already quit their jobs, or were about to do so. Instead, we decided to only publish their stories anonymously as all of them had signed clauses that required them to maintain strict secrecy, also after employment. We reproduced their testimonies in italics. The conversations all took place in Berlin, either in person through Skype, or through an encrypted online communication channel.

The vast majority of applicants are young, they came to Berlin for want of love, adventure or education. Many of them are refugees from Syria. All of them are attracted to the prospect of having a fixed contract working for a large German company. They’re not bothered by the fact that their contracts are often limited, it’s something. Job interviews are done in a fast fashion: applicants are asked what foreign language- and computer skills they have. After that, they’re asked the following the question: “how well do you deal with upsetting or disturbing images?”

On our first day we received an introductory training. We were about thirty people, gathered together in a lecture hall. There were people from all over the world: Turkey, Sweden, Italy, Puerto Rico, but also many from Syria.

Our trainer entered the room with a broad grin and said: you guys won the lottery: you will all be working for Facebook! All of us started cheering.

As the training course got started, the new hires learnt what their job at Arvato would be like. Firstly: no one should know whom they are working for. The name Facebook should not be mentioned on anyone’s resumés or LinkedIn profiles. The new hires are not even allowed to tell their families what they’re doing.

“Your job is to keep Facebook clear from content that should not be seen by children”, the Arvato trainer tells the new hires. “And by removing such content, you are keeping the platform free from terror and hatred.”

One former employee of the firm speaking to SZ-magazine, called the training session ‘indoctrination’. Applicants were supposed to believe that their, in his words, “brainless and monotonous work”, was keeping society safe, when in actual fact all they served were the interests of a multibillion dollar company like Facebook whose only interest lay in removing the most unsettling images to keep its users online for as long as possible.

The images that were shown on the training session were not at all disturbing: mostly penises in all shapes and sizes. Stuff that made us giggle. Pretty funny actually, to watch things like that during working hours. Of course, we would have to remove it afterwards. Same with bare nipples.

One evening, we went drinking with people that had been doing our jobs for longer. After a couple of beers one of them said: let me give you one piece of advice, get rid of this job as fast you can, it will finish you off.

The office on the Wohlrabedamm in the Berlin district of Siemensstadt is drab. Located in a former factory, its walls of exposed brickwork line row after row of small white writing desks. Black computers with white keyboards, ergonomic chairs and a grey carpet make up the rest of interior which offers room to several dozen employees. Mobile phones are contractually forbidden. On the ground floor are two vending machines, one for coffee and hot chocolate, the other one selling snacks. Smokers have access to a large courtyard which is shared with the offices of several other companies all located in the same building.

After you login, you see a waiting list with thousands of reported publications waiting for moderation, you click on one and off you go.

There are software filters that can automatically detect potentially malicious content, one former employee explains, but these automatic filters cannot differentiate between an image of a hospital operation, or one of an execution. That’s why the majority of all reported content that the Berlin team has to browse through are ones that are reported by Facebook users clicking on “report as inappropriate” or “this publication should not be on Facebook.”

I have seen things that made me seriously question the goodness of human beings. Torture, sex with animals.

Content marked as inappropriate is first seen by those employees at the bottom of the company hierarchy. Their team goes by the name of ‘FNRP’, which stands for “Fake Not Real Person”. They are the ones to decide which of the written posts, images or videos that users marked as inappropriate break the rules of Facebook’s community standards. Their first priority is to check whether an offending profile name is connected to a real person. If it is not, hence the name ‘Fake Not Real Person’, the suspicious profile is threatened with removal. When the user cannot plausibly identify themselves, the account gets deleted altogether. An effective way to punish profiles that were created with the aim to spread forbidden content.

A usual working week for the FNRP team lasts around forty hours, staff work in two shifts from 8.30 am until 10 pm. An average gross wage is about 1,500 euros per month, this equates to little over the minimum wage of 8.50 euros per hour.

One level up the hierarchy are the so-called ‘content moderators’, they also watch videos. Particularly complicated cases are forwarded to the ‘Subject Matter Experts’ for a final say on whether to remove content or not. On top of them are the team leads, whose job is less challenging: they barely get to see any of the disturbing content.

Arvato is a giant. It prides itself on being a company that takes on any job which any company may prefer to outsource: from managing frequent flyer programmes to running entire call – and distribution centres. The outsourcing colossus employs around 70,000 people in over forty countries and is one of the cornerstones of German media empire Bertelsmann: more than half of all of Bertelsmann’s employees are in active service for Arvato. The company’s motto, as can be found on its corporate site is, “How can we help you?”.

This service mentality is one of the reasons Arvato picked Berlin as its base from where to serve Facebook. The social media firm is faced with increasing pressure from the German authorities. Federal minister of justice Heiko Maas demanded Facebook to introduce German speaking contact people that are able to check and where possible swiftly remove morally questionable posts. Meanwhile, the Munich public prosecutor launched an investigation into Facebook’s alleged role in contributing to incitements of mass hatred. The charge: the firm did not actively remove hatred inciting content or did so too late.

In the early summer of 2015, a small team of Arvato-delegates was invited by Facebook to meet in its European headquarters. The two enterprises were determined to work together as the world’s largest social network needed help to keep its online space clean. Up to Arvato and its managers to set up a dedicated team for it. Work on the project began in the autumn of 2015 and was shrouded in secrecy from the very start.

For how long would the contract between Facebook and Arvato be valid? How are the employees prepared for their work and does Arvato do a risk assessment regarding the psychological hazards attached to content moderation? SZ-magazine presented Arvato with these and some 19 other questions. In response, Arvato wrote that: “our client Facebook reserves the right to deal with all press requests regarding the cooperation with us internally.”

Facebook’s German branch does not provide any more clarity. Its responses to questions posed by SZ-magazine are limited to “we do not have any information about this”. The discrepancies between the things Facebook states and the things that current and former employees told us are huge. Facebook for example, states that each employee working for it through Arvato, has to pass a six-week training as well as a four-week mentoring programme, before they can start to work. The employees SZ-magazine spoke to, mention a two-week training programme, significantly shorter than what Facebook mentioned.

Arvato separates its moderator teams by language. The floor language is English, but within the teams employees stick to their own tongues: Arabic, Spanish, French, Turkish, Italian, Swedish and of course German. The teams are supposed to only review content that’s in their mother tongue, but generally speaking, the content does not differ very much between countries.

A random selection of images from the pending reported content shows a mixed bag of animal torture, swastikas and penises.

Every team found its own way of dealing with extremely shocking or disturbing content: the Spanish for example, loudly exchange them amongst each other, whereas the Arabs tend to keep what they see to themselves or retreat somewhere quiet. The French remain seated silently in front of their computers.

When we just started working here, we spent our lunch breaks making jokes about all the porn we had seen. But at some point it started to depress us.

Remove, or not remove? As soon as a moderator has made a decision, a new request pops up on their screen. The amount of open cases, called tickets, can be tracked using a notification bar on a moderator’s monitor.

The images just get much worse, much more shocking than the ones we were shown during the training. But altogether not much different from what we are shown at home in the newspapers. Violence, partly decomposed bodies.

Seeing people jump up, run away or leave the room crying is not an unfamiliar sight at Arvato. Some employees testified to SZ-magazine of such gruesome content that it could not be reprinted on these pages. Some of the following examples were only a little more bearable.

The dog was tied to a rope. A naked Asian girl tortured the animal with blazing hot irons before pouring boiling water over it. People with a fetish for these kinds of things were supposed to be turned on by it.

Child porn was the worst. There was this little girl, hardly 6 years old lying in a bed shirtless. On top of her sits a fat man who is abusing her. It was a close-up.

The people receiving images like these carry out a job that walks a fine line between that of a security guard and an assembly line worker: this should stay on Facebook, “click”, that doesn’t “click”. As soon as they start, a regular FNRP-employee has to tick off around one thousand tickets per day: one thousand decisions on whether or not content offends the complicated web of regulations that Facebook calls its ‘community standards’ that content has to comply with for it to be allowed to remain online, or get deleted

Sometimes we saw beheadings, terror or a lot of nudity. One cock after the other. An endless procession of cocks and always something particularly gruesome. Tough to say how much we saw, it depended on the day. One or two cases per hour for sure. But you would at least see one horrible thing per day.

After a few days I saw my first corpse, a lot of blood, this terrified me. I removed the image right away. My supervisor approached me afterwards and told me: you did the wrong thing, this image does not offend Facebook’s community standards. He told me I should be more diligent next time.

Even when ‘community standards’ sounds as innocent as a cleaning rota, first impressions are not always what they seem. Because the web of rules conceals a well kept secret of the social media firm, stipulating in full detail which content is allowed to be published or shared and what should be removed. In a way, the manner in which the firm regulates what billions of people are allowed to be shown and what not, constitutes some kind of parallel law on the freedom of speech. And it’s about more than the question whether a bare nipple is offensive or not. Facebook is a powerful tool for political grooming and influence. What content is shared here decisively shapes the image of society. The way we view catastrophes, revolutions or demonstrations is shaped by the images we see on our Facebook timeline. Despite this importance, details concerning these rules are not made public nor are lawmakers allowed to know the criteria that dictate why certain content is censored and others are allowed to circulate freely.

Social media firms tend to publish only snippets of these rules and then only the very vaguely formulated ones. Sentences like: “In no way do we tolerate behaviour that puts people in danger”, are nothing unusual on Facebook. What behaviour the company is referring to is not specified. One former employee explains that the rules are kept secret deliberately to prevent people from being able to find a way to bypass them more easily. A strange kind of logic: it is as if a state, out of fear for people being able to break the laws more skillfully, would keep its law books strictly sealed.

Even though Facebook likes to present itself to the world as a transparent and open enterprise whose platform people can use to share whatever information they want, the firm keeps its business practices strictly secret. Gerd Billen, state secretary for the federal justice ministry and chairman of a task force called ‘Dealing with unlawful hate publicity on the internet’, says: “unfortunately Facebook is showing itself not to be a very cooperative partner when it comes to telling us how it thinks it should deal with criminal content.” Even he, as representative of the federal justice ministry, has thus far not been allowed to pay Arvato a visit. “Several times, I have asked for more transparency on they way in which so-called disturbing or criminal content is dealt with. What are the specific reasons why content is being removed, how many staff are involved in the operation and which qualifications do these people have to comply with. So far, we have had to contend ourselves with off the record testimonies only”, says Billen. Meanwhile, his ministry is currently looking into law proposals that could force Facebook to become more transparent.

SZ-magazine has access to a large part of Facebook’s secret code of conduct. It is the first time that these rules are accessible in such large amounts. The last that happened was at the start of 2012 when the American website Gawker published a 17-page guidebook with criteria for removing content of a firm which had been working on behalf of Facebook.

The internal documents that SZ-magazine has access to, consist of hundreds of micro-rules, each of them minutely predescribed by Facebook. Particularly interesting are the many examples it provides to illustrate what content should be removed and which shouldn’t.

Posts like these, for example, have to be removed:

– An image of a fully naked woman in public, but only in combination with the comment: “Oh my God. You are an adult now. That’s disgusting” (Reason: the comment is seen as bullying as a body part is referred to as disgusting).

– An image of a girl without any text, next to it is another image of a chimpanzee with the same facial expression as the girl (Reason: demeaning image editing through the direct comparison of a human being with an animal).

– A video of a human being getting tortured, but only in combination with a comment like: “I enjoy watching how much pain he’s suffering from.”

These, on the other hand, should not be:

– A video of an abortion (unless it contains nudity).

– An image of a hanged person, underneath it the comment: “hang the son of a bitch” (even though this is considered an approval of the death penalty; it would only have to be deleted if it explicitly offends certain “protected groups” such as gays. For example in combination with a comment like: “hang this faggot by the neck”).

– An image of an extremely underweight woman without any comment (photos of self-harming behaviour without a context are allowed).

A concrete example on how to deal with extraordinary violence is given in chapter 15.2, on the encouragement of violence, it reads: “We do not tolerate it when people share photos or videos in which people or animals die or get gravely hurt when this type of violence is framed as being encouraged”. What is shown on the image is of no relevance, it’s the combination of text and photos that makes it illegal. To illustrate this more clearly, the code of conduct provides some examples of what constitutes content that encourages violence. One them is this one: when someone posts a comment like “look at that, how cool”, or “fuck yeah”, under an image of a dying person, the post may be deleted.

The rules were tough to understand. I asked my team lead: that’s impossible, that photo is full of blood and it’s so brutal, no one should ever get to see that. Despite this, he calmly retorted: that is just an opinion. You have to try to think like Facebook, think what they want. You should reason like a machine.

Facebook regularly updates or changes its community standards, something Arvato has to constantly be aware of and follow up on. Therefore, the company employs somebody solely for this purpose, which is something Facebook finds very important. Because in the end, the platform’s real interest lies in identifying what may drive people away from it – and perhaps more importantly: keeping people logged in for the longest amount of time for them to see the highest amount of ads to make the highest revenue as possible.

The problem which Facebook has to tackle is not an easy one: keep the hatred and madness that people spread online in check, while making sure important and sometimes equally upsetting events are not kept hidden. Whether to delete content or not is a decision that can have far-reaching consequences: for example when the content concerns journalism or objective reporting.

Hundreds of millions of people use Facebook as their primary news source. Despite that, the company is not considered a media outlet, chiefly because it does not produce its own media. Nevertheless, the company does have to concern itself with media ethics: when is display of violence appropriate, for example? Should gruesome content be allowed in war reporting, as it serves a broader goal? For decades, scientists have been trying to find an answer to this question, but for social media, this question needs to be solved in an instant. When in 2009 an revealing video clip was posted on Youtube of the death of Neda Agha-Soltan, a young woman shot during anti-government protests in Tehran, a vehement debate ensued over whether to remove the clip or not. In the end, Youtube decided to keep the video online despite its savage brutality as a political statement. The posting and the discussion that followed proved to be an early acid test for Facebook. For many years, companies have tried to set up straight rules to deal with complex questions such as these. As the secret Facebook documents say: “videos that display the deaths of people can be disturbing, but can also serve to create awareness of self-harming behaviour, psychological disorders, war crimes or other important topics.” When in doubt, Arvato employees are supposed to forward morally questionable videos to their superiors. Even more complex cases should be dealt with internally at Facebook’s European headquarters in Dublin.

What struck me in particular were the Paris terror attacks last year. Special meetings were called to decide on what should be done with the images recorded live on the scene. The most horrible things were forwarded to us, almost in real-time. Eventually we were told to forward the majority of the content to the French or Arab-speaking teams. I don’t know how they handled it afterwards.

When the Paris attacks got started, our supervisors and team leads asked us content moderators to return to the office from our weekend breaks. They called me and texted me. I worked throughout the entire weekend.

Detailed insights into the number of people moderating Facebook content worldwide, are sparse. When speaking on a conference in March 2016, Monika Bickert, manager of Facebook’s international ‘policy’ department, let slip that every day, over one million Facebook posts worldwide are being reported by users as inappropriate. Conversely, she declined to reveal anything about the number of people involved in deleting reported posts. Sarah Roberts, media scientist at the University of California in Los Angeles, has been studying this relatively new profession for several years and estimates that worldwide, around 100,000 people are working in this field. Roberts reckons that the majority of them are working for external service providers like Arvato. After having interviewed countless moderators from all over the world, Roberts made a worrying discovery: many of them are suffering from traumas. That’s surprising, as the mental health of these people plays a crucial role in which content eventually ends up on our timelines. The more worn out a moderator gets by being exposed to months’ worth of violent, pornographic or evil content, the more likely they are to allow disturbing content through. On top of (mental) exhaustion, most moderators simply lack the time to do their work properly, making mistakes or minor slips more likely.

Some videos you have to watch entirely. We are not allowed to skip through parts, even when there are screenshots available. That’s because the culprit can be in the sound. You have to listen to every bit because the soundtrack can reveal parts that are not allowed. Hate speeches for example, or sadistic ones. Some of these videos are almost movies, lasting sometimes for over an hour.

Content moderators often have a hard time processing the images they saw during work hours, they take the images that are imprinted on their minds home. There, many still receive text messages from team leads, telling them their lagging behind or asking them whether they would like to work an extra shift. The workload for many employees can often be too overwhelming, according to one of them who quit his job since.

Besides pressure from team leads, there is also pressure from other internationals competing for a limited pool of English-speaking jobs, making job switching sometimes impossible. The times when Berlin only attracted Bavarians and Swabians are over, the city is experiencing an influx of thousands of well-educated middle class people from as far as India, Mexico, South Africa. They are eager to find jobs, but the jobs are not eager to take them on as most of them speak little or no German, regardless of their high education and affluence.

Being able to adapt easily and be flexible in order to survive, is a crucial trait for Berliners, especially for those that come from abroad and barely speak German. For the latter, life can be bleak. Around thirty percent of foreigners living in Berlin are living just on or under the poverty line. As one former employee put it:

The only reasonable thing you can congratulate Arvato with is the fact that they picked Berlin as their base. As this city is a melting pot of languages and cultures, where else could you find so many Swedes, Norwegians, Syrians, Turks, French and Spanish looking so desperately for work.

The majority of those new to the city are torn: on the one hand they want to live in the city, but that often means they have to accept any job to survive, regardless of how overqualified they are for it, or how mind-numbing and deadening it may be.

That explains why so many of Arvato’s moderators carry titles as prestigious as Doctor or Professor, or why others worked as quantum physicists before. Many of them are immigrants whose qualifications are not recognised in Germany. One former employee testified that these people were often the hardest to motivate as they found the work particularly mentally exhausting. Made worse by the knowledge that good work is rewarded with promotion to content moderator, a daunting prospect as that meant having to check reported videos as well.

One video was enough to completely ruin my life. I knew that. In no case did I want to be promoted to content moderator. I was in real fear that that would harm me mentally. Content moderators get to see horrendous things, stuff you can not even images. As images and as videos.

On top of that, content moderators are supposed to work faster than the people from one level below, the ‘FNRP’s’, at the bottom of the hierarchy. Moderators get on average around 8 seconds to judge a case – even though some of the videos they are supposed to watch often last much longer. One content moderator explains that he had a daily target of more than 3,000 cases. That amount corresponds with what US radio station NPR cited as the amount content moderators in countries outside Germany have to deal with per day, something which Facebook subsequently denied. According to one former employee, all moderation takes place on a special internal platform within Facebook, therefore he thinks that the social media firm should always be up to date with the latest numbers.

At the same time, it was impossible to watch and moderate each video individually. Some of them were so gruesome that you instinctively wanted to look away, even though you were not allowed to. Moreover, you had to evaluate every single detail for its appropriateness, while it often was not clear from the rules whether content was against the rules or not.

You had to reach the daily target, if not you were in trouble with your supervisor. The pressure was huge.

In the spring of 2016, members of the Spanish-speaking clearing team wrote a letter to Arvato’s board of directors about the enormous workload, high pressure and poor labour conditions. The letter quickly finds its way to all other Arvato employees: “Because of consistent overwork, we asked for the introduction of five-minute breaks (…). Until now, our request has been left unanswered. On top of the problems that were already made mention of above, we feel the need to tell you of the growing psychological damage caused by the reported cases we are forwarded, often containing extremely shocking or unnerving images.” According to employees, nothing has changed since then. Meanwhile, the workload has even increased with targets raised from one thousand, to almost two thousand cases per day for regular FNRP-workers. Facebook declined to comment when SZ-magazine tried to verify these numbers.

My team lead told me: if you don’t like this job, you can always quit.

Today, Arvato employs more than 600 people moderating content for Facebook in Berlin, according to one employee. A number that is constantly on the rise. In March 2016, a second building was purchased a couple of walking minutes away from the current office. Employees hung up a huge Facebook banner in their new office.

It’s such a contradiction: of course we find it cool to be working for Facebook, it’s a company that everyone knows and likes. You just try to hide the bad things.

Despite the fearsome character of the work, an astonishingly low number of employees quit, one of our sources confided. Perhaps it is that way because many need the job, or maybe because they became already too insensitive to the work to suffer from it. One member of the Arab-speaking team told us:“It is bad, but this way I can at least contribute to stopping the spread of violent and horrifying videos from Syria.”

But there are also countless examples of videos that force employees to give up.

There was a man with a child, I think the kid was around three years old. The guy turns on his camera and grabs the child. And then he grabs a butcher’s knife. I have a child myself, one exactly like that one. It could even have been him in fact. As I am trying not to get too mentally disturbed by this shit job, I turned everything off and walked out of the room. I grabbed my bag and walked to the tram, crying.

The scientific definition of a trauma is an event or experience which someone can not process mentally without needing outside help. Traumas often result from physical or mental violence and are very likely to lead to some sort of post-traumatic stress disorder when left untreated. Harald Gündel, who works as professor psychosomatic medicine at the university hospital of Ulm and chairman of the German Trauma Foundation, was given access to some of the transcripts that SZ-magazine obtained after interviewing Arvato employees. In them, Gündel recognises some potentially characteristic traits of post-traumatic stress disorder. From images of gruesome video clips or photos that keep returning in their minds, to recurring nightmares, reacting overly nervous at minor distractions that have little or nothing to do with their trauma; pains without any physical explanation, signs of social withdrawal, exhaustion, emotional detachment and loss of sexual attraction.

I could have become a nun after I saw the child porn. Since then, I could no longer think of sex. Since about one year, I can no longer be intimate with my partner. As soon as he touches me, I start to tremble.

Sometimes I realise I am losing little clumps of hair. I notice it after I am finished showering and sometimes it even happens in the office. My doctor told me I need to quit my job.

It happens all the time: people jumping up from their desks to run to the kitchen to pull open a window and catch their breath back after having watched a video of a beheading. Many started drinking or blowing excessively to recover from the horror.

After being approached by SZ-magazine, Facebook declared that: “All employees are entitled to psychological assistance, which they will receive promptly on the request of the employee, whenever it suits them.” Nevertheless, most employees complain of feeling left in the lurch and regret the blatant lack of psychological care when they need it. Adequate care is seemingly absent, as is any targeted prevention of mental disorders from working with online content that is often too gruesome to look at.

We acknowledge that Arvato indeed offers psychological assistance, but in daily practice it is often impossible to get the right support. They are doing nothing for us.

In 2013, two articles explicitly requiring employers to protect their workers from obtaining lasting psychological damage, were added to the German labour protection code. “Essentially, an employee’s mental health should never be allowed to damaged in the first place, the aim of the employer therefore, should be to limit any health risk”, says Raphaël Callsen, employment lawyer at law firm Kanzlei, in Berlin. He suspects that Arvato violates the labour law by not offering content moderators any professional psychological support. The employer must proactively take measures to protect the mental health of its staff. Employees should be allowed to take breaks after witnessing something that disturbs them and they should have direct access to someone with whom they can vent and reflect on the situation. If possible with a professional doctor with whom the employee can speak in complete confidentiality. None of the interviewed employees at Arvato know of the existence of any such doctor. Our sources mention public group sessions that they can join anytime to discuss their problems. The session is hosted by a social pedagogue, not a certified psychologist, as all of them concur. None of the employees we spoke to ever visited this session. They feel embarrassed to discuss the intimate problems some of them are struggling with in front of others. One female employee tried several times to book a private appointment with the social pedagogue. She eventually gave up when she was repeatedly told to wait and her appointments kept getting postponed. When asked by SZ-magazine, Facebook refused to tell which, if any, qualifications psychological assistants have to comply with, nor would it say if these people are allowed to work confidentially or not.

Where I am from, a social worker like her would tell everything immediately to my boss. After which he would simply let me go. No one in my team has any trust in this company – why should we trust them with our worries?

Examples of companies and agencies that do know how to deal with people exposed to cruel and violent media professionally, are plentiful. The so-called ‘Federal Department for Media Harmful to Young Persons’, is one of those. This government agency is tasked with evaluating which – often also gruesome – videos, movies and TV-shows are fit to be seen by young audiences. The department offers its employees regular trainings to help them manage exposure to disturbing content. “People working here never have to watch everything in one go, certainly not the more violent movies”, says Martina Hannak-Meinke, chairwoman of the department, “they can watch a part, do something else, and then watch the rest later.” Staff can also book exclusive appointments with a social worker, a psychologist or, when preferred, round-the-clock access to a trauma expert. Other authorities employing people that deal with violent content professionally, have set up equally strict protective measures: in some, employees are restricted to watching no more than eight hours of violent movies per week, or elsewhere they are only allowed to watch them when accompanied by another colleague, so that the most disturbing parts can be talked about as it happens, to relieve stress. And even others only employ only specially trained lawyers.

I was serving in the military in the country where I come from. Images of war and death do not really shock me anymore. But what still grounds me down me is the unpredictability. There is one video in particular that I can not get out of my head: it features a woman who is crushing a small kitten with her heels as part of a sex fetish video. I could never have imagined that people were capable of doing such things.

The described kitten video is a good example of content that should be removed by a moderator, as it is a clear violation of paragraph 15.1 from the secretive code of conduct that SZ-magazine has access to. Deriving erotic pleasure from the suffering of other living creatures, in other words: sexual sadism, is formally not allowed by Facebook.

Observing this expansive maze of rules requires a lot of effort from the people trying to implement them. For fear of leaking any of the information they receive during trainings, many employees don’t have the courage to take notes; so we are told by some of them.

The community standards are changing all the time. Before, a picture of a severed head was allowed as long as the cut was straight. What kind of rule is that and came up with it?

Facebook’s community standards include a chapter on hate speeches, that stipulates exactly which insults are allowed or not. It reads: “before, Facebook did not remove any instances in which immigrants were attacked as this demographic was not considered a protected category. This led to a negative portrayal in the media of Facebook and its posting guidelines, that in turn spiralled into a threat by the German government to make Facebook inaccessible in the whole country”. Something which spurred Facebook to update its community standards. Almost overnight, immigrants were also considered to be a protected category. On the one hand, this illustrates that politicians and pressure from society can indeed influence the rules which Facebook set up to determine the content that should be prohibited or removed. On the other hand, this lays bare one of the existential problems a company like Facebook is facing: whom or what enjoys any special protected status should first and foremost be decided on by the constitution and not by the code of conduct of company worth several billions that changes the rules it plays by depending on which way the political and societal winds blow. What, for example, would happen if – in the purely hypothetical scenario of a turn in American public opinion – Facebook suddenly decides to roll back back some of the protections it currently offers to Muslims? Would the bullying of Muslims suddenly get punished less severely than bullying Jews, Christians or Mormons, also listed as protected category by the code of conduct? The general public would most likely never know the answer, despite that fact that even the slightest change in Facebook’s community standards dramatically alters what billions of people around the world see on their timelines.

We are seeing so much suffering – but actually we know nothing about what happened to the people in the videos. How are the abused children doing now, and were the perpetrators prosecuted?

Most content employees from Arvato get to check, does not only cross moral standards, it also often breaks German federal law. Striking a balance in how Facebook should deal with criminal postings is not easy. According to German law, online platforms like Facebook are obliged to, as soon as they receive information of or come across any unlawful content, remove this immediately or block all access to it, explains Bernhard Buchner, who specialises in media and IT-law. In case they fail to act, companies like Facebook risk becoming liable themselves. And that is not all: paragraph 138 of the criminal code mentions a list of crimes that legally bind anyone learning of a person’s plans to carry any of these crimes out, should make this known publicly. A post on Facebook of someone plausibly claiming to shoot their classmates should therefore not only be removed, but also be reported to the authorities and those under threat.

It is well known that Facebook forwards child porn to the American National Center for Missing and Exploited Children (NCMEC). All incoming reportings registered by the NCMEC are checked and are where possible, sent through to the criminal authorities in the United States or in other countries, the German Federal Criminal Police Office confirms when asked by SZ-magazine. “As far as can be determined that the criminal act took place on German territory, all available information about the case gets forwarded to the federal criminal police. When asked which other crimes besides child porn Facebook forwards to the authorities, the social media refused to say.

There certainly are people at Arvato concerned with the fate of the content moderators. Facebook’s consolation however, presents a stark vision: computers will soon be able to detect content that break the community standards with the help of artificial intelligence. Facebook, Twitter, Google and Microsoft recently announced that they are planning to store suspected terror propaganda in a shared database and fitted with a digital fingerprint so that an image, that Twitter for example removed, can also be taken off Facebook without any human interference. On the one hand, that’s a thought that can make you feel more hopeful, this way people will no longer be exposed to horrifying content. But it can also be frightening to think that in the future the final say on which posts billions of people get to see on Facebook is decided on by an algorithm. That it is up to computers to judge whether something is too gruesome to be shown and what is not or what constitutes satire, before it turns into terrorism.

I know that someone has to do this job. But it should be done by people that are trained and prepared for the job, that receive help when they need it and that are not left to fend for themselves, like us.

I have this dream that keeps coming back: people falling out of the windows of a burning house. Their bodies smash on the ground, one after the other landing in a puddle of blood. I am standing down below trying to catch some of the people, but there are too many and they are too heavy. I have no choice but to walk away before I am getting smashed myself. I am surrounded by other people that do nothing to help. They are just recording everything on their phones.

During the course of our research we kept a constant check on the health and well-being of our sources.

One of them finally managed to overcome his frequent nightmares, although some images still kept recurring in his head. When one day he was standing on a ladder to replace a light bulb, he looked down – and suddenly the floor appeared to him on which he had seen videos of accused homosexuals dropping dead after being pushed of a roof by IS-executioners. Another source left the country and is now living somewhere far away from Germany. One is still struggling with constantly seeing animal torturers in the park and child molesters on the beach. She left Arvato and has sought the assistance of a psychologist working in a trauma clinic. One other source has started taking German language classes so that he can pick up his actual profession again in Germany.

None of those still working for Arvato have any serious intentions to stay at the firm.

Behind the walls of silence

Violence, hate speech, child pornography. Several months ago, employees of Bertelsmann-owned company Arvato who inspect and remove abusive content on Facebook, shared their traumatizing stories with SZ magazine. We stayed in contact with them. What has changed?

Mid-December 2016: When a report was published in SZ magazine depicting the working conditions of Facebook’s so-called content moderators – who are regularly exposed to the most horrific pictures and videos – an employee’s phone would not stop buzzing. She kept receiving messages from her colleagues: “Have you seen this?” – “I wonder who talked to the press.” – “Are we all losing our jobs?” No one knows that she is one of the whistleblowers who talked to journalists to leak the working conditions to the public.

I had to do my best to act as if I was surprised to see the article – as soon as it came out people started looking for the source.

Word got around quickly among the 600 employees of the Bertelsmann-subsidiary Arvato who work in shifts to delete content containing violence, hate speech and child pornography from Facebook. They are not allowed to talk to the press, even politicians are refused entry to the office in Berlin – although the work done inside is exceptionally controversial and of high public interest. Content moderators decide which posts reported as abuse get removed or stay online. They get exposed to the darkest sides of human behaviour: torture videos of humans and animals, executions. Many employees are relieved that the article in SZ magazine triggered a wider public debate about the implications of the traumatizing work they do. And yet many feel intimated by new regulations.

Our supervisors demand we report to them if we see any one talking to the press.

The day the article came out, rumor had it Facebook would no longer seek Arvato’s services and thus everyone would be laid off. In the following months, current and past employees of Arvato were intimidated by Facebook’s influence within the company and felt their fate was in the hands of an omnipotent corporate.

They generate a culture of fear which makes everything worse.
The refugees fear losing their jobs the most.

The Arabic team in Berlin contains of many employees that fled their home countries due to terror and violence – and are desperate for employment. This team makes up one of the largest in the Berlin office, which is divided into languages. According to sources, the Arabic employees are the most vulnerable: without going into much detail; Arvato has opened another office based in Morocco. Some content removal is already being done over there, as reported by current and past Arvato employees: A company called Phone Group, belonging to parent company Bertelsmann, is currently operating in Casablanca. As described in one of their own vacancies, the employer is a ‘large social media company’. Insider reports reveal that executives of Arvato traveled to Casablanca to give instructions on the rules and regulations of removing content on Facebook. Employees of Arvato report that there is cooperation and coordination between the Morocco and Berlin offices, as some ‘Subject Matter Experts’ supervising content moderators in Germany work from Morocco.

They constantly threaten with relocation to Africa or other countries in case of trouble.

The LinkedIn profile of the current Berlin-office executive indicates that he has managed teams in both Germany and Casablanca, Morocco. The addresses of the Phone Group in Morocco and the official office of Arvato are identical. Arvato currently posts vacancies for content moderators in Morocco, where strict EU labour law does not apply. Neither Arvato nor Facebook respond to the request of giving details about working conditions and psychological assistance of their employees.

About a year ago reporters of SZ magazine and informants removing content for Facebook at Arvato in Berlin started talking. The employees were eager to make their working conditions public.

In December 2016 the report was published, partially covering the regulations on removing content.

That day Minister of Justice Heiko Maas announced via the phone that Facebook would be under more scrutiny. Just like everyone else, he was not permitted to enter the office. A bill requiring Facebook to be more transparent is still highly debated. After the publication of the article, Arvato’s working conditions were examined. Meanwhile, a building currently undergoing construction is planned to be the new home of 700 employees by the end of the year.

Bubble: They constantly threaten with relocation to Africa or other countries in case of trouble.

Many of the Berlin office employees are not contracted with Arvato but hired through recruitment agencies. An employee of such agency who wished to remain anonymous reports that the same instructions were given there: no talking to the press, strictly prohibited by Facebook. She discloses that most people that find work at Arvato through recruitment agencies are desperate for work and have low to no standards for the kind of work they do. They usually receive a six-month contract; most of them do not speak German and cannot foresee the psychological damage their jobs will cause. Allegedly, none of the applicants ask for psychological counsel and support. And yet, even if they wanted to, recruiters would not be able to provide any information: they have no idea what the internal facility around psychological counsel at Arvato encompasses. They were not allowed to check the office space that they would send future employees to.

Some of the informants from our research in 2016 have left their job at Arvato since we’ve spoken to them. Other Arvato employees have contacted SZ magazine and confirmed the accuracy of our story and add: No effort has been made to improve the working conditions, instead there has been an increased effort to isolate the office even more – literally. Glass doors facing the stairway as well as windows have been covered with privacy film so no one can look inside. As soon as someone outside the building talks to an employee they get reminded by their managers: no one may talk. The office is part of a multiple-story office complex located on Wohlrabedamm in Berlin, with a shared canteen on the ground floor. If you run into Arvato’s executives they immediately hand you a copy that reads: we do not talk to the press, please send your request to our press secretary in Gütersloh.

They have increased security measures immensely: you’ll find more security personnel at the entrances, the no-phone policy is checked more strictly and if you want to use the restroom you’ll need to register.

The informants report that even ex-employees have received calls from former supervisors. They ask: Who talked? And how can we prevent people from talking? Some employees who in the past would point out problems internally now have to sign documents that say something along the lines of: I have not talked to journalists and will refrain from doing so.

Even though I still work there, I have given up. I often do not even glance at the pictures, I just randomly press buttons on the keyboard. Sometimes I remove content, sometimes I don’t. I cannot take any of this anymore.

The regulations on removing content are complex, secret, change constantly and are hardly comprehensible for employees. The rules are safely and strictly protected and only accessible via an internal browser, according to sources. This again needs a personal log in and makes it thus easier to track who accessed the document at what time – another safety measure to prevent the regulations from being leaked to the public.

SZ magazine was eager to research the changes within the Arvato office. Neither Facebook nor Arvato properly responded to multiple requests, left most points unanswered or gave very vague information. Facebook indicates ‘great concern’ about the depicted conditions. They claim to plan on conducting “a thorough psychological risk assessment to determine the well-being of our employees and if necessary, provide support”. However, Facebook fails to provide an answer to why such an assessment is only planned two years after their operations started in the Berlin office.

Audit

Around four weeks after the article was published in SZ magazine, auditors from the Authority for Occupational Safety and Health set foot into the Arvato office in Berlin in mid-January 2017 – the first out of a total of three audits. Our report and a letter from Würzburg-based attorney Chan-jo Jun, who had sued Facebook several times before, had raised an alarm with the authorities about the working conditions at Arvato. Their suspicion: Arvato had exposed their employees to psychologically damaging activities without ensuring any sort of support system.

Legislation for occupational safety formally refers to this as ‘risk assessment’ which must be determined prior to all activity: To illustrate, if you want to use an (industrial) furnace, you must first and foremost determine what the safety measures for anyone operating it are. According to the result of this assessment, one can measure how aware an employer is about the potential risks – and how much he/she abides to that very assessment.

This does not only apply to physical work, but also activities that include psychological burdens. Neither Arvato nor Facebook have disclosed whether such risk assessment has been conducted for the Facebook team.

As isolated companies like Facebook and Arvato are to the public, no one can escape the Authority for Occupational Safety and Health. They must be given access to all documents and offices at all times. The result of the audit reveals: a risk assessment exists, yet of very poor quality. It did not, for example, provide a serious framework for providing psychological support for their employees. Evidently, an indication that Arvato did not take their legal responsibilities to provide a system of support seriously. Even so, the result of the audit had no consequences. A spokesperson of Berlin’s Authority for Occupational Safety and Health explains: “Unfortunately we cannot sanction insufficient risk assessments if the malfunctions have been fixed after requesting the involvement of the authorities”. In other words: a company is not required to deal with the psychological distress of their employees unless the authorities have been alarmed and involved.

We are used to the fuss when some representative from Facebook shows up – everyone is expected to pretend like they love being here. I wouldn’t be surprised if it were the same when the authorities show up.

Two more audits followed, one of which was unannounced. At the end of March, Berlin’s Senator for Labour and Social Affairs issued a statement that the issue of psychological distress of employees had been addressed by “the implementation of offering counseling”. Arvato emphasizes that the authorities have found “no reason to intervene”. The Berlin government assumes that Arvato knows very well how to handle the authorities and is perfectly aware of how to act in case of audit. Arvato, which is in essence a service-providing outsourcing company, had done administrative work for the Berlin state governments for several years. The Berlin Senate discloses that Arvato still has active connections within the local governments. And yet, Robert Rath, head of the Berlin Authority for Occupational Safety and Health stresses that the fact that Arvato is aware of the psychological burden of their employees “should not be mistaken for acquittal”. In order to push the ongoing investigation forward, Robert Rath wants to talk to current and exemployees in confidence.

Improvements

Despite the unwillingness of Arvato and Facebook to disclose information, employees have contacted SZ magazine and revealed that the working conditions have somewhat improved. They report that the highest risk factor remains traumatic experience while checking reports of abuse on Facebook. After the article was published, a certified psychologist replaced the social worker responsible for providing psychological support. The new in-house psychologist works 40-hour weeks and offers psychological counsel – free of charge and during working times. The social worker, who was perceived by most as not sufficiently qualified, worked mostly in group sessions and did not create a culture of confidentiality. Upon asking, Arvato confirmed the improved psychological counsel and emphasizes that the counselors are subject to patient-doctor confidentiality.

I have never sought counsel with him, he only speaks German and English, but at least there is someone around now.

In addition, a social worker has been hired to support foreigners that don’t speak German – which make up the majority of employees at Arvato.

I believe they have realized themselves that they need to act. At least now the executives listen when we share that the work has tremendous impact on our wellbeing.

And still, many complain about persisting high pressure and an unreasonable workload. Employees are granted a few 15-minute breaks per day, away from their desk in a designated quiet area.

Unfortunately, the horrific images cannot escape my thoughts and they keep playing on repeat in my head.

A higher-ranking employee reports that they “initially screwed up and completely underestimated the risks the work entails. I haven’t worked here long enough to say with confidence whether it was their intention or sheer ignorance”. Although, it could have been assumed that removing content on Facebook comes with traumatic experiences: In 2010 the New York Times reported about the psychological distress that click-workers in the US have to sustain. Likewise, Wired magazine has repeatedly covered the risks of being a content moderator. Not to mention German child protection agencies, who have been exposed to violent content for decades. The agencies could have shared their experiences – though no effort was made on Avartos side to contact them.

Political Pressure

Upon publishing the article in December 2016, Minister of Justice Heiko Maas announced there would be legal consequences in case Facebook failed to address the issues. And thus, in March 2017, his ministry drafted a bill that would hold companies such as Facebook accountable with sanctions of up to 50 million euros – that is, if criminal content wasn’t removed faster and more carefully. The legal requirement for “obvious criminal content” prescribes: removal within 24 hours of the reported abuse.

I don’t see how this will work at all. With the existing amount of horrific images and videos, people are already overworked.

The ambitions of the bill are disputed, as critics see a potential infringement of freedom of speech: The new regulations could have a deterring effect on companies and result in automatic removal of reported content. Others fear that the interference of the state privatizes justice, as only courts should rule on the criminality of content. In fact, German law is already quite clear on the content that can be shared on the Internet. Undoubtedly, Facebook is not exempt from the law, though does not fully comply with the regulations as they remove content either late or not at all. In addition, Facebook’s own “community standards” provide a regulatory framework on content posting. After the SZ magazine’s publication in 2016, hate speech against certain groups of people – senior citizens, migrants or the unemployed – has been treated more mildly than that against people of certain religions or with disabilities. The details of the framework are not disclosed to the public, even representatives of the Ministry of Justice are denied access.

The bill also has the ambition of making it legally binding for companies like Facebook to become more transparent: “Social networks should be legally required to disclose how they treat their content moderators: not only how they get briefed, but also the extent of their psychological counsel”, explains state secretary of the Ministry of Justice Gerd Billen. The reason for wanting such a bill to pass also lies in the lacking willingness to cooperate of companies like Facebook: “We have been denied access to Arvato’s office in Berlin to this day”, concludes Billen.

Verdi, a labour union seeking to investigate the alleged improvements of the working conditions, seems to run into the same obstacles. “At this point we have neither members nor any sort of contact to Arvato’s worker’s council”, states a spokesperson of Verdi. Hoping to uncover the nature of Facebook’s work with authorities, the Linke party sent a request to the government – upon which they were faced with a disappointing reply: we don’t know anything. As if that wasn’t alarming enough, Facebook seems negligent even in case of serious criminal offences. In reply to the Linke party’s request, the Ministry of Justice soberly states that “there are no accounts of Facebook contacting law enforcement to report criminal content”.

Future

In more uplifting news, ex-employees of Arvato report that their mental health has improved since they have left. Some have found new employment.

I try my best to forget everything. The less I think about it, the better I feel.
I have broken contact with everything that only in the slightest relates to Arvato.

One of Arvato’s former employees has only found the courage to tell her mother about her job after the publication of the article. The mother, living abroad, was utterly appalled. Some of the informants plan to sue Arvato – the claim: severe psychological damage. Most of them have sought professional help with psychologists, therapists and trauma experts; initial sessions have revealed a few anomalies.

“Deferred psychological damages like this are entirely unnecessary. Adequate training and qualified counseling at the work place can prevent people from suffering such traumatic experiences”, explains trauma expert Jane Stevenson. In 2002, Stevenson published her ‘Best Practices’, a set of guidelines for treating patients that are exposed to traumatizing content. She blames the suffering of content moderators on the capitalist character of social media companies – cutting corners instead of providing an adequate system of support. Outside the corporate world (secret services, the police, prosecution of pedophiles), this problem is widely recognized and has the proper mechanisms in place.

The lawsuit of ex-content moderators Henry Soto and Greg Blauert against their former employer Microsoft attracted attention on the issue in January in the United States. Their claim: insufficient support in dealing with the psychological damage caused by their job. Their case emphasizes that the problem is more severe and goes beyond Facebook: any company producing online content has to deal with hate speech, pedophilia, violence and propaganda. And yet, there is no end in sight.

On Youtube, around 90 percent of reported abusive content gets removed. Youtube’s long-term vision is to have computers in place that automatically recognize and remove abusive content. Algorithms cannot manage to make these decisions yet, and thus humans are still needed to make such differentiated decisions. Just like Facebook, Youtube has employees around the world removing content. The difference: Youtube already uses computers to analyze content, simultaneously training AI to partially make these decisions without the need for humans in the future.

Facebook has had plans to use AI since 2016. And yet, the demand for content moderators increases. Offering services such as live videos requires even more attention and stricter regulation – which cases of live streams of murders and rape have shown. Tragically, the trend of spreading fake news, hate speech or other manipulative content has entered the online world – which is as of yet difficult to detect with machines.

Facebook CEO Mark Zuckerberg recently announced increasing the current 4500 content moderators to a staggering 7500. Yet, Facebook failed to mention whether these new moderators would be employed by Facebook or external companies such as Arvato. The new content moderators won’t be seeking for work for long, as the content posted on Facebook seems to get increasingly more violent, vicious, and uncontrollable. A homicide of an elderly man in Cleveland. A father in Thailand who murdered his daughter and posted the video online. The New York Times predicts that dealing with such atrocities might be Facebook’s biggest threat. The inclusion of such dreadful content in virtual reality applications could be one of next steps in technological development. 

Arvato remains to grow. The aim is to employ 700 people in Germany at the end of the year – the Berlin office is preparing by expanding the building, the grand opening is approaching. Minister of Consumer Protection, Renate Künast, is one of many who has asked Facebook to visit the Berlin office. And then, at the beginning of May, a spark of hope. Facebook-lobbyist Eva Maria Kirschsieper grants Künast a visit to the office, to get an impression. They’re still looking for an appropriate time in their busy agenda. Good things come to those who wait.