Advertisement

'No more faith in humanity': A day in the life of Berlin Facebook moderators

DPA/The Local
DPA/The Local - [email protected]
'No more faith in humanity': A day in the life of Berlin Facebook moderators
Inside Facebook's moderation centre in Berlin. Photo: DPA.

For the first time, Facebook opened up its Berlin centre for deleting hateful or violent content, providing journalists with a glimpse into the workers' everyday dealings with decapitation videos, racist propaganda and child pornography.

Advertisement

“I still remember the first beheading video - I turned it off, went outside and wept a little bit,” said one female employee.

But she said that this was her only breakdown, because the first time she was unprepared.

“Now we’re so used to it, that it’s not so horrible anymore,” the 28-year-old explained.

This was the first time that journalists were allowed to speak with three workers at Facebook’s deletion centre, though they were not allowed to give their names so as to protect their identities.

In total, 650 people work in this multifaceted operation to examine and delete posts which could be considered illegal, or against Facebook’s own rules.

They alert Facebook when they believe that someone could harm themselves or others. These workers have already been able to prevent suicides through subsequent contact with police, they say.

One of their least stressful tasks is also to verify the authenticity of accounts.

Facebook is now facing increased pressure from the German government to crack down on hate speech, after the Bundestag (German parliament) recently passed a law to fine social media companies up to €50 million for not swiftly removing illegal content.

The legislation - one of the toughest in the world - came amid a rise in racist content posted online, often in response to the refugee crisis, which has brought in around one million asylum seekers since 2015.

Holocaust denial, incitement of hatred, as well as racist and anti-Semitic speech are all illegal under German law.

But opponents of the so-called "hate speech law" have cautioned that the fines could stifle free speech, with social networks opting to delete rather than thoroughly vet content out of fear of being punished. Facebook itself condemned the law before its passage for allowing the state to “pass on its own failures and responsibilities to private companies”.

SEE ALSO: Opinion - How Germany's 'hate speech law will put control of free speech in private hands

In recent months, German media have published critical reports of the working conditions within Facebook's Berlin moderation centre, which is run by global services company Arvato and contracted out by Facebook. Former employers claimed that they felt psychologically strained by the work, and were not cared for by their employer, which is a division of the Bertelsmann media company.

“As a team leader, I do not know if someone needs support or not,” said one employee during the media visit, noting that workers are told to report such issues themselves.

“No one can read minds,” said one of his female colleagues. “And the support has already been available for some time.”

Inside the building, stickers are posted around work spaces with contact information for psychological support, which Arvato manager Karsten König said was not always the case. He admitted that perhaps they could have made these offers more prominent from the beginning.

The employees themselves - under the watch of both Arvato and Facebook spokespeople - expressed being hurt by the reports of poor treatment.

“I was really upset about it,” said one of them, noting that it had cast a cloud over the team’s work.

“We save lives, we try to help people.”

Her co-worker chimed in with agreement.

“We feel good about what we do. When I can save someone from seeing something through my work, then I find that really good,” she said, adding that if she had kids, she also would not want them to stumble upon certain content.

The office itself feels like any other in a big company: long rows of desks with ten to 12 people each, space for around 60 people in a room, but there's also a white wall with the iconic Facebook “like” thumbs up, and fruit and vegetables up for grabs. The office also offers yoga classes, and a “feel good” manager is available to help troubled workers.

READ ALSO: Facebook launches battle against 'fake news' in Germany

Each of the three workers speaking with reporters had been at the moderation centre for at least a year. Newcomers are given a week of orientation, and then a multi-week training process for certain tasks, explains Facebook manager Walter Hafner. They also "shadow" someone else before going into a new field.

The employee who was shocked by the beheading video said she was already shown the post during the orientation phase.

Then she was given “high priority” content that needed quick action, such as posts about self harm or suicide.

“I realized then that I could not get over it very well, and asked that I no longer had to deal with it,” she explained.

But how long can workers put up with such tasks?

“Definitely not for years. You want to of course develop yourself further,” she said.

But her colleague reported being better at dealing with the unpleasant content.

“I personally have never been disturbed by the content,” he said, noting that he has seen everything from child porn, to animal abuse, to murders.

“It’s not that I think it’s nice, but that I can always separate between work and personal things.”

He added that he did once go to a psychologist for preventative care.

As human filters for the internet, these workers admit that the job does change people.

“It certainly sensitizes you,” said one woman, noting how someone might not notice a woman on the S-Bahn with scars on her hands unless they were used to looking for self harm.

“People do terrible things to themselves,” said one woman.

“I personally did not have much faith in humankind beforehand, and now I virtually do not have any.”

Reporting by Andrej Sokolow, DPA

More

Join the conversation in our comments section below. Share your own views and experience and if you have a question or suggestion for our journalists then email us at [email protected].
Please keep comments civil, constructive and on topic – and make sure to read our terms of use before getting involved.

Please log in to leave a comment.

See Also