[dt_button link=”http://d165vjqq8ey7jy.cloudfront.net/mp3/33253/se-8091s.mp3″ target_blank=”true” button_alignment=”default” animation=”fadeIn” size=”small” style=”default” bg_color_style=”custom” bg_color=”#333333″ bg_hover_color_style=”custom” bg_hover_color=”#444444″ text_color_style=”custom” text_color=”#ffffff” text_hover_color_style=”custom” text_hover_color=”#dddddd” icon=”fa fa-cloud-download” icon_align=”left”]Yuklash[/dt_button]
[dt_divider style=”thin” /]
Transcript:
Voice 1
Welcome to Spotlight. I’m Liz Waid.
Voice 2
And I’m Bruce Gulland. Spotlight uses a special English method of broadcasting. It is easier for people to understand, no matter where in the world they live.
Voice 1
A woman is working on her computer. An image appears on her screen. It is not an image the woman wants to see. In it, a man is killing a dog. The woman quickly dismisses the image. Soon after, another image appears. This image is also difficult to look at. In it, two people are performing violent sexual acts. The woman dismisses this image too. Seeing these images is clearly difficult for the woman. But as she sits at her computer, the images continue to come. She will not turn the computer off or get up and walk away. She dislikes them, but looking at these images is part of her job. She is a content moderator.
Voice 2
Many people work in a job like this. But people rarely think about the people doing these jobs. Today’s Spotlight is on the human cost of content moderation.
Voice 1
Today, almost half of the world’s population uses the internet. People use it to share content like videos, images and ideas. The internet can be a wonderful thing. It can help connect people and spread understanding. But some people also use the internet in negative or illegal ways. This includes sharing violent or offensive things. This kind of content can frighten and disturb people. It can hurt people mentally. Sadly, there are a lot of things like this on the internet. How can people avoid seeing this terrible content?
Voice 2
For many large online companies, content moderation solves this issue. Content moderation involves finding images and words that are criminal or violent. A content moderator decides what content is bad for people to see. Then they remove this disturbing content from websites. Computer programs perform some content moderation. But computers cannot always understand the difference between good and bad content. So, online companies employ people to do this work.
Voice 1
But the policies and rules about content moderation are often held in secrecy. No one wants to talk about it, but everyone does it. An executive from a company that uses content moderators spoke to the New York Times. He did not share his name. But he said:
Voice 3
“Online companies pay millions a year for content moderation. It provides some control over the internet, which for the most part is not controllable. If businesses do not do it their chance to make more money will completely disappear.”
Voice 2
Content moderators remove many different kinds of offensive content. This content may include violence, pornography, child abuse, animal abuse, or hateful images. How does a moderator decide what should be deleted? Some of the decisions are simple. But others may be more difficult. They may change depending on the culture or a person’s beliefs. Dave Willner began working at Facebook in 2008. He told The Verge:
Voice 4
“We were told to take down anything that makes you feel bad, that makes you feel bad in your stomach.”
Voice 1
Seeing these terrible things has a cost. For example, a content moderator could see 100 terrible images or videos in a day. Seeing this many horrible images can be very bad for a person. Many moderators are not prepared to see so many terrible things. And it is a job that becomes more and more difficult over time. Rob is former moderator for YouTube. He tells Wired Magazine,
Voice 5
“Everyone reaches their limit, usually between three to five months. You just think ‘What am I spending my day doing? This is horrible.’”
Voice 2
Some content moderators develop mental problems. The stories and images that they see change how they think about the world. Some moderators become anxious or fearful. Other moderators become depressed. Rob experienced this. The videos he watched every day made him feel horrible. So he began to drink a lot of alcohol. This made him feel less emotional pain. But it did not really help. He tells Wired:
Voice 5
“If someone made a video of violence against animals, they were usually the one who hurt the animal. That person was proud of it. And seeing it from the eyes of someone who was proud to do a horrible thing, not just a news report about it, it hurts you worse. It just gives you a much worse view of humanity.”
Voice 2
Rob left his job after only a few months. He could not deal with what he had to see.
Voice 1
In 2016, two content moderators accused the company Microsoft of not providing enough mental support for content moderators. Microsoft employed the two men for several years. During that time, the men saw many horrible things. The images and videos from their job invaded their private lives. One of the men was named Henry Soto. During his time at Microsoft, he saw many child pornography videos. These images of sexual acts including children affected Soto deeply. Now, he cannot even see his own son without thinking about them. Microsoft says that they employed a counselor to talk to Soto about his mental health. But Soto says that this was not enough.
Voice 2
Many content moderators feel that companies do not treat them well. Many are from poor countries, and are not paid well. No one warns them of how difficult their jobs will be. Ben Wells is the lawyer representing Henry Soto against Microsoft. He told Al Jazeera why it is important for businesses to address these problems.
Voice 6
“These internet service providers and tech companies make billions of dollars. But like a lot of products, there is a toxic by-product that comes as a result. And that toxic by-product needs to be managed. And the employees that work with that toxic by-product need to be protected.”
Voice 1
Terrible content will always exist on the internet. So websites will continue to need content moderation. Making the internet safe for all people seems like a good policy. But who will make the content moderators safe? There seems to be no easy solution.
Voice 2
What do you think? What kinds of support should businesses provide for content moderators? What is the best way to moderate content on the internet? Is content moderation necessary? You can leave a comment on our website. Or email us at radio@radioenglish.net. You can also comment on our Facebook page at Facebook.com/spotlightradio.
Voice 1
The writer of this programme was Dan Christmann. The producer was Michio Ozaki. The voices you heard were from the United States and the United Kingdom. All quotes were adapted for this programme and voiced by Spotlight. You can listen to this program again, and read it, on the internet at www.radioenglish.net. This programme is called “The Internet’s Secret Job”.
Voice 1
Look for our free listening app in the Google Play store and in iTunes. We hope you can join us again for the next Spotlight programme. Goodbye.
Nicely done