Content Moderation Case Studies | Techdirt.

0

of find-the-monsters-among-us department

Summary: YouTube offers an endless stream of videos that meet user preferences of all ages and has become a go-to content provider for kids and their parents. The market for children’s videos remains wide open, with new competitors surfacing daily and using repetition, familiarity and keyword strings to present their videos to kids willing to spend hours clicking on the thumbnails that spark. their interest, and YouTube is leader in this market.

Taking advantage of the low expectations of extremely young viewers, children’s YouTube videos are filled with low-cost, low-effort content – videos that use familiar songs, bright colors, and pop culture props to attract and hold the audience. attention of children.

Most of this content is harmless. But a much darker strain of content was exposed by amateur Internet sleuths, who were quickly dubbed “Elsagate,” borrowing the name of the main character from Disney’s massively popular animated hit, Frozen. TO the subreddit r / ElsaGate, the redditors found videos aimed at children that contained adult themes, sexual activity or other content not suitable for children.

Some decidedly dangerous topics for children listed by r / ElsaGate include injections, blood, suicide, pregnancy, BDSM, assault, rape, murder, cannibalism, and alcohol use. Most of these acts were performed by animated characters (or actors disguised as characters), including the titular Elsa as well as Spiderman, Peppa Pig, Paw Patrol, and Mickey Mouse. According to parents, users and members of the r / Elsagate subreddit, some of this content accessible via the YouTube Kids app – an oriented child youtube version subject to tighter controls and curated content intended to keep children away from adult subjects.

Additional attention was drawn to the matter by James Bridle’s post on the subject, titled “Something is wrong on the internet. The post – preceded by numerous content warnings – detailed the sheer amount of disturbing content that easily found its way to younger viewers, mainly thanks to its child-friendly tags and harmless thumbnails.

The end result, according to Bridle, was just horrible:

“Exposing children to this content is abuse. We are not talking about the questionable but undoubtedly real effects of the violence of movies or video games on adolescents, or the effects of pornography or extreme images on young minds, to which I alluded in my description of opening my own teenage internet use. These are important debates, but that is not what we are talking about here. What we are talking about is that very young children, indeed from birth, are deliberately targeted with content that will traumatize and disturb them, via networks that are extremely vulnerable to precisely this form of abuse. This is not about trolls, but about a form of violence inherent in the combination of digital systems and capitalist incentives. It is at this level of metal. James Bride

Elsagate“also received more coverage. A New York Times article on the topic wondered what had happened and suggested that the videos had escaped YouTube’s algorithms that aimed to ensure that the content that was showing on its Kids channel was genuinely appropriate for children. YouTube’s response when asked to comment was that this content was “the extreme needle in the haystack,” perhaps an immeasurable percentage of the total amount of content available on YouTube Kids. Needless to say, this response did not satisfy critics, and many have suggested that the online content giant relies less on automated moderation when dealing with content targeting children.

Business Considerations:

  • How should content review and moderation be different for content targeting young YouTube users?
  • How could a verification process be deployed to monitor users creating content for children?
  • What processes can be used to make it easier to find and remove / restrict content that seems kid-friendly but is actually filled with adult content?
  • When content like the one described in the case study goes through the moderation process, what can be done to restore the trust of users, especially those with young children?

Considerations on the issues:

  • Should a product targeting children be separated from the main product to ensure content integrity and make it easier to manage moderation issues?
  • Does creating a product specifically for children increase the chances of regulation or direct intervention by government entities? If so, how can a company prepare for this fate?
  • If you are creating a “restricted” product for children, should it require all content to be fully and thoroughly vetted? If so, would it become prohibitively expensive, making it much less likely for companies to create products for children? Is there a way to balance these things?

Resolution: Immediately after these reports, YouTube purged content from YouTube Kids that did not meet its standards. It removed videos and posted new guidelines for contributors. He added a large number of new human moderators, bringing his total moderators to 10,000. YouTube has also been deleted the hugely popular “Toy Freaks” channel, which users had suggested contained child abuse after investigating its content.

YouTube was not the only entity to act after the “Elsagate” World Video Show. Many of these videos are from China, which prompted the Chinese government to block certain search keywords to limit local access to disturbing content, as well as shutting down at least one company involved in the creation of these videos.

Originally posted on the Trust & Safety Foundation website.

Filed under: content moderation, elsagate, children
Companies: youtube


Source link

Share.

Comments are closed.