YouTube is under fire for allowing troubling videos to get past its filters on an app designed specifically for younger viewers, according to a report this weekend by The New York Times.
The Google-owned website is the largest video site in the world, with more than a billion people visiting a month. The affected service, YouTube Kids, was launched in 2015 to be a family-friendly version of the site.
But the kids service reportedly has a dark side. One video showed Mickey Mouse in a pool of blood while Minnie looks on in horror. In another video, a claymation version of Spider-Man urinates on Elsa, the princess from “Frozen.” The videos were knockoffs depicting the beloved Disney and Marvel characters.
Representatives from The Walt Disney Company, which owns Marvel, didn’t immediately respond to a request for comment.
YouTube called the content “unacceptable,” but said it isn’t rampant. In the last 30 days, less than .005 percent of videos viewed in the app were removed for being inappropriate, the company said. YouTube is trying to reduce that number.
“The YouTube Kids team is made up of parents who care deeply about this, so it’s extremely important for us to get this right, and we act quickly when videos are brought to our attention,” a YouTube spokeswoman said in a statement. “We use a combination of machine learning, algorithms and community flagging to determine content in the app as well as which content runs ads. We agree this content is unacceptable and are committed to making the app better every day.”
The videos made it onto YouTube Kids by getting past safety filters, either by mistake or by trolls gaming the software.
The controversy comes as tech giants find themselves under intense scrutiny from Congress over the power and influence they have over what billions of people see online. Google, Facebook and Twitter spent last week inover the way Russian trolls abused their platforms to meddle in last year’s US presidential election. Lawmakers grilled the tech companies over accountability for the algorithms they used.
This isn’t the first time YouTube has faced a backlash for unsavory content. Earlier this year, advertisers boycotted YouTube after their ads appeared next to extremist and hate content because of YouTube’s automated advertising technology. Major brands including AT&T and Johnson & Johnson ditched advertising on the platform.
As for the issues with YouTube Kids, the company said parents can use additional controls to limit what their kids see. The controls allow for blocking specific videos or channels and turning off search. YouTube said the app was never meant to be a curated experience, and that parents flagging inappropriate videos would make the app better over time.
The Smartest Stuff: Innovators are thinking up new ways to make you, and the things around you, smarter.
Special Reports: CNET’s in-depth features in one place.