YouTube Kids and the corruption of recommendation algorithms

Recommendation engines have been around for years, at least since Amazon started correlating shared purchases and suggesting products (or was it only books back then?) with “since you bought this, you might like this.” It was a good-enough recommendation algorithm that helped shoppers sift through endless options to find what was relevant for them.

The goals for today’s recommendation engines haven’t changed much from those early years: find the user things they like in order to either sell them more stuff or keep them on the site for a longer time to show them more ads, also known as “engagement.” Yet while today’s recommendation engines have the same goals, they’re coming up with completely different methods and results. On one hand we have Facebook’s tinkering with their Newsfeed algorithm, where trying to increase engagement has negative results such filter bubbles and promotion of extreme content. On the other we have Spotify’s amazing discovery playlists, such as the Daily Mix, that almost always delight me in their selection of new-to-me music. In between we have Netflix, which sometimes gets it right, and more selective stores, like Nordstrom, that do a decent job of suggesting products others bought. For most of these the recommendation engine is a black box for users, and its effects are measured religiously.

Yet with all recommendation engines, especially when user-generated content is involved, it’s a game between the platform, that decides what to recommend, and content providers, who try to influence that decision. What got me thinking about just how intensely this game is played is this very detailed post from a few weeks ago about how independent producers are gaming YouTube Kids. “Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale,” said the author, James Bridle. At the time, the post seemed too extreme and I waited for other analysts to weigh in. This week, John Biggs  at TechCrunch said conclusively: YouTube isn’t for kids. “YouTube is a cesspool of garbage kids content created by what seems to be a sentient, angry AI bent on teaching our kids that collectible toys are the road to happiness. YouTube isn’t for kids. If you give it to kids they will find themselves watching something that is completely nonsensical or something violent or something sexual. It’s inevitable.” This is as condemning as it comes.

YouTube’s and YouTube Kid’s reach is incredible. Earlier this week, Ofcom published a study on “Children and Parents: Media Use and Attitudes Report” in the UK and came up with these numbers:

Younger children especially watch a lot of YouTube Kids.
Source: Ofcom

  • YouTube is the content provider the highest proportion of 12-15s say they ‘ever’ watch – 85%.
  • YouTube is the only content provider, of the 14 examples, which is used by a majority of 12-15s to ‘often’ watch content.
  • Use of the YouTube website or app increases with the age of the child, accounting for 48% of 3-4s, 71% of 5-7s, 81% of 8-11s and 90% of 12-15s. Use of YouTube has increased since 2016 by 11 percentage points for children aged 3-4, by 17 percentage points for 5-7s and by eight percentage points for 8-11s. [Note: YouTube Kids was launched in February 2015.]
  • Half of YouTube users aged 3-4 (48%) and a quarter (25%) aged 5-7 only use the YouTube Kids app rather than the main YouTube website or app.

These numbers are high but not entirely surprising. Parents trusted Google when it said YouTube kids was child-friendly: “the app makes it safer and easier for children to find videos on topics they want to explore.” Its availability on every device has made it easy to access from practically everywhere. But with that level of trust, could YouTube Kids have done more to monitor content?

Last week, YouTube issued a response that, based on the comments on it, didn’t do enough. Of the five changes, only two went beyond guidelines: “tougher application of our Community Guidelines and faster enforcement through technology.” Also, ads will be removed from “inappropriate content.” What they didn’t do is allow parents to block specific providers or, as some requested, to whitelist channels or providers. The new restrictions don’t go far enough in allowing parents to control what their children watch and to block content they don’t want them to watch.  

What this proves, beyond the eternal axiom that we just can’t have nice things, is that once a service allows user-created content, it’s finding it extremely difficult to monitor that content. Beyond that, many creators, not only on YouTube, have figured out how to game the recommendation algorithm and get their content in front of viewers. I wish YouTube had taken a stronger stance with this, like blocking all creators aside from a few hand-picked ones until they can figure it out. Maybe that won’t be the most profitable choice, but it might be best from the product perspective. Until then, I have to agree with Mr Biggs: YouTube is not for kids.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s