Advertisement

News

How does the YouTube algorithm actually work?

How does the YouTube algorithm actually work?
Grace Sweeney

Grace Sweeney

  • Updated:

YouTube users watch a billion hours of video each day — from branded content and tons of music videos to cats and conspiracy theories.

youtube

But one thing that has long plagued content creators is the mysterious algorithm. Like other platforms, YouTube aims to show users what they want to see, using an automated system that sorts through the massive selection of content uploaded every second.

According to YouTube, their search and discovery system has two goals in mind. One is to help viewers find videos that they want to watch. Two is to maximize viewer engagement and satisfaction in the long term.

But it seems that there’s more at play here, from how the platform makes recommendations to who gets access to ad revenue.

Here’s a little more about what’s going on behind the scenes.

How the YouTube algorithm works

The algorithm has gone through a few iterations

Initially, YouTube cared about clicks and clicks alone. Which, of course, was a flawed metric. Counting each time viewers clicked “play” did not indicate quality, and creators started to take advantage of the system by adding click bait descriptions to drum up higher rankings.

Then, in 2012, YouTube changed it up. Instead of relying on clicks, they opted to measure engagement — or how long people spent on a video. Which, of course, is a more accurate measure of whether videos meet viewer expectations.

In 2016, YouTube released a paper discussing how the algorithm worked. It’s geared toward a more technical audience, but long story short, the algorithm looks at how the audience interacts with video content, using AI that analyzes 80 billion pieces of audience feedback.

Among those billions, audience feedback covers likes, dislikes, watch time, how many videos someone watches, and what they do or don’t watch.

These days, they’re playing around with the order videos appear in an attempt to boost viewer satisfaction. However, the inner workings are still a big mystery to content creators and viewers alike.

Pew finds that popularity and length play a major role

youtuber

Pew Research found that 64 percent of recommendations went to videos with more than a million views. And the top 50 most recommended videos were viewed on average, 456M times a pop.

As such, the research concluded that when YouTube doesn’t know much about your preferences, they’ll go ahead and recommend what’s popular. And, the more you watch, the suggested videos start to get longer and longer.

See, each researcher took a “walk” through the platform, each starting from a different point of entry.

On average, those initial videos were about 9 minutes. Videos at the end of the session were, on average 14 minutes and 50 seconds — and had a much higher view count.

It seems that YouTube assumed that if a user watches multiple videos, they’re likely to hang around for a while.

Additionally, the research found that only about five percent of the recommendations offered went to videos with fewer than 50,000 views.

That said, the Pew Research was performed by anonymous users who did not have an existing user history. YouTube makes recommendations based on a user’s habits, more or less getting to know them over time.

So, when the algorithm is working with a clean slate, it’s going to lump you in with the masses until you start offering up preferences — likes, dislikes, specific searches, and so on.

Lack of transparency has been a problem for content creators

Last year, several YouTube creators criticized the platform for experimenting with the video delivery system. YouTube stars have had issues with videos failing to display for subscribers or the platform pulling ads from the videos people do see.

The company started testing an algorithm that changed the order that videos appeared in users’ feeds. Where videos used to display in a subscribers’ feed in chronological order, YouTube said it was testing an approach aimed at delivering content that users actually wanted to watch.

The other major change came in the form of an ad disabling effort, where YouTube’s AI started flagging offensive content.

The so-called ad-pocalypse was a response to advertiser complaints about their products shown alongside videos containing hate speech, violence, and other offensive content.

The problem, however, is many YouTubers found themselves demonetized, ads pulled due to colorful language — not hate speech or violence. While some of these creators may have violated specific rules, the platform failed to explain how creators can protect themselves or fix the situation.

While the published paper we mentioned above represented an act of transparency, content creators have long been frustrated with a lack of insight into how the platform works.

In response to the ad issues and surprise display changes, YouTubers have increasingly started posting longer videos to squeeze in more ads per video. The idea is, the algorithm will give preference to longer videos, as they make more money in ad dollars.

The algorithm can send viewers on a dark path

conspiracy theory

Take, for example, this article in The Atlantic that poses the question, “does YouTube unwittingly radicalize its viewers?” In it, they cite the scholar, Zeynep Tufekci, who studies sociology in the age of the internet. She found that after watching Donald Trump rallies on the platform that autoplay started recommending videos that took a sharp turn away from mainstream political content — think Holocaust denials and conspiracy theories.

Then, she tried watching videos of Bernie Sanders and Hilary Clinton. And again, found that autoplay began recommending conspiracy videos, this time with a leftist bent.

The same thing happened with apolitical content — looking up videos about jogging eventually led viewers to a selection of videos about running ultramarathons.

Tufekci’s theory is the algorithm is exploiting a natural human desire to dig deeper, to uncover secrets — sending viewers toward an extremist menu of content.

Okay, time for yet another pivot. YouTube said today that it will try to show fewer fringe videos. It won’t remove the clips, but the algorithm will try to serve them up less frequently as “watch next” options.

The Pew Report we mentioned above also mentions that half of YouTube users they surveyed use YouTube to understand the world around them. Which, when you consider the fact that 68% of Americans get at least some news from social media, that may be a real problem.

Chances are YouTube doesn’t fully understand the algorithm, either

At a certain point, algorithms start taking on a life of their own, taking what they’ve learned in the programming stage and from user behavior and kind of, well, running with it. We doubt the platform is actively trying to instill extremist behavior in its users or prevent creators from appearing in the suggested results.

But, it is a problem when you consider that some ads may have been pulled unfairly. It’s also a problem that some people believe the platform is a reliable source of information.

You might be able to learn how to grill salmon in a pan or DIY a new coffee table, but it’s maybe not the best place to learn about global events or your choice of political candidates.

Grace Sweeney

Grace Sweeney

Grace is a painter turned freelance writer who specializes in blogging, content strategy, and sales copy. She primarily lends her skills to SaaS, tech, and digital marketing companies.

Latest from Grace Sweeney

Editorial Guidelines