Cracking the Algorithm: taking a look inside Youtube

The YouTube algorithm was discovered to be unpredictable and hard to use

Kimberly Cui

The YouTube algorithm was discovered to be unpredictable and hard to use

High school and college students have been spending increasing amounts of time on YouTube this past summer, and many of them have been finding conflicts with the algorithm. 

With COVID-19 keeping everyone indoors, YouTube became a quick escape for entertainment. Like Instagram’s bottomless explore page can suck one into a “rabbit hole,” YouTube’s algorithm is programmed to behave similarly.  However, a deeper investigation into the YouTube algorithm reveals the many problems hidden inside this popular entertainment outlet. 

Anyone who spends a considerable amount of time on social media platforms like Instagram, Snapchat or YouTube is aware of how fast time flies while doom scrolling. 

YouTube is an entertainment machine designed to feed their audience tailored videos (videos suited to the individual’s taste) and media to increase the time spent watching on the platform through its complex machine learning system. With every click, the algorithm learns to push forth media that appeals to the individual so that they will continue to be active on the platform. 

Pranav Dhillip, a student at the University of California Santa Cruz and a Dougherty Valley High School alumni, is also a videographer and video editor at the City on the Hill press. He has a YouTube channel where he produces and posts short films. “I have spent years trying to figure out how the Youtube Algorithm worked. I still don’t understand it, I feel like it is so random sometimes,” Dhillip reflected. 

However, with this algorithm, new problems have emerged. An increasing amount of people turn towards social media platforms, such as YouTube, as a primary source of information and news. But with no concrete system of verifying the credibility of information fed to users, the platform allows misinformation to quickly spread and become popularized. It merely feeds information based on popularity; a force that is prone to deception, as many do not bother to go through the lengths of fact checking details. 

Guillaume Chaslot, a software engineer who worked on YouTube’s recommendation system from 2010-11, reasons, “This is dangerous because this is an algorithm that’s gaslighting people to make them believe that everybody lies to them just for the sake of watch time.”

The platform must find a way to manage a fine balance between user experience, and profit. If the algorithm is too profit-oriented, and the user experience will plummet, as well as implications of the platform’s moral standings and credibility. Too user experience-oriented, however, and the profiteers will not be satisfied. 

With an algorithm lacking humanity and not understanding boundaries or which lines not to cross, this can have dangerous effects. A news report published in 2020 by social-activism nonprofit Avaaz showed just how far misinformation can spread. After collecting more than 5,000 videos, 16% of the results when “global warming” was searched depicted false information. Results for “climate change” and even more misdirected searches such as “climate manipulation” gave rise to 9% and 21%, respectively, of videos containing misinformation. 

YouTube’s executives revealed that this recommendation algorithm drives over 70% of watched content on Youtube to create a tailored and individual experience for each user. After all, more watch time means more revenue, which then leads to the problem of neglecting user experience (how one interacts with and experiences a product). This means the priority of the platform is shifting from solely entertaining users, to making money while entertaining users. 

Recode Executive Editor and MSNBC contributor, Kara Swisher, says, “They [YouTube] may not be meaning to do it, but if growth is the goal, then user experience is not the goal.”

What does this neglecting of user experience signify? 

This algorithm’s popularity has been decreasing and gaining immense criticism, as such issues have been brought into light. One possible solution to this increasingly concerning problem is to strip the platform of its recommendation system entirely. However, this is highly unlikely, as the algorithm is highly dependent on this variable. The next best step in this case might just be to increase human involvement in the recommendation system. Although this will not fix the mass of problems under the platform, it can at the very least uphold certain morals. 

YouTube contains numerous problems and issues in the way the platform functions and pushes out videos to viewers. But despite these flaws, many people still come to the platform to seek knowledge and information, which only serves to increase the urgency of solving the algorithm’s problems .