Understanding social media algorithms: what makes your feed look the way it does


Algorithms: you hear about them, you know they’re behind your screens, and you may even be here as a result of one right now. Still high off the anxiety-driven craze that was Netflix’s The Social Dilemma, many have found themselves curious about what exactly algorithms are. You always hear people go on about mindless sheep operating in an echo chamber thanks to the algorithm, but what does this mean?

The conversations surrounding algorithms are more frequent now than in the past because they’re in the mainstream. It’s no long just scientists that are aware computers learn on their own when given a few simple instructions. That’s really all that algorithms are: mathematical instructions.

Algorithms are like following a cake recipe to bake a cake. However, what’s pretty startling is that, unlike when baking a cake, sometimes you don’t have to tell computers exactly what to do every step of the way for a result to materialise. Rather than follow only explicitly programmed instructions, some algorithms are designed to allow computers to learn on their own (i.e., facilitate machine learning).

Finding your algorithm

This has trickled down into the election-influencing, ‘fake news’ fear-mongerers that are social media algorithms. The more you search for content, the more algorithms learn about you. By providing content specific to your previous searches, algorithms undoubtedly influence, alter or maintain your world views. Before the switch to algorithms, most social media feeds displayed posts in reverse chronological order. In short, the newest posts from accounts a user followed showed up first. Now it’s all about relevancy.

Take the experiment done by WIRED’s Sinead Bovell, as an example. Bovell created three different accounts on three different laptops (which had their history and cache cleared). On each account, she searched for content that aligned with one US political party over the other: Fox News for more conservative news, MSNBC for more liberal news, and ABC for a more neutral stance.

After just one video search for each of these news stations, Bovell’s feed became much more tailored to similar videos thereafter. In such ways, social media algorithms keep us tucked away in our little own echo chambers of compatible discourses. Social media companies know that if you support Trump then you’d want to see more pro-Trump content. And they’re going to give it to you, in order to increase website traffic and keep advertisers on their payroll.

It’s not hard to see why social media algorithms are controversial. Algorithms can influence us, even if we’re not aware of it. As the New York Times’ Rabbit Hole podcast explores, YouTube’s recommendation algorithms can drive viewers to increasingly extreme content, potentially leading to online radicalisation. It begs the question of just how much of your thoughts and behaviours are truly your own?

Recently, we’ve been deliberately trying to mess around with the algorithm that rules our worlds (with not that much success). We guess years of searches that have established a particular trend, coupled with the fact that there’s only so many videos on why feminism is bad we can click on before wanting to pull our hair out, is too solid for even the big bad algorithm to fully change.

Starting with a clean slate would likely make all the difference, but there’s already a lot of data the algorithms that surround us have to contend with. Shifting that… isn’t easy. At the end of the day, underneath all the algorithms are people. And we influence the algorithms just as much as they may influence us.



About Author

Leave A Reply