We can all agree that social media is more or less ingrained into our lives. For the Facebook users out there, we scroll through our newsfeeds almost daily, and sometimes even more than once a day! However, do you stop to think how stories, posts and videos end up on that feed?

In this post, we discuss how Facebook shows us content on our feeds and the ethical implications that it can have for us.

How does the Facebook News Feed algorithm work?

Above is a great video which explains how the Facebook News Feed algorithm works. As mentioned in the video, Facebook takes into account 3 things when deciding how to rank content on our newsfeeds:

  1. Your connections and friends on Facebook
  2. How many likes, comments, shares the post has
  3. How old the post is

Essentially, this means that Facebook predicts what the user will definitely like and then push this content towards the top of their feed. For instance, if you liked page A and a friend shared/liked/commented on something related to page A, more likely than not you will see this post towards the top of your news feed.

However, many Pages have started to unethically game the system to stay on top of Facebook users’ feeds. One way that this was achieved was posting articles that mislead users to click into it, in order to stay relevant and receive more engagement on their posts.

However, recently Facebook have added in a 4th condition: How long users have spent viewing the article. This combats the initial problem of Pages trying to circumvent the ranking system and ensuring a higher quality of content on Facebook.

Should Facebook control what we see on our feeds?

However, Facebook has received a lot of criticism over how it controls what content we see on our feeds. A direct consequence of the newsfeed algorithm is that in fact it somewhat accurately predicts the content that we want to see on our feeds. This means that users can become isolated within an online bubble, where we are limited to only the opinions of other like-minded individuals with little diversity in opinions and thoughts from others outside this group.

Moving beyond your personal news feed, should Facebook be able to control sensitive topics in everyone’s feed?

Here’s an example. With the impeding US Presidental elections coming up in 2016, Donald Trump has been talked about worldwide for his contraversial policies and rallies held in his campaign.

In a recent Facebook company poll for questions to ask Mark Zuckerberg at the regularly held CEO Q & A, one of the suggested questions read: “What responsibility does Facebook have to help prevent President Trump in 2017?” While Facebook would be in its rights to withhold any Trump-related in its newsfeed, should they? Should we place trust in the handful of employees at Facebook to make these ethical decisions for us, the users of Facebook?

While there may not be a correct answer to that question, Facebook and other social media sites have come under fire in the past few years for doing similar things to the scenario above. For instance, users were outraged that the Ferguson protests, that occurred in 2014, did not show up on the ‘Trending’ topics section for either Twitter or Facebook, despite the ongoing #Ferguson tweets at that time. While its absence for certain users were explained by Twitter’s algorithm (which works similar to Facebook’s), it still begs the question of whether social media platforms can choose to block certain content on our feeds.

Why are we more concerned about this than when it happens in other media forms?

Censorship and selective inclusion of content and articles are nothing new. We experience this every day with newspapers and media outlets choosing what news they want us to read/watch, which political candidate to vote for and essentially curating what opinions and attitudes they want the public to feel. But why do we feel so strongly when social media platforms, such as Facebook, can choose to do the same to us?


Ayres, S. (2014). The Facebook Algorithm Changes, AGAIN!.. Here’s the Scoop. [online] Post Planner. Available at: https://www.postplanner.com/the-facebook-algorithm-changes-again/ [Accessed 11 May 2016].

Gillespie, T. (2014). Facebook’s algorithm — why our assumptions are wrong, and our concerns are right – Culture Digitally. [online] Culture Digitally. Available at: http://culturedigitally.org/2014/07/facebooks-algorithm-why-our-assumptions-are-wrong-and-our-concerns-are-right/ [Accessed 21 May 2016].

Myers, M. (2016). Facebook Employees Want to Stop Trump, Perhaps By Hiding Posts About Him. They Can Do That?. [online] The Mary Sue. Available at: http://www.themarysue.com/facebook-vs-trump/ [Accessed 19 May 2016].

Perez, S. (2014). Why #Ferguson Wasn’t Trending For Some Social Media Users Last Night. [online] TechCrunch. Available at: http://techcrunch.com/2014/08/14/why-ferguson-wasnt-trending-for-some-social-media-users-last-night/ [Accessed 14 May 2016].

Why does the way Facebook curate your news feed bother you? Tell us why below or answer our poll.