As digitalization of news has become more pervasive across society, Facebook has become the de facto public forum for sharing of media content. As of 2017, 67% of adults get their news from social media. From a supply chain lens, one could consider media content Facebook’s “customer offering”, and their recommendation algorithms to be the supply chain. Rapid advances in machine learning have made this supply chain more efficient via complex algorithms based on massive data sets, improving Facebook’s ability to connect users with media they choose to engage with. If user engagement is the desired heuristic, these algorithms have unquestionably improved the Facebook user experience. However, there are two negatives to this approach: the unpredictability that comes from “Black Box” systems, and the creation of “Echo Chambers” as users’ feedback loops silo them into, “Spheres of information where we’re rarely subjected to views that differ from our own.”
Black Box models are those where only the outputs are visible. Facebook’s proprietary news recommendation algorithm is an example. These models trade high performance for a complete lack of explanatory power and oversight: “Users have no control over what algorithms decide to show us, and little understanding of why they may learn to show you one piece of content over another.” In fact, even Facebook employees can’t definitively say what factors led to news being displayed a certain way for a certain person, or why tweaking inputs slightly will change result sets massively. The combination of black box algorithms with the general user’s “Human tendency to select information that confirms their existing beliefs” has a scary result. In addition, research has shown that Facebook exacerbates this problem by restricting “cross-cutting” content, or content from outside one’s political sphere. “After ranking [news content], there is on average slightly less crosscutting content”. Facebook is enabling formation of these echo chambers of like-minded users, without knowledge of how these echo chambers will affect the future profitability of Facebook or human society as a whole.
There are three actors in this equation: content producers, Facebook, and content consumers. Facebook must address this concern by answering these two questions. How does Facebook interact with the content producers to select which content to show, broadly? How do users interact with Facebook to consume content, specifically? Currently, there is a lack of clarity about Facebook’s role, intentions, and algorithms. To solve this problem going forward, Facebook must provide transparency in all duties they play.
As “Fake News” has entered the collective vocabulary, Facebook’s role in filtering content has been called into question: are they an impartial aggregator that is truly indifferent to content, or an editor, emphasizing “Real News” from trustworthy sources? In the short term, Facebook can build a system to flag and filter fake news articles written by trolls, or any other spam-esque content. In both the short and long, Facebook must define its position to maintain the trust of content generators and consumers.
The second question Facebook must answer is about its intentions and objectives. As Hosangar wrote, “Why should Facebook be obligated to show content with which we don’t engage?”. Is the echo chamber actually even Facebook’s responsibility to fix? If Facebook does come to the conclusion that this echo chamber is something worth fixing, they can tweak their content delivery system to deliver media content popular in other circles, perhaps communities that have contradictory views.
The last question Facebook must face is how transparent they should be with the nuts and bolts of their actual recommendation algorithms. To this point, the details of the algorithm have been kept under wraps: “It’s the company’s unique intellectual property, a competitive advantage it can’t expose”. However, this is not the only solution. Facebook could use an open-source recommendation algorithm to allow others to peek into the logic behind what they’re seeing. They could give the user more agency by building tools to allow more cross-cutting content to be displayed; potentially only a few users would opt into a more diversified Newsfeed, but it would at least force the user to make a conscious decision. Finally, Facebook could build and share reporting on what people consume, both at the individual (anonymized) and collective level. By taking these steps to ensure transparency in all steps of the content recommendation process, Facebook can ensure all parties have the trust and resources to consume news as they intend, while facilitating the conversation about the place of echo chambers in society.
 Elisa Sheahrer and Jeffery Gottfried, “News Use Across Social Media Platforms 2017”, Pew Research Center, September 7, 2017, http://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/, accessed November 2017
 Patrick Kulp, “Escape the echo chamber: How to fix your Facebook News Feed” Mashable, November 18, 2017, http://mashable.com/2016/11/18/facebook-hacking-newsfeed-well-rounded/#fex3kZADGqq2, accessed November 2017
 Christina Bonnington, “It’s Time for More Transparency in A.I.” Slate, October 24, 2017, http://www.slate.com/articles/technology/technology/2017/10/silicon_valley_needs_to_start_embracing_transparency_in_artificial_intelligence.html, accessed November 2017
 Eytan Bakshy, Solomon Messing, Lata Adamic, “Exposure to ideologically diverse news and opinion on Facebook”, Science Magazine, Vol 348 Issue 6239 p. 1130, June 5, 2015
 Kartik Hosangar, “Blame the Echo Chamber on Facebook. But Blame Yourself, Too” Wired, November 11, 2016, https://www.wired.com/2016/11/facebook-echo-chamber/ accessed November 2017