Context: Digitalization of News Media
The emergence of digital technology platforms such as Google/YouTube has fractured the traditional models for creating and distribution media content. Google/YouTube is particularly influential, handling more than 70% of worldwide online search requests . Through its open platform, Google/YouTube changed the conventional rules for content production and ownership, which had been guarded by capital intensive, analog technologies such as production studios and equipment, owned by a few publishing stakeholders.
The value chain of the news supply, from development, financing, production, sales, distribution and consumption, was tightly controlled by limited availability of distribution channels such as TV, radio and other public infrastructures. However, since Google/YouTube launched, an average citizen can quickly create and upload content, leading to 400 hours of videos being uploaded to YouTube every minute—approximately 1000 days worth of video every hour . Yet today, Google/YouTube is increasingly battling the behemoth it has created. Two challenges ensue: 1) how to manage the abundant supply of content and 2) what role should it have in controlling the access to such content, if at all.
Grappling with the Heart of Its Success
The rapidity with which one can distribute and surface content, the very feature that made Google/YouTube potent, has crippled the organization’s ability to manage the abundant supply of content with debilitating consequences on the society. The reduced barrier has led to abundant information given airtime and audience, without regard for quality and veracity. The creators that have equal access to the platform have discovered ways to manipulate Google/YouTube’s “neutral” platform relying on personalized algorithms to proliferate low quality and false information portrayed as journalism.
In the increasingly polarized global political climate, there has been a viral spread of hateful misinformation that are perceived as journalistic accounts . The end consumers of that content unknowingly filter themselves in their political ideology, exacerbating the political tensions, which can have potentially deadly consequences. In one instance in June 2017, in which an extremist conducted a deadly terrorist attack on London Bridge, YouTube found itself in the limelight upon reports that the attackers became radicalized by watching sectarian and hateful messages on YouTube  . Legislators have called for greater oversight of Google and some politicians such as Stephen Bannon have called for regulation of Google/YouTube and other private technology companies as public utilities .
The flaw in personalized “neutral” algorithms has crippled Google/YouTube’s founding principles of openness and transparency. The implications of this are enormous for Google/YouTube as it aims to rival television and traditional media as a source of public information. YouTube’s feeding of spurious content shows the shortcomings of the medium despite its scale and accessibility.
Google/YouTube Fights Back
Public scrutiny by the government and the society have pressured Google/YouTube for action. In the short-term, Google has announced an initiative called “Project Owl” to provide “algorithmic updates to surface more authoritative content” and to demote low quality content . It has devoted more engineering resources to apply machine learning research to “train new content classifiers” to help identify and remove extremist and false content .
It has also announced a set of policies aimed at curbing misinformation and hateful extremist content. It has promised to remove videos that are in violation of its community guidelines. As for more dubious content that does not violate the code of conduct, Google will attempt to make the videos harder to surface and unmonetizable .
In the long-term, Google, which has relied on computer and machine-based video analysis, will greatly increase the number of independent experts in YouTube’s Trusted Flagger program. It plans to enlist experts from 63 NGOs to help determine categorization of videos that can be democratically inflammatory.
In conjunction with these actions, I further urge the management to work more closely with industry collaborators including other digital technology platforms. Google/YouTube may represent the majority enabler of the proliferation of misinformation, however, it is part of a much larger digital ecosystem. In order to more effectively stem the virality of misinformation, an international coalition of Facebook, Twitter and Microsoft as well as international governments should work together.
While the debilitating effects of pervasive misinformation on societies remain unequivocal, what is less clear are questions of free speech versus censorship and the role of private companies. At what point is monitoring “low quality” content a ban on free speech and the marketplace? In the wake of supremacist rally in Charlottesville, Virginia, tech companies including Google/YouTube, have blacklisted the neo-Nazi blog the Daily Stormer . They have become less of a neutral platform but more as “custodians of public interest.” But is that the role for the tech companies to play? Furthermore, is it even possible to fashion a democratic social media in a highly divisive culture?
(Word Count: 793)
 “Google Inc”, Britannica, September 28,2017, https://www.britannica.com/topic/Google-Inc, accessed November 15, 2017
 Mark Robertson, “500 Hours of video uploaded to YouTube every minute”, Tubular Insights, November 13, 2015 http://tubularinsights.com/hours-minute-uploaded-youtube/, accessed November 15, 2017
 Jack Nicas, “YouTube cracks down on conspiracies, fake news”, Market Watch, October 5, 2017 https://www.marketwatch.com/story/youtube-cracks-down-on-conspiracies-fake-news-2017-10-05, accessed November 14, 2017
 Ben Gomes, “Our latest quality improvements for Search”, Google Blog, April 25, 2017 https://blog.google/products/search/our-latest-quality-improvements-search/, accessed November 15, 2017
 Camila Schick and Stephen Castle, “ ‘I trusted him’: London Attacker was friendly with neighbors”, New York Times, June 5, 2017, https://www.nytimes.com/2017/06/05/world/europe/london-attack-theresa-may.html, accessed November 15, 2017
 Daisuke Wakabayashi, “YouTube sets new policies to curb extremist videos,” New York Times, June 18, 2017, https://www.nytimes.com/2017/06/18/business/youtube-terrorism.html?_r=0, accessed November 15, 2017
 Kent Walker, “Four steps we’re taking today to fight terrorism online,” Google Blog, June 18, 2017 https://blog.google/topics/google-europe/four-steps-were-taking-today-fight-online-terror/, accessed November 15, 2017
 Adrian Chen, “The Fake-News Fallacy”, The New Yorker, September 4, 2017 https://www.newyorker.com/magazine/2017/09/04/the-fake-news-fallacy, accessed November 14, 2017