MOOCs are acronym for Massive Open Online Courses aimed at unlimited participation and open access for everyone on the internet. Although some variation of technology-assisted long distance learning existed well before their arrival, MOOCs became talk of the town in the fall of 2011 when Andrew Ng and Daphne Koller from Stanford university started Coursera.org, with Andrew’s famous Machine Learning course. Around the same time, Sebastian Thrun and Peter Norvig started a similar company named Udacity, with his famous course titled “Introduction to AI”. Right their launch, these companies attracted a huge user base initially, with Coursera growing “faster than facebook” to reach 1.7 Million users in first eighteen months.
The MOOCs took the world by a storm in their first two years, with aggressive growth in both user base, online content and brand-name universities. All platforms had several courses listed from star professors at prestigious universities such as Stanford, MIT, University of Pennsylvania, MIT etc. There was so much buzz and media attention around the concept that people calling it the long-awaited disruption in perhaps one of the most conservative and traditional industry in USA: higher-education. New York Times called 2012, “The Year of MOOC” and reported “Elite Universities are partnering with Coursera at a furious pace”.
However, half a decade out, the results have been far less than the forecasts. We still see student debt in USA increasing, with colleges comfortably increasing tuitions fees several times the rate of inflation. Moreover, the interest in attending conventional university campuses hasn’t shown any signs of subsiding either. The question is, why this disruptive model, that gives anyone with an internet connection to take a course from the top universities, isn’t giving universities a run for their money. Why none of these slow bureaucracies in higher education are going to join the ranks of Blockbuster or Kodak any time soon?
The answer is quite unsurprising. These platforms do provide great quality education. However, traditional college campuses still have three distinct advantages of them that are not going to go away anytime soon. First and foremost, universities provide official degrees that are perceived as pre-requisites “certificates” for jobs. These “certificates” represent several types of information to potential employers in the marketplace. For instance, they signal selectivity through the admission contests elite universities organize. For this reason, a Harvard dropout sends the same signal to marketplace as Harvard graduate: They were capable enough to pass the bar. This signal carries significant economic value with it that MOOCs, with their current structure, are not designed to provide.
Secondly, college degrees have now become a pre-requisite for graduate education. It wasn’t always the case. In fact, Charles Eliot, President of Harvard in 1869, made bachelor’s degree a pre-requisite for graduate admissions at Harvard. It led other colleges to follow suit and established this monopoly of bachelor’s degree over graduate education market. As almost all universities with graduate programs also run bachelor’s programs, they have no incentive to let go of this “captive” market to MOOCs.
Thirdly, Government regulation and private-sector HR practices have traditionally been structured around conventional degrees. For some professions, for example university teaching or medicine, it is illegal to work without a university degree, regardless of how capable you are. Similarly, the whole corporate recruitment machine has been structed around the conventional degrees and diplomas. In fact, these companies often use these degrees and diplomas for their “information” and “signaling” value of an individual, as described in the earlier paragraph. Completing 130 credits can also attest to a person’s discipline and personal effectiveness, in addition to academic ability.
So, what’s the future for MOOCs? Their main problem is to solve the “credential gap” by providing some means of attestation for their programs. Coursera has already started their “verified certificates” program. Given the vast differences in grading mechanics, esoteric course codes on transcripts, and relatively smaller class sizes (resulting in small sample sizes) in traditional colleges, these digital certificates can be encoded with much more information about how a student compare to the cohort (which happens to be very large in MOOCs). More specifically, these digital credentials can also store the actual work or assignments that a student has completed, for years. Employers can use that data to compare applicants from the final pool of selection.
But aside from these “incremental improvements”, digital credentials can be made “machine readable”, allowing algorithms to “mine” for the candidate with the right set of credentials. This can help solve the current problems companies like Google or Microsoft face, tens of thousands of applicants for one role with no objective way of selecting who to interview.
Word Count: 769 (excluding sources)
Carey, Kevin. “Here’s What Will Truly Change Higher Education: Online Degrees That Are Seen as Official.” The New York Times. The New York Times, 07 Mar. 2015. Web. 18 Nov. 2016.
Friedman, Thomas L. “Revolution Hits the Universities.” The New York Times. The New York Times, 26 Jan. 2013. Web. 18 Nov. 2016.
Bothwell, Ellie. “Moocs Can Transform Education – but Not Yet.” Times Higher Education (THE). Times Higher Education, 20 July 2016. Web. 18 Nov. 2016.
Auletta, Ken. “Get Rich U.” The New Yorker. The New Yorker, 23 Apr. 2012. Web. 18 Nov. 2016.
Pappano, Laura. “The Year of the MOOC.” The New York Times. The New York Times, 03 Nov. 2012. Web. 18 Nov. 2016.
Selingo, Jeffrey J. “Demystifying the MOOC.” The New York Times. The New York Times, 01 Nov. 2014. Web. 18 Nov. 2016.
Title Image: Till Hafenbrak