Partners Healthcare and Machine Learning: Building Efficiencies in Diagnostics and Data Interpretation

How can hospitals better utilize patient data by applying machine learning technologies to augment diagnosis and clinical decision making? Partners Healthcare has launched the Clinical Center for Data Science to lead the charge.

Imagine storing your personal savings under your bed. Most people with access to capital would prefer to earn interest in a bank account or invest their wealth to generate returns.

For years, the American healthcare system has kept medical data stashed under the bed. Rather than growing the value of medical data, most of its potential remains untapped. Because patient data is difficult to access, poorly organized, and timely to mine, the ability to generate insights from it has been relatively limited.

Machine learning offers hope for better utilization of data and images stored by provider networks to improve the accuracy and efficiency of medical decision making. Medical decision making often requires analysis of many data points—including symptoms, comorbidities, risk factors, imaging, blood tests, and biopsied tissue—by several specialists. Yet, data interpretation may be limited by the expertise and experience of providers, as well as the time required for analysis and care coordination.

In 2016, Partners Healthcare established the Clinical Center for Data Science (CCDS) to develop machine learning technologies to improve diagnosis and data interpretation in healthcare. As the largest hospital system in New England, Partners stores an enormous amount of patient data in its electronic health records system. [1] As of 2017, Partners had access to two billion medical images (ie. radiograph, CT, MRI, PET, etc.) that the CCDS could use to build and validate algorithms. [2]

By April 2017, the CCDS had initiated over 20 machine learning projects. [3] In May 2017, the CCDS initiated a ten-year partnership with General Electric (GE) Healthcare to accelerate the development of machine learning applications for patient care, recognizing the need for multidisciplinary collaboration among physicians, researchers, data scientists, and developers. The shared vision is for Partners, GE Healthcare, and third parties to develop algorithms on an open platform and disseminate advances to other hospitals using the GE Health Cloud. [4,5] Additional collaborators include Nvidia, whose supercomputer and graphics processing units (GPUs) offer the CCDS significant computing power, and Nuance, who developed the first open platform for artificial intelligence innovations in medical imaging. [6,7]

The initial focus for CCDS has been improving accuracy and increasing efficiency of diagnosis using medical images. For example, by pointing a physician’s attention towards particular components of a patient’s image or providing a rating based on the likelihood that an image contains an anomaly, machine learning could allow for quicker decision making in emergent scenarios where diagnosis is the bottleneck before sending a patient to the operating room. [5] Other applications in radiology include early detection of strokes, injuries, and cancer, allowing for more judicious use of physicians’ time and diagnostic interventions such as biopsies.

The diagnostic medical specialties that require interpretation of images, such as radiology and pathology, are likely to be the first specialties where machine learning will augment and expand clinical capacity. However, these specialties may serve as a pilot for others, as artificial intelligence begins to increase the speed and accuracy of diagnosis and management across diseases and patient populations. In the coming years, Partners aims to develop algorithms that can augment medical specialties beyond radiology, as well as organizational operations. For example, with an eye towards cancer treatment, CCDS is considering how this technology can help track tumor response to therapy. [5]

In developing future machine learning applications, Partners should consider other rich data sources beyond images, such as the patient notes that are written by providers after each clinical encounter. Moving forward, Partners will need to determine how to organize its wealth of data to select appropriate inputs and adequately validate outputs to derive actionable insights. Another challenge will be smoothly integrating these technologies into clinic flow to augment rather than disrupt patient care. Partners should also consider how to help community hospitals and clinics with fewer resources deploy these technologies to minimize disparities.

A number of other vital questions remain. How will these new technologies garner trust among providers and patients, as well as adoption among healthcare organizations? How will providers be educated on how the outputs are derived, so as to appropriately interpret the results? What safety measures and patient privacy regulations will need to be instituted? How can these approaches be distributed to improve global access to care, and how fast should they be scaled? Will these technologies improve access to care and lower costs, or like other approaches to automated image interpretation, will they fail to significantly move the needle?

Pending resolution of these uncertainties, machine learning technology will hopefully offer deeper insight into high risk or at-risk patients, increase efficiency and productivity, allow for better allocation of human capital, increase opportunities for global access to care, and decrease errors by creating a safety net in medical decision making.

 

Word count: 784

[1] “About Partners Healthcare.” https://innovation.partners.org/about/about-partners-healthcare, accessed November 2018.

[2] Tomsho R. “Artificial Intelligence Expert Sees Healthcare Impact.” Mass General Magazine, 2017. https://giving.massgeneral.org/artificial-intelligence-healthcare-impact/, accessed November 2018.

[3] “Bringing the latest advances in artificial intelligence to patient care.” Mass General News, April 7, 2017. https://www.massgeneral.org/News/newsarticle.aspx?id=6264, accessed November 2018.

[4] “A.I. and GE – The disruption accelerates.” Partners Healthcare Innovation, News: Summer 2017, July 11, 2017. https://innovation.partners.org/summer-2017/ge-disruption-accelerates, accessed November 2018.

[5] “The team behind the future of AI in healthcare.” GE Healthcare: The Pulse on Health, Science & Tech, May 18, 2017. http://newsroom.gehealthcare.com/the-team-behind-the-future-of-ai-in-healthcare/, accessed November 2018.

[6] Davenport TH and Bean R. “Revolutionizing Radiology with Deep Learning at Partners Healthcare – and Many Others.” Forbes, November 5, 2017. https://www.forbes.com/sites/tomdavenport/2017/11/05/revolutionizing-radiology-with-deep-learning-at-partners-healthcare-and-many-others/#6e3d04255e13, accessed November 2018.

[7] “Nuance and Partners HealthCare Collaborate to Accelerate Widespread Development, Deployment and Adoption of AI Applications for Diagnostic Imaging.” Globe Newswire, March 5, 2018. https://globenewswire.com/news-release/2018/03/05/1415228/0/en/Nuance-and-Partners-HealthCare-Collaborate-to-Accelerate-Widespread-Development-Deployment-and-Adoption-of-AI-Applications-for-Diagnostic-Imaging.html, accessed November 2018.

Previous:

“Creepy”: When Big Tech Personalization Goes Too Far

Next:

Taking to the Skies with 3D Printed Jet Engines? GE Aviation Already is

3 thoughts on “Partners Healthcare and Machine Learning: Building Efficiencies in Diagnostics and Data Interpretation

  1. CDSS has a massive promise to improve access and quality of care, especially in countries with fewer experienced and qualified physicians per capita. Having invested and sat on boards in multiple CDSS projects I can attest that interoperability, i.e. inability to scale the technology across a network of hospitals that each have a unique IT system is the bottleneck to scale. Sadly, it’s the smaller and resource constrained hospitals [that arguably would benefit the most] that today lack the resources for integrating these IT solutions into their workflow. Unless we move to a truly standardised HCIT protocol, I doubt that it’s going to change.
    Regulatory approval pathway and business model are still at their nascent stage. The initial FDA guidance paper shown a lack of appreciation and understanding for CDSS and its role in clinical workflow. I expect there to be many iterations in regulations over the coming years. The lack of clear regulatory pathway and aforementioned scalability issues have held companies and investors back from CDSS despite the great promise. I do think that that’s changing in the right direction and Partners as well as a couple of European hospitals are really leading the way, since they have the IT infrastructure, sponsorship and academic resources necessary for these projects to happen.
    As for the natural language processing of EHR notes – it’s done already in some hospitals, but a scribing style of radiologists within hospitals can vary widely and therefore NLP models may not be as straightforward to transfer as purely image and demographics based systems. Even image based systems may not be – I’ve talked to some physicians who concur that phenotypic differences across regions might decrease the accuracy of a CDSS that’s trained on a single hospital data. I know that there have been efforts across the board to test model transfer across sites and geographies, and I’d be watching this space to see how it pans out.

  2. I agree with the previous comment. The facilities that stand to gain the most from these CDSS and similar models are those that are likely under-resourced. The subset of training data for models in patient populations is a huge and interesting issue in healthcare IT. Particularly in the case of rare disease, individual health systems may not have broad enough patient populations that have a specific condition in order to accurately train a model to identify the condition and its variants or to recommend specific care pathways.

    Another question raised by this article – and by machine learning applications in healthcare more broadly is – how good does a model need to be in order to be ‘good enough’ to trust its results? There’s a general public perception (evident in backlash to relatively rare accidents caused by autonomous vehicles) that we expect algorithms to be perfect when human life is involved. In reality, models should only need to be better than the alternative of MD diagnosis before we adopt them. I personally feel the application of these types of technologies in the next 5-10 years is more likely to be augmented/intelligent decision making on basis of machine learning data interpreted by providers rather than purely prescriptive AI in the absence of MD oversight.

  3. I loved this article and feel motivated that the Partners, the gold standard in the healthcare industry, is being so forward leaning with regards to utilizing healthcare data, machine learning technology, and innovation collaboration. Hopefully other hospital systems will follow suit. I would be excited to see if they are considering partnering with payor entities to add claims data to their treasure trove of images, patient notes, and other data contained within the EHR. I wonder if this is the key to empowering healthcare providers to truly leverage machine learning to begin to diagnose, risk stratify, and even predict potential healthcare outcomes before they even happen. Is actual progress limited by the lack of inter-operability amidst EHRs and healthcare data sources more broadly? What does it say about our healthcare system that it is faster to print out a patient’s file and walk it across the street to a neighboring hospital than to try to send it electronically?

    Like the author, I am excited to see that Partners is starting somewhere with regards to taking advantage of “their savings under the bed” and I think that progress, even if initially limited to the field radiology and the evaluation of x-rays, is a much needed win for high tech within the US healthcare system.

Leave a comment