Deep learning’s recent successes have mostly relied on exploiting the fundamental statistical properties of images, sounds and video data. In addition, for computers to learn well, there must be sufficient data. Diagnostic radiology, which relies on radiologist to read images (X-ray, CT, MRI, etc) seems like a perfect testing ground: in 2015 alone, there were roughly 800 million multi-slice exam performed in the US, generating roughly 60 billion medical images1. In addition, each report comes with radiologists’ opinions and formal diagnoses. In 2015, IBM purchased 30 billion of medical images for Watson to “look at”2. While IBM is on its way to develop the most powerful and fastest radiologist ever, image recognition has garnered results in a field that also heavily relies on image recognition. Earlier this year, computer scientists and physicians at Stanford University teamed up to train a deep learning algorithm on 130,000 images of 2,000 skin diseases. The result program performed just as well as 21 board-certified dermatologists in picking out deadly skin lesions3.
This naturally argues for the relevance of medical professionals, especially radiologists. But what does the hospital workflow of diagnostic radiology look like currently in the hospital, and what changes will AI bring to the system?
Figure 1. Current radiology workflow
Figure 1 shows the current workflow of a radiologist. First patient shows up in the hospital, and the primary provider (emergency room physicians, hospitalists, etc) decides an imaging study is warranted. Image is then obtained, and sent to the radiologist’s reading room digitally, where radiologists will read and diagnose based on the images. The results are sent to the primary provider, who will make decisions accordingly. In this system, radiologists are often the bottleneck, as there are often many more primary providers than the on-call radiologist.
Figure 2a. First step of AI integration: Radiologist as a user of AI
Figure 2a is likely first step of AI integration into radiologists’ workflow. In most hospitals, radiology division is comprised of different anatomical departments – chest, abdomen, neuroradiology, etc. Although there are many medical imaging AI companies that are developing AI solutions via a general route, more and more companies are specializing in particular areas (e.g. Imbio for lung imaging). In addition, there are specific pathologies on images that are notoriously difficult for radiologists to track but possibly simple for AI, such as lung nodules. The logical first step of AI integration, is to allow radiologists to use AI to solve difficult tasks so radiologists can be more efficient. This will reduce the congestion at the bottleneck and reduce the time it takes for providers to receive a decision.
Figure 2b. Second step of AI integration: Radiologist as a supervisor of AI
Figure 2b reflects a possible second step of AI integration. As AI continues to improve, it may surpass radiologists in most aspects of image reading. However, radiologists might still be superior in complex or ambiguous conditions. For example, in the hospital, radiologists often call the image-ordering provider to clarify certain patient history in order to narrow down the number of possible differential diagnosis. Therefore, a subset of images read by AI may still require supervision.
Figure 2c. Last step of AI integration: Radiologist now only doing procedures, independent of AI
Figure 2c reflects possibly the ultimate workflow, effectively eliminating the need for radiologists under the current workflow architecture. AI will run the department of diagnostic radiology. It is, however, important to point out that in most institutions nowadays, diagnostic radiologists also perform radiologic-guided procedures, such as biopsies, injections, drainage, etc. Comparatively, these procedures are not as easily replaced by AI. This may result in radiologists becoming interventionists as their reading counterpart is taken over by AI.
Last, it is important to do a sanity check of AI integration. How likely will a patient be comfortable having their images read by a computer, even with the same accuracy? How can primary providers interact with AI? In addition, we have to speculate how the legality of AI integration will work out. Can AI be sued if it missed a diagnosis?