In this briefing:
- RSNA launches a new AI journal.
- ACR makes several moves to advance the use of artificial intelligence in the future of medical imaging.
- Researchers publish in high impact journal the successful use of AI to detect breast density.
- ACR and MICCAI sign agreement to advance AI in medical imaging.
- Geisinger declares successful implementation of algorithm to improve time-to-diagnosis of intracranial hemorrhages 20-fold.
Stay up to speed in 2 minutes. Radiology AI Briefing is a semi-regular series of blog posts featuring hand-picked news stories and summaries on machine learning and data science.
Several days ago, Rear Adm. Ronny Jackson confirmed he will drop out of the confirmation process for the Veterans Affairs secretary position after Donald Trump fired the then-Secretary David Shulkin last month.
Whoever ends up taking the lead in managing the American heroes’ healthcare bears quite a heavy burden, and careful selection, approval, and confirmation process is warranted. Continue reading
Use the AI Canvas.
Source: A Simple Tool to Start Making Decisions with the Help of AI
In a recent Harvard Business Review article, Ajay Agrawal and coauthors shared a simple tool to think about how an AI tool may be deployed. Although the tool is no more complex than 6 boxes of free text, it does follow a number of best practices when thinking about general data and machine learning:
- Always define an end-goal – what’s the desired outcome?
- You should always make a hypothesis of what may drive this desired outcome.
- You should determine how to present the ML prediction in a way that drives action, not just the data itself.
- Your data acquisition strategy should include a feedback mechanism.
For example, this is how one might fill out the AI Canvas tool in a radiology use case:
- Prediction: Predict whether a brian MRI for a cancer patient contains increasing or new hydrocephalus
- Action: Label the examination as critical, and denote that AI has determined a critical finding. For example, create an “AI-STAT” category on worklist priority.
- Judgment: Compare the cost of interpreting this brain MRI at its usual turnaround time, versus immediately.
- Outcome: Observe whether the action taken in response to a study labeled AI-STAT is correct.
- Input: New MRIs of the brain MRI performed, and their prior studies.
- Training: Historical brain MRIs
- Feedback: Identify false positives – perhaps the prior study was from 20 years ago, or there’s been surgical resection, so that ex vacuo dilation of ventricles is not hydrocephalus. Perhaps there has been recent surgery Identify false negatives – subtle enlargement of the temporal horns missed by AI. Use this information to improve the AI.
How might you use this worksheet to brainstorm AI for your radiology practice?
There’s an origin story for every superhero; even those without superpowers (like Batman – that’s right) got started somewhere. What we sometimes forget is that there is also an origin story for every regular person, every profession, every hobby.
Source: Wikimedia Commons
If you’re a radiologist looking to learn a few things in radiology data science, a simple web search will reveal a seemingly overwhelming amount of material you might have to know.
Fortunately, only a very small subset is necessary to start being productive. Here are a few resources I used to get started.
A well-written framework on Stanford Social Innovation Review describes three distinct forces of transforming a practice.
An agitator brings the grievances of specific individuals or groups to the forefront of public awareness. An innovator creates an actionable solution to address these grievances. And an orchestrator coordinates action across groups, organizations, and sectors to scale the proposed solution.
The key observation is that transformation requires all three in harmony. In medicine, the voices of agitators frequently meet top-down repression or with the silence of the leadership. “This is just the way we’ve always done it,” they might say.
The Stanford article focuses on building a team consisting of people in all three domains in order to bring about social innovation. In medicine, practices tend to be resistant to change partly due to the higher stakes but also due to the highly regulated climate of modern health care. (This is not necessarily good or bad – it just is.)
Although medicine often places more weight on orchestration – coordination of interdisciplinary care to benefit patient health – it stands to reason that a healthy dose of the other two is also necessary. If you see yourself as an agitator, know that a thorough understanding of stakeholder analysis can help you better differentiate between a simple inconvenience and an opportunity to create value. If you are an innovator, your strength may lie in an intuitive visualization of connections between disparate organizational units. Know that what seems obvious to you is probably opaque to others. In the end:
Agitation without innovation means complaints without ways forward, and innovation without orchestration means ideas without impact.
Artificial intelligence is the hottest topic in medical informatics. The promises of an intelligent automation in medicine’s future are equal parts optimism, hype, and fear.
In this post, Mike Hearn struggles to reconcile the paradox surrounding the supposedly objective, data-driven approaches to AI and the incredibly opinion-charged, ultra-political world from which AI draws its data source.
The post focuses on broader applications, but in medicine, a similar problem exists. If AI is expected to extract insight from the text of original research articles, statistical analyses, and systematic reviews, its “insights” are marred by human biases.
The difference, of course, is that AI may bury such biases into a machine learning black box. We have an increasing body of research on latent human biases, but machine biases are much harder to discover, particularly when it reflects the inherent biases in the data from which it draws its conclusions. Our own biases.
AI acts as a mirror. Sometimes we don’t like the face staring back at us.
Source: The most dangerous AI – Mike’s blog
This month’s Harvard Business Review has an article highlighting one of the most fascinating emerging trends in quality improvement: that a “root cause” exists may be a myth. As healthcare QI/QA moves towards eliminating errors and improving metric-based performance, the increasing obsession towards solving a quality problem is laudable but sometimes misguided.
This excellent HBR article focuses on reframing. In short, what you say after discovering a complex problem is important. Before saying “Let’s start making a pareto chart and collect some data!” try inserting a 30-second pause with, “Is that the right problem we should be solving?”
Without spoiling the fun of reading the article, try thinking through this issue before reading – You have received multiple complaints about the speed of your building’s elevators. How would you address this problem?
In fact, the very idea that a single root problem exists may be misleading; problems are typically multicausal and can be addressed in many ways.
Source: Are You Solving the Right Problems?
If you have been paying attention to data science in healthcare you will have noticed the gradual shift from 2016’s Big Data to 2017’s Machine Learning. Specifically, deep learning techniques attract much of the attention. The FDA recently approved the use of deep learning techniques in cardiac diagnoses. Enlitic promises to automate the process of radiologic diagnosis for medical imaging. And with the advent of wearables, there is an ever-increasing volume of health data that requires “smart” algorithms to parse out the signal from the noise. Continue reading
A Quora answer/article about data science.
Incidentally, the same 5 skills are also highly relevant to be a physician-informatician, particularly in radiology. Give it a read.
Source: William Chen’s answer to What are the top 5 skills needed to become a data scientist? – Quora
There is something strangely satisfying about being able to take things apart and putting it back together. Inspired by the popularity of Lego sets in our childhoods, Minecraft brought this sense of wonder to video games.
For those of us who are life-long tinkerers who happen to be radiologists, I published in Radiology Data Quest a DIY on how one take DICOM apart and manipulate it. All in Python, no less.
DICOM is a pain in the neck. It also happens to be very helpful. As clinical radiologists, we expect post-processing, even taking them for granted. However, the magic that occurs behind the scene…
Source: DICOM Processing and Segmentation in Python – Radiology Data Quest