AI is Transforming Healthcare, but Cost, Quality, and Access is So Much More

Because of its unique combination of data, science, and machines and the interactions with patients’ lives, innovation in the industry requires much more than cool new gadgets or one-off apps that don’t get to the heart of challenges around cost, quality, and access.”

Source: Doctors, data, and diseases: How AI is transforming health care | VentureBeat | Bots | by Charles Koontz, GE Healthcare

Data Science and Machine Learning Developments in 2016 – Radiology Data Quest

    As we welcome 2017, it’s time to sum up the key developments in data science and machine learning from the past year so that we can open our eyes to the new year. Here is what you missed …

Source: Data Science and Machine Learning Developments in 2016 – Radiology Data Quest

Innovating in a large health system

One would think that resource-rich organizations are able to foster new ideas better than poor, cash-constrained startups.

However, it is remarkably difficult to innovate within a large health system on an ad hoc basis, for the same reason that it is difficult to innovate in a large corporation.  For one, it’s all too easy to feel like a cog in a large machine.  Fear of failure, perceived lack of reward, and a paucity of institutional support are other reasons why innovation stagnates in otherwise resource-rich organizations.

But little-fish-big-pond problems are not the only ones that plague innovation.  This phenomenon is well-recognized as part of the key reasons why disruptive innovations are notoriously difficult to launch from within a corporation.

If you feel this way, you may be an “intrapreneur.”

Continue reading

Check your assumptions with two types of MVP

In an MVP-based approach, flagship design begins with a paper boat. And a mouse.

The concept of MVP applies to research, quality improvement, or innovation projects.   In this case, MVP is not “most valuable player” (although it could make you one) but “minimum viable product.” In a nutshell, rather than the traditional approach of building only after deliberate planning and careful design, the MVP concept focuses on a just-enough set of features.  At first glance, it may seem counter-intuitive: shouldn’t the greatest success come after long-term project planning and thorough discussion of all possible outcomes?

Initially, MVP was developed for anemic startup companies to get a quick infusion of revenue before developing their flagship offering.  It was soon realized that this process of small-target rapid iteration yields not just faster but also better results.  Gantt charts, project timelines, and table-pounding meetings are still important, but real-life experimentation is a higher priority.

When Innovation and Improvement Collide

MVP is an extension of Plan-Do-Study-Act (PDSA). It makes one assumption: “all assumptions are more likely wrong than right.”  In designing a medical research proposal, you have implicitly made assumptions about some aspect of a disease’s biology. In creating a product, you inevitably face the need to make assumptions about customer needs.  In starting a business, you have determined that about research outcome, or even the fundamental design.

If these assumptions are more likely wrong than right, then the best next step must be to make as few as possible before having the opportunity to test them.  The MVP approach asks innovators to encapsulate as few concepts as possible into a deliverable and then bring that product to a test market to verify those assumptions.  Since outcome metrics can be very difficult or expensive to obtain – just ask people who run Phase III trials or market research – limiting variables allows you to be sure that acquired data only have a small number of possible interpretations.

Two Ways of Learning from an MVP

Some approaches to software design approaches embrace the MVP concept, one of the better known being Scrum.  Another product-oriented approach is called pretotyping (not to be confused with prototyping) – “faking” as much as possible with the goal of acquiring data before the making heavy financial investments.

The venerable Harvard Business Review has more – there are two types of MVPs.  Your MVP can be validating – by trying an inferior product to prove a concept.  It can also be invalidating, where the MVP is actually a better project than the one you plan to create.

If your MVP is a worse product than your imagined final version, success validates your idea; failure, on the other hand, doesn’t necessarily invalidate it. If your MVP offers a better experience, then failure invalidates your business model; success doesn’t necessarily validate it.

Hypothetically, if your radiology department is contemplating an investment of $3 million over 5 years on a “virtual radiology consultation” technology to improve communication, the rationale for purchase may be that busy radiologists cannot satisfy the high clinician demand for collegial discussions, and live digital discussions would solve that problem.

To test this assumption, you could deploy an invalidating MVP.  For instance, this may take the form of a one-week real radiology consultation for all questions.

This solution is obviously a huge waste of valuable radiology resource and unsustainable over time.  But failure invalidates one key assumption for the intended purchase.  Even if successful, it may raise important points to resolve: is subspecialist availability necessary?  Does the consultation need to be 24/7 or only during key hours?  The virtual solution might still work, but it would work for reasons other than but you know at least one of the underlying assumptions might need reassessment.

John Donahoe: Dump the Myth of the High Achiever

A gifted young baseball player played Little League through college ball, hitting an average of nine out of 10 pitches. [But] hitting .900 in the Major League is impossible. The best baseball players […] miss two out of three swings. They sometimes strike out in a crucial moment, drop balls, make bad plays, and disappoint their fans.

The difference […] is that world-class baseball players wake up every game day ready to swing the bat.

Source: John Donahoe: Dump the Myth of the High Achiever | Stanford Graduate School of Business

Informatics Sessions at RSNA 2016 You Don’t Want to Miss

The Radiology Society of North America (RSNA) Annual Meeting is a place to expand your knowledge base, both by taking a deeper dive into your core interest and by getting your feet wet a few new skills.

If informatics is something you’ve been interested in but need a good way to get started, then the RSNA offers some solid opportunities for beginners. Continue reading

Creating a RSNA Word Cloud – Radiology Data Quest

The RSNA conference will take place in Chicago in 1 month. If you’ve already started looking at the meeting program, you might get the same sense of excitement as the rest of us – this …

Source: Creating a RSNA Word Cloud – Radiology Data Quest

When Gut Instinct and Logic Work Together

After the recent presidential election, you are probably either particularly alarmed or especially excited about the outcome.  Regardless of your particular political predilection, it is fair to say that this election puts data science on its head when so many got so wrong.

On an earlier issue of Harvard Business Review, the venerable magazine shared a piece of research from the University of Southern California on forecasting. When forecasting sales, the best estimators use a combination of intuition and logic – with both the logic-heavy and intuition-heavy forecasters performing less accurately.

In the age of artificial intelligence and big data, it can be sobering to realize that despite the staggering volume of data we are now collecting, ignoring your gut instincts can take a heavy toll on your decision-making abilities.

 

Source: “What type of forecaster are you?” Harvard Business Review (March): 26.

 

 

6 Elements of a Data-Driven Informatics Solution (2/3)

The first part of this discussed the heterogeneity of data projects and how a uniform approach can help hone in the solution. The first post also discussed the first two elements: Refine the question, and identifying the right data.  Here we tackle the next two elements.

img_57f1c4091698c

Plan Your Approach

At this step, we begin to go into the technicalities of data science. This post is not designed to go into the detail of each approach, but it will attempt to ask the relevant questions.

How will you process the data that you now possess? In almost all cases, this step will involve data wrangling (also known as data munging or data cleaning). To determine how the “clean” form your data must take for proper analysis, it is important to determine the transformations and algorithms necessary for your question. Continue reading

6 Elements of a Data-Driven Informatics Solution (1/3)

Big Data has become a radiology buzzword  (the others: machine learning, AI, and disruptive innovation are also up there).

However, there is a real problem with using the term Big Data – it isn’t just one set of data problems.  Big Data is a conglomerate of different data challenges: volume of data, heterogeneity of data, or the velocity of data are all important dimensions.  Machine learning and internet of things are others layers superimposed on the big data problem.

Sometimes it is helpful to step back and approach data problems with a common framework, a way to think about how and which facets of data science fit in a real-life workflow in the face of an actual problem.

Below is a 6-element framework that helps me think about data-driven informatics problems. They are generally in chronological order, but they are not “steps” because you frequently will find yourself going back and redefining many things.  However, the framework helps you maintain a big-picture outlook.  The reason any sufficiently complex data problem requires a team approach. Continue reading