The first part of this discussed the heterogeneity of data projects and how a uniform approach can help hone in the solution. The first post also discussed the first two elements: Refine the question, and identifying the right data. Here we tackle the next two elements.
Plan Your Approach
At this step, we begin to go into the technicalities of data science. This post is not designed to go into the detail of each approach, but it will attempt to ask the relevant questions.
How will you process the data that you now possess? In almost all cases, this step will involve data wrangling (also known as data munging or data cleaning). To determine how the “clean” form your data must take for proper analysis, it is important to determine the transformations and algorithms necessary for your question. Continue reading
Not long ago, the ability to create smart data visualizations, or dataviz, was a nice-to-have skill. For the most part, it benefited design- and data-minded managers who made a deliberate decision to invest in acquiring it. That’s changed. Now visual communication is a must-have skill for all managers, because more and more often, it’s the only way to make sense of the work they do.
A June 2016 Harvard Business Review article by Scott Berinato discusses the four types of data visualization, in their traditional “boil complex stuff down to a 2×2 matrix” method no less. In short, what works depends on the level of details necessary to convey the purpose.
Two axes of data visualization – what works best depends on the purpose
The overall concepts are reminiscent of concepts by Edward Tufte and his many, excellent, books on visualization.
The HBR article is worth a read for anyone interested in business intelligence, data analytics, or data visualization (which, as Berinato says, is probably a misnomer – it’s not the visualization that matters, but the question it seeks to answer).
Last October, my team started working on a project to bridge the communication gaps between inpatient general medicine and radiology. Despite having done a full year of internship before starting residency, we quickly realized that as radiologists we knew very little about healthcare is delivered on the wards. Understanding how well the imaging workflow runs from ordering to reporting, identifying possible delays by systematically analyzing patient data seemed straightforward.
Hypothesized imaging workflow for admitted medicine patients. Source: post author
A 2-hour meeting, eight weeks of delay, and several email exchanges later, we now rely mostly on manual data collection. This blog post is about what happened. Continue reading
This recent Deloitte healthcare analytic report came out with focus on big data. The report aims to identify current state of data analytics as well as emerging trends in healthcare.
The 30-Second Recap?
Most organizations believe data analytics is important, but less than half have a clear strategy to approach it. Few (5 of 50 surveyed organizations) anticipate an increase in budget. Those who did invest describe the most important drivers as improve clinical outcomes, deliver value-based care, and reduce operating costs.
And then there’s my favorite figure, reproduced from the Deloitte report: Continue reading
In a way, healthcare has spearheaded the forefront of universal connectivity with common objects. In the world of Big Data, healthcare is now uniquely positioned to take the next step.
A few years ago, I needed hand surgery. Shortly after checking in to the outpatient surgery department, the helpful nurse attached EKG leads onto my arms and chest, and a pulse oximeter to my finger. The monitor next to my bed flickered and came to life. Then, colorful telemetric and oximetric tracings in a nursing station computer reflected an exact copy. A record in the hospital intranet traced my wellbeing overtime. Wireless connectivity allowed an extra pair of eyes to watched me and to ensure aberrant flickers do not go unnoticed… Continue reading
This article originally appeared in American Journal of Managed Care.
Data is the results section of a scientific paper.
Data is a graph on the dashboard.
Data is a powerful motivator when it puts what we already know about ourselves in numbers.
Data is necessarily biased because it cannot exist in a vacuum.
Data is rarely perfect or complete.
Data is the Wizard of Oz in whom we only see that which we desire to see.
Data is not meaning.
Data is not opinion.
Data is not a mirror mirror on the wall to reveal the hidden truth in it all.
At the end of the day, data is data. It’s people who write the Discussion sections.
People draw conclusions from analytics.
It’s people who create meanings. People who form opinions.
Don’t confuse the two.
If you are at least 25 years old, you would remember the days when everyone is trying to expedite the speed of information transfer. Messages began with the courier services, first by horse, then by car. Then they went digital. The internet began with dial-up, when 56kbps was deemed state of the art, then broadband. Then we decided that having to sit in front of a computer to transmit data is too slow.
Back then, when you get a wrong piece of information, it was usually because of timeliness. Timely data was the business of newspapers, radios, and later television.
At some point, the speed of data transmission became near-instantaneous.
We had thought that faster information means better, but it may come at a cost. Rapid information is raw, and sometimes inaccurate. This is a common occurrence, but like car crashes relative to plane crashes, what made Twitter newsworthy is the few times when it nailed the right information seconds after an event, not the hundreds of thousands of times when it misfires.
Like breathing air, bad information has become so commonplace in Twitter and blogs that inaccuracy is invisible to us – we easily process the concept and underlying logic behind why rapid information is sometimes inaccurate, we just don’t think about it often.
And yes, I am aware of the hypocritical nature of using a blog post to divulge this argument. As it turns out, the burden of verification is on you; I’m just exercising my first amendment rights. 🙂