A gifted young baseball player played Little League through college ball, hitting an average of nine out of 10 pitches. [But] hitting .900 in the Major League is impossible. The best baseball players […] miss two out of three swings. They sometimes strike out in a crucial moment, drop balls, make bad plays, and disappoint their fans.
The difference […] is that world-class baseball players wake up every game day ready to swing the bat.
Source: John Donahoe: Dump the Myth of the High Achiever | Stanford Graduate School of Business
The Radiology Society of North America (RSNA) Annual Meeting is a place to expand your knowledge base, both by taking a deeper dive into your core interest and by getting your feet wet a few new skills.
If informatics is something you’ve been interested in but need a good way to get started, then the RSNA offers some solid opportunities for beginners. Continue reading
The RSNA conference will take place in Chicago in 1 month. If you’ve already started looking at the meeting program, you might get the same sense of excitement as the rest of us – this …
Source: Creating a RSNA Word Cloud – Radiology Data Quest
After the recent presidential election, you are probably either particularly alarmed or especially excited about the outcome. Regardless of your particular political predilection, it is fair to say that this election puts data science on its head when so many got so wrong.
On an earlier issue of Harvard Business Review, the venerable magazine shared a piece of research from the University of Southern California on forecasting. When forecasting sales, the best estimators use a combination of intuition and logic – with both the logic-heavy and intuition-heavy forecasters performing less accurately.
In the age of artificial intelligence and big data, it can be sobering to realize that despite the staggering volume of data we are now collecting, ignoring your gut instincts can take a heavy toll on your decision-making abilities.
Source: “What type of forecaster are you?” Harvard Business Review (March): 26.
The first part of this discussed the heterogeneity of data projects and how a uniform approach can help hone in the solution. The first post also discussed the first two elements: Refine the question, and identifying the right data. Here we tackle the next two elements.
Plan Your Approach
At this step, we begin to go into the technicalities of data science. This post is not designed to go into the detail of each approach, but it will attempt to ask the relevant questions.
How will you process the data that you now possess? In almost all cases, this step will involve data wrangling (also known as data munging or data cleaning). To determine how the “clean” form your data must take for proper analysis, it is important to determine the transformations and algorithms necessary for your question. Continue reading
Big Data has become a radiology buzzword (the others: machine learning, AI, and disruptive innovation are also up there).
However, there is a real problem with using the term Big Data – it isn’t just one set of data problems. Big Data is a conglomerate of different data challenges: volume of data, heterogeneity of data, or the velocity of data are all important dimensions. Machine learning and internet of things are others layers superimposed on the big data problem.
Sometimes it is helpful to step back and approach data problems with a common framework, a way to think about how and which facets of data science fit in a real-life workflow in the face of an actual problem.
Below is a 6-element framework that helps me think about data-driven informatics problems. They are generally in chronological order, but they are not “steps” because you frequently will find yourself going back and redefining many things. However, the framework helps you maintain a big-picture outlook. The reason any sufficiently complex data problem requires a team approach. Continue reading
At the turn of the century, Joel Spolsky came up with the idea of a “Joel Test” – a highly irresponsible, sloppy test to rate the quality of a software team.
Then, this group thought to come up with their own criteria to rate the quality of a data science team. How do your analysts in the radiology department fare?
- Can new hires get set up in the environment to run analyses on their first day?
- Can data scientists utilize the latest tools/packages without help from IT?
- Can data scientists use on-demand and scalable compute resources without help from IT/dev ops?
- Can data scientists find and reproduce past experiments and results, using the original code, data, parameters, and software versions?
- Does collaboration happen through a system other than email?
- Can predictive models be deployed to production without custom engineering or infrastructure work?
- Is there a single place to search for past research and reusable data sets, code, etc?
- Do your data scientists use the best tools money can buy?