Can Data Presentation be a Matter of Life or Death?

Untitled design

To my surprise and delight, “communication” topped the list of key skills for data scientists in a CEB Market Insights blog post I read this week.

The post covered the top 10 skills for data scientists and 2 strategies for hiring them. Yet “communication” felt like a lone outlier among a list of highly quantitative skills, like managing structured data, mathematics, data mining and statistical modeling.

But indeed, the Business Broadway study the post cited showed that “communications” recurred the most frequently across a variety of data science roles.

When Thomas Davenport and D.J. Patil named “Data Scientist” the sexiest job of the 21st century in Harvard Business Review, they cited an enduring need “for data scientists to communicate in language that all their stakeholders understand – and to demonstrate the special skills involved in storytelling with data, whether verbally, visually, or – ideally – both.”

As a communicator who pivoted into marketing analytics, it’s heartening to to see data showing there’s a role and need for effective communication and storytelling skills.

And having led communications, the field is dramatically improved by data that demonstrates what works and what doesn’t, and helps predict how various audiences might respond to different communications strategies.

Beyond enabling data-driven decisions, clear communications about data can literally be a matter of life or death. Two fascinating examples crossed my path this morning in an article by Dr. Jenny Grant Rankin called Over-the-Counter Data: the heroics of well-displayed information.

The first example was an early use of data visualization in the summer of 1854. In London, 500 people died of mysterious causes in a 10-day period. A Dr. John Snow made his data user-friendly. He took a neighborhood map and noted the exact locations where people had died.

This pointed toward a local water pump that was the culprit in the spread of cholera. With this clearly displayed data, Dr. Snow was able to convince authorities to remove the pump’s handle in order to stop the outbreak.

Another example took a much more ominous turn. The night before the Space Shuttle Challenger launched in January 1986, NASA engineers and their supervisors looked at charts and data on the rocket’s O-ring function. This is what keeps hot gasses contained. Based on what they saw, the launch was cleared for takeoff.

But the available data was not displayed clearly. It showed failed launches, but not successful launches. And this led decision makers to overlook a critical piece of information – the O-rings worked properly only when the temperature was above 66 degrees. The day of the Challenger launch was 30 degrees below that. It was “so cold it does not even fit on the graph.” It’s still heart wrenching to recall the tragedy that occurred that day.

While thankfully the work of data scientists is rarely a life or death matter, these examples underscore the need for clarity in communicating data. For what cannot be understood cannot be implemented.

4 Key Questions About Data

image

When I started my learning project, the plan was to alternate posts between learning how to learn and learning more about data science.

A data review would show I’ve focused too much on the former and not enough on the latter. The data-driven conclusion? It’s time to shift the balance.

As I’ve worked in a new role the last 6 months focusing on marketing analytics, I’ve drawn heavily on my academic background. There’s  economics with its emphasis on statistics and communications management with its reliance on research.

My professional experience is key, too. Leading an employee engagement survey strategy for several years and conducting corporate communications surveys has helped tremendously.

It’s fascinating how many parallels exist between seemingly disparate areas. And problem solving and team leadership are often similar from function to function.

One of the skills I’ve needed to sharpen is thinking critically about data measurements. I’m learning to ask better questions. And I’m learning to anticipate questions from colleagues on how data was collected and analyzed.

Harvard Business Review is a valuable resource in generating good questions – from branding to market insights and from big data to the customer experience.

A March 2016 article by Thomas C. Redman – 4 Steps to Thinking Critically About Data Measurements – gives great tips on asking good questions about data. Here’s a short summary:

  • How does the actual measurement line up with what you want to know? Ask yourself if the measures are good surrogates for what you really want to know.  Redman advises to “distinguish ‘pretty close’ from ‘a good-enough indicator’ to ‘not what I had in mind.'” If you’re settling for something less than perfect, you should be aware of it.
  • What do you want to know? Clarify what you want to know. This is similar to asking, “what problem are we trying to solve?” It’s also important to make sure all stakeholders are aligned on the exact nature and outcomes of the measurement process.
  • What are weaknesses in the measurement process? Here Redman advises a thorough understanding of the entire data collection process. He suggests listening to customer calls if you’re measuring customer complaints or going to a factory if you’re measuring factory productivity. This helps to “develop a feel for the weak links.”
  • Have you subjected results to the “smell test”? If results don’t seem right to you, based on other knowledge you have, dig into them. If results come in much better or worse than expected, consider the possibility of bad measurement and investigate further.

Thank you, Thomas Redman, for a few simple litmus tests to think more critically about data.