image

When I started my learning project, the plan was to alternate posts between learning how to learn and learning more about data science.

A data review would show I’ve focused too much on the former and not enough on the latter. The data-driven conclusion? It’s time to shift the balance.

As I’ve worked in a new role the last 6 months focusing on marketing analytics, I’ve drawn heavily on my academic background. There’s  economics with its emphasis on statistics and communications management with its reliance on research.

My professional experience is key, too. Leading an employee engagement survey strategy for several years and conducting corporate communications surveys has helped tremendously.

It’s fascinating how many parallels exist between seemingly disparate areas. And problem solving and team leadership are often similar from function to function.

One of the skills I’ve needed to sharpen is thinking critically about data measurements. I’m learning to ask better questions. And I’m learning to anticipate questions from colleagues on how data was collected and analyzed.

Harvard Business Review is a valuable resource in generating good questions – from branding to market insights and from big data to the customer experience.

A March 2016 article by Thomas C. Redman – 4 Steps to Thinking Critically About Data Measurements – gives great tips on asking good questions about data. Here’s a short summary:

  • How does the actual measurement line up with what you want to know? Ask yourself if the measures are good surrogates for what you really want to know.  Redman advises to “distinguish ‘pretty close’ from ‘a good-enough indicator’ to ‘not what I had in mind.'” If you’re settling for something less than perfect, you should be aware of it.
  • What do you want to know? Clarify what you want to know. This is similar to asking, “what problem are we trying to solve?” It’s also important to make sure all stakeholders are aligned on the exact nature and outcomes of the measurement process.
  • What are weaknesses in the measurement process? Here Redman advises a thorough understanding of the entire data collection process. He suggests listening to customer calls if you’re measuring customer complaints or going to a factory if you’re measuring factory productivity. This helps to “develop a feel for the weak links.”
  • Have you subjected results to the “smell test”? If results don’t seem right to you, based on other knowledge you have, dig into them. If results come in much better or worse than expected, consider the possibility of bad measurement and investigate further.

Thank you, Thomas Redman, for a few simple litmus tests to think more critically about data.