Written by Roger Jackson,
January 17th, 2019 | Views

10 things that annoy me in data discussions

I spend my life in data. I love it. Give me a fact versus a guess any day of the week. I know I may be in a minority in that regard, but I think marketing should be a balanced synthesis of science and art. As a musician, I know and love art. Art is most likely where the original idea comes from. But if I am spending loads of someone else’s money, I think a little science is warranted. Science stops me making bad mistakes….so here are my gripes. Sorry.

1.  No sample size information

Guys, commenting on numbers without knowing the sample size is like a spectator from the side of the ground on commenting on LBW in cricket. USA readers contact me for a translation. And I mean the sample for that particular data point, not the overall study (I know you asked 5,000 people overall, but only ten were represented on this chart!)

2.  No idea what the question was

You can’t interpret the answer unless you know exactly what was asked. A chart that doesn’t show the question is as safe for decision making as asking your Dad if you should wear a red or blue dress (please refer to my daughter)

3.  Knowing who is being asked

Believe me, if you ask different people the same question you will get different answers. Your sampling has to be representative, or you may as well just ask the person next to you on the bus. It’s not hard to get good quality fieldwork so no excuses.

4.  No context

Wow “80% of our shoppers are satisfied”. Good on you. But what if everyone else’s shoppers show 90% satisfaction? Oops. Comparisons unlock insights not isolated examples even if you think they look good. And they are easy to understand.

5.  Being told that “the difference is significant”

I am afraid that’s meaningless. One can only say for example “this meets a statistical test of 95% confidence”. The significance is about confidence levels. Not an absolute thing (oh, and please don’t say that because something is “not significant so its directional only”). You can work out its level of confidence, and if its less than 50% I suggest it’s not worth using. No better than tossing a coin.

6.  “We already knew that”

Right. My favorite test is to ask people before they are shown the data what they think the figures will show. In reality, most people don’t know, but of course, human nature is not to want to admit that. We all want to seem smart. A little bit soul destroying for a researcher (and unfair).

7.  It’s a “nice to know” not a “must know”

Can you give me a page to unpack that one? Any data is only as good as what it’s used for. Data doesn’t achieve anything. Decisions are what matters. Any data that helps make a better decision has a value. The absolute value is dependent on a) the size of the decision b) the risk of the decision and c) the confidence we already have in the decision. I recommend hypothesis building (“we think XYZ is happening can you support or deny it?”)

8.  “People lie in questionnaires” Really? Can you prove that? 

Firstly, if you have ever tried to answer a questionnaire, you will know that lying takes far more time and effort than telling the truth. Second, why bother? What’s to gain? Well designed questionnaires are simple to complete. In any case, if everyone lied the answers would all tend to the midpoint (and there’d be no more surveys)

9.  Shoppers don’t know why they do what they do

They just post-rationalize, right? I am getting the sense that a tiny bit of neuroscience knowledge goes a long way. Hmm. Yes, shoppers are subject to many influences, some of which are subconscious. But we are not mindless automatons. We usually do things when we go shopping for good reasons. And if asked sensible questions, we can answer them. Of course, questionnaires shouldn’t ask about genuinely subliminal matters (e.g. “do you prefer the red or green?”)

10.  “We want to uncover something genuinely new”

It’s nice if you come up with a world-shaking insight, but such things are usually the rare product of much digging, sweat and argument. Often combining many different studies and data sources. I’d also dare to suggest that almost all major innovations that succeed are based on things that people already knew (to some extent), they just didn’t leverage it before. Probably because it wasn’t championed in the business and brought to life in a powerful way, or perhaps the MD didn’t agree. Research isn’t only about “new news”, it’s about the systematic use of data to make better decisions.