Forward to the Future, Back to the Basics

With all of the changes happening in market research, we can’t forget the importance of basic skills. Without the basics, all the shiny new stuff doesn’t work.



Ron Sellers

With all of the attention being paid to emerging research methods, a point that is too often missed is that, just like the old techniques, many of these new approaches still require basic research skills.

For instance, bulletin board qualitative, mobile MR, Google Surveys, and other less traditional approaches still mean we’re asking questions of people, even if we’re now doing it in different ways. So it’s important that we continue to ask relevant questions that people can actually answer.

Unfortunately, no matter what technique is involved in asking questions, there are still a lot of bad questions being asked.

Example: I just completed a questionnaire (as a respondent) in which I was asked to name my “primary financial institution” (with no further definition of what that means). Trouble was, the questionnaire had already asked about a variety of financial services: loans, credit cards, investments, checking and savings, etc. I use a major national bank for day-to-day transactional needs, but most of my investments (and therefore most of my money) are with a different set of financial services companies, and my personal and business credit cards (which get a heavy workout as daily transactional tools) are with two entirely different firms.

Which one is my “primary” financial institution? Is it the one where I have most of my money, the one where I have my basic checking account, or the ones I use every day for transactions? In answering the survey question, I might define it one way, while another respondent uses a very different definition. That leads to inconsistent data based on different parameters.

Similarly, the questionnaire asked me how likely I would be to consider a number of financial institutions if I wanted to “open a new account or take out a new loan.” Again, for me there’s a big problem trying to answer this. If I were to refinance my mortgage, I would shop for the best rate and not particularly care which financial institution provided it (since I figure it’ll just get sold anyway). If I were to open a new checking account, I would only consider a major national bank that has ATMs all over, because of how much I travel. If I were to open a new investment account, I would not consider a bank at all (not being a novice investor).

My answer to that question doesn’t fit into a nice, convenient box like the researcher wanted. I simply cannot give a blanket answer to this question, because my answer would be very different for different types of financial service products that the questionnaire has lumped together.

In another questionnaire, I was asked whether I consider clothing made out of cotton to be better quality than clothing made out of other materials. Well, it depends – I don’t wear a lot of cotton suits or cotton ties, but I certainly want cotton socks and cotton jeans. And what is meant by “better quality”? Does that mean durability, how it feels against my skin, how others perceive my wardrobe, or something else? Further, what happens if I consider cotton to be better quality than rayon and polyester, but lower quality than silk and wool?

What the researcher obviously wanted was the ability to have one nice, neat number that shows how many people think cotton is superior (or inferior) to other materials. But sometimes you can’t just ask one question and learn everything you need to learn. People don’t work that way. And if people don’t work that way, neither should research.

I see this type of question all the time, and quite frankly, I’ve probably written a few of them in my career. It’s easy to do. But it’s also important to understand that how respondents think, and the lives they live, won’t always conform to the neatly wrapped parameters we desire in order to simplify research. And that fact won’t change whether the respondent is participating in a telephone survey or a Google Survey on his tablet.

This becomes particularly important in our current industry situation. It’s easy to become enamored with a new approach and forget that many of the same rules and standards still need to apply. Good probing is good probing, whether the respondents are gathered around a conference table, doing laundry as you watch, or staring at you through their webcams. A survey conducted by tablets and smart phones still loses value and relevance if respondents are not quite sure what you’re asking, just as it did when interviewers were marching door to door.

And this all becomes even more important when you consider that some of the people now designing the questions have expertise in data mining or technology rather than in traditional research techniques.

In this respect, research is a bit like medicine. When doctors were making house calls in their Model A’s, they didn’t have CAT scans, MRIs, genetic testing, antibiotics, or many of the wonderful tools available to today’s practitioners. But today’s doctors still have to know basic skills they’ve used for decades; things like diagnosing a condition, setting a broken bone, stitching up a wound, and dealing with a scared eight-year-old (or eighty-year-old). The tools are different but many of the basic skills are still the same.

No matter the research method chosen, a biased sample is still a biased sample. A meaningless but statistically significant correlation is still meaningless. A bad question is still a bad question. A bored, disengaged respondent is still failing to give you useful insights. And using bright, shiny, cool new research tools doesn’t change any of these facts.

You can leave a response, or trackback from your own site.

8 Responses to “Forward to the Future, Back to the Basics”

  1. Scott Weinberg says:

    June 11th, 2014 at 11:59 am

    Nice reminder of the basics we should never take for granted. I blog about this same topic on occasion on my own blog, esp recent post on sample quality.

    Two other observations.

    Question: why are MR surveys written so embarrassingly poorly?

    Answer: because the barrier to entry in this industry is exactly zero. Our surveys are written by poly sci majors with zero social science training let alone formal survey design coursework, which you’ll only get at the grad level.

    Question: why are open end comments rarely if ever used in our surveys?

    Answer: I don’t know. Because you’ll get actual data? Because it takes more effort to process? Because it requires more expense to process? If we were serious about delivering insights via surveys, we wouldn’t have kids writing them, and we’d use far more open ends to try to get something actually useful.

    And we engage in hand ringing with why our response rates are sub 5%? Guess who’s taking your surveys, folks.

  2. Mlouca says:

    June 11th, 2014 at 10:45 pm

    Word. “The tools are different but many of the basic skills are still the same.”

  3. #FridayFive – Stories from Around the Web | QuestionPro Blog says:

    June 12th, 2014 at 11:01 pm

    […] Forward to the Future, Back to the Basics (GreenBook) […]

  4. Kevin Gray says:

    June 15th, 2014 at 5:12 am

  5. Kevin Gray says:

    June 15th, 2014 at 5:15 am

    Totally agree. “Old” is still very new in the real world of MR.

  6. John Coldwell says:

    June 25th, 2014 at 11:49 am


    Don’t get me started.

    Clearly you are far too patient – not even naming and shaming the organisation!

    We conduct B2B customer satisfaction surveys and far too many clients have a really hard time getting that list of their most important customers together.

    And then there’s the spelling of their customers’ names. –

    Keep up the good work Ron.


  7. Judy Bernstein says:

    June 30th, 2014 at 9:28 am

    “My answer to that question doesn’t fit into a nice, convenient box…”
    The perfect rallying cry for qualitative!

  8. Can Political Polls Really Be Trusted? | GreenBook says:

    October 13th, 2016 at 6:00 am

    […] political polls that are flat-out conducted poorly, just as there is some business research that is misleading garbage.  But many of the supposed “problems” with political polls are that pollsters, pundits, and/or […]

Leave a Reply