Research Methodologies

August 11, 2015

Heal Thyself! When Will Market Research Get Serious About Sample Quality?

Sample quality from online panels is an issue everyone knows about but no one wants to address.

Allan Fromen

by Allan Fromen

0

 

By Allan Fromen

It’s a really fun time to be in market research. As June’s IIeX conference demonstrated, there is a plethora of innovative start-ups shaking up the industry. I had the pleasure of introducing Jeff Bander from Sticky, a company that aims to bring the previously expensive methodology of eye tracking to the masses via a cloud based solution. Really cool stuff. To read reviews of the IIeX conference, see these posts by Annie Pettit, Tom Ewing, and Dave McCaughan.

Amidst all the focus on new tools and techniques, I was struck by a presentation that compared research panelists with dynamic respondents. A senior individual went through slide after slide of data showing significant differences in the results based on whether the respondent was a panel member or recruited via dynamic techniques.

Someone in the audience aptly asked “So which respondent type reflects the truth?”

The answer was “We don’t know.”

This turn of events made me sad and reflective at the same time.

We all know that response rates are declining. More worrisome are the studies indicating that some panelists are professional respondents, jumping from survey to survey to goose their incentives. I recall once hearing a presentation that the average panelist was on 5-6 panels. Another presentation at a top conference years ago methodologically demonstrated that cheaters (speeders, straight-liners, etc.) only changed the results by a tiny amount (low single digits), so any concerns about bad sample was overblown and misplaced.

Panel companies likely follow the classic bell-curve, with some at the top of their game and far superior to their peers. But when you speak to folks over a drink, they admit that panel companies all “borrow” respondents from each other and that quality is an issue everyone knows about but no one wants to address.

At another conference recently, a senior researcher admitted to being an online panel member herself. Not in a nefarious way, but as a means to evaluate survey design, user experience, and how surveys are actually being delivered. The two panel companies she named have top-notch brands and are known as leaders in the field. But her experiences could not be more different. One panel was exactly as you would hope – they only sent her surveys that were targeted, based on her demographics and past surveys. From this panel, she received about 2-3 invitations per week. The other panel bombarded her with a steady stream of surveys – 40 or so per week (by her estimation) with seemingly no connection to any data she had shared previously, either via registration or completed surveys.

Isn’t it ironic that we are so meticulous in our survey construction – “garbage in, garbage out” we all say – and then we throw our carefully constructed babies out into the unknown? In an effort to reduce order effects, we rightly focus on randomizing brands within the questionnaire, but have no idea what survey the panel members completed just before ours. Some of the very brands we were trying so hard to inoculate from bias, might have been exposed to our panelist a minute before our survey.

None of this is new news unfortunately. Ron Sellers of Grey Matter Research has done an impressive and laudable job detailing some of these issues. See here and here.

I think the reason this is boiling over now is that we as an industry maintain this mantra of innovation. Embrace new methods, innovate or die, disrupt or be disrupted. And yet the foundation of our entire industry – the respondent – is somehow exempt from the conversation.

There are some noteworthy innovations in the sampling space, of course. Google Consumer Surveys and Survata catch people where they are browsing, in their natural environment, and don’t rely on opt in sample. GfK’s Knowledge Panel is the gold standard of probability sampling, and RiWi is doing some really cool work in this space as well.

But these are exceptions to the rule. It seems most of us shrug our shoulders and think if everyone else is doing it, it must not be that bad after all. If the whole industry is sideways, at least I’m not upside down, we tell ourselves.

I am not calling out the panel companies any more than the researchers who use them, myself included. Researchers and clients have all created the demand for faster and cheaper, and so panel companies have quite reasonably moved to fill that need in the market. We are all guilty of basking in the short term high that comes from easy and cheap sample. What other course of action is there? Revert back to the more expensive techniques that were the norm before the Internet came along? Even if they were viable, we’d have a hard time convincing clients to pay for such rigor, now that the genie is out of the bottle.

So what is the solution? I am not sure but I have one suggestion. Couldn’t organizations such as ESOMAR play the role of Consumer Reports, but focused on the sample industry? Via mystery-shops, surveys with buyers, and other methods, a report would be issued rating each company on a number of criteria, such as overall quality, number of surveys per x timeframe, quality of the “match” between survey and respondents, and so forth. The same Harvey Balls we’ve all seen a million times could be used to help buyers understand the strengths and weaknesses of various sample sources. Panels with high ratings would undoubtedly be able to charge a premium. Panels with lower scores could strive to improve, or position themselves as the low cost provider. For every BMW, there is a Kia. This would not be a public shaming, but rather a guide to help select a sample provider.

One of the challenges in using panels today is that such ratings do not exist. How are buyers supposed to evaluate sample sources, other than reputation? Panels are notoriously reticent to discuss their recruiting practices, participation rates, or number of invitations sent (if they even track that data).  As Ron Sellers told me, “There really is no way to evaluate the quality of different panels or compare options, other than actual user experience either as a buyer or as a respondent. That’s one reason so many researchers join panels. So lacking any objective measurements or personal experience, choices often come down to the only knowable factors: price and feasibility. That just exerts additional downward pricing pressure, which in turn further impacts quality. It’s a vicious cycle; one which is entirely undesirable for the industry.  And I see no end in sight.”

0

innovationmarket research supplierspanel providerssamplesample quality

Disclaimer

The views, opinions, data, and methodologies expressed above are those of the contributor(s) and do not necessarily reflect or represent the official policies, positions, or beliefs of Greenbook.

Comments

More from Allan Fromen

Insights Industry News

How I Beat The Pundits and Predicted A Trump Presidency

Allan Fromen predicted a Trump presidency and out-forecasted all the experts. Or did he?

Allan Fromen

Allan Fromen

Consumer Behavior

Do Respondents Even Understand Our Surveys?

Language can be imperfect. How often does the receiver of a message truly understand it exactly as the sender intended?

Allan Fromen

Allan Fromen

Brand Strategy

What Marketers Can Learn From The Trump Campaign

Trump has demonstrated the impressive power of emotion, and it is a lesson marketers would do well to remember.

Allan Fromen

Allan Fromen

Research Technology (ResTech)

Big Data’s Big Hype – Why Big Insights Are So Elusive

The road to Big Data is paved with the challenging work of cleaning, preparing, and integrating systems that were designed in silos.

Allan Fromen

Allan Fromen

ARTICLES

Moving Away from a Narcissistic Market Research Model

Research Methodologies

Moving Away from a Narcissistic Market Research Model

Why are we still measuring brand loyalty? It isn’t something that naturally comes up with consumers, who rarely think about brand first, if at all. Ma...

Devora Rogers

Devora Rogers

Chief Strategy Officer at Alter Agents

The Stepping Stones of Innovation: Navigating Failure and Empathy with Carol Fitzgerald
Natalie Pusch

Natalie Pusch

Senior Content Producer at Greenbook

Sign Up for
Updates

Get what matters, straight to your inbox.
Curated by top Insight Market experts.

67k+ subscribers

Weekly Newsletter

Greenbook Podcast

Webinars

Event Updates

I agree to receive emails with insights-related content from Greenbook. I understand that I can manage my email preferences or unsubscribe at any time and that Greenbook protects my privacy under the General Data Protection Regulation.*