Research Methodologies

May 31, 2011

How Do You Assure Online Panel Quality?

Ron Sellers shares some steps you can take to increase your likelihood of getting quality data from online panels.

Ron Sellers

by Ron Sellers

0

In a meeting recently, I was asked a very cogent and interesting question:  “How do you assure the quality of online panel sample?”

The answer is very simple:  I don’t.

I can’t be positive that the panel sample I am using is good quality.  There is no one statistical test or software that can tell you whether you have a reliable sample or not.

Dealing with Internet research panels is like securing your house when you’re going away on vacation.  There are all sorts of steps you can take to burglar-proof your home, but the plain fact is that if a burglar wants in badly enough, you’re going to get burglarized in spite of all your careful precautions.  Police officers will tell you that the goal of burglar-proofing your home is not to make it impossible for burglars to break in, but to make it much more difficult for them, and therefore much less likely to happen.

Quality checks on Internet panels work the same way.  I may not be able to ensure panel quality 100%, but there are plenty of steps I can take (and have taken) to make it a far higher likelihood that I’m getting good data.

1.    Find out what quality work has been done on various panels.  My own company explored respondent experiences with 12 major panels in the report Dirty Little Secrets of Online Panels.  We found one panel that averaged only two survey invitations per month per panel member, while another one averaged fifty-seven per month.  There’s also the Grand Mean Project, which explores replicability of survey responses on various panels.  Seek out information like this and learn from it.  After completing our Dirty Little Secrets report, there are two major panels I will not consider for use under any circumstances, and others about which I have concerns.

2.    Talk with others in the industry.  It’s interesting how many researchers I know who have personally signed up for a few different panels; their experiences can help inform your decisions about which panels to use and avoid.

3.    When you work with a panel, insist on complete and transparent field statistics.  How many invitations were sent, and how many people responded to the invitations?  How many respondents did they clean off your database due to concerns such as unreasonably fast completion times and straightlining?  (If there are a lot, this calls into question respondent validity; if there are few or none, this calls into question whether they care about these problems.)

4.    When looking at the demographics of the sample, include all attempts rather than just completions.  Take into account the people who quit mid-interview, as well as those who did not qualify for your study.  You may end up with a perfect gender split on the completed sample, but if the incidence was 30% and the attempts were 70/30 female/male, you have a skew (one the panel company should have recognized and adjusted for, and which you now need to correct through weighting).

5.    Evaluate validity on more than just demographics.  There are lots of other known factors about the US population that can help you determine whether your sample is representative.  For example, in a religious study, we already know the proportion of Americans who attend various denominations.  If we know 5% of American churchgoers are Lutheran, and 20% of our sample comes up Lutheran, we have a potential sample problem.  You might measure the proportion of respondents who are Republican, who own a dog, or who drive a domestic vehicle, and compare your figures to known data about the national population.

6.    Consider who you’re working with at a panel.  Are you just giving orders to a salesperson, or is your contact highly knowledgeable about their own panel and about research in general?  How do they react to questions or concerns you raise?  Are they constantly trying to explain away anomalies or make excuses for their panels?

7.    Talk in-depth with panel managers.  It’s surprising what you can learn about the panel industry.  Ask about typical response rates and respondent longevity.  Compare their answers with your own experiences with those panels.

8.    Evaluate what each panel focuses on in what they say about themselves.  One panel I’m aware of, for instance, makes a big deal about systems they have in place to stop the same respondents from completing the survey if you blend their sample with other panels they coordinate with.  Think about that for a moment.  Here we have two panels with, say, a million members each (out of around a quarter of a billion American adults).  Out of those million members, you’re sampling 500 respondents from each panel (for example).  Unless your incidence is extremely low (e.g. you’re trying to find skateboarding grandmothers), what are the chances you’re going to encounter the same respondent by pulling 500 out of 1 million members from two different panels?  Yet that’s really the reason I should choose to work with this panel?  Seems to me their efforts toward panel quality are targeted on the wrong thing.

9.    Have a clear understanding about your expectations with any panel you might use.  I will not put a survey into the field if the panel company is using a portal, for example (a portal is where the panel company asks the respondent a few questions of their own prior to your questionnaire, seeking to identify low-incidence populations for other studies).  I don’t want someone else’s questions potentially biasing my own sample.  I also will not accept one panel company getting sample from another panel in order to complete my study without my prior approval.

10.    If you subcontract work to a vendor that is selecting which panel to use, know which one they’re choosing and why.  It’s very easy for clients to turn a blind eye to the fieldwork and focus on things such as questionnaire design and reporting.  But without solid fieldwork, questionnaire design and reporting are meaningless (or worse, misleading due to bad data).

11.    Most of all, have a healthy amount of doubt.  Assume you’re getting garbage on every project until you can take a variety of steps to prove the validity of the data to yourself.  I don’t consider a panel innocent until proven guilty; I consider it guilty until proven innocent – on each project.  Continually prove the data to yourself and you’re far more likely to wind up with something that is actually reliable.

These are just a few ideas for working with online panels.  They’re the equivalent of stopping your newspaper, installing a burglar alarm, putting lights on timers, and asking your neighbors to watch your house while you’re away:  they don’t guarantee quality, but they sure make it a much higher likelihood.

———————————————————————————-

Get Inspired. Stay Informed.  .

0

data qualityonline panelssample

Disclaimer

The views, opinions, data, and methodologies expressed above are those of the contributor(s) and do not necessarily reflect or represent the official policies, positions, or beliefs of Greenbook.

Comments

Comments are moderated to ensure respect towards the author and to prevent spam or self-promotion. Your comment may be edited, rejected, or approved based on these criteria. By commenting, you accept these terms and take responsibility for your contributions.

More from Ron Sellers

Are the Fraudsters More Sophisticated Than the Researchers?

Research Methodologies

Are the Fraudsters More Sophisticated Than the Researchers?

It’s amazing what some people will do in order to make a buck-fifty. Two recent studies have brought to light how sophisticated panel fraud has become...

Ron Sellers

Ron Sellers

Still More Dirty Little Secrets of Online Panels

Research Methodologies

Still More Dirty Little Secrets of Online Panels

Nearly half of your panel data is trash. Here is how to fix it.

Ron Sellers

Ron Sellers

Can Political Polls Really Be Trusted?

Can Political Polls Really Be Trusted?

When political polls fail to predict the exact outcome of an election, maybe they’re not wrong…maybe we are.

Ron Sellers

Ron Sellers

Panel Quality Stinks and Clients Are To Blame

Research Methodologies

Panel Quality Stinks and Clients Are To Blame

Why should panel companies improve their results when clients accept the status quo and won’t pay for better?

Ron Sellers

Ron Sellers

ARTICLES

Moving Away from a Narcissistic Market Research Model

Research Methodologies

Moving Away from a Narcissistic Market Research Model

Why are we still measuring brand loyalty? It isn’t something that naturally comes up with consumers, who rarely think about brand first, if at all. Ma...

Devora Rogers

Devora Rogers

Chief Strategy Officer at Alter Agents

The Stepping Stones of Innovation: Navigating Failure and Empathy with Carol Fitzgerald
Natalie Pusch

Natalie Pusch

Senior Content Producer at Greenbook

Sign Up for
Updates

Get what matters, straight to your inbox.
Curated by top Insight Market experts.

67k+ subscribers

Weekly Newsletter

Greenbook Podcast

Webinars

Event Updates

I agree to receive emails with insights-related content from Greenbook. I understand that I can manage my email preferences or unsubscribe at any time and that Greenbook protects my privacy under the General Data Protection Regulation.*