1. Greenbook 2
  2. Greenbook-Mobile-6.29.16-

More Dirty Little Secrets of Online Panel Research

In 2009, Grey Matter Research ran a little internal test on a few panels we had used or were considering. The result was the report Dirty Little Secrets of Online Panels, which burned up Twitter feeds and LinkedIn comments, and was requested by researchers from as far away as Finland, Japan, and South Africa. Well, we’re at it again.

 

Editor’s Note: When I was just starting this adventure in social media, Ron Sellers sent me a copy of the first  Dirty Little Secrets of Online Panels report. It was chock full of the type of information that I felt all in the MR community should know and I did my best to help it get visibility. That was also one of the reasons that I asked Ron early on to join me as a contributing author on the GreenBook Blog; he has a real gift for cutting through to the heart of issues and taking a pragmatic approach to addressing them. When Ron told me he was going to revisit the issue of panel quality I thought it was a great idea and support the effort 100%.

That said, I need to post a disclaimer here: GreenBook has no official position on this project nor have we endorsed it. We support having an open and honest dialogue about all issues that impact our industry and believe that only through total transparency can we achieve that. In that spirit we support Ron’s efforts to share what he has found through his research and hope that this can be used to further the conversation about the future of market research. If you’re an online sample provider you might not like what you read here, but I hope that you’ll use it as an opportunity to engage with the industry on these issues and help us all develop new practices that support our collective vision of a dynamic and growing field.

 

By Ron Sellers

Online access panels.  Love them or hate them – the reality is if you’re in quantitative research, you probably will use them sooner or later.

In 2009, Grey Matter Research ran a little internal test on a few panels we had used or were considering.  We arranged for a selection of mystery shoppers to sign up for each panel and be typical respondents for a month.

What we found encouraged us to expand our test to include 12 major panels, and take the findings public.  The result was the report Dirty Little Secrets of Online Panels, which burned up Twitter feeds and LinkedIn comments, and was requested by researchers from as far away as Finland, Japan, and South Africa.

Well, we’re at it again.  A few panel mergers, plus requests about panels we didn’t include the first time, and it’s time for More Dirty Little Secrets of Online Panel Research.  e-Rewards.  Toluna.  Clear Voice.  Surveyhead.  Opinion Outpost.  MySurvey.  These and six more were evaluated from the perspective of the typical panel member.

Why should researchers care much what panel members are experiencing?  We pay a panel provider or a panel broker, get our N size, toss out the obvious cheaters, and use the data.  Right?  Well…

Imagine you’ve crafted a relatively short, engaging questionnaire that respects my time as a respondent.  However, yours is the tenth questionnaire in a row that I’ve completed that morning, and many of the others were long, boring, and irrelevant.  I’m tired and inattentive.  Now just how reliable is your data?

Or let’s say that I’ve attempted 12 different questionnaires this morning before trying yours.  One of them asked me ten minutes’ worth of questions before telling me I wasn’t qualified (and tossing me out with no reward).  One of them froze when I was mostly done.  Another one told me I wasn’t qualified and kicked me out before I could answer a single question.  Two more were actually called “surveys” but were trying to get me to compare car insurance rates.  Five of them were already closed by the time I tried to respond, even though the invitations were all sent yesterday or today.  I disqualified for two more because I don’t own a pet, even though I stated in my panel profile that I have no pets.  I’m tired, I’m frustrated, I’m annoyed, and now I’m evaluating a new product concept that you really hope I’ll like.  Now just how reliable is your data?

These aren’t just hypothetical situations – these are real situations we found in our work with these panels.  Plus, multiple other problems:

  • The panel that gave us opportunities to complete 50 to 60 questionnaires in a row, non-stop
  • The panel on which over four out of ten studies were closed within less than 24 hours after invitations were sent, and which closed some studies in as little as one or two hours
  • The panel that sent two of our panelists 61 survey invitations in just one month
  • The panel that pays its respondents the equivalent of $2.67 per hour
  • The panel that sent one of our panelists 15 survey invitations over a two-day period
  • The panel that carries advertising on its website – are panelists seeing your competitors’ ads before they answer your surveys?

Of course, there were also much better situations, such as the panel that actually prevents panelists from completing more than one questionnaire per week…the panel that actually pays an average of over $8 an hour to respondents as incentives…and panels that invite people to eight or ten surveys a month, rather than 50 or 60.  It’s not all bad news.

It’s very easy to gloss over fieldwork or let someone else worry about it.  Let’s face it, finding and interviewing respondents is not the most exciting element of research, no matter whether it’s RDD dialing, focus group recruiting, or access panel interviewing.

But always keep in mind that you are depending on these people to give you input that you will use in critical business decisions.  Paying them pennies, giving them boring, lengthy, or irrelevant surveys, frustrating them with multiple closed studies, and bombarding them with opportunity after opportunity is most definitely not how you want to treat people upon whom you are depending for your success.  And if you or your research vendor are not paying attention, this is exactly what may be happening in your research.

This post has addressed some of the problems that exist in panels.  In my next blog post, I’ll focus on what we as researchers can do to avoid some of these pitfalls and give our research a better opportunity for success.

And if you’d like a copy of More Dirty Little Secrets of Online Panel Research, shoot me an e-mail at ron AT greymatterresearch.com (in the normal e-mail format, of course).

Share
You can leave a response, or trackback from your own site.

12 Responses to “More Dirty Little Secrets of Online Panel Research”

  1. Johannes Schaefer says:

    January 31st, 2012 at 12:30 pm

    Hi Ron,

    I work as a researcher at a full service market research company and deal with these online panels quite frequently. I’m interested in both articles, the dirty little secret and the latest one. Thank you so much for putting them together. It is much appreciated.

    Best

    Johannes

  2. Bob Ceurvorst says:

    February 2nd, 2012 at 9:37 am

    As a stat guy, this is an issue I’ve been harping about since the 1980’s, when I first saw 24-page surveys (legal size paper with small fonts), 258 attitude statements (there are far more interesting patterns than flat-lining), the first 24-brand X 60 attribute association grid (which counts as “1 question”), etc. I have implored researchers and clients to take their own survey before they ask hundreds or thousands of respondents — who are doing us a huge favor and could get paid better for their time by working at Macdonalds. There is a limit to how much energy people can devote to a survey before they start answering without caring. Take your survey before you ask respondents to do it!

  3. Ron Sellers says:

    February 2nd, 2012 at 5:29 pm

    Bob, couldn’t agree more. That’s an issue that is true whether or not we’re talking about phone, online, mail, or anything else. I was a member of an old mail panel way back when, and was flabbergasted at some of what I was asked to do. Twelve-page surveys, legal size pages, 10-point font – and a $1 bill shoved into the envelope. I remember one survey that asked me to rate a bunch of different hotel brands on a bunch of different attributes “no matter how familiar you are with each brand” – I counted 480 total check boxes. That was ONE PAGE of the 12-page survey. It went into the trash.

    The dollar bill I kept.

    Unfortunately, technology allows us to migrate terrible practices to new platforms…

  4. Nicola Peck says:

    February 8th, 2012 at 8:24 am

    These poor practises are only dirty little secrets if we allow them to stay hidden.

    Armed with Esomar’s 26 questions to help research buyers of online sample you can quickly expose panels who have given little thought to respondent engagement, data quality or acting professionally as an extension of their client’s brand.

    I work on the panel side and take pride in the care and attention we give to protecting the respondent’s experience whilst ensuring we consider our clients’ end objectives and data integrity.

    http://www.esomar.org/knowledge-and-standards/research-resources/26-questions.php

  5. Catherine Giordano says:

    February 8th, 2012 at 1:07 pm

    I work hard to make sure my online surveys are short overall and that attribute lists are also short. Fortunately, my clients take my advice.

    I hope you name names in your report because all the panel suppliers say that they follow “best practices.”

  6. Ron Sellers says:

    February 8th, 2012 at 5:29 pm

    Nicola, I wish it were that simple. I’ve reviewed some of the Esomar responses, and it’s amazing how little some of these panel providers actually say. They sometimes couch responses such as “While actual contact frequency varies from one respondent to the next, our goal is…” – which basically means that while they’ve set up certain goals, they’re free to exceed them any time they want.

    In addition, some lie outright. One panel was carrying clear sales messages and marking them as “surveys” – even though they are CASRO members and CASRO clearly rejects this as unethical.

    Then there are those where very few people within the company actually know what’s going on. I’ve spoken with numerous employees at panel companies, and I’m amazed at how many from the worst offenders will talk about how they focus so much on quality. Others make honest guesses as to which one they are in the report, and their guesses are so far off base as to be laughable.

    There are good panels and good people out there. Just not enough of them.

  7. Gayle Marshall says:

    February 9th, 2012 at 6:18 pm

    Ron,
    Would like to receive both of your “Dirty Little Secrets of Online Panel Research”!

    Thank you!

    Gayle

  8. All panels are equal, but some panels are more equal than others | YouthSight says:

    May 28th, 2012 at 12:46 pm

    […] Sellars, summarises these issues brilliantly in his Dirty Little Secrets… series. In it, he exposes the bad and even shameful practices of some panel suppliers and the […]

  9. People not respondents | Life on the Dark Side says:

    March 22nd, 2013 at 5:16 am

    […] a lot of researchers I have used online panels and like many researchers (probably) I have asked how 1,000 people answer a 27 minute questionnaire with 9 different […]

  10. B2B Market Research Study? - Don't Hire that Panel Firm - Cascade Insights says:

    August 25th, 2015 at 10:23 am

    […] The market research community has even commented on the challenges that exist in working with panel providers. Just some examples of titles that you come across when searching the subject: “Garbage In, Garbage Out Part Deaux (aka Panels Suck),” “More Dirty Little Secrets of Online Panel Research.” […]

  11. agnes josef says:

    September 21st, 2015 at 8:20 am

    Hi Ron,

    Very useful information. Thanks for sharing…. Bookmarked:)

  12. daniel says:

    April 13th, 2016 at 3:47 am

    Online panels are really helpful in many researches

Leave a Reply

*

%d bloggers like this: