Online Panels: The Black Sheep of Market Research, Part 1.

black sheep

 

By Adriana Rocha

Anyone following Greenbook Blog, or reading the numerous articles posted about online panels recently will probably agree with me that the market research industry has named online panels as the “Black Sheep” of the industry. It seems that everybody agrees the golden days of panel companies are over, that panels are the responsible for the poor data quality of online surveys, the ones that have created professional survey takers, and the list of bad things about online panels goes on and on….

As more of those articles continue being published, the more pissed off I’ve become, and that’s why I’ve decided to write this piece. It seems people have lost the notion of what a market research panel truly is.  With the proliferation of “scam panels”, routers, and many easy ways to drive online traffic to surveys, tons of companies that call themselves a “panel company” have arisen, with no market research background, no panel background, caring little about the experience respondents have by taking surveys and even less about the quality of the data collected.

Yes, “respondents” is still how the industry calls people who dedicate a few minutes of their precious time (well, OK, most of the times many minutes) to participate in market research studies. Oh, “respondents” are just the good ones. The bad ones are “speeders”, “straight-liners”, etc. and market researchers have equipped themselves with advanced technologies to find such bad respondents, identify frauds, eliminate duplicates, etc…

However, researchers are missing the point by forgetting to fix the origin of all of this big mess: poor user experiences.

My advice: provide people good user experiences when responding to your surveys. Respect their time. Be transparent. Give them a real purpose (not necessarily monetary) to contribute to your research study and you’ll be surprised with the results. Instead of buying poor sample services from cheap providers (those are the ones who care less about your data) and then spending money later having to clean your database or re-doing your fieldwork, look for reliable companies who respect people that participate in their panels and in your studies.

In any other industry, companies who build a quality database of subscribers and maintain a long-term trusted relationship with their users are highly valuable.  That’s what true panel companies do. So instead of killing the few ones that are still fighting to survive in this industry, it would be smarter to work closer with them. They are not the black sheep of the industry.  They are the ones with background, knowledge, tools and technologies that can help online market research be great again.

Please share...

16 responses to “Online Panels: The Black Sheep of Market Research, Part 1.

  1. Hi Adriana:

    I think there are still many ways to use panels and probably many things that are called “panels”. The real problem started years ago with Greenfield who abused both researchers and panelists as well as with researchers in the early days that tried to use panels in the same way they calibrated telephone response.

    We’ve all learned a few things along the way starting with the way we use panels (experience not demographic driven) and it’s really our fault that the incentive process became the carrot. We can’t change those things. More importantly, panelists, respondents, people in general see the Internet as a two way street where personal equity matters. They understand the value of response and influence.

    People respond to questions either because they want to have input or they want an incentive. Both have problematic areas and bias, but then again so does every questioning method. What that means for panel providers is that they have to be willing to create profiles that are more than self-directed. They have to go the extra mile to “marry” profiles with validated customer data for a particular segment; i.e. custom panels or the ability to create proprietary segments within a master panel. Secondly, they have to create value outside of the survey or qualitative experience that lessens the need for individual incentives.

    Lastly, panels have to continuously recruit people not only for attrition but also for new perspectives. Many panels have “go to” panelists who always respond and believe or not, they are fairly easy to recognize in a data set.

    A lot of panels these days also are supplementing with river sample and unless it is well managed it is problematic. I had a half million dollar project blow up over one cell that was clearly bogus and not caught until the project was finished. The spammers got one survey through and then repeated the same answers from various IP addresses based in China. It was a US only study and a low incidence quadrant they filled in three days. The aggregated response varied by more than 60% to all other quadrants. Nobody noticed. It is hard to forget something like that – ever.

    The good news for panel companies is that clients are using panels differently now, more for either census sampling which is less stringent and more directional or for very specific things like crowd sourcing among validated purchasers etc. so the recruits are easier. The bad news is that those projects are a lot less profitable for a panel company, hence all the migration toward pseudo full service.

    I admire your dedication and defense of your business and believe you genuinely care about the results. I hope the rest of your organization has the same passion. While you can’t solve the problems of an industry you can show how your are different by offering you clients quick-up pretests to validate qualifications and by offering customized services in addition to providing specs on qualifying panelists. You’d be surprised how many companies don’t offer those services. Secondly, I would consider working with a social media company who can create profiles that extend beyond demos and show influence if you don’t already.

    There is always a place for companies who are passionate about their services and there will always be clients willing to pay extra for quality. The real key is understanding that quality trumps quantity, even in the panel business.

  2. This is a valid post, indeed an insightful one, in the majestic history of the MEA CULPA tradition of market research.

    Alas we market researchers have only ourselves to blame, and how well we know it.. We want to defy the laws of economics, and have our ‘raw material,’ the respondents, cost nothing. And so it is a race to the bottom. But crying about it makes us feel good. Sort of a confession, allowing us to return to sinning, which is fun in the world of ethics, and leads to increased profits in the world of business.

    Twenty eight years ago I was giving a presentation at one of the HBA companies. I mentioned that I ordinarily ‘pay respondents’ to participate in an extended interview lasting an hour or two. I did this for concepts and products. I cannot tell about the quality of the data, other than the fact that it seemed to be better. Respondents were more careful with their responses, which may or may not generate better results.

    There is more to the story. One particularly hyper person in the meeting, J, jumped up, full of righteous fury, and stated that ‘paying the respondent was destroying the business.'[ Essentially J was stating that we have a G-d given right to have zero cost of goods in our production work, and that the only cost should be the researcher’s efforts.

    Well, J was right. We try to have zero cost of goods. And zero quality of our work product. But there is a silver lining. We simply appeal to increasingly sophisticated analyses, which we believe can wreak the necessary miracle, transforming dross into gold, bored respondent data into insights.

    We have met the enemy. The enemy is everyone else but us. We are the alchemists.

  3. Thanks for this, Adriana! I began to notice a serious decline in data quality about 10 years ago, so much so that in extreme cases even basic analytics was meaningless. I’ve long felt that there are big opportunities for quality panels. Doing things on the cheap doesn’t pay – it costs. And, my goodness, we’ve got to clean up our own act and raise our questionnaire design standards.

  4. @Ellen, thanks very much for your comments and feedback. Much appreciated!

    Sorry to hear about the US/ China study episode. Unfortunately that is a sad reality and many researchers still pay for not reliable data without even noticing the possible fraud. My recommendation is that it’s worth paying a little extra for quality, and, as previously mentioned, valuing the real people who will be on the other side participating on your survey.

    I would also recommend anyone reading Pink’s book “Drive: The Surprising Truth About What Motivates Us” and reflect about what will keep the new generations participating in market research studies. Here our entire team is passionate and very committed to our values and purpose. Sometimes it’s frustrating because we don’t control the entire value chain, but I believe we’re doing our part supporting our clients and trying to create something that we believe can help fix this situation. I’ll share some ideas on that in the 2nd part of this article. 🙂

  5. @Kevin, thanks for your comments. I agree with you there are many opportunities for quality panels, and I’m sure the industry will end up realizing that.

    Yes, please, let’s launch a campaign to raise questionnaire design standards! Good panel companies and people responding to surveys will be very thankful! 😉

  6. @Howard, so honored by your feedback! Thanks for sharing the story with the HBA companies. As alchemists, we can still change the formula and fix the situation. I’m sure there is brighter future for Online Market Research.

  7. Thank you for your thoughtful post, Adriana. I wanted to weigh in with my thoughts on the state of the panel industry. Today’s poor panel quality is primarily the result of three levels of mismanagement: (i) too many layers built on top of a system that lacks any standardization around respondent profile management, (ii) an inadequate respondent rewards structure, and (iii) lack of volume in truly mobile compatible survey templates.

    Let’s start off by looking at a survey from its creation and follow it along its path of obtaining its required number of responses. A survey customer visits one of many survey template providers to build a survey. Upon completion, their survey is ported into an exchange which manages relationships with many panel companies who in turn each provide access to a pool of respondents. The exchanges effectively manage the supply of surveys that are fed to any given panel. In order to maximize their supply, most panels integrate with multiple exchanges.

    Each exchange has its own set of profile questions that it uses to qualify a respondent for a survey. Now a survey will be ported into an exchange to get responses from several panel companies, however, depending on the level of targeting and resulting incidence rate, an exchange may not be able to obtain enough respondents. They in turn purchase responses from another exchange. The problem here is that these exchanges are not working off a uniform set of profile questions for a respondent. The result is that Exchange B may be presenting a respondent with a survey they obtained from Exchange A. The lack of compatibility results in a breakdown in the ability to effectively target a respondent. So how do the exchanges handle this? Well, rather than only inviting respondents to surveys that they are truly qualified for, they let a respondent begin a survey only to disqualify them along their path to completion. What you find is that often times a respondent will start a survey only to be disqualified 2 out of every 3 attempts. Now this certainly frustrates a respondent. How would you feel if you repeatedly invested 5+ minutes answering a survey only to be kicked out and handed no reward? This encourages users to speed through surveys given their frustration with the process.

    A byproduct of having so many layers involved (template provider>exchange A> exchange B>panels>respondents) is that most respondent are not being well compensated for their time; too many middle men are taking a cut of the revenue. A survey that originates at $4.00/response may compensate the respondent less than $0.50.

    Another concern from the respondent’s perspective is the quality of the templates being provided to them and the length of the surveys. Given the exchanges source their surveys from so many different template providers, there is no standardization in the user interface that is presented to the respondent. Even worse is that most surveys are deemed to be “mobile compatible” when they are nothing of the sort. Not only are the templates painful to complete on mobile, the length of the surveys themselves make the process unbearable.

    So what is the end result of all of this? Respondents are effectively commoditized and abused. Panels lure them in with misleading rewards structures. Most respondents try a handful of surveys before giving up. A continual churn of respondents leads to only the most desperate respondents fighting through in attempts to earning a reward that equates to a fraction of the minimum wage for their time.
    How do we fix this? If we want quality data we need to respect the respondent by catering to their user experience and adequately compensating them for their time. Quality in = quality out.

    Disclaimer, I am the founder of http://www.centiment.co. We are act as both a survey originator and panel company. We function as a fundraising platform where parents answer surveys to generate proceeds for their child’s school. By owning our own survey customer base, we control the quality of our templates, effectively target users without disqualifications, and can compensate our respondents adequately for their time.

  8. Thanks a lot for yet another fine piece.
    The online panel industry is indeed on its knees.
    @Kurt I think you have also summed up the most thorny of issues. With so many players in the food chain, one has inadequate control of the sample supply. The food chain links are so crossed its even to validate data. Eg, if you are yet to invest in advanced tech that will allow you to screen IP and Mac addresses, you may end up with a situation where you are supplied x sample from provider A, Provider B, C, D etc also buy the same sample to satisfy your supply but since they are in a rush to convert more business, either they do not detect that they are supplying the same project to different firms. Since zealous panelists will always utilize a survey opportunity, the result is you end up with 100+ homogeneous bogus data where you would have had only strays.
    On the other hand if you had the latest survey applications that will only give one time access to each IP or Mac address, the same panelists will be invited more than once for supposedly “different” surveys. The latter ones of which they will definitely be screened out. Not only does this decrease overall IR and increase the fielding cost, it greatly demoralizes the respondents.
    I run one of the few budding panels in Africa, so once in a while I stop at the Panel Manager’s desk. 7/10 of the panelist inquiries/complains is the issue of late screen-outs. Being a relatively new concept to these people, a newcomer panelist who gets screened out 20 minutes into a survey will dismiss the entire Panel as just another Nigerian 419 (Advance fee scam).

  9. Adriana: Thank you for a thoughtful — and thought provoking column. In addition to the market research industry itself, part of the problem lies with some researchers who think that it is okay to torture survey participants with poorly designed, extremely lengthy surveys — and then expect them to complete these in a highly involved manner just because they are being compensated for their time. This goes against the very principle that survey participants are often motivated not by the rewards they receive, but also by a desire to shape the marketplace of goods and services.

  10. I’m late to the party here… great thread. We started providing panel for online research over 20 years ago. To the best of my knowledge, we were the first to develop an online panel for market research purposes. And yes, how times have changed. In 1995, the people who took these online surveys were completely engaged. There was very little to do online and this was a cool and totally new thing. Bad data was the result of the skew… the online population was high income/high education. Not general population. Fast forward 21 years and there seems to be more time spent online than off.

    And reflecting on the state of panels and their use in research, I share Adriana’s frustrations. The research industry has to participate in the blame for poor quality data. You can’t support “cheap” sample and expect the results to be quality. You can’t run a respondent through an aggregator site and let them route through multiple surveys in one go. You can’t pay in micro-payments and expect to keep people engaged. And you can’t have table upon table of 20 rows/columns of questions and then be surprised people straight line.

    We need to respect the respond and respect the respondent experience. Thoughtful responses/respondents are of value and need to be treated as such. This shouldn’t be a commodity business. The burden of respondent quality and hence data quality is on the panel companies – and we fight tooth and nail to keep things clean. We (panel companies) need to subscribe to services to stay white-listed with ISPs, we need to subscribe to services to validate our members. We need to geo track and address match. We do all of this and still there are cheats. But we can remove the cheats. What we can’t do is enforce engagement. And to this end, we need the help of those producing the surveys. Writing and designing the research.

    We need to work together. Because this is a great methodology – preferred by those taking the surveys. Phone and mail are out. This internet thing is here to stay.

Join the conversation

Sign up to our newsletter

Don't miss out...
Get great insights content delivered straight to your inbox.
 I agree to receive emails with insights-related content from GreenBook.


You can manage your email preferences or unsubscribe at any time. GreenBook protects your privacy under the General Data Protection Regulation.