Ever Thought of Researching Ethnic Minorities Online?

A treasure trove of insights awaits those who have the confidence to challenge mistaken assumptions and to give a proper voice to ethnic minorities.

By Dr. Marie-Claude Gervais

I simply cannot recall the number of times clients have ruled out conducting online research with people from ethnic minority backgrounds on the grounds that, somehow, ‘ethnic minorities are not online’. As face-to-face research is too expensive, clients quickly give up altogether on the idea of conducting research specifically with ethnic minority customers. Sadly this means that a whole segment of the population – one whose needs and experiences may be very specific – ends up excluded from research, and companies and service providers fail to reach and engage this lucrative group of consumers.

It is puzzling that assumptions about digital non-participation should persist, especially when robust evidence shows that, in the UK, people from ethnic minority backgrounds are more likely to have broadband, to own a smartphone, to be active online and to have positive attitudes towards new technologies. Surprised? Well, this is true of all the main ethnic minority communities in the UK, whether they are of Indian, Pakistani, Bangladeshi, Black African and Black Caribbean descent, or more recent migrants from Eastern Europe.

This evidence is not new either. An authoritative report produced by OFCOM in 2013 showed that internet penetration is deeper within these groups, with considerably more people from ethnic minority backgrounds owning a broadband connection. This is often because possessing a good internet service is essential to maintaining family ties and connecting with people ‘back home’ (on Facebook or Skype) in order to get international news and remain linked to their culture and society. This early internet adoption has increased their technological confidence; usage is more frequent, innovative and complex, as indicated by the fact that people from ethnic minority backgrounds are more likely to connect to WiFi hotspots. Importantly, ethnic minority people are also significantly more influenced by comments and reviews posted online than White British respondents. I know from my own research that people from all ethnic minority backgrounds place a greater importance on community networks and word of mouth, as a source of social and cultural capital. The fact that minority experiences are often marginalised, misrepresented or ignored in the media and elsewhere means that people develop the habit of turning to their network of trusted family and friends for advice, to share experiences and to access a perspective on the world (e.g. events, politics, brands and products) with which they can identify, and to which they can relate.

A similar picture emerges with respect to usage of mobile phones. Ownership of mobile phones is very deep, with virtually every ethnic minority household having at least one mobile phone and fewer people relying on a fixed landline. While the data is based on households instead of individuals, we also know that a greater proportion of ethnic minority individuals also own mobile phones.

Looking at the attitudes which people from ethnic minority backgrounds have towards the technology that enables and enriches online research sheds no further light on why they are currently often excluded from digital research. Indeed, all ethnic minority groups report being ‘less confused by computers’, ‘loving gadgets’ more and generally being keener to welcome ‘the latest technology’.

With all this in mind, why would we not seek to engage with these connected, forward-thinking, technology-loving consumers online? The answer is more likely to be found in the minds of researcher buyers than in the attitudes and lifestyles of the consumers. A treasure trove of insights awaits those who have the confidence to challenge mistaken assumptions and to give a proper voice to these consumers. Online research communities enable qualitative researchers to work people from ethnic minority backgrounds to explore their world, wherever they are (not just in London, Birmingham or Bradford!), in a cost-effective and meaningful way.

I certainly can recall the number of times that clients, who have used online research communities with ethnic minority consumers, have expressed their delight at the outcomes of these projects. These experiences not only had a positive impact on their business but, perhaps more importantly, also on their mindset.

Originally posted here

Five Challenges to Overcome for Successful Market Research Online Communities

As with every methodology, there are some potential challenges to overcome with online communities to ensure you generate strong results.

By Lisa Boughton

Market research online communities are a popular, fast growing research method with numerous benefits for researchers and participants alike. When well-managed, MROCs can deliver unbeatable insights instantly, giving researchers access to participants’ thoughts that they just didn’t have before. Not only that, but because they take place online in a safe and secure setting, participants feel comfortable voicing their opinions and can do so anytime, anywhere – meaning that MROCs easily fit in around respondents’ busy lifestyles. When you add in the fact that market research online communities can often be quicker and more cost efficient than traditional methods, it’s clear to see why they are such a hit in the market research world. However, as with every methodology, there are some potential challenges to overcome to ensure you generate the results you deserve…

  1. Eliminating recruitment risks and the danger of dropouts

If you have certain quotas and specific segments you need to adhere to, it can be a struggle to find respondents – and when you do find them, you’ll want to do everything you can to keep them! There are several strategies you can enlist during recruitment to ensure that your MROC runs smoothly and your respondents stay engaged. Firstly, make sure you let them know exactly what is expected of them so there are no surprises. Do they need to dedicate 30 minutes a day for a month? Or is your project more ad-hoc and flexible? Either way, outlining the requirements from the beginning in a clear document will make a dramatic difference. Additionally, we always recommend an over-recruit of 20-30% so if the worst does happen and you do have dropouts on the day, you’ll be prepared.

  1. Ensuring your onboarding goes smoothly

Not having a sufficient onboarding process can impact the whole study. From problems logging onto the platform to being asked to carry out tasks they weren’t expecting to complete, not having an onboarding strategy can result in disgruntled respondents dropping out. Don’t worry, though – there are a few things you can do to prevent this happening, such as allowing plenty of time for onboarding before the community starts (up to 48 hours where possible) and including an ice-breaker to help respondents feel more comfortable with each other – which in turn leads to more honest and insightful responses in the tasks.

  1. Encouraging participation and boosting engagement

It’s important that your participants feel valued throughout the project to keep them engaged, and offering bonus incentives in addition to the initial incentive received for taking part in the community can be a fantastic way to do this. You can also enlist a Community Manager to deal with any concerns and answer any questions to help boost engagement. It’s important to keep in contact with your respondents, so think about how long your community will last and plan how often you will follow up with your participants – if your research community is lasting a week, for example, you might want to reach out to participants everyday. A phone call is generally the best way to do this as it means any issues can be addressed immediately. You should also think about how you will remind participants to complete their tasks – a gentle nudge via text message can work well to remind them of upcoming deadlines.

  1. Overcoming tricky technology issues

One of the biggest challenges to overcome is choosing the right software, with problems such as a lack of support or software that is difficult to use resulting in dropouts. That’s why it’s important to ask yourself exactly what you need from the research and tailor your software accordingly. It’s also worth considering using an app to make life easy for your participants, enabling them to access the community and complete tasks on the go. Before the community starts, be sure to provide respondents with an information sheet outlining what to do and who to contact, and you should also demo the software as well as testing it on respondents beforehand to iron out any issues. To combat any problems once the community is up and running, make sure there is a contact number for respondents to call both in and outside of office hours, so any issues are dealt with swiftly. Find out more about choosing the right software here.

  1. Increasing engagement through interesting tasks

Another issue with market research online communities is ensuring you choose the right tasks. By creating a variety of different tasks, you can keep the project exciting and your respondents engaged. From blogging and discussion tasks to idea generation and picture books, making sure you offer fun, frequent and flexible activities that appeal to your participants can ensure your respondents are keen to get involved and share their opinions with you. Have a look here for some inspiration on how to choose exciting tasks to keep your community engaged.

The High Cost of Cheap Sample

The fact is, bad or "cheap" sample can give you information that is dead wrong.

By Rob Berger

“At least it will give us directional information. I mean, how wrong can it be?” I’ve heard those remarks and many like them bantered about when people rationalize using cheap sample sources.

The fact is, bad or “cheap” sample can give you information that is dead wrong. So wrong that the information it provides is directional—it’s just pointing in the wrong direction.

We’re into the third year of a study where we test the reliability and validity of a popular and well known consumer survey service. We compare the findings of that research with the same data tracked by the Pew Research Center, who are well known for their methodological rigor. The question we are tracking is about the use of social media sites and apps.

The consumer survey results and the Pew Research Center findings could hardly be more different. For starters, the consumer survey suggests social media usage is half the level that Pew and many other sources say it is at.

The consumer survey source says, for example, that 39% of online Americans are using Facebook. Pew puts that number at 79%. For other social media sites the differences are equally stark. Furthermore the consumer survey data would suggest great volatility in the use of social media—with vast surges and declines. Pew shows a slow steady growth.

Why this discrepancy? We believe a lot of it has to do with why people are completing the survey. The consumer survey source obtains respondents by working with publishers to intercept people who are seeking to access “premium content” on their sites. Potential respondents are asked to answer a few questions in order to get access to content. In a whitepaper on this topic we consider a host of potential reasons, but the one that seems to be the most important is respondent motivation.

These people are not taking the survey because they want to. They are doing it to get to their desired content. They don’t have any stake in how they answer—in fact, the question is just a nuisance. When you treat people like that, it is no surprise that the data they give you may be inaccurate. Why would they bother? After all, they have just been frustrated in their pursuit of something they want.

That’s why we take respondent engagement so seriously. Whether it be on our Market Communities—recruited to be representative of the US and Canadian populations—or our client’s specially recruited Insight Communities, we take care to ensure respondents know that their opinion is valued.

We’ve researched why people respond, and we work hard to let respondents know their opinion is important to us and that their input makes a difference. We strive to provide them with feedback on what we’re learning and we try to expose them to interesting new things.

When we invite people to join our communities we mean community in the fullest sense of the word. That’s an important part of what allows us to collect accurate and consistent information to inform our client’s decision making.

People conduct research to help them make better decisions. When the research is wrong, they make bad, even terrible, decisions. In those cases the data is hurting rather than helping them. That makes cheap sample very expensive.

To learn more, download our whitepaper The High Cost of Cheap Sample: Evaluating the Reliability and Validity of a Publisher-driven Online Sample Source.

The NPS Recoding Trick: The Smart Way to Compute the Net Promoter Score

The Net Promoter Score is most people’s go-to measure for evaluating companies, brands, and business units. However, the standard way of computing the NPS is a bit of a pain.

By Tim Bock 

The Net Promoter Score is most people’s go-to measure for evaluating companies, brands, and business units. However, the standard way of computing the NPS – subtract the promoters from the detractors – is a bit of a pain. And, in most apps, you cannot use it in stat tests, so you are never really sure when the NPS moves whether it reflects a change in performance, or just some random noise in the data.

The standard way of computing the NPS


The table to the right shows data for Apple.  Fourteen (14.4%) said they were Not at all likely to recommend Apple, 2.6% gave a rating of 1 out of 10, 2.0% gave a rating of 2, etc. If we add up the first seven categories (0 through 6), 51% of people in this data set are Detractors. Adding up categories 7 and 8 gives us 31% Neutrals, and then the remaining 18% are Promoters. So, in this data set, Apple’s NPS is -33.3, which is not great. (Among Apple customers, the NPS is much higher.)

A smarter way

The table below shows the raw data for the 153 people in the data set. The actual ratings, out of 11, are shown in the Recommend: Apple column. The second column shows the recoded data, where the original values are replaced by new values. The trick is to replace values of 0 through 6 with -100, values of 7 or 8 become 0, and values of 9 or 10 get a new value of 100. The column on the right shows the recoded data.

Why is this recoding clever? Once you have recoded the data this way, you can compute the NPS by computing the average, and you get exactly the same answer as you do when using the standard way.

Why the smarter way is so much smarter

The genius of the smarter way is that we can now compute NPS using any analysis app that is able to compute an average. For example, I have used the multiway table feature in Displayr to compute the NPS for Apple by age and gender, by just selecting the three variable (age, gender, and the recoded NPS variable).

The multiway table is created using Insert > More > Tables > Multiway Table.

Doing this with your own data using Displayr

The fastest way to do do this is to start using Displayr, and then:

  1. Import a data set: Home > Data Set (Data). If you want to play around with a live example where the data is already in Displayr, click here.
  2. Drag the variable containing the likelihood to recommend data from the Data tree onto the page, so that it creates a table (like the first table in this post)
  3. Select the table, and select Data Manipulation > Utilities (far right) > Compute > Net Promoter Score. This will add the NPS to the bottom of the table.
  4. (Optional) Select the variable (in the Data tree, bottom-left), and change the Structure to Numeric (in the Object Inspector on the right). This will mean that you only ever see the NPS, rather than seeing both the NPS and the percentages in each category.

Alternatively, if you want to do the calculations “by hand”:

  1. Import a data set: Home > Data Set (Data). If you want to play around with a live example where the data is already in Displayr, click here.
  2. Drag the variable containing the likelihood to recommend data from the Data tree onto the page, so that it creates a table (like the first table in this post).
  3. Select the variable in the Data tree.NPSRecodedValues
  4. Select Data Manipulation > Values (Data Values). 
  5. Change the entries in the Value column so that they look like those to the right, and press OK.
  6. Change the Structure to Numeric (in the Object Inspector on the right). This will cause the table to show the Average, which gives us the NPS.


Jeffrey Henning’s #MRX Top 10: GRIT, SMEs, and Other Research Stories

Of the 3,001 unique links shared on the Twitter #MRX hashtag over the past two weeks, here are 10 of the most retweeted...

By Jeffrey Henning

Of the 3,001 unique links shared on the Twitter #MRX hashtag over the past two weeks, here are 10 of the most retweeted…

  1. Participate in the GRIT Study – A call to market research professionals to take the twice-annual GreenBook Research Industry Trends survey.
  1. The Tragic Tale of Research Participants – Melanie Courtright of Research Now discussed past GRIT research on research that found that only 15% of surveys are optimized for Worse, under 10% of researchers considered it to be of significant importance for research participants to have a positive impression of market research after participating, and even fewer felt it was important for participants to speak highly of their research experience. As Melanie writes, “And we wonder why respondent rates are falling?”
  2. Pepsi’s Ad Failure Shows the Importance Of Diversity and Market Research – Did you know the Kendall Jenner ad was produced by an in-house creative team rather than an outside agency? Interviewed by Marketing Week, Andy Nairn of agency Lucky General commented, “If you run an in-house creative department like Pepsi does you need to really interrogate your own approach and make sure you’re not blinkered by your own “
  3. MRS Introduces Market Research Training Simulation – The Market Research Society has introduced a simulator where students design a research program to support a brand launch, using qual and quant techniques across six
  4. The State of the SME Nation – Microsoft sponsored a pan-European study of 13,000 SMEs (Small and Medium-sized Enterprises): “69% of SMEs want to be known for the quality of the service they provide, but despite this, interactions with their customer rely on low-tech solutions, such as face-to-face interactions (26%), or phone “
  5. ComRes Launches Brexit Unit – The agency ComRes has created a new unit that will aggregate public research about the impact of Brexit, supplemented by its own proprietary
  6. Analysis Shows Long-Term Effect of Newsbrands Research Live discussed Peter Field’s research into advertising with newspapers (online and offline): “Campaigns using newsbrands were 43% more likely to deliver very large market share growth, 36% more likely to deliver profit, and 85% more likely to drive customer “
  7. Better Marketing for Market Researchers: Lessons from Insights Marketing Day – Stefanie Mackenzie recaps presentations from Dan Brotzel of Sticky Content, “7 Ways to Nudge Your Company Towards More Conversions”, and Tom Ewing of System 1, “The Cat’s Lifejacket: Thought Leadership in a Thoughtless “
  8. What’s the Story with Market Research? Q&A with Author and Leading Business Storytelling Expert Paul Smith – Brian Izenson of Dialsmith interviewed Paul Smith, who provided two useful templates for research storytelling, the “How We Got Here” story and the “Discovery Journey” story–“Explain in story form: context, challenge, conflict, “
  9. Time and Convenience Biggest Opportunities for Engaging Post-Millennials – IGD ShopperVista surveyed 500 British 18-25 year olds. The top 5 time-savers for grocery are: “self checkouts (77%); buying prepared food in jars, tins, packets or cartons ( 69%); spending less time cooking (68%); shopping in smaller convenience stores ( 67%); and buying food-to-go ( 64%).”

Note: This list is ordered by the relative measure of each link’s influence in the first week it debuted in the weekly Top 5. A link’s influence is a tally of the influence of each Twitter user who shared the link and tagged it #MRX, ignoring retweets from closely related accounts. The following links are excluded: links promoting RTs for prizes, links promoting events in the next week, and links outside of the research industry (sorry, Bollywood).

Food for Thought: Challenges and Ideas for Addressing Bias

If given the right tools, bias can be overcome.

By Katja Cahoon

In a previous article, I outline a topic that is not talked about (a lot). Bias impacts all of us, even, and sometimes especially, experts, as Kahneman and others outline in detail in their writing. I promised ideas for how to address it. Daniel Kahneman himself states that it is very hard, if not impossible, to change bias and cognitive errors[1] and I am humble enough not to think that I can outdo the Master! Research has even shown that certain kinds of bias training do not lead to noticeable change and can have the opposite effect.

On the other hand, I believe in our innate ability to change if given the right tools. Innovation guru Stephen Shapiro states: “Branch out and start studying tangents. I learn about innovation by studying magic.” I am not going to draw on magic but on my training and experience as a psychotherapist as well as having worked with dozens of wonderful, smart, inspiring insights and brand teams all over the world. As a psychotherapist, I am acutely aware of the amazing human ability to change unhelpful thought patterns and behaviors, in some cases quite fundamentally and profoundly.

Specifically, I am going to use the 12 Steps of Alcoholics Anonymous. Bear with me, please! They are actually very applicable as they deal with recognizing maladaptive patterns of thought and behavior, working to change those, and finally integrating new ways of thinking into one’s daily routine. The 12 Steps break down neatly into three distinct categories, which I will translate here into a business context.

Market research and marketing is of course not as dangerous or even lethal as addiction, but mistakes can certainly be costly (as in losing an election or expensive product or communication missteps). So, let’s look at the steps, how they create a structure for change, and end with some practical and inspiring ideas:

1) In AA, steps 1 to 3 are called the surrender steps. This is where a person realizes and accepts that they have a problem. We as an industry know that consumers are biased (it is pretty well established at this point). Most of us know that this applies to all human beings, marketers included. But, have we truly accepted it?

Knowing is different from accepting. The former happens at a cognitive, detached level. It is not that hard to say, “of course we are all biased.” It is even easier to point the finger at bias and we tend to forget that in those cases three fingers are pointing back at us. Acceptance happens at a deeper, more emotional level. It is unpleasant to realize that I have been biased, as I also discuss in my previous article. And more importantly, acceptance does not just theoretically call for some type of action but rather makes that action necessary. Acceptance requires more work, reflection, discussion, and practice.

  • The fundamental question is, what would it take to move you and/or your organization – meaning the people within it- from knowledge to acceptance and consequently action?
  • One starting point is to challenge yourself and encourage others to do so. Harvard’s Implicit Bias project offers a variety of well-designed tests that are pretty quick to take. They are eye-opening with regard to well-established cultural and societal biases (gender, race, ageism, etc.) – you will be surprised one way or another.
  • Another one is the BigThink article testing how rational you are – as with the IAT, these are quick, fun, and eye-opening.

Core Task: Move from knowledge to acceptance.

2) Steps 3 to 9 are called the working or action steps. These are designed to “clean house” and change patterns of behavior. If I have accepted that I am biased, I am open to a “Kaizen” approach about it: continuous learning and improvement, resulting, over time, in different associations, attitudes, and behaviors.

  • One of the action steps calls for a personal inventory. What would it look like if you took a “bias inventory?” What are the most common biases you, your team, your organization experience? Does it differ by context, e.g., is there a risk of confirmation bias skewing your views when observing a focus group or indeed any research?

It goes beyond the scope of this article to provide a comprehensive overview of common biases and heuristics. The following are important with regard to insights and research: anchoring bias, availability heuristic, bandwagon effect, confirmation bias, stereotyping. Here is a helpful quick list of the biases I mention and a few others.

With regard to the workplace in general it is helpful to understand performance bias, competence/likeability bias, and information bias. Information bias refers to the need to seek more and more data without the information impacting action. It is particularly worthy of attention in the context of market research as it is rampant and costly. Is your company culture one in which you are more likely to run with a small amount of quality research, or problematically, biased and surface level research, or is it a company that tends to postpone and delay by getting ever more research that either goes unused or cripples action?

Core Task: Make a bias inventory and use it for learning and improvement.

3)Steps 10 to 12 are called the maintenance steps. They are designed to integrate new behavior and thought patterns in order to keep the house “clean.” This does not call for dramatic, one time action but rather continued diligence, awareness, and a practical tool kit. Small changes over time have a significant impact, a point also reiterated by the well done and publicly accessible Facebook Bias Training.

  • Attune yourself to bias before important insight or brainstorming meetings. My former co-worker, Olson Zaltman Account Director Jessica Kukreti, starts many of her client meetings with a quick exercise around common biases among research professionals. She thereby integrates bias language, thinking, and awareness into the discussion, which also gives people permission to call it out.
  • Focus groups are viewed increasingly critically, among other things for their high bias risk. Interestingly, many brainstorming meetings are sort of like focus groups. One simple trick is to write down ideas, opinions, judgements about important questions/topics before the discussion and make sure to hear all voices, especially dissenting ones.
  • Put yourself into consumers’ shoes – literally! One of my clients used to make her team use their product at home, while on the road, and in other situations. That does not sound very impressive until you realize that this involved incontinence products. Yes, the team wore diapers to create more empathy and consumer understanding!
  • And that is my last point: work on seeing consumers as holistic, diverse, complex human beings. As one participant stated during a study on Millennials we conducted a while ago: “I want companies to realize that we are people just like them. Don’t look at me and my friends as a just another profit but people with voices. People with dreams, aspirations and goals. If they can see that then we can create dialogue to better deliver the products and services we need.”

Having true empathy is a way around bias and a way around this is to understand consumer complexity and multiplicity. I will write more about this soon.

Core Task: Integrate bias busting processes into your routine.

To wrap it up, I am honored to be speaking about this and related topics at IIEX North America (June 12-14). If you are attending please come and see me: I would love to hear your thoughts, experiences, learning, and stories about this important topic. Let’s work together to tackle bias, one day at a time!

[1] Daniel Kahneman Thinking, Fast and Slow

Who Are The 50 Most Innovative MR Industry Firms? Take the Latest GRIT Survey & Tell Us!

We’d like to invite you to participate in the Q1-Q2 2017 GreenBook Research Industry Trends (GRIT) Survey which helps us write the GRIT Report.

We’d like to invite you to participate in the Q1-Q2 2017 GreenBook Research Industry Trends (GRIT) Survey which helps us write the GRIT Report.

The GRIT Report was created to help insights professionals like you better understand where the industry is heading so you can make the right decisions for your organization. Click here to check out the most recent GRIT Report.

This edition of the GRIT Report features the always-popular GRIT Top 50 – a tracker of the 50 suppliers and clients perceived to be the industry’s most innovative.

Plus, we’ve updated the survey to include some of the most hot-button topics in our industry today, including:

  • The Role of Sample Providers
  • Sample Quality
  • The Future of Sampling
  • Adoption of Automation Technologies/Strategies
  • Post-Secondary Education Programs in Market Research
  • US MMR Programs Brand Ranking
  • Skillsets for MMR Program Graduates
  • Project Spending/Revenue in 2017
  • Industry Benchmarking:
    • Success Factors Of The “Perfect Study”
    • Importance of Knowledge, Influence & Marketing impact for study
    • Organization Ratings and Benchmarking
    • Organization Technology Strategies
    • Methodology Prioritization Factors
  • Prediction market Exercise: Emerging Methods Adoption

Will you support our profession by completing the survey? It shouldn’t take you more than 15 minutes.

Take the survey now.

As a thank you, we’ll send a PDF copy of the report straight to your inbox as soon as it hits the (virtual) shelves so you’ll be among the first to see it.

Prefer to take the survey in Chinese, German, Japanese, or Spanish? You’ll be able to select one of those languages at the start of the questionnaire.

Thanks in advance for giving back in support of our profession!

Thanks to Our GRIT Partners

Research Partners
Ascribe, AYTM – Ask Your Target Market, Bakamo Social, Consensus Point, G3 Translate, Gen2 Advisors, Lightspeed, Michigan State University, MROC Japan – Community Solutions Company, mTAB, Multivariate Solutions, NewMR, OfficeReports, Research Now,Researchscape International, Stakeholder Advisory Services, Virtual Incentives

Sample Partners
A.C. Nielsen Center for Marketing Research at The Wisconsin School of Business, AIM, AMAI, American Marketing Association New York,Asia Pacific Research Committee (APRC), ARIA, Australian Market & Social Research Society (AMSRS), BAQMaR, BVA, MRS, Next Gen Market Research (NGMR), OdinText Inc., Provokers, Qualitative Research Consultants Association, The Research Club, The UTA MSMR Alumni Association, University of Georgia | MRII, Women In Research


Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Seeing is not Believing

In a study, Hotspex discovered that online testing is as effective as laser eye-tracking and, in the process, they uncovered a disruptive truth.

By Gera Nevolovich

Physical eye tracking at shelf, as measured by laser technology, has been the long-standing measure of success in packaging design research. A Hotspex study set out to determine whether our click-based online attention tracking tool is comparable to laser eye tracking. We discovered that online testing is as effective as laser and, in the process, we uncovered a disruptive truth.

But first, some key observations…

1. Online attention tracking and laser eye tracking results are comparable

Across the five categories included in the study, there is a good correlation (.67) between attention as measured by laser eye tracking and clicks, with some variability by category.  Both methodologies show that respondents usually begin their exploration by focusing on the center of the shelf. Their attention then gravitates outward to other areas of the shelf.

2. Online attention tracking is a better predictor of choice

This validation study included a measure of market share based on virtual shopping: ‘products purchased on shelf’. Online tracking is superior, correlating more closely than laser to products purchased on shelf in 4 out of the 5 categories, and is comparable with laser in the pasta sauce category.

The underlying reason is that the physical act of ‘seeing’ a package on shelf is fundamentally different from the act of ‘clicking’, which involves mental processing of stimuli. This second scenario (clicking) more closely imitates the real-life shopping experience, which involves the same level of mental processing, because it culminates in a choice.

The ‘seeing + thinking’ dynamic intrinsic to online attention tracking is even more important when consumers are faced with highly-cluttered shelves. While shelf clutter and diversification were the most prevalent in the gum category in this study, retail shelves in general are becoming busier, forcing consumers to spend more time ‘processing’ before making a choice. Laser tracks physical eye movements, but the heart and mind do not always follow the eye.

The relevance of these findings is underscored by the fact that our study observes a closer correlation between click data and purchased-on-shelf data than the correlation of laser to purchase data.

3. Disruptive truth: Online eye tracking is a better indicator of Market Share.

a) What are the implications for online packaging testing?

The purchase ‘sweet spot’ is where a package design achieves high scores on both claimed purchase intent as well as high rates of on-shelf purchase in the virtual shopping exercise. To get into this space, a package needs to be ‘noted’ (attention) and then be ‘processed’ (connection).

Our internal research and development studies suggest that consumers’ emotional relationships with a product account for at least 50% of consumer choice drivers. This underscores the importance of making a connection with the consumer at shelf in order to influence choice.

Online, click-based eye tracking is inclusive of both attention and connection, meaning that respondents are more inclined to click first on what they like as opposed to what they first see, and it is clear that what they like is what they tend to buy.

b) Online is on-the-money

Technology has leveled the playing field for packaging testing methodologies with interactive technology, graphical power and global scalability. Online packaging design tests can be conducted in multiple countries with hundreds of consumers per design, providing a much more comprehensive evaluation of a design’s performance on-shelf (shelf test) and in isolation (concept test) in a fairly quick survey at a fraction of the price of in-facility testing.

About this study

A total of 5 shelves were tested. The categories tested provided a good mix of package sizes, number of SKUs on shelf, category purchase cycle frequency, and repertoire vs. impulse categories: pain relief, laundry detergent, gum, wood cleaners, and pasta sauce.

Per category, a total of n=75 interviews were conducted using laser eye-tracking (in-facility), and n=200 using click-tracking technology. Respondents were aged 18+ with quotas to ensure age distribution was proportionate to each category. 70% were female and 100% were New Jersey, USA residents for both in-facility (laser) and online interviews. Other qualifiers included: primary grocery shopper or shared responsibility for shopping, recently purchased items from qualifying category, and standard industry exclusions.

Traditional isn’t always better. Measure what matters.

A pack is the essence of the brand – its role is to introduce, communicate, engage, reinforce, and remind, as well as house the product. These many tasks are given to a relatively small space which needs to activate the target emotional states that drive consumer choice and relationship. The best packaging effectively balances these elements.

Do you know whether it is time to change your packaging? If it is time to change it, are you leveraging both System 1 implicit measurement and aligning it to System 2 explicit measurement to truly understand what to say and convey to drive brand growth?


Voted among the top global insights consultancies for 3 years in a row in the annual Greenbook Research Industry Trends (GRIT) most innovative survey, Hotspex is working with 15 of Top 20 advertisers in over 30 countries because we leverage the most innovative approaches from behavioral sciences, combining System 1 and System 2 measures to truly understand WHY consumer behave the way they do. We then apply marketing sciences, such as the Laws of Growth, to help you find out HOW to apply your insights in an actionable way to build distinct and coherent brands that accelerate growth.

For more information, contact Jonathan La Greca here.

Growing the Industry by Funding More Research – Part Six

Collaborata is the first cost-sharing research platform, saving stretching clients’ budgets by savings upwards of 90% on each project. We’ve asked Collaborata to feature projects they are currently funding on a biweekly basis.

By Peter Zollo

Editor’s Note: Welcome to our next post featuring two insights projects currently offered on Collaborata, the market-research marketplace. GreenBook is happy to support a platform whose mission is to get more research funded. We believe in the idea of connecting clients and research providers to co-sponsor projects. We invite you to Collaborate!

Collaborata Featured Project #1:

“Pushing Past Stereotypes: Understanding the Plus-Size Woman and How to Better Connect with Her”

Purpose: Plus-size women are often misunderstood, underserved, and overlooked. Until now. This research will provide a rich understanding of the psyche and behaviors of this critical audience.

Pitch: Women’s clothing brands and retailers have a problem. Some 65% of American women are considered “plus size,” but this category represents only about 18% of apparel sales. Learn how to more resonantly connect to and deliver for this audience.

Deliverables: Detailed report to include qualitative and quantitative findings and an edited consumer video. As a sponsor, you’ll receive a special section of the full report tailored specifically to your brand. Fifty apparel brands and retailers will be measured and evaluated.

Who’s Behind This: Big Squirrel, a boutique research agency with an expertise in branding.

To watch a 60-second video on the study: click here

To purchase this study or for more info: click here or email info@collaborata.com

Collaborata Featured Project #2:

“Politics and Purchase Power: Do People Really Put Their Money Where Their Party Is?”

Purpose: Americans are more aware (and vocal) than ever about companies’ political stances, but little is known about how such perceptions play out at a transactional level. In such a heated political climate, it’s critical for brands to understand how to navigate such potential pitfalls.

This study will gather insights into how, when, and why people’s political beliefs influence their purchase decisions, allowing brands to better understand if and when they are vulnerable, as well as where opportunities may exist.

Pitch: The 2016 election has taken pocketbook politicking to a new level, with anti-Trumpers launching “Grab Your Wallet” to protest such brands as L.L. Bean and New Balance, and the pro-Trump camp boycotting the likes of PepsiCo, Oreos and even the entire state of Hawaii.

This research will uncover how purchases are influenced by beliefs, demographics, and situations.

Deliverables: An insight-rich report that includes the PoliPower Index, which will measure political impact on category and brand purchase volume across political affiliation, demography, and geography.

Who’s Behind This: The Halverson Group, experts at uncovering and quantifying insights into people’s lives and the behaviors, beliefs, attitudes, situations, and cultures that influence their choices through the use of their proprietary Jobs to Be Won™ method.

To watch a 60-second video on the study: click here

To purchase this study or for more info: click here or email info@collaborata.com.

What could $20,000 do for your company? Submit to the Insight Innovation Competition!

The Insight Innovation Competition helps entrepreneurs bring disruptive ideas to life while connecting brands to untapped sources of competitive advantage through deeper insights.

Editor’s note: Submissions and voting are now open for the newest round of the Insight Innovation Competition, to be held at IIeX North America 2017, June 12-14 in Atlanta.

Submissions and voting are now open for the newest round of the Insight Innovation Competition, to be held at IIeX North America 2017, June 12-14 in Atlanta.

Imagined and organized by GreenBook, and made possible by our sponsors at Kantar and Lowe’s, the Insight Innovation Competition helps entrepreneurs bring disruptive ideas to life while connecting brands to untapped sources of competitive advantage through deeper insights.

This is how it works:

  1. Innovators submit a great idea that will change the future of marketing research and consumer insights.
  2. The market research industry votes on the ideas that have merit.
  3. Five finalists with the most votes (and possibly a couple of wildcard entrants selected by the judging committee) are invited to pitch their ideas to a panel of judges at Insight Innovation eXchange North America 2017.
  4. The winner gets $20,000, mentoring, fame and exposure to potential funding partners.
  5. The industry benefits from a great new solution that improves how companies understand consumers.

The Competition has been a huge success story. It’s a win for everybody: entrepreneurs with great ideas for improving the business of insights, investors looking for new opportunities in the insights space, and the corporate-side end-users of market research who are looking for new solutions.

Past winners have gone on to great success and include:

Stephen Phillips, CEO of past IIC winner ZappiStore, said winning “not only helped us feel great about what we were doing but also helped us attract both clients and potential investors.”

Similarly, according to David Johnson of Decooda, winning the IIC “helped accelerate the entrance of our company into the marketplace, gave us massive visibility to potential partners and clients, and led very directly to new business.”

The $20,000 cash prize is sponsored by Kantar and Lowe’s, and of course another benefit of participation is the ability to connect with and explore possible relationships with these three (and other) potential partners, including the many IIeX Corporate Partners who will be in attendance at the conference.

Submissions and voting take place on the Insight Innovation Competition website at http://www.iicompetition.org.  Submissions and voting are now open. Submissions AND voting close Friday, May 12th. Please note: the sooner you submit, the more time you will have to be included in the voting.