1. Official
  2. Clear Seas Research
  3. eu2015_480x60
  4. RN_ESOMAR_WP_GBook_480_60_11_14

The Qual Hackathon – A Chance To Co-Create Cutting Edge Qualitative Solutions

To facilitate innovation in Qualitative Research, IIeX has created the Qual Hackathon, an all day, interactive, co-creation session where participants will be challenged to create new solutions to real business problems. The event is being run as part of IIeX Europe, and will take place 19 February in Amsterdam.

Hackathon-logo (1)

By Ray Poynter 

Many people seem to think innovation starts with tech, but we believe it starts with people, and in particular innovations in qualitative research are all about people, interactions, and imagination. To facilitate this innovation IIeX has created the Qual Hackathon, an all day, interactive, co-creation session where participants will be challenged to create new solutions to real business problems. The event is being run as part of IIeX Europe, and will take place 19 February in Amsterdam.

This highly interactive session will be facilitated by leading qualitative researchers from across Europe, giving participants the chance to learn by doing and creating cutting edge qualitative solutions.

The team of facilitators are:

The day will be split into four sections, with each section having a goal:

  • Methods of defining a problem and designing an outline solution.
  • Adding detail to a solution and working out how the data/artefacts/information are going to be collected/acquired.
  • How is the information going to be assessed/analysed/processed? How are the insights going to be created/discovered?
  • How are the insights going to be communicated? Strategies for making the stories come alive.

The results of the hackathon will be posted on the GreenBook website, and will be available as ‘idea starters’ for anybody interested – the session is ostentatiously open-source and collaborative – the ideas are intended to be shared, borrowed, and improved..

The Qual Hackathon is being run as part of IIeX Amsterdam. You can come along just for the Hackathon, make it part of your two days at IIeX, or come for part of the day and spend the rest of your time in the other IIeX streams.

To book a place or for more information visit http://iiex-eu.insightinnovation.org/


2015 Insights-Driven eCommerce Resolutions

Customers are the reason for ecommerce success. And today, we have access to the tools that connect our businesses directly to those customers’ insights. So get the most out of 2015 and the research technology you have at your fingertips.

Thumb up 2015


By Malcolm Stewart

The ecommerce landscape is vastly different today than 10 years ago. As technology and communications advance, ecommerce experiences need to keep up, and it can be a challenge to pinpoint exactly what steps will keep you ahead of the game.

With the new year almost upon us, now is a great time to identify some resolutions for propelling your ecommerce strategy. To do that, ecommerce leaders need to be people-driven, not just number-driven. On that note, we’ve come up with three ways to help you build ecommerce experiences that are centered on the customer, not the quantitative data.

1. Invest in Technology That Drives Real Customer Insights

Consider the prediction that CMOs will be spending more on IT than CIOs by 2017. Sadly, much of this investment is not efficient. Making smarter technological investments in data and research can mean wiser ecommerce strategies.

It oftentimes feels like there’s a new “big data” analytics tool on the market every month. Before you know it, ecommerce leaders are managing a half dozen different tools, and dashboard after dashboard, graph after graph. Ultimately, this data is telling you the same information, just from different points of where your customer happens to be.

We’re setting ourselves up to fail when we add more data, not better data.

The addition of qualitative research data is a smart investment when so much is on the line. Qualitative research can help you understand the “why” behind your KPIs, instead of just adding a dozen more KPIs to track.

2. Treat Customers as People, Not Data Points

That’s right: Stop treating customers as data points and start treating them as people. With the plethora of big data, it’s understandable that ecommerce leaders inadvertently start shaping images of a customer based on numbers rather than human feedback, but the habit is a dangerous one.

In order to really leverage big data, you must connect to the humans behind the numbers.

Today, quantitative data plus “instinct” drives a lot of our ecommerce campaigns. Because we have the ability to collect massive portions of quantitative data, however, qualitative data still falls by the wayside. Until today, large-scale qualitative research did not exist.

The hard truth is that qualitative data can plateau – a fact reflected in the very existence of “standard” conversion rates. So next year, avoid getting wrapped up in the numbers and missing out on real customer insight by using qualitative data as a complement to your other analytics tools.

3. Develop an eCommerce Experience That Creates a Story

In 2015, resolve to create an experience that actually lays out a story that engages your customers versus just trying to get them to click one button after another. This is about being a seamless part of the “cognitive funnel” – the cognition process that a target customer experiences when making purchasing decisions.

Founder of YouEye, Kyle Henderson, discusses this idea more in an interview that you can read here. He points out how qualitative research can help ecommerce brands be a part of this decision-making process.

Through qualitative research, ecommerce brands can discover meaningful offerings that align with the e-commerce cognitive funnel; one example Kyle gave highlighted that qualitative research was able to identify that the confirmation of free shipping very early in the shopping process “led to additional confidence and eagerness to purchase from a vendor that was no longer in question.”

A great e-commerce experience is going to build a story, help inspire confidence and establish trust. It’s about discovering the order in which information is presented to the customer. Big data oftentimes points ecommerce leaders to create a checklist of all the different things that they need to have on their website in order to be at “best practices.”

But the problem with big data is that it doesn’t tell you what order each item on that checklist needs to be in to create a story. And that’s what qualitative research can do.

Make Ecommerce More Human in 2015

Customers are the reason for ecommerce success. And today, we have access to the tools that connect our businesses directly to those customers’ insights. So get the most out of 2015 and the research technology you have at your fingertips. Your customer – and your bottom line – will thank you.


7 Tricks For Cracking Trackers

Start cracking your tracker and make its data more impactful for marketers and end users. Combining the focus on relevant brand performance dimensions with the additional context deep dives will allow you to finally bring your tracker data to life again.
7 tricks for cracking trackers


Many brands continuously measure their performance on several key performance indicators. Yet, if you are working with tracker data, whether you are a researcher or research user, it will not come as a surprise that trackers often lead to many frustrations amongst its users.

One of the key frustrations is that tracker data – coming from your brand or customer experience tracking – is often too static. The data does give an impression of the brand performance across different key performance dimensions, tracking this over time, however – whether it is on a monthly, a quarterly or a yearly basis – often does not reveal anything new. Most tracker reports bring a similar story over and over again. On the other hand, when you actually see a disruption in the data pattern, the traditional measurement questions do not allow you to capture the ‘why’ behind the data movement.

Should brands just ditch their tracker studies? Considering the above mentioned shortcomings of traditional tracker research, both researchers and research users need to start evaluating their existing trackers critically. Here are 7 tips to become a real ‘Tracker Cracker’:

Tweetaway redTweetaway: 7 tips to get more out of your tracker studies insit.es/1Ag1xCS by @KPallini #mrx #surveys #tracking

Tip 1: Stop tracking for the sake of tracking

Firstly, you need to ask yourself which key performance indicators are worth tracking over time. Evaluate the significance of each performance criterion in your survey critically. Does the item directly influence your brand strength and impact your strategy? If the answer is ‘no’, you should instantly discard that question from your survey. It is not because you have always measured your brand performance through the traditional brand funnel thinking that you should continue doing so. The same goes for those 3 touchpoint NPS scores that are doing the tour of the organization: don’t be afraid to let go of those irrelevant KPIs and focus on the core. So stop tracking for the sake of tracking and revise your tracker survey question by question. Furthermore, it is essential that each metric is owned by a stakeholder in your organization. Assign people to be responsible for all the different metrics, ownership is key to guarantee tracker relevance!

Tip 2: Is it possible for consumers to give a truthful answer?

Once you have critically evaluated your tracker as to assure it only contains relevant questions, the next step is to assess whether it is possible for consumers to actually provide you with an accurate answer. Recall of information depends on one’s memory, which is imperfect and therefore often unreliable. Avoid questions which require for participants to look too far back in time, it is better to limit your question’s timeframe. Use the repurchase timeframe of your product or service as a guide in the matter.

Tip 3: Can your next-door neighbor answer this?

Subsequently, you need to make sure that each question is written in clear consumer language. Always ask yourself if your average consumer can answer your questions. Researchers and marketers need to keep in mind that the average consumer is not like them. Avoid using marketing jargon. Here’s a trick: simply ask yourself whether your next-door neighbor would be able to answer your survey questions.
Confused man

Tip 4: Be available in relevant context or moments

Especially when your brand experience is lived in different customer touchpoints, you need to make sure your survey is available throughout those touchpoints, allowing consumers to provide in-the-heat-of-the-moment feedback. Ensure that your tracker survey is available on any device used by your consumers. These days mobile is part of the device usage of any target group, so make sure you design your questionnaire smartly to fit the mobile screen.

Tip 5: It is not just what you collect, but also how you collect it

It is essential that you correspondingly think about how your data is collected: depending on the objectives, you can choose either continuous or pulse tracking. By using continuous tracking, where you have a continuous inflow of interviews during your field period, you can smoothen out (wanted or unwanted) marketing effects (e.g. media exposure, PR actions, etc.), assuring comparability across waves. Continuous tracking is in contrast with pulse tracking, where fieldwork is condensed in field waves (e.g. a two-week field period every quarter). Pulse tracking makes it easier to assess specific marketing actions, but can be strongly influenced by a marketing event happening close to or during fieldwork; events happening between field periods may even be unnoticed.

Tip 6: Building a modular approach

Apart from making sure that your tracker consists only of relevant performance indicators, namely those criteria which need continuous follow-up and determine your brand health, you additionally need to make sure that there is room to put your data into perspective. Trackers should not be entirely static, you do need some repetition to allow comparability (that is what the key performance indicators are for). Yet trackers can truly benefit from a modular build-up, combining a fixed part of performance tracking with a variable module depending on your content needs. These deep-dive modules can allow to zoom in on specific marketing efforts, allowing to bring additional insights to better comprehend variations in your tracker data.

Building a modular approach

Tip 7: From tracking to structural collaboration

Modular trackers allow to continuously measure your brand performance while also leaving room for some deep-dive moments to measure ad hoc marketing effects. But we can take this even one step further by combining this continuous validation stream with an ongoing collaboration approach, using a structural Consumer Consulting Board (also know as online research community). The latter empowers consumers to help you understand them better, detect insights, develop innovations, strengthen your brand power, optimize your go-to-market strategy or improve customer experience, every single day. This new eco system will allow you to better comprehend your tracker data, while at the same time permitting to validate the consumer input from your community in the modular tracker component. The interaction between the inspirational content from these boards and the continuous validation form a powerful tool for your decision-making on brand, product and strategic level.

From tracking to structural collaboration

Tweetaway blueTweetaway: From ad hoc #tracking to structural #collaboration insit.es/1Ag1xCS by @KPallini #mrx #surveys

Start cracking your tracker and make its data more impactful for marketers and end users. Combining the focus on relevant brand performance dimensions with the additional context deep dives will allow you to finally bring your tracker data to life again. Because, let’s be honest… how many marketers actually read those tracker reports?


CASRO Transformation Series: Morpace – Back to the Future

This month’s Transform blog takes us to the Motor City, an appropriate venue for a discussion on transformation with Morpace, an integral part of the automobile industry.



By Jeff Resnick of Stakeholder Advisory Services


Lawrence_0828_final (2)This month’s Transform blog takes us to the Motor City, an appropriate venue for a discussion on transformation given the re-emergence of the auto industry as well as the rehabilitation aspirations of one of America’s storied cities.  Morpace is an integral part of the automobile industry, the source of over half of its revenue.  During our interview, Duncan Lawrence, Morpace’s President and COO, reflected on the how and why of its business transformation.

Morpace’s story is about a return to its roots of success intertwined with a firm vision of the future.  Duncan characterizes its transformation as evolutionary, not revolutionary.   Today, Morpace is one of the fastest growing ”traditional” market research firms – with double digit growth for each of the past five years.  Here is his advice based on Morpace’s journey. 

Start by establishing an “all in” culture.  As for many in the MR industry, 2009 was a difficult year.  Financial stability for Morpace, and many in our industry, was the immediate focus.  Sacrifices were made by everybody on many fronts.  Finger pointing was replaced with group-generated solutions.   Communication was a strategic imperative, with regular Town Halls to discuss the status of the company.  Because everyone was integral to successfully addressing the adversity of 2009, a shared culture with common goals developed among the staff instead of major fractures appearing throughout the organization.  The result was the ability to capitalize on the recovery that began in 2010 – a year Morpace grew by over 40%.  Annual revenues have topped double digits since then with a strong 2014 also expected. As Duncan pointed out, the catalyst for establishing an “all in” culture will differ from one organization to the next.  However, getting everyone on the ‘same page’ is essential.

Map the future.  The senior team began mapping the future for Morpace in 2011.  Top leadership candidly discussed Morpace’s strengths, weaknesses and the brand they wanted to bring to market.   The soul searching took Morpace back to its roots of providing deep industry expertise to clients with an eye toward accomplishing this on a much larger scale.  This became the ideological foundation for launching two strategic initiatives – becoming more consultative with clients and growing globally.  These strategic decisions had far reaching implications for Morpace – from the people it hired to the infrastructure it built.   But by making these choices, the road map for Morpace’s transformational journey was put in place.

Look from the outside in.   One of the skills that Duncan looks for in professionals joining the firm is their ability to put themselves in the shoes of their client.  An accurate understanding of the client, their needs and challenges is crucial. Understanding client needs is a mandate for the transformational process.  For Morpace, it confirmed decisions to expand globally and move in a consultative direction. 

Focus on what you are best at, find partners for everything else.  Historically, Morpace owned all necessary services to complete client work.   However, it became clear many services were best left to specialist firms where ongoing development was a core investment priority.   This decision allows Morpace to assign internal resources in a way that delivers true value and insights, enhanced by tools and services readily available through partners.

Don’t just talk about transformation, get people doing it.  Duncan emphasizes that while transformation has to be driven from the top, it also has to be owned by employees.  When required change is identified, Duncan encourages leadership to assign the ownership and implementation of that change to a team.  This embeds the mechanics of transformation into the fabric of the organization.

For Morpace, business transformation is an ongoing journey, yet, the recognition of the need to change and the willingness to do so are clearly two powerful factors in the success it is achieving today.  That is not to say there won’t be bumps in the road.   Reward is not without risk.  Growth is not without setbacks.  However, establishing a culture of ongoing transformation can only have positive outcomes in an industry where the winds of change are constant.

Morpace’s focus moving forward is to become faster, smarter and more innovative.  If it makes progress in these areas both now and in the future, continued success is within its grasp.

One final word – congratulations to the entire Morpace team on being named CASRO’s 2014 Research Organization of the Year.


About Morpace Inc.

Morpace Inc. is a full-service market research and consulting organization that helps businesses solve their most pressing marketing challenges. It specializes in the automotive, healthcare, financial services, retail and consumer goods, and technology industries.

Established in 1941, Morpace was named the 2014 CASRO Research Organization of the Year. It is one of the fastest growing U.S.-based research organizations since 2009. 

With an experienced team of industry professionals and an array of innovative research tools, Morpace has the “Creative Minds” and “Intelligent Solutions” to help its clients make smarter business decisions. Morpace is headquartered in Detroit with offices in Los Angeles, London, and Shanghai. Visit www.morpace.com for more information.


SMB 101: How To Plan Your Market Research Budget For 2015

Every business owner wants to make the most out of their budget, and here’s how you can do it (at least in regard to market research).

Image Source: ShutterStock.com


Editor’s Note: Too often we forget that the largest part of the global economy is made up of small to medium sized businesses, and their research challenges (and internal resources) can be very different from larger companies with extensive insight organizations, although even those firms are feeling the need to do more with less and develop “cheaper, faster, good enough” solutions to a variety of insights needs. This trend has helped many new suppliers enter the marketplace in the last few years with many of the “DIY”-type companies becoming major companies in their own rights due to the demand for access to their solutions.

This trend is so prevalent that last year I conducted a webinar for SMBs on How to conduct market research like a pro for (almost) free where I explored some of the solutions available to budget conscious companies to conduct great research:



In today’s post business blogger Megan Ritter plays on a variation of this theme by exploring how SMBs, especially those in the ecommerce sector, can develop a budget for conducting market research in 2015, even if that budget is minimal. It’s a great “Business 101″ post and serves as a great reminder that every company can now benefit from great market research regardless of budget size.


By Megan Ritter

Market research tends to get the short end of the stick when it comes to budget planning. In fact, it is usually the first item that gets cut when there are limitations on time or finances (or both). After all, it’s not as necessary as something like web hosting or credit card processing, but then again, you do need to understand your audience, their desires, their pain points, and their general online behavior in order to make sure that your company will profit and grow.

Market Research’s Place In Your Ecommerce Marketing Budget

There’s a lot to be said for the value of market research, and the positive role it can play for your ecommerce business. Which is why it should be considered an integral part of your budget, not an afterthought. And there are plenty of ways to accomplish your market research goals that are relatively cost effective.

Working With A Market Research Firm Versus Doing It In-House

There are plenty of market research tools that range from social media monitoring software like Radian6 or Spiral16 to survey apps such as SurveyMonkey available these days – and the internet can essentially be considered a giant focus group, in a way (but then again segmenting populations, gathering information, and analyzing the resulting data can take a significant amount of time and effort.

Granted, a market research firm can perform this work for you – but if you have the time or human resources to do the job in-house, it could be the worth the effort. And after all, perhaps no one knows what questions to ask better than your own team!

Use Your Existing Resources

Furthermore, your current customers, your employees or coworkers, and even your friends and family might be the best source of information if you have an existing business – or even potential customers if you have your main demographic already established. Set up interviews or send out a survey to see what they think about your business, competitors in your vertical, and any other information you wish to know.

Keep in mind that you don’t necessarily need to be in the same room or even the same country as your interviewees – there’s plenty of technology that will help you hold market research interviews remotely at everyone’s convenience.


Image Source: ShutterStock.com

Maximizing The Value Of Your Market Research Dollar

Every business owner wants to make the most out of their budget, and here’s how you can do it (at least in regard to market research). Start by determining the end goals of the process:

  • What do you want to learn about your target audience?
  • Where do they interact online?
  • How they discover and recommend products?
  • What are their pain points and how can you improve their lives?
  • Do you need to discover who the prospective customers for your idea or concept are, and how they interact with brands and make purchasing decisions?

Knowing what questions you want to answer is the most important part of getting the most of your market research efforts – and potentially generating capital that help improve all aspects of your business in the future.

Syndicated Research Or A Custom Project?

Sometimes, the work has already been done for you by a market research firm or similar company. For instance, if your demographic is fairly well established and you know what questions you’re looking to answer, you may be able to purchase white papers, reports, or other documentation – which frees up time and resources on your side. After all, you don’t necessarily have to re-invent the wheel.

On the other hand, you may have unique targets or other goals that simply require a custom project – which also doesn’t necessarily need to be incredibly expensive.

Understand What You Need To Learn

In the end, knowing what you want out of any given marketing endeavor – but perhaps it is especially important in the market research realm. Planning out the questions you want to ask – and the answers you want to uncover is the first step towards success. And remembering that you can almost always factor it into your budget is the second – market research shouldn’t be the first item to cut. It’s vital in the long run!


Harnessing The New Marketing ATOM

I believe that our digital, always on lifestyles offer the second golden age of brand-building. …creating ways to convert always on behaviors into brand equity beyond what traditional media can contribute. Here’s how to unleash the power of the marketing ATOM.



By Joel Rubinson

The truly transformational part of marketing in a digital age is that brands can now become Media, attracting their own audiences. The transformation follows what I call the A.T.O.M. model…Audience, Technology, Operating practice, Metrics.

Yet so few brands seem to be fully acting on this model.

The new marketing ATOM is shaped from the always on lifestyle we are addicted to.  Any slow point in our lives while watching TV, sitting on the toilet, or waiting in line immediately gets filled by reaching for the nearest screen. From research I conducted with Civic Science we found that half of us multi-task while watching TV but 80% is unrelated to the show itself. A series of studies I worked on for AOL adds to learning about the new always on lifestyle.  Over 2/3rds of mobile isn’t mobile…and we often go to a shopping site not to shop but to alleviate boredom.  Nearly all of us engage in shopping behaviors daily or weekly without shopping purpose…it’s a form of entertainment; therefore, retailers like Amazon, Walmart.com, Macys.com have become media companies as well as retailers. When we go to Wholefoods.com we immediately see they get it…that values beyond functional purpose matter… …they are a lifestyle portal offering entertainment, information, and a sense of belonging.

I believe that our digital, always on lifestyles offer the second golden age of brand-building.  …creating ways to convert always on behaviors into brand equity beyond what traditional media can contribute.

Here’s how to unleash the power of the marketing ATOM.

Audience. Brands can now build their own audiences where millions or even tens of millions visit the brand.com website regularly, register to transact, for newsletters, for offers and therefore voluntarily engage in a one-to-one relationship. Audiences can be built via website visitation, frequent shopper IDs, and liking or following in social media.  A brand builds a large audience if it finds values it can stand for beyond functional purpose and expresses them via consistent and valuable content.  Yes, audience building is a content game for media publishers and for brands who become publishers.

A brand audience gives you an amazing runway to the most important consumer group there is…those who like your brand but who could be buying your brand a higher percent of the time. Starbucks is the best example I know of for illustrating marketing based on brand audiences.

“The brand [Starbucks] has more than 35 million fans on Facebook, well over 4 million followers on Twitter, 6.5 million members of its Starbucks Rewards program, and nearly 17 million mobile app downloads…The point is that Starbucks can reach tens of millions of people any time it wants to.”

Ad Age, 2013

Technology. Understand that when you commit to building a brand audience, you are committing to fully leveraging first party data for marketing advantage and therefore to the technologies of big data, personalization, and programmatic ad buying. Beyond the communications runway, consumers are also granting you access to amazingly powerful data about what is relevant to them, what lifestage they are in, what they might be shopping for, what interests they have, and more. You can now segment your audience at scale…using clickstream and transactional data to go beyond attitudinal segmentation.

Operating practice. Using the ATOM model, the way marketing goes about its job must fundamentally change. Brands must offer valid reasons to encourage registration and log-in so it can link consumers across screens, match to their social media profiles, send e-mails, and connect to behavioral targeting ad products throughout the web. All brand communications must be thought of as content. And all content must be highly strategic, reinforcing the values that brands share with their audience, creating search engine optimization to make your brand “finadable”.  If your content strategy is effective, it will lead to people spending much more time with your brand story and even sharing its content with friends.

Metrics. Digital marketing at its best is a game of measurement, experimentation and predictive analytics.  And yet, marketing research remains locked in a survey-based brand tracker mentality.  Most do not integrate telltale digital and social signals of brand success against the ATOM model.  We track awareness but not audience, equity but not digital behaviors, attribute ratings but do not profile user characteristics that lead to higher click-through and view-throughs. Our research reporting cycles remain slow as the world speeds up around us. Old tracker approaches continue to reinforce the old rules and practices of marketing and undervalue the contribution of building brand audiences; I believe this is a main cause for big brands having such a spotty track record at building audiences.  Research is not lighting the path to show their importance.

One final note about brand audiences is that your brand presence will be distributed throughout the web but you still need to choose your center of gravity.  Three years ago Marketers were being told to abandon websites for Facebook pages which I counseled against. Today we see that Facebook has all but killed organic impressions. If you work the ATOM model, you will understand that the power is in the communications runway that YOU control, the first party data YOU own, and research that follows a contemporary view to provide useful brand-building guidance.

Note: The use of the acronym “ATOM” is unrelated to any marketing, research, or agencies use of this word.

Market Research: 2014 In Review & 2015 In Anticipation

Last Friday a panel of leaders from the research industry shared their thoughts about 2014 and the projections for 2015. Here is a summary of the key points.


By Ray Poynter

Last Friday a panel of leaders from the research industry (plus me) shared their thoughts about 2014 and the projections for 2015. The panel comprised Lenny Murphy (GreenBook, Gen2 Advisors, IIeX), Simon Chadwick (Editor of Research World and head of consultancy Cambiar), Kristin Luck (ESOMAR Council member and President/CMO of Decipher), Jeffrey Resnick (CASRO board member and Managing Partner at Stakeholder Advisory Services) and myself.




You can hear the full recording here (90 minutes).

In summary the key points were:

2014 Main Themes

  1. Mobile is mainstream, it is everywhere, and it is soon going to represent more than 50% of all surveys.
  2. The Cambiar data on investment, mergers and acquisitions showed that 2013 was a very big year for big data companies, but 2014 was probably a much weaker year – the bubble may have burst for a while.
  3. Automation is a key theme, ZappiStore is the poster child, but there are many other examples.
  4. The emergence of large Japanese companies in the mergers and acquisitions scene is an important indicator of change. Companies like Macromill, Intage, and Rakuten are shaping up to be major regional and global players.
  5. The MR industry is much happier than it has been for years, partly because of the improved economy, but in many cases the positive vibe is based on new solutions, new tools, and younger people.
  6. After a couple of years in the doldrums, neuroscience and biometrics has seen a major growth in interest.
  7. Recognition that MR needs to do more to attract new talent into the industry, more to train people, and more to change working practices to make them attractive to millennials.

Things that did not happen in 2014

  1. Salesforce not buying somebody like SurveyMonkey, to expand their breadth of offer, was seen as surprising.
  2. Text analytics and big data did not establish any bridgehead with the core business of market research.
  3. Lack of key acquisitions by GfK and Ipsos, unlike Nielsen and Kantar.

Key predictions for 2015

  1. Web messaging is going to be the biggest mobile growth area, although location, passive, and in the moment are all going to be strong growth areas.
  2. Japan is going to emerge as a much bigger player, although the mergers are also going to create some major challenges.
  3. More key mergers and acquisitions (and perhaps IPOs).
  4. More automation, more cost-cutting, doing more with less.
  5. Moving budgets from established markets to emerging markets.
  6. Text analytics and social media analytics will start to build a bigger platform of use.
  7. The line between qual and quant continuing to blur with the growth of quant approaches to data that have traditionally been qual, and the growth of data collection modes that are equally suitable for qual and quant (such as communities and mobile ‘in the moment’ research).

Challenges for 2015

  1. Privacy problems! It would only take one more failure/scandal on the part of data handling organisations to create a public and legislative backlash that would make life for market researchers very hard.
  2. Building trust – the Global Business Research Network (GRBN) recently published the Trust and Personal Data Survey, based on 23,000 respondents, showing that on average one-in-three do not trust the market research industry to safeguard their personal data. Without this trust, we have a future with limited prospects.
  3. The risk of an economic downturn, with Europe, China, and Japan all being likely causes. This is a possibility, not a probability, but the downside would be major.
  4. Unintentional legislative impacts. Laws relating to net neutrality, terrorism, do not call, distance selling all pose a risk of impacting market research, often unintentionally.
  5. The large companies, such as GfK and Ipsos, need to establish a clear migration path from the old world to the new world.

Can Neuroscience Save the Market Research Industry?

After a day at NIMF, I believe we are on the cusp of finally cracking the unconscious, and accelerating the market research industry into a new dawn of innovation, creativity, and relevance.



By Allan Fromen 

The Market Research industry is approaching a frightening crossroads. On the one hand, DIY tools  like SurveyMonkey and Google Consumer Surveys have been the recipients of well deserved attention and kudos, with the former receiving a $1.4 Billion valuation last year, and the latter getting a boost from Nate Silver, who praised GCS as more accurate than many traditional polls in the last presidential election. Today, anyone – literally anyone – can easily create a free survey and even find sample for a modest sum. DIY is taking off, and research firms are left vying for more complex studies that require their expertise and scale. Client side researchers are equally challenged, as budgets dry up and internal clients expect insights that are faster, cheaper, and actionable.

There is also a growing consensus that consumers simply cannot tell us what we want to know, for example, why they selected a certain product or brand. With the explosion of behavioral economics, books like Predictably Irrational and Thinking, Fast and Slow have codified the fact that consumers are highly emotional and make decisions for reasons that are often not accessible to their consciousness, and by extension, not accessible via classic market research techniques. I recently heard a C-level executive talk about System 1, which demonstrates that what was once the domain of psychologists and academics – namely, that consumers are often driven by non-conscious triggers – has officially reached a tipping point and is now part of the mainstream business lexicon.

Amidst all this disruption, I recently attended the NIMF conference, a full day dedicated to neuroscience (I use the term neuroscience loosely here, to mean all techniques that seek to tap into non-rational, automated thinking). It was highly refreshing to witness innovative market research companies commercializing the exploration of the non-conscious as a means of delivering insights.

Here are some of my key takeaways:

There are many, many hammers: EEG, fMRI, metaphors, galvanic skin response, eye tracking, implicit measures, and more. While the sheer number of techniques is exciting, it still feels a little like the Wild West. We desperately need guidance as to which methods are most appropriate for specific research questions.

Validation has started…: Some truly fascinating case studies were presented. In one, Aaron Reid of Sentient Decision Science noted how implicit measures combined with a classic conjoint exercise yielded a .90 correlation with sales data. This synergistic effect of neuroscience and traditional research is very encouraging, and worthy of replication.

But more is needed…: Neuroscience is no panacea, nor is it the right tool for every research question. We are in the early stages and need to marry our excitement with methodological and academic rigor, to put our clients at ease and help inform the industry. The ARF has taken steps in this direction and noted that fMRI is the “gold standard” for advertising research. However, we need many more such validation studies, before neuroscience techniques can be a regular part of the research toolkit.

Heed the clarion call: No doubt, things are changing. We need to adapt and innovate in order to stay relevant in a commoditized market research industry. To that end, we should embrace the change, even if (or especially!) it is scary or uncomfortable. Will Leach said it best, when he challenged the room to stop delivering insights and start focusing on behavior change. This was a refreshing reminder that while insights have value, they are only a means to an end. If we can move up the stack and demonstrate how to change actual behavior, we will earn the proverbial seat at the table.

As a psychologist, I have long believed that humans are far from the rational agents we aspire to. After a day at NIMF, I believe we are on the cusp of finally cracking the unconscious, and accelerating the market research industry into a new dawn of innovation, creativity, and relevance.


Personal Data: A Threat Or An Opportunity For Our Industry

The recent GRBN survey on Trust and Personal Data tells us that, across the global, people are already concerned about the possible misuse of their personal data, and as many as 36% are very concerned about this issue.



Editor’s Note: In 2011 The World Economic Forum and Bain & Company released a report “Personal Data: The Emergence of a New Asset Class”. This wide ranging and comprehensive study, with contributions from leaders in every global industry, captured the essence of the personal data economy in their introduction:

“We are moving towards a “Web of the world” in which mobile communications, social technologies and sensors are connecting people, the Internet and the physical world into one interconnected network. Data records are collected on who we are, who we know, where we are, where we have been and where we plan to go. Mining and analyzing this data give us the ability to understand and even predict where humans focus their attention and activity at the individual, group and global level…..  Increasing the control that individuals have over the manner in which their personal data is collected, managed and shared will spur a host of new services and applications. As some put it, personal data will be the new “oil” – a valuable resource of the 21st century. It will emerge as a new asset class touching all aspects of society.” 

Indeed, this report gave birth to a new program called the Rethinking Personal Data Initiative which has already defined the value of personal data as an asset class and reinforced the value of trusted data flow. The current stage of the program brings together data experts with practitioners in different commercial environments so that together they can drive results that are practical, implementable, and can be widely communicated. They are looking at how to create and implement the right rules, tools, frameworks and business models to bring about the emergence of a personal data ecosystem where people have greater control over the collection, use, sharing and monetization of their personal data.

Conspicuously absent from the steering group are any representatives from the consumer insights or even marketing organizations. No Nielsen, Kantar, WPP, Publicis, Omnicom, Dentsu, etc… or any other connected organization. And that is a missed opportunity for our industry, for as we can see from the GRBN Trust & Personal Data Report, in some markets the insights industry has an edge in the trust of consumers vs. virtually any other industry, and globally is at least average. We have a stake in the global dialogue around personal data, and perhaps even a larger one than many other industries since consumer data is the driving force of market research. Whether it’s surveys or focus groups or Big Data analytics and applied neuroscience, our industry has historically worked hard to utilize even the most personal and intimate information in a respectful and beneficial way, and as new technologies blur the lines between market research and marketing via single source channels, digital advertising, social media analysis and mobile tracking we have a unique opportunity to apply our historical role as the advocate for consumer empowerment via data sharing with the broader world. It’s a differentiation that we should continue to work hard to develop and a perspective that our trade bodies need to share with both consumers and organizations like the WEF. This report is an important piece of evidence to help us shape that broader discussion, and hats off to GRBN for leading the charge to understand where we stand in the new data-driven world we live (and work) in.

But enough from me. Today Andrew Cannon, Executive Director of the GRBN, share’s his take on the results of the GRBN report and what it means for our industry. It’s good stuff and an important topic.


By Andrew Cannon

I think there is no denying that over the next few years the amount of personal data people will need to share will continue to grow, as access to services, as well as to information, will take place more and more via digital channels.

At the same time, the number of security breaches, both inadvertent and criminal, is also likely to continue to climb.  According to a recent report by PWC: “The total number of security incidents…climbed to 42.8 million this year, an increase of 48% from 2013.*

The recent GRBN survey on Trust and Personal Data** tells us that, across the global, people are already concerned about the possible misuse of their personal data, and as many as 36% are very concerned about this issue. I believe that the increased number of breaches, as well as the increased media coverage of such breaches, will only serve to raise this already high level of concern people feel about the risks.

In turn, I believe people will become more and more careful with whom they share what, where, when, how and why. In this environment, one of the key questions for the market research industry is: how will people’s willingness to share their data, personal and otherwise, with us change over the next few years?

A key determinant of this, in my opinion, will be how much trust people will have in our industry to protect and appropriately use their personal data. According to the same GRBN study, for every one person who highly trusts our industry to do so, there are three who have a very low level of trust in us.

Whilst this may sound alarming, it is in fact an average level of trust on this issue (based on the 17 different types of organisations included in the survey), and it is clearly better than the level of trust people express in, for example, media companies or social media companies.

I believe that our glass is half full on the issue of personal data and trust. I believe we, as an industry, already act responsibly with personal data, and have a fantastic window of opportunity to use the issue of personal data to help build a strong relationship with the general public; a relationship built on trust. For in an environment of high concern, it is with those whom people trust that they will share their data. As our industry relies heavily on the people’s voluntary participation, and as people do not need to give us their data to access services and/or information, we have much to gain, and indeed much to lose, by gaining, or losing, people’s trust.

But, I also believe that this window of opportunity will not stay open for long, and that our industry needs to act now.

We need to actively address and alleviate people’s concerns. We need systems and procedures in place to ensure that we are responsible with the collection, handling, protection and use of personal data. We need to ensure that the risks of inadvertent abuse, as well as security breaches, are minimized. We need to be transparent in our dealings with the general public and to communicate effectively with the public, as well as other stakeholders, such as clients and the authorities, about this issue.

In short, we need to build trust with the general public.

Do you share my views on this opportunity for our industry? If not, why not? If so, what do you think our industry should do in order to seize it? 

Andrew Cannon

Executive Director, GRBN



**A survey of over 23000 adults across 24 countries globally iconducted by the GRBN in co-operation with Research Now. You can access the reports as well as an interactive dashboard, please visit http://grbn.org/initiatives/index.php?pid=35

Stop taking drugs! Take a natural supplement of Acai Berry

The Conversion Story Of A Cognitive Researcher To An Emotional Researcher

Posted by Todd Powers Thursday, December 4, 2014, 10:01 am
Posted in category General Information
Emotions are important determinants of choice behavior, and we now have systematic means of measuring those very emotions during the process, so we have a great opportunity to combine our right- and left-brain influences to better understand these marketplace dynamics.



By Todd Powers, PhD

If you are like me, you have probably avoided the whole question of emotions in your market research career.  I am perfectly willing to admit that, for the most part, I have prayed each morning to the God of Logic as the ultimate arbiter in the battle for market insights.  Routinely, I produced and pledged allegiance to statements like, “Males are significantly more likely than their female counterparts to cite horsepower as an important determinant of brand choice in pickup trucks.”  And I had proof, right there in the sample of 500 recent buyers of trucks who had answered questions I asked them about that very issue.

Oh, I knew that people often let their emotions get the best of them, but I also assumed that these were largely secondary factors in decisions that they made.  Well, no more.  I may be a slow learner, but I am now thoroughly convinced of the dominant influence that emotions have in the purchase process, and more importantly, of the mandate that we have to measure and understand that influence.

So how did this happen?  I now realize that the main reason that I (dare I say “most of us” researchers) studied cognitive effects — to the exclusion of emotions — in decision-making was because emotions were just so damn difficult to measure.  Really.  We tried asking people what they were “feeling” when they bought that refrigerator, but the results were always less than satisfactory.  People gave answers that were more a function of social desirability than any real sense of introspection.  Qualitative techniques like laddering helped us to understand that platinum-version credit cards made us feel important, and superior, and condescending when we slapped that baby down on the countertop at the hotel check-in, but that stuff is difficult and expensive in a survey of n=500.  So we didn’t.  Instead, we just asked our respondents to rate each of 8 or 10 different competitive brands of say, toothpaste, on things like price and availability and flavor.

Consider this:  One of the more common models of choice behavior, maximum likelihood, can be represented by the following (simplified) equation.


Presumably, we make choices by rating all of the options on a set of attributes, and then assigning weights to each attribute according to how important they are.  If having good gas mileage, and leather seats, and four doors, and a high resale value are important to me when buying a car, then the brands that rate highly on those features will be the ones I am likely to choose.  So … If we want to predict brand choice, we just ask people to rate brands on those attributes, then we ask the same people to tell us how important each feature is to them, and presto, we get accurate predictions, right?  The answer is “No.”  Those efforts proved to be fruitless, for the most part.  And that’s because the importance ratings are driven by our emotions, not our cognitions.  And we got pretty much meaningless answers to questions about those importances, or weights.

Instead, we adopted a practice of asking for attribute ratings, then observing choices, and (using various modeling techniques) inferring the weights, the importance scores.  Essentially, that’s what conjoint/discrete choice models do for us.

I guess I could have been perfectly happy working under this set of neatly logical rules.  Let’s measure what we CAN measure, and do it really well.  But it still bothered me.

And then I encountered two very different academic discussions that convinced me we needed to measure our emotional reactions in decision-making more directly.

First, I read about the work of Antoine Bechara, a psychiatrist working with patients trying to recover from traumatic brain injury due to car accidents and the like.  These were people who had lost the functioning in their pre-frontal cortex, the portion of the brain largely responsible for processing of emotions.  Repeatedly, he observed that his patients could perform all of the normal activities of day-to-day life, and could manage the cognitive tasks that he asked them to address.  What they could not do was “make a decision.”  Bechara cited the example of patients who, when asked at the end of the counseling session just when they might be available to come back next week for follow-up counseling, would start listing all of the reasons that Monday was good (or bad), or that Wednesday had potential, or Thursday might or might not work out.  But they were unable or unwilling to actually choose a day and time for their next appointment.  They struggled deeply with pulling that trigger.

Bechara concluded that choice was highly dependent on emotional processing, due to the individual, personalized nature of weighing the options, and it is hard to argue differently.

A second perspective came to me as I was reviewing some of the considerable body of work that Kahneman and Tversky pioneered in behavioral economics, work that won Kahneman a well-deserved Nobel prize.  His book, Thinking Fast and Slow, has become legion for researchers of human behavior, and the principles of System1 and System2 thinking have wide-ranging implications for decision theory.

A key idea is that we have two dramatically different modes of thinking.  Our System2 thinking handles the complex, cognitive tasks where we need to concentrate and draw conclusions based on our efforts.  So, for example, if I ask you to multiply 17 times 36 in your head, you could do this.  But it would take some work.  And some time.  It is slow and laborious, and we tend to avoid it like household chores and homework.  System1 thinking, on the other hand, is fast and intuitive, and happens with little effort.  In the course of our day, we make something like 20,000 individual decisions, and the vast majority is made by System1, with little or no direct attention.  Behavioral economists point out that our emotions are handled by our System1 brain.  We don’t have to concentrate to be happy, or embarrassed, or remorseful.  It just happens.

Furthermore, Kahneman and his followers have been able to demonstrate that System1 thinking invades all manner of decision-making.  Even the highly involved and complex decisions, like buying a new lawnmower, give way to System1 processes whenever our lazy System2 brain can unload the work.  [In fact, I have personally come to believe that this is one of the important functions of brands – to move product choice decisions to System1 processing whenever possible – but that discussion is best pursued elsewhere.]

Earlier, I might have easily acknowledged that emotions play a strong role in purchases centered on fashion.  But on form?  No way.  If I’m buying a new sport coat, where style is more important than say, protection from the elements, then sure, my emotional needs loom as important determinants of choice.  But for that lawnmower, I’m all about performance, and this comes in the form of System2 considerations of quality and materials and sale price and such.  Yes?  Well … yes, but only to an extent.  Experimental evidence from behavioral economics has shown consistently that System1 thinking is also pervasive.

Examples like those from Bechara and Kahneman, cited above, left me convinced that we needed a more formal and exact means of incorporating emotional influences into our study of decision-making in general, and purchase decisions in particular.  But until recently, that need was largely unfulfilled.  This glaring gap was nowhere more apparent than in the Mauss and Robinson article published in Cognition & Emotion in 2009 summarizing the various measurement techniques deployed to measure emotion.  The table below, drawn from that article, classifies the basic approaches to emotional measurement, and reflects the general inability to capture emotional specificity using the techniques in play at the time.


Response System Measure Sensitivity
Subjective Experience Self-report (surveys) Valence; Arousal
Physiology Autonomic nervous system (ANS) Valence; Arousal
Physiology Startle response Valence (at high arousal)
Physiology Central (EEG, fMRI, PET) Approach/Avoidance
Behavior Vocal amplitude, pitch Arousal
Behavior Facial behavior (observed) Valence (some specificity)
Behavior Facial behavior (EMG) Valence
Behavior Whole body (observed) Some emotion specificity


As a rule, the emotional measurement systems we had been using could get valence (or sentiment, which groups emotions into positive, negative and neutral components) or various dimensions of emotions.  A good example is the PAD theory of emotions from Meharabien and Russel (1974) that used the three dimensions of Pleasure, Arousal and Dominance to describe emotions.  But consider the model as depicted in the figure below.




Here you can see that fear is low in pleasure, high in arousal, and low in dominance.  When we are fearful, the feeling is unpleasant, it definitely gets our attention, and makes us retreat.  You can see from this cube where bliss and triumph would show up on the dimensions.  But where would you put guilt?  It is, like fear, characterized as low pleasure, high arousal, and low dominance, so it could occupy much the same place in the cube.  But we all know that these two emotions – fear and guilt – are quite different experiences.

It was at this point that my colleagues at Converseon, an award-winning company focused on the technologies in social listening, shared with me the work that they had been pursuing using Robert Plutchik’s psycho-evolutionary theories of emotion.  Plutchik and his cohorts at the Einstein School of Medicine had classified emotions into eight basic, or fundamental, emotions that developed in humans due to evolutionary (Darwinian) pressures.  So fear, for example, gave rise to the “fight or flight” response, and helped humans with this capability to either triumph or escape, and thus be less likely to be meals of certain predators.  The eight basic emotions are:

  • Fear
  • Trust
  • Joy
  • Anticipation
  • Anger
  • Disgust
  • Sadness, and
  • Surprise

Now, Plutchik’s psycho-evolutionary theories are not new, having emerged in the 1980’s.  But they provided a sense of organization – a taxonomy – to the entire world of emotion which, ironically, made such logical sense to me.  It is instructive to look at the “flower” or “wheel” depiction of the Plutchik structure of emotion shown below.




In this figure, emotions are arranged so that the relationships are meaningful.  The eight basic emotions appear as the middle of three concentric circles on the wheel.  Start with Joy, at the top, and go around that middle ring to see all eight.  Each of these eight is the anchor for a spoke on the wheel, varying in intensity.  In this manner, Rage is the more intense version of the basic emotion, Anger.  And Annoyance is the less intense emotion.  Also, the placement of the spokes is important, with Joy being opposite of Sadness, for instance.  And some emotions are combinations of the more basic emotions.  Love, as we all know, is a complex and “many-splendored” thing, but you can see how it can be construed as the combination of Joy and Trust.

The Plutchik taxonomy and validation of his theoretical explanations for emotion are much more complex than the simple wheel shown above, of course, but the basic premise is captured well, and this was what changed my concept of how to think about emotional effects.  It finally made sense to me.

But I think that the reason that Plutchik’s view of emotion had never come to my attention was the fact that it provided structure alone, without a means of measuring the emotions in the nicely ordered wheel.  And without this tool, the mental construct could not be applied effectively to the study of product purchases, and other pursuits.

The Converseon team essentially solved that problem.  The time frame was 2012 – 2013, and as Chief Research Officer of the Advertising Research Foundation (ARF), I had initiated a study we called “Digital and Social Media in the Purchase Decision Process.”  We had recognized the growing importance of digital technologies, like search engines and social media, on how people went about buying goods and services, and we were looking to get a better grasp on that process.

For the study, we wanted to explore the purchase process across a broad range of product types.  So Kraft joined the effort, and we looked at packaged meats and cookies (Oscar Mayer and Oreo brands) on their behalf.  At the other end of the spectrum, GM came aboard to study compact cars.  And somewhere in between was Motorola, who tested smartphones.  Google (our digital experts) and Young & Rubicam (Y&R, our advertising consultants) were also sponsors.  Besides Converseon, who conducted social listening as part of the study, we had comScore (online panel research), Firefly/Millward Brown (qualitative research) and Communispace (online communities) as research partners donating professional services to the cause.  Duke University’s Fuqua School of Business served as our academic advisor.

During the course of the study, the research team acknowledged that we needed to understand the emotional effects at play in the purchase process, and both Firefly and Communispace jumped on that challenge.  But so did Converseon.  In fact, they proposed that we could capture emotions, as expressed in online social conversations occurring in the different basic stages of the purchase process.  And we used the Plutchik organizational taxonomy to show the emotional journey that buyers experience as they seek to acquire cookies, or smartphones or cars.

It’s important to know the process that Converseon uses in their approach to social listening.  First, they have access to the full range of data on the Internet, including social media (Facebook, Twitter, Google+, etc.), blogs, company websites, wikis, review sites and so forth.  The volume is huge, and we gathered over 50,000 posts about Oreos in about a week’s time period.  It’s 10 to 20 times that for all cookies in general.  At that point a combination of machine- and human-based coding is used to make sense of the posts.  In our case we classified social commentary into a) the stage of the purchase process that the author was in at the time, and b) the predominant emotion expressed in the post (anger, excitement, etc.).




The figure above depicts one perspective of the purchase process.  Another viewpoint that you see frequently is that of a “funnel.”  Now, it is shown here as a linear process, but there is ample evidence these days to indicate that people do not go necessarily in a straight line through this process.  They bounce around.  They may get half-way through the process and just up and decide they are not really in the market.  But this model is a nice heuristic, nonetheless.  And that’s because, at any given moment in time, a consumer actively on the purchase journey is engaged in the behaviors associated with these distinct phases.  So we classified our posts into these five phases.

We also classified the posts into specific emotions.  The Converseon method of classification does not use the common rules-based approach via Boolean logic and similar techniques popular with many social monitoring firms.  Instead human coders do the initial classifications, and the machine “learns” from their efforts.  Only posts with strong inter-rater reliability (all coders agree on the proper designation) are fed to the machine for the learning process, and only when the computer classification outcomes reach acceptable levels of precision are they allowed to proceed.  Throughout the process, individual posts that are deemed “uncertain” are given to human coders for final arbitration.

The figure below shows the results for all products in the ARF study combined.  Finally, we have an accurate and consistent means of measuring emotions, and we can use this information to better understand things like the purchase process.





It is clear from the figure that consumers experience a range of emotions in their purchase journeys.  Anticipation is a common emotion in the problem definition phase, and in the purchase decision phase, as well.  Joy predominates in post-purchase commentary.  But people express many emotions, both positive and negative.

Interestingly, when we separated the data in the figure above into our sponsor product categories, we discovered that packaged meats, cookies and automobile emotional journeys were quite similar, but that smartphones were unique.  Go figure.  The smartphone results are depicted below.




For smartphones, we found that negative emotions, like Sadness, were prevalent in the early, problem definition phase.  Inspection of the relevant posts revealed that users were reluctant to give up their old phones, as they had become comfortable and quite attached to their devices.  In many cases, they were being forced to purchase new phones because the older models were no longer being supported.  Once they started active shopping, in the information-search phase, that negativity turned to Interest as potential buyers began to encounter the wonderful array of new phones with many new features.

The real surprise came, however, when we looked at post-purchase comments.  Many of the posts expressed Joy, but there were an equal number of negative comments, reflecting Anger, Disgust, Annoyance and Sadness.  Again, we dove into the data and learned that many new smartphone buyers were upset and frustrated, since they could not easily figure out how to use their new devices.  One comment I felt summed it up:  “Just bought me a new smartphone.  Turns out the damn thing ain’t so smart after all.”

This was valuable insight to our sponsor, Motorola.  And based on the learnings, we gave them two pieces of advice:

  1. Don’t just throw your phones over the fence to new buyers. Educate them.  Train them.  Offer courses to all new buyers and show them exactly how to accomplish all of the things they were accustomed to performing on their old phones.
  2. Connect the groups via social media. Hook the joyful buyers up with the disgruntled ones, and let them exchange information.  The joyful will feel proud and quite pleased to share their knowledge, and the unhappy will welcome input from regular users (instead of helpline employees).

Knowing the specific emotions associated with decision processes can be illuminating indeed.  As the results above indicate, marketers can leverage this insight to their competitive advantage.  And certainly, this can inform the marketing/communications strategy.

Emotional measurement can actually shed important light in many areas.  Interestingly, at the conclusion of the ARF study, I was thinking about possible contexts for interpretation of the emotional data.  I knew that my longtime colleague, Amy Shea, had just opened a consultancy focused on the fundamental concepts of Story.  I approached Amy with my challenge.  Can we use the principles of story-telling to help interpret findings in the brand/advertising/purchase process environment.  She spent a day looking over the various findings and reported back:  “Absolutely!”

If you are interested in how story-telling actually works, you should read the book, aptly named Story, written by the noted screenwriter Robert McKee.  He has described 52 separate and distinct genres of stories, and he has shown how mastery of the fundamentals of plot, and character development, and sub-stories, and the like are assembled beautifully in the great films of our times.  Amy is a big fan, and she uses the principles of story to understand the dynamics of consumers and marketplaces.

I will not go into detail here (and there is plenty of it) but just share with you two key learnings that came out of the Converseon data and emotional coding.  First, we have realized that, despite what the brand pundits often claim, consumers have not gained ownership of brands.  Social media has not allowed savvy consumers to wrest control of brands, moving them in directions they see fit.  But it is more of a joint venture these days.  What consumers do control is the category story.  They decide what the story is around running shoes, or frozen pizza.  The brands actually play the roles of different characters in those category stories.  That’s how it works.  And it is only by capturing the emotional journeys of consumers in their buying process that we can see all of that unfold.

Second, our review of the smartphone data revealed that the story in that category is a love story genre, with a disillusionment plot.  Think about it.  People just love their smartphones.  They take them to bed at night.  They keep them in their breast-pocket, next to their hearts.  Heck, most people would rather part with their wallets than their phones.  But they occasionally get jilted by their lovers, told that they no longer support the relationship.  So sad.  Now these jilted consumers take to the streets, hoping to rekindle with a new partner.  And for some, the result is indeed Joy.  Their new partner is terrific.  For others, however, the new mate is not so great.  Anger and Sadness prevail.  Heartbreaking.

There is much more to the story-based interpretation, as I’m sure you can imagine.  I just wanted to share enough to underscore the point that the new approach to measuring emotions that I have described can generate value in many ways.

We now have a new and informative way to measure emotions.  It is by no means perfect.  It relies on social media data, for instance, and Lord knows that source is not exactly “representative.” Many people do not have access.  And while some people may only post once a month, others post 10 times a day.  It’s a problem if you are wanting to generalize your findings to any known population.  But if you are trying to determine what people are likely to encounter when they go online for information and advice, then it is the ideal source, of course.  And it is an “in the moment” source, as well.  In surveys, we often ask people to remember what they were thinking or feeling when they bought that flat-screen TV.  Such recall is often biased and/or inaccurate.  By looking at social posts that were made at the time of the experience, we get a better picture of true emotions.

There is much more to this pursuit of emotional effects in purchase decisions.  For example, I have been delving into the relative importance of emotions vs. cognitions in the various stages of the “path to purchase,” and this work is showing early promise.  Also, there are other measurement techniques, like neuro-methods, that are producing considerable insight.  But my argument here is simple:  emotions are important determinants of choice behavior, and we now have systematic means of measuring those very emotions during the process, so we have a great opportunity to combine our right- and left-brain influences to better understand these marketplace dynamics.