1. Clear Seas Research
  2. SIS International Research
  3. RN_ESOMAR_WP_GBook_480_60
  4. TopBanner480x60_Greenbook

The Next Economy: After ‘MarketWorld’

In his new book, The Zero Marginal Cost Society, Jeremy Rifkin argues that capitalism’s own “extreme productivity” is speeding it toward its next iteration, a decentralized market economy operating globally in the online “collaborative commons.”



Editor’s Note: We live in an amazing era, where the possibility of widespread disruption of traditional economic and business models grows with every day. The technological leaps taking place are simply astounding, and one of the single best examples of this trend is in the emergence of the “Maker Economy/Sharing Economy” taking place right now. This fundamental shift in production and distribution has profound implications for every aspect of human civilization, and therefore is something MR should be paying close attention to. Luckily Robert Moran is on the case.  Bob leads Brunswick Insight in the Americas, writes and speaks on emerging trends and serves on the Board of Directors for the World Future Society. He is a globally recognized futurist, and most certainly is the preeminent thinker on future trends in MR.

Here is his take.


By Robert Moran

When the Berlin Wall fell in 1989 and took communism with it, the well-respected consulting firm Global Business Network (GBN) was asked to paint alternative scenarios for a post-Communist world. They developed three scenarios:

New Empires, a kind of regional, neo-mercantilism
Change Without Progress, hi-tech gangster capitalism
MarketWorld, a fast-paced, globally networked finance capitalism

Of course, GBN correctly identified the “MarketWorld” that we have been living in since 1989.

But, what comes next? What comes after “MarketWorld”?

And how will the next economy change American politics?

These are big questions that very little of contemporary American public opinion research addresses.

Fortunately, we do have some data on the outlines of the next economy (shared, networked, automated) and we have Jeremy Rifkin’s sweeping portrait of market capitalism’s next act.

In his new book, The Zero Marginal Cost Society, Rifkin argues that capitalism’s own “extreme productivity” is speeding it toward its next iteration, a decentralized market economy operating globally in the online “collaborative commons.” In economic terms, Rifkin analyzes the trends leading us toward “near zero” marginal cost, an economy in which the cost of producing each additional unit of a product or service is nearly or essentially zero — making products incredibly cheap while eroding profit.

If this sounds familiar, it should be. This is already happening in communications, publishing and entertainment. The marginal cost of another download is virtually zero.

The business model disruption in communications, publishing and entertainment today will happen in energy (distributed solar power generation), education (MOOCs), banking (peer to peer lending), computing power (see Moore’s Law), and manufacturing (3D printing) tomorrow.

But, it’s not just a story of “extreme productivity,” it is also the story of:

1. production and power shifting from large, capital intensive enterprises to the individual “prosumer”
2. the rise of “sharing” business models and the trend toward “disownership”

There is now a growing body of opinion research in these spaces:

The term “prosumer” describes any consumer engaging in household production activities. These could include renting assets (home, auto, etc.), household energy production (a growing threat to existing utilities), and household creation of products and services as seen in microtasking sites like TaskRabbit and in the future of home-based 3D printing.

Sharing Economy
The trend toward many consumers now valuing access over ownership has been well-documented. This is especially true for millennials and cars.

A 2013 Harris Interactive survey found that in the past two years, 52 percent of Americans rented, borrowed or leased items instead of buying them.

A 2014 Chubb Group survey found that 36 percent of Americans would rent a private home for vacation and that nearly a quarter would rent out their vehicles.

But the best data on the sharing economy comes from the 2013 “Sharing is the New Buying” study by Vision Critical and Crowd Companies. This survey of 90,112 adults in the US, UK and Canada segmented the population into non-sharers, resharers (those who buy-sell used goods online), and neo-sharers (those already using advanced online sharing networks.) Critically, 40 percent of the US population is now engaged in the collaborative, sharing economy. The breakdowns are as follows:

Non-Sharers: 60% of US population
Re-Sharers: 16% of the US population
Neo-Sharers: 23% of the US population

But, why should public opinion researchers, pollsters and elected officials pay attention to these trends? There are two reasons.

First, because the next economy is more likely to be driven by networks of prosumers and e-lancers than large corporations. In the 21st century the people are the heroes now. And secondly, because the rise of the next economy that Rifkin describes will undoubtedly cause flash points with 20th century businesses. We see this already with:

1. Conflicts over tax treatment of traditional business models vs. sharing models

2. Traditional business model attempts to insulate themselves from the heightened competition of sharing models.

3. Public shock that anyone could simply purchase a 3D printer, download the plans for a firearm and print themselves a gun.

4. Electric utilities seriously analyzing the threat from home based solar power. Read the Edison Electric Institute’s report on “disruptive challenges.”

3D Printing
Take 3D printing as an example. Peer-to-peer music sharing reshuffled the music industry. Napster brought the marginal cost of music to near zero. 3D scanners and printers will do the same for many physical objects that we purchase today. It won’t happen tomorrow. 3D printing and scanning technology is still improving. But it will happen. We will have a Napster for things. If you doubt it, just take a look at Thingiverse. The public policy implications of this are very challenging. Makers will design, share and print their own products. They will borrow from existing design, challenging business control of its intellectual property. And, they will utterly disrupt the current manufacturing, distribution and retailing system. In a sense the workers will own the means of production. And the act of making will color their political views. In the future we will closely study the political attitudes and behaviors of prosumers and makers. A good start may be an examination of the White House’s first ever Maker Faire.

Polling and most market research is a “snapshot in time” — a clear picture of opinion the moment the survey was taken. This is good for helping us understand the opinions driving politics today. But, if we want a clearer picture of emerging trends and the next economy, we need to better understand the effects of automation on employment, sharing on existing business models and household production on political views. A good start is Rifkin’s Zero Marginal Cost Society.


Data Data Everywhere…What Price An Insight?

Are clientside researchers, and the industry overall, in good shape – or are we simply one of many industries being disrupted by web-based technologies?


By Edward Appleton

Market Research is going through exciting times – mobile, communities, biometrics are just some of the new ways technology is helping us get closer, and for longer, to the audiences we wish to understand.

DIY providers are another exciting innovation – for those with the time and ability, it’s easy to link up to online access panels, use survey software at a very low cost and reduce the price-per-complete radically. Zappistore, Gutcheck are two of the higher profile DIY providers who are addressing the perception of research being slow and expensive.

Google Consumer Surveys is another major entrant – making the microsurvey a serious contender for insights gathering, and giving mobile a considerable boost in the process as a data collection mode.

Where does this leave the concept of insights? Is it flourishing, with all the exciting array of data points available almost immediately? Is the circle effectively squared – better tools + lower price points = better insights?

Are clientside researchers, and the industry overall, in good shape – or are we simply one of many industries being disrupted by web-based technologies?

I fear the two concepts – masses of ever-cheaper data and insights excellence – may not be in synch with one another, possibly even going in opposite directions – data virtually omnipresent, insights increasingly rare, not to say endangered. Here’s my take.

1. Insights Departments are Challenged.

Clientside research departments are – according to consistent anecdotal evidence -  not growing. On the contrary, fewer client researchers are often expected to handle more work, more datapoints – with budgets that are often flat, perhaps in slight decline.

This equation often doesn’t add up – with MR turnaround times becoming ever shorter, volumes increasing, with less time to add value strategically.

This is a double-edged sword – a commodity trap, that can lead to the erosion of perceived value at higher levels of an organization.

Outsourcing is a potential option – but there is a limited number of those on the Agency side with the sufficient all-round business understanding and interpersonal skills to make a significant impact.

2. More Data doesn’t Necessarily Help

We are currently witnessing an extreme, almost frightening drop in the costs needed to access many forms of “insight” work – qual. and quant. This leads to a surge in volume – and an expansion of the market overall, as more companies can afford to execute a form of market research. It doesn’t automatically mean an increase of value delivered, and it doesn’t per se lead to an explosion of insights.

The option of having easy access to more data can actually easily be a distraction – find out this quickly, include that additional survey question last-minute…..perspiration for sure – but inspiration?

Asking the right business questions, spending more time working on the implications, comparing with alternative data streams, focusing on critically relevant data – these added-value areas easily get squeezed out. The MR discipline is in danger of becoming increasingly tactical. We need to regain strategic weight – which leads me to the next point.

3. Insights Culture – How Deeply Ingrained?

Many companies state that they are driven by the customer – definitely included in corporate literature, VOC commitment reads well as a necessary part of corporate philosophy.

The reality is often somewhat different. The “customer” is often a major corporate account rather than end users – the voice of the sales person the dominant factor. Operations – finance, sales, supply chain – often hold more weight internally than those paid to represent the outside view, such as Insights.

There are many reasons for this – a topic for a future blog. The critical question is: what does it take to create a strong Insights culture? I’d suggest the following:

  • the attention and support of senior management
  • a marketing-infused culture
  • a strong departmental profile for insights
  • an insights attitude characterized more by goal- rather than process-orientation
  • the ability to speak the language that resonates amongst budget-owners.

How many of the seminars and congresses we attend look at these issues, and how many still focus on methodology?

Returning to my original question: is it an exciting time for insights? I see the glass as half full.

Market Research no doubt has the potential to be an immensely powerful tool; we have a more refined understanding of the limits of direct questioning techniques, seemingly unlimited access to data, and an enhanced toolkit. All this should add up to more impact – if we can consistently demonstrate and communicate the impact insights have. Improved tools on their own won’t generate superior insights – it’s our skillsets that have to improve and deliver.

As economies emerge from a recessionary climate, companies’ confidence to invest increases, we need to grab all opportunities to add value, not just reduce cost, and ensure that insights is a discipline that gets at least its fair share of the marketing budget.

Curious, as ever, as to others’ views.


Jeffrey Henning’s #MRX Top 10: Research Lectures, Confidential Concepts, and Mobile Qual and Quant

Of the 2,451 unique links shared by the #MRX community the past two weeks, here are 10 of the most retweeted.



By Jeffrey Henning

Of the 2,451 unique links shared by the #MRX community the past two weeks, here are 10 of the most retweeted.

1. April Lecture Series 2014 – If you’ve missed any of the ongoing NewMR webinar series, you can watch recordings of Pete Cape of SSI discussing questionnaire design, Betsy Leichleiter of Open Minds presenting on global innovations, and yours truly discussing the representativity of online surveys.

2. Consumer Collaboration around the World: Europe – Tom de Ruyck looks at the state of European collaboration, contrasting practices in the UK, Western Europe, and Eastern Europe.

3. Research Participants and Confidentiality: The Balance of Rights and Responsibilities – Maya Middlemiss of Saros Research presents a 15 minute webinar looking at the responsibilities of research participants to keep the concepts they are evaluating confidential, and the greater difficulty of doing this in a world with social media.

4. Mobile Qualitative: How Does It Fit In the Research Toolkit? – Edward Appleton profiles Revelation, a pioneer of mobile qualitative research whose founder, Steve August, argues that most of the interesting things that happen in participants’ lives occur when the researcher isn’t there.

5. Discord on Business Rules – Ipsos Mori reviews the reaction in the UK to David Cameron’s pro-business policies from MPs, business leaders, and the general public.

6. The State of Research Careers – AMA TV offers a 5-minute video focusing on a number of stories, including growth in market research jobs, with more demand for research directors. The video begins with a profile of how Charity Water uses social media.

7. A Millennial Problem in Market Research – Chris Ruby laments that many of today’s researchers “just fell into it”, and argues that the industry must be doing more outreach to Millennials. In a recent survey of Millennials, not a single Honomichl 50 research company was listed as a preferred employer.

8. Einstein’s Secret to Amazing Problem Solving (and 10 Specific Ways You Can Use It) – Einstein said, “If I had one hour to save the world, I would spend 55 minutes defining the problem and only five minutes finding the solution.” Yet most of us would reverse this. Luciano of Litemind offers 10 tips for taking advantage of this advice.

9. How TNS Is Validating Mobile Globally – Edward Appleton continues his mobile research series with a profile of ongoing TNS research, some of it contrarian: for instance, there is little evidence that smartphones are informing the retail shopping experience. However, mobile surveys are often better surveys: “shorter, more relevant surveys have a higher predictive validity than longer ones.”

10. From the Client Side: Interview with Stacey Symonds, Senior Director of Consumer Insights for Orbitz Worldwide – Ron Sellers of Gray Matter interviews Stacy Symonds about how she evaluates methodologies and vendors.


Note: This list is ordered by the relative measure of each link’s influence in the first week it debuted in the weekly Top 5. A link’s influence is a tally of the influence of each Twitter user who shared the link and tagged it #MRX


Research? No Thanks, I’m Only Human! A Client-side Perspective

No one is immune from making daft decisions and our reliance on IQ and educational qualifications as an indicator of competence can be a recipe for disaster.



By Neal Cole

Business people pride themselves in their decision making and many  businesses embed market and competitor research into this process. However, because business people are prone to the same human frailties as all of us this can discourage the use of research and insight.

Behavioral science suggests that as people gain experience and knowledge in their area of expertise they have a tendency to become overconfident and complacent about their ability to understand the past and predict the future. Our brains assume that we are living in a simpler, more predictable world than is really the case.

This is one of the most useful insights of behavioral economics and yet professionally a difficult truth to acknowledge when we like to be seen as an expert our field. Indeed, we are sometimes informed that decisions have been made on the advice of an ‘expert’ as if this guarantees the quality of the process.

As humans we are certainly prone to the illusion of understanding. Our minds create narrative fallacies from our continuous attempt to make sense of the world.

We notice the small number of unusual events that happen rather than the multitude of events that failed to occur.

Our memory is selective and biased by the workings of our mind. We construct vivid accounts of the past based on memories that change every time we recall them but believe they are a true reflection of past events.

We suffer from a tendency to like (or dislike) everything about a person. This  helps generate a simpler and more coherent representation of the world than is really the case. We fill gaps in our knowledge about a person using guesses that fit our emotional response.

Short-term emotions are probably the most powerful force in our decision making arsenal. Many of our judgements and decisions are directly influenced by feelings of liking and disliking rather than rational deliberation.

We hate uncertainty and suppress ambiguity because inconsistencies slow our thought processes and interfere with the clarity of our feelings. People are attracted towards confidence and we prefer decision makers that demonstrate such qualities above someone who may be equally competent but wants to think through a decision before giving an answer.

People are heavily influenced by the What You See Is All There Is (WYSIATI) rule. We naturally work with the fragmentary information that we have access to as if it were all there is to know. The paradox is that it is easier to construct a consistent story when you have little knowledge. People make fallible guesses from incomplete information by making a leap of faith about how things should work.  Steven Pinker points out our only defence is that it worked sufficiently well in the world of our ancestors.

“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.” Daniel Kahneman, Thinking, fast and slow.

This can lead us to define our choices too narrowly and consequentially reduces our options. Research and working collaboratively can help by widening our horizons and introducing new insights to challenge our perception of the topic.

We are also very good at changing our beliefs after an unpredicted event without being aware of it. We often unconsciously adjust our view of the world and find it difficult to recall what we believed before the event. This leads us to evaluate the quality of decisions by the nature of the outcome rather than the process by which the decision was made.

“Asked to reconstruct their former beliefs, people retrieve their current ones instead.” Daniel Khaneman, Thinking, fast and slow.

The danger here is that people get blamed for a decision that resulted in a negative outcome despite the unpredictability of the event. In corporate decision making this can result in people relying on bureaucratic solutions to avoid blame which leads to extreme risk aversion.

It can also result in business people receiving unjustified rewards (e.g. bonuses) for being irresponsible with risk taking and just being lucky. This can be seen prior to the 2008 financial crisis where banks and other financial institutions were paying risk takers massive remuneration packages for activities that put the whole financial system at risk. Due to the complexity of some of the assets their AAA ratings proved to be illusory.

Kahneman asserts that any comparison of how successful or not companies have been is to a large extent a comparison between how lucky or not they have been. In every story of a successful company there will have been moments when the destiny of a firm could easily have turned in an instant.

So, is the analysis of the situation more important or is it the process that is they key? Research conducted by Dan Lovallo and Oliver Sibony studied 1,048 major business decisions over 5 years. They found that “process mattered more than analysis by a factor of 6”.

I have no use whatsoever for projections or forecasts. They create an illusion of apparent precision. The more meticulous they are, the more concerned you should be. We never look at projections … —Warren Buffett

This does not mean that analysis is unimportant and should not be undertaken. Rather it we should treat it as only part of the jigsaw. When making decisions it is essential that we explore uncertainties and encourage discussion of opinions that may contradict the views of senior stakeholders.

Intelligence and a high IQ are not normally associated with stupidity. But research suggests that our propensity to make rash, foolish or irrational decisions is often not related to our IQ. No one is immune from making daft decisions and our reliance on IQ and educational qualifications as an indicator of competence can be a recipe for disaster. When the business culture gives too much reverence to people with certain qualifications and skills this can lead to rewarding decisions based mainly upon intuition rather than evidence.

CAVE Men! Colleagues Against Virtually Everything. People invest a lot of time and effort into their existing strategy or ideas. Dan Ariely calls this the not-invented here bias. People have a tendency to value their own ideas significantly more than others’ ideas. This can result in obsessive focus on poor ideas and probably explains some of the less successful decisions that we come across in business.

Confirmation bias also means that we tend to ignore information that does not align with our existing beliefs. We subconsciously seek and are drawn to evidence that confirms our view of the world. People are very good at overlooking facts that undermine their opinions and will follow the crowd that most closely supports those beliefs.

So where does this leave the customer insight professional? It demonstrates the need for a comprehensive strategy for promoting the use of insight and collaboration to facilitate innovation and evidence based decision making.

  • Stakeholder management is essential not just to obtain the buy-in and support of senior management, but also to counteract many of the myths about how research and insight is undertaken.
  • Use storytelling to engage people at an emotional level. Our brains become more active when we are told a story, not only the language processing part of our brain, but also other areas we would normally use to experience the events of the story in real life. Some evidence suggests that our brains can synchronize with the brains of the person telling a story.
  • Spend time getting to understand your audience and their preconceptions. “What You See Is All There Is” tends to be strongly influenced by survey research when it comes to insight.
  • Never underestimate the importance of how choices are presented and ensure you are fully prepared so that you avoid uncertainty about your recommended approach.
  •  Immerse yourself in the customer facing side of your business by meeting and observing how your organization interacts with your customers. Don’t rely on third parties or management to identify the real challenges customer facing staff have to deal with.
  • Identify information gaps to highlight the need for research and insight. Our illusion of understanding sometimes needs reminding how little we really know about the world.
  • Challenge default methods of conducting research. Examine the potential for alternative approaches to insight, including experiments, observation and collaborative methods.
  • Encourage a culture of experimentation. For instance use A/B and multivariate testing on your website to understand what content most engages and motivates your existing and potential customers.
  • To counter hindsight bias always ask key stakeholders before you commission a project what they expect the outcome/findings to be. You can then use these as hypothesis to prove or disprove their views.
  • Encourage all areas of the business to share insights and engage with the research process. It shouldn’t just be marketing and customer services that buy-in to customer insight. This helps avoid group think by bringing diversity into the decision making process.


Further reading: Thinking, fast and slow by Daniel Kahneman, Herd by Mark Earls (@Herdmeister),  Decoded by Phil Barden (@philbarden), How to Get People to Do Stuff by Susan Weinschenk (@thebrainlady).


Innovation Or Sales Pitch?

As a marketing science and analytics person I am bombarded by sales pitches of various sorts, frequently pertaining to “new” or “innovative” methodologies or software. I’d like to share a few thoughts about how to separate the wheat from the baloney.


By Kevin Gray

Openness to change is not blind acceptance of claims.

“What you have is old.  I have something new and better.”  This may well be true but it is also a tried-and-true sales pitch.  Talking about “new” is O-L-D.  Though I strongly feel we should keep our eyes, ears and minds open for things that will help us live happier, fuller and more productive lives, there is no need to believe everything we read or hear.  If it seems too good to be true, most likely it is.

The fact is things can be old and good, old and bad, new and good or new and bad.  What’s more, something positioned as innovative might actually be an old idea that has been repackaged and recycled.  (If you point this out, you may be greeted with a retort offering quite trivial modifications in support of the argument that this time it really is new.)

Some innovations don’t diffuse very far simply because they aren’t very good ideas.  On the other hand, new ideas can fail, not because they’re bad ideas, but because they are difficult to comprehend or put into practice.  New products or processes may address real and important needs but may be too complicated for the intended user.  Others fail because they have been poorly marketed.

As a marketing science and analytics person I am bombarded by sales pitches of various sorts, frequently pertaining to “new” or “innovative” methodologies or software.  I’d like to share a few thoughts about how to separate the wheat from the baloney, and I think they will apply generally, not just to my areas of specialization.

One tipoff that a claim is suspect is when the status quo is criticized…and the party leveling the criticism has gotten the status quo wrong.  How can you think outside the box if you don’t know what’s in it?  We shouldn’t take for granted that another person knows our job better than we do.  Generalizing from the exception is another tactic to watch out for, and sometimes very poor practice is presented as standard practice.  Clearly, many things will be an improvement over incompetence.  Consumer surveys, some of which are very badly designed and executed, are a case in point.   In general, though, they do work and are still essential even though they are “old.”

Be on the lookout for the words “advanced” and “world class.”  Like “innovative” and “revolutionary” they are hackneyed and have lost much of their meaning.  One example is “advanced analytics” software that, in reality, only offers cross tabulations and graphics, and another is software that mainly consists of standard routines wrapped in a flashy package.  They may be solid products, but no better than what you already have on hand.  Don’t allow yourself to be dazzled; this of course applies generally, not only to software.

“We are the only ones who can…” should get your guard up as much as “99% accurate.”  One is tempted to wonder if the reason no one else does it is because it doesn’t work.  Ostensible benefits of new a technology often are really camouflaged claims about its hypothetical potential, not what in fact it has been proven to deliver.  “Validated” is another word to be wary of.  How is valid defined?  Who did the validation and what process did they follow?  Has the validation been replicated?  More to the point, can the new product or system really do what the folks pitching it claim it can do?  I recall a rather caustic but telling comment made by another marketing science person: “This algorithm is fantastic!  Can it also forecast how many suckers will be born in the next hour?”

Dubious claims sometimes conceal themselves behind academic or scientific authority.  While endorsements from true experts are impressive, should any substantial investment be required please take it upon yourself to find out who the real experts really are and what they are really saying.  Also, don’t be taken in by impressive paper credentials; ethics are not correlated with mathematical prowess or programming skills.

Some pitches attempt to bewilder us us with complexity, perhaps in the hope we won’t look too closely or ask too many questions.  What is being pushed apparently is not only new but so complex and sophisticated that the ordinary Joe will never be able to get his head around it.  (Therefore, if he buys it he joins an elite cadre!)  Those adopting this sales strategy typically lean heavily on jargon and tend to dodge specifics.  Don’t allow yourself to be intimidated; try to pinpoint concretely what this new product, service or process is supposed to be able to do and whether there is genuine evidence it can deliver on these promises.  Or just ignore it.  This classic Monty Python skit is a wonderful parody of the tendency of the chattering classes to over-intellectualize:



Popular business media are another excellent source of nonsense. “Companies that do XYZ are more profitable than companies that don’t do XYZ” is not evidence that XYZ works.  It is merely a sentence written in English.  A few obvious questions should come to mind.  Specifically, how is XYZ defined?  How is “more profitable” defined?  How did the two groups of companies differ before it was adopted?  What about performance over time? Average profitability for companies doing XYZ might actually have decreased since they adopted it!

Some claims are self-repudiating almost to the point of farce, for instance, eloquently-written pieces asserting that humans cannot express themselves well verbally.  We may be told in-depth interviews don’t really work but, inexplicably, text-mining Twitter with their software does.  It seems we’ve been deceiving ourselves all these years.  Some pitches for biometrics make similar sorts of contentions, neglecting that their development may have required exhaustive interviews with test subjects.

Along similar lines, that humans are not perfectly rational is not news, nor was it was when Sigmund Freud was a lad.  My reason for bringing this up is that every few years it seems we are informed, once again, that it has been discovered that humans do not always shop very scientifically and therefore that conventional marketing thinking and practice are wrong.  I suspect, however, that tail fins were not installed on automobiles in the 1950′s for purposes of aerodynamics.

Silly or misleading claims and outright falsehoods can discourage adoption of useful new tools and overhype risks backlash.  Nevertheless, something slickly marketed or hyped to an irritating degree in fact may work well and be worth its price tag.  Being open-minded means being open-minded and our decision ultimately should boil down to: “Will I get what I’m expecting and will our investment, including my time and my staff’s time, pay off under the constraints we work?”

Look closely and ask hard questions.


The 4 Futures Of Marketing Research

How is the marketing research industry doing when it comes to adjusting itself to the ever-accelerating changes in the marketplace? Not all that great.



Editor’s Note: It’s been a while since we posted a good “future of research”  piece here, but it was worth the wait because Kristof De Wulf, CEO of Insites Consulting, has delivered a doozy that I think is very close to the mark. I would add in a few gradations of new sectors I think will begin to emerge in a more fragmented data rich future, and I think we’ll see consolidation from platform players, consultancies and marketing agencies that will usurp the role of many MR firms, but overall Kristof ‘s 4 futures scenario is a great road map of where the industry is heading at a very fast pace.

This post is important folks, and I hope you give it the attention it deserves.


By Kristof De Wulf

We all know the future will be fast and furious. It is fast, knowing that according to Richard Foster from Yale University the average lifespan of a company has decreased by more than 50 years over the last century, from 67 years in the 1920s to just 15 years today. It is furious, as new technologies are expected to further drive massive economic transformations and disruptions in the coming years. There is no escaping it: no matter what industry you are in, you can run but you cannot hide.

So how is the marketing research industry doing when it comes to adjusting itself to the ever-accelerating changes in the marketplace? Not all that great. Just recently, McKinsey reported that the research & consulting industry is at the bottom of the list in comparison with other industries when it comes to economic mobility. With traditional surveys and focus groups still dominating the scene, the speed at which our industry is changing is similar in many ways to the speed at which companies adjusted to the invention of electricity. It took well over a decade for companies to realize they no longer needed to build their factories near water, still considering water to be their prime source of power. We cannot afford a similar slow response now, with the speed of environmental change being far higher today than at the time electricity was invented.

Using data from the most recent GRIT study, Bottom-Line Analytics derived an interesting mapping of the top 20 innovators in marketing research.[1]



This mapping tells us something important about the most likely scenarios for the future of marketing research. I expect the industry to take 4 possible strategic directions over the next years, adjusting itself to the conditions of today’s economy:

  1. Lifting on DIY tools, driven by speed and tech infrastructure
  2. Making sense from ever bigger data, driven by size and tech infrastructure
  3. Bypassing the rational brain, driven by methodology
  4. Tapping into consumer collaboration, driven by trust, diversity, innovation and creativity


  1. 1.    Lifting on DIY tools

Being in the midst of an unfolding economy of individuals, anyone being talented and creative can make it happen, with money nor power standing in the way. In an age of unprecedented power to individuals where we can 3D-print just about anything, we are about to witness a wave of disintermediation in our industry with clients taking over tasks directly that used to be performed by marketing service providers before. Many of the presentations shared at the latest IIeX conference in Amsterdam touched upon powerful new technologies which enable clients to do just that, shaking up the entire industry. Google Consumer Surveys is a striking case-in-point, currently rolling out its business model across more than 40 countries. Lifting on a huge base of more than 1 billion Android users who get the Google Consumer Survey application pre-installed on their device, Google claims that its approach generates more reliable data as it taps into audiences which are less skewed towards market research (compared to research access panels) and as it demands less fill-out time from consumers given the limited number of questions. No wonder Google has just popped up in the top 10 GRIT list of most innovative research agencies in the world.

  1. 2.    Making sense from ever bigger data

The speed at which we collect new data is several times faster than the speed at which we can structure, analyze and interpret that same data. Until 2003, mankind collected about 5 ‘exabyte’ of data bytes during its entire history. Today, we collect new data at a rate of about 10 ‘zettabyte’ a year or 2,000 times more in one year than in the whole of human history. Gartner just recently published a study forecasting that about one third of the Fortune 100 companies is about to face an information crisis by 2017, not being able to convert data into insights. They may be right. Just think about the mysterious disaster of Malaysian flight MH370 acting as a symbol of violation of our faith in the information society. While we are flooded with satellite data, we struggle to get to the right type of information. I expect to see a bright future for companies who manage to find more needles in ever-growing haystacks, applying more creative, sophisticated and powerful techniques than we do today.

  1. 3.    Bypassing the rational brain

As human beings, we are all victims of our own limitations. We suffer from an illusion of knowledge bias, thinking we know more than we actually do. We suffer from a false consensus bias, starting from our own vision of the world, believing that everybody thinks like us and would make the same choices. We suffer from an observational selection bias, making us find new evidence to support our own false beliefs. We suffer from an agnosticism bias, not knowing what we do not know, focusing too much on what we do already know. The Nobel Prize winner and intellectual godfather of behavioral economics, Daniel Kahneman, summarized a lifetime of research on human thinking in his book “Thinking, Fast and Slow”. His groundbreaking work teaches us that people primarily make choices according to ‘system 1’ thinking (fast, intuitive and emotional) as opposed to ‘system 2’ thinking (slower, deliberative and rational). A major part of the future of marketing research will lie in finding new ways to bypass consumers’ rational thinking and uncovering the real reasons behind behavior.

  1. 4.    Tapping into consumer collaboration

In the age of the Consumer-Innovator where consumers in the UK already spend more time and money on innovation than all consumer product firms combined, marketing research can finally benefit from tapping into the consumer collaboration domain. The people who were formerly known as consumers have turned into contributors and volunteers, composing a world full of problem solvers who are creating billions of dollars’ worth in value without even being paid for it. With consumers having increasing access to powerful online, social and mobile technologies, clever organizations can lift upon consumers’ desire, enthusiasm and ability to collaborate with brands if they create the required conditions for it. Research agencies have a unique strategic advantage of having worked with consumers for ages, being in a far more powerful position than any other business strategy consultant, advertising agency or innovation firm willing to jump on the consumer empowerment bandwagon. The expertise and capabilities that research agencies have built over time provide a strong backbone and springboard to claim the consumer collaboration space.

Interested in hearing what you think about the future of marketing research!

[1] Mapping was revamped by the InSites Consulting design team


Market Research Is Like…. A Collection Of Some Of The Best MR Analogies

Here are some of the best marketing research analogies, shared by some of the brightest minds in marketing research.



By Isaiah Adams

Have you ever had someone try to explain something to you that didn’t make sense no matter how many times they explain it? Now, how many times has the “light bulb” turned on once they shared a good analogy with you? Analogies are a powerful way to explain complex ideas. Researchers and Marketers have long struggled over the communication of research methodologies and strategy. Marketers have a hard time understanding “Research-ese”. One of the best ways to bridge that gap is by using analogies. Below are some of the best marketing research analogies, shared by some of the brightest minds in marketing research.

A Strong Foundation

Using an analogy of a house foundation, marketing research can be viewed as the foundation of marketing. Similar to how a well-built house needs a strong foundation to remain sturdy, marketing decisions require the support of research in order to be perceived favorably by customers and to stand up to competition and other external pressures. Therefore, all areas of marketing and all marketing decisions should be supported with some level of research.

Application of Market Research

Some use research as the drunkard uses the lamppost, for support rather than for illumination.

- David Ogilvy

Marketing Research is like panning for gold. You must sift through the dirt in order to identify the golden opportunities.

-Heather Hinman – Salford Systems (@SalfordSystems)

Starting a business without doing market research is like stepping out onto a tightrope without bothering to check the tightness of the knots that are holding the rope in place. You’re halfway across when the knots loosen, the rope wobbles; you lose your balance, and fall to the ground with a splat.

The problem is they’re afraid their market research will tell them what they don’t want to know, that their big idea – their baby – ain’t so cute after all. They fall victim to what I call “Ugly Baby Syndrome.”

It’s a tough pill to swallow when market research tells you that your baby is about as attractive to the buying public as the south-end of a north-bound mule.

-Tim Knox – entrepreneur, author, speaker, and radio host (@timknox)

Marketing Research is like having a root canal. No one wants to do it, but it is absolutely necessary. You’ll be glad you did it in the end.

-Heather Hinman – Salford Systems (@SalfordSystems)

 Importance of Quality Research

“Doing good research makes you like a one eyed man in the land of the blind.” and its corollary: “Doing bad research is like sticking a lawn dart in your eye.”

-Mark Moody


Outliers – the folk at the back of the hall who know something and you don’t want to listen to them.

-Dr. Brian Monger (@SmartaMarketing)

Stimulus-Response Measurement

A comedian does not simply walk on stage and say “I’m a really funny person.” They continually measure the crowd reaction as they alter different ways of delivering a joke. They may change how the punch line is delivered, the length of the pause before the punch-line, etc. By testing various ways of telling the same joke, the comedian is able to identify the best way to tell the joke.

Conjoint Analysis Methodology – Booking a flight

Every day we make a series of choices (or trade-offs). There are usually many elements involved in each choice and we typically don’t take the time to weigh-out each element in our decision process. We just choose.

To illustrate how we make trade-off’s in purchasing decisions, think of when you book a flight.  You don’t pick just the cheapest flight, you also consider the time, number of layovers, frequent flyer miles, baggage fees, etc.  The idea is that traditional surveys ask people rational questions to which they give rational responses (i.e., What factor is most important in choosing an airline? – most people would answer price).  However, purchasing decisions are far more complex and irrational. When you book a flight, you choose the flight that best fits your desires. By analyzing a series of choices during a conjoint exercise, the individual importance of each element is derived.

Conjoint Analysis Structure- Doughnuts

I like to think of Conjoint Analysis as a “doughnut shop”. The features (attributes) of a doughnut shop are the things it uses to make and sell doughnuts; such as ingredients, flavors, pricing, and time of day to sell.

A feature at the doughnut shop (e.g. Flavor) can be broken down into levels (e.g. powdered sugar, chocolate, cinnamon sugar, plain, or maple.)

VP of Client Services at Survey Analytics – Esther LaVielle (@surveyanalytics)

MaxDiff and Food

MaxDiff is like a refrigerator full of food. There’s one thing in the refrigerator you love the most and can’t get enough of, and there’s one thing you wouldn’t touch with a three-foot pole.  For my husband he loves cold beer the best from the fridge and can’t stand my smelly homemade kimchi in a jar.

-VP of Client Services at Survey Analytics – Esther LaVielle (@surveyanalytics)

Market Research – The Football Coach

The modern marketing researcher is the football coach: providing each of the team’s players with the right level of intel at the right time, before, during and after the match; so that all players can focus on what they’re hired to do: winning the game!

-Hans Lingeman- CEO / Partner at Winkle (@hanslingeman)

Crowdsourcing: Difference between closed and open innovation

Imagine that you are planning a big surprise party. You want it to be entertaining, spectacular, memorable and different. You could plan and project manage every element of the party yourself: the theme, venue, music, food, drink, entertainment, games, diversions etc. Or you could involve a number of people to help you with their ideas and their skills. One person could manage all aspects of the venue, someone else could design special decorations, another person could put together a music mix and so on.

If you do it all yourself then you are in complete control, you have sole responsibility and you can keep the whole thing a surprise but you have to remember to do everything and it is only as good as your ideas. If you bring in a group of friends and experts to help then you can harness their imaginations; you can bounce ideas off each other. You have to delegate tasks which involve collaboration, supervision, letting go and an element of risk. Keeping the whole thing a surprise is more difficult but can be done. The choice between doing it all yourself and doing it with a group is the choice between a closed and an open model.

-Paul Sloane – Inexorable Rise of Open Innovation and Crowdsourcing

Focus Group Facilitator (Moderator)

“To use the analogy of exploration, the facilitator is not so much the expedition leader; rather, a combination of navigator and cartographer, ensuring that the group heads in the right direction but happy to investigate new paths if relevant to the purpose of the expedition”

Bob Gates and Mary Waight (2007:113 – Focus Group Methodology: Principal and Practice)

Data Mining

Finding the proverbial needle in the haystack.

Data Mining: The danger of the model “over-fitting”

“Say you are at the tailor’s, who will be sewing an expensive suit (or dress) for you. The tailor takes your measurements and asks whether you’d like the suit to fit you exactly, or whether there should be some “wiggle room”. What would you choose?

The answer is, “it depends how you plan to use the suit”. If you are getting married in a few days, then probably a close fit is desirable. In contrast, if you plan to wear the suit to work throughout the next few years, you’d most likely want some “wiggle room”… The latter case is similar to prediction, where you want to make sure to accommodate new records (your body’s measurements during the next few years) that are not exactly identical to the current data. Hence, you want to avoid over-fitting.”

- Galit Shmueli

How product innovations actually spread through a market

The chance that an individual tree will be consumed in a fire depends upon neighboring trees, how many of them burn, how long the tree is exposed to high temperatures, and if the tree is isolated without nearby burning trees to provide a source of ignition. In dense forests, most trees will burn, but in sparse forests, few trees will burn.

Translated into marketing insights, if potential customers are surrounded by people who have adopted a product innovation, and if this adoption is repeatedly brought to their attention, it’s likely they will also adopt the innovation. Multiplied over an entire market, this innovation will be a success. But if the reverse conditions prevail, the innovation will not reach a level of acceptance necessary to remain in production. It will fail.

Robert Winsor – (1995). Marketing Under Conditions of Chaos – Percolation Metaphors and Models. Journal of Business Research, 34 (1995), pp. 181-189.

Optimal Marketing-Mix

An optimized marketing mix is like a recipe for chocolate cake.  You can take a few basic ingredients (flour, eggs, milk) and mix them in one set of proportions – add some flavoring – put it in the oven and you get chocolate cake.  Take the same ingredients and mix them another way – put them in a fry pan, and you get pancakes.    There is no “one answer” – the recipe depends on what you’re trying to do.

-Jeff Ewald – CEO at Optimization Group

What are some you your favorite marketing research analogies?



Nominations Open For 2014 Ginny Valentine Badge of Courage Awards

The Research Liberation Front (RLF) announces that it is accepting nominations for the 2014 Ginny Valentine Badge of Courage awards.



The Research Liberation Front (RLF) announces that it is accepting nominations for the 2014 Ginny Valentine Badge of Courage awards.

The awards, an industry first, were launched in London in 2012 to recognize exceptional bravery in the market research industry. They are named after the late Ginny Valentine, the “ultimate revolutionary” who established semiotics in the UK as a legitimate market research tool in the 1990s. In this their third year the awards travel to Atlanta as a collaboration between the Research Liberation Front and GreenBook, a champion of innovation in research.

In the video below, John Griffiths describes why an award was established in Ginny’s name:

The winners will be announced in Atlanta in June 2014 at the Insight Innovation Exchange of market research innovators.

Unlike conventional market research awards, which typically celebrate marketing success or the rigor of a clever methodology, the Ginny Valentine Badge of Courage is awarded to those who fought long odds and showed exceptional determination to produce great market research which informs and inspires.  The awards are designed to recognize researchers wherever they work – within client companies or as suppliers.

You can submit nominations and vote on current nominations using a crowdsourcing ideation approach here: http://kl.grupthink.com/.

Nominations are open through to April 30th 2014.

Bravery takes many forms: developing a new approach; struggling with a difficult brief;  undertaking cultural research methodologies such as semiotics and ethnography; setting up new venture; pushing through controversial projects; pioneering new approaches in a client organization, even physical danger, to name a few. In all cases, the panel will look for nominees with persistence, drive, and guts.

Anyone can nominate someone for a bravery award. From anywhere in the world. As long as the area is research. You can even nominate yourself!  Nominators need to explain in up to 500 words why they think the nominee should receive a bravery award, and post this on our crowdsourcing board at: http://kl.grupthink.com/.

The public will then vote on which nominees make it to the shortlist. The final decisions will be made by our fearless panel:

  • Ana Alvarez, Senior Director Insights, PepsiCo Latin American Beverages
  • Catherine Willis, Customer Research Manager, Delta Air Lines
  • Betty Adamou, CEO and Founder, Research Through Gaming
  • Alison White, Managing Director, Face Facts

The judges’ decisions will be revealed at Ginny Valentine Bravery awards ceremony at the Insight Innovation Exchange, Atlanta

This award is ideal for managers and research sponsors looking to recognize the work of team members or other stakeholders. You can find more information about the 2013 awards on the crowdsourcing site or at www.researchliberationfront.com

Speaking on the launch of the third year of the awards, John Griffiths from the Research Liberation Front commented:

“The Ginny Valentine Badge of Courage awards won a new group of supporters after the ceremony in New York last year when winners included a researcher at Unilever who had launched a global accreditation program to improve standards which wasn’t universally welcomed by suppliers and a researcher who was bombed out of her offices in Pakistan because of the healthcare research she was doing.  Another winner Catalina Mejia who was running community surveys in Colombia under the noses of FARC guerrillas, is speaking at the IiEX LATAM this week in Santiago and will be one of a panel at the event to encourage the nomination of brave researchers now the nominations are open. These awards are really quite unlike other industry awards and the winners have the satisfaction of knowing they will inspire others in the industry to do better braver work.”


Who’s Afraid Of The Big Bad Algorithm?

Captain America: The Winter Soldier deals with many of the very real concerns we have in our very real lives today, including what might be modern society’s new unmentionable “A”-word: Algorithm.



Editor’s Note: Like any good geek, I went to see Captain America: The Winter Soldier this weekend. Suffice to say I LOVED IT, but for reasons I didn’t expect going in. It was a very smart movie in all ways, but perhaps smartest was the integration of smart tech, Big Data, and predictive analytics into the major plot of the story. It was actually called “Operation Insight” in the movie! After the mandatory discussion about the flick with my family who went with me, I broadened  the discussion on social media. Immediately Eric Swayne started tweeting back about his take on the movie along the same lines as mine and we both agreed it would make a great blog post. Since I knew there was no way I could fit it into my schedule this week, Eric volunteered to do the honors and it is an excellent post! I hope it’s the first of many from Eric: he is brilliant. Enjoy!

By Eric Swayne

WARNING: The following post will contain copious references to Captain America: The Winter Soldier, some of which may reveal key plot points in the movie. Proceed on at your own personal superhero movie preferred level of risk.


Saw the aforementioned movie this past weekend, and I must admit, it was awesome. And I’m not alone: the movie has already set a box office record for April (beating Fast and Furious 5), and has received critical acclaim from even the most staunch of fanboys. Sure, there were Easter Eggs and references galore, including one brilliant nod to Samuel L. Jackson’s iconic role in Pulp Fiction. But what makes this movie really pack a punch isn’t your standard fare of pecs and abs – I think it wins with audiences at a deeper level because it deals with many of the very real concerns we have in our very real lives today. Drones and electronically-controlled death from the sky? In the movie. Government being the entity you can’t trust anymore, because of their hidden agendas? Got it. Questions about the nature of privacy and the terrifying power of data mining? Check.

But beyond those, you see discussed what might be modern society’s new unmentionable “A”-word: Algorithm.


In the movie, a mad scientist (and I’ll leave the description there) creates an ultimate algorithm that can predict which individuals will be dangerous to the “bad guys” in the future, thus giving them targets to attack in their nefarious scheme. Within the movie, it’s stated that this algorithm uses the detritus of our digital lives to accomplish its evil machinations: credit card statements, phone calls, text messages, social networks, et cetera. Of course, this level of data collection sounds a lot like some recently-unveiled REAL government programs that are rocking the international community, so you immediately see the monsters in the shadows the film’s creators are implying. Moreover the concept implies that, given enough data about an individual, an algorithm has almost infinite powers of clairvoyance with deadly accuracy. Here is where fiction diverges from reality, because while the data may be limitless (tremendous privacy issues aside), algorithms have considerable limits they don’t discuss on the silver screen:

1. Algorithms are based on assumptions.

The most common assumption baked into most algorithms is that past performance predicts future behavior. This is often the case, but isn’t correct in all cases for all time. In the film’s case, the algorithm assumed people were binary – either enemies or not. Pay close attention when assumptions hit binary “chokepoints” like these when dealing with human data – humans are extremely messy from the perspective of their data, at an aggregate level. In fact, assumptions one makes about the world could be very correct at the time they are made, but can rapidly become obsolete. One classic example of this is in Natural Language Processing for sentiment analysis: most solutions currently on the market equivocate their answers into discrete buckets of “positive,” “negative,” or “neutral.” This already presupposes the content has a sentiment, and that it can be categorized for the whole of the analyzed segment. It also assumes certain linguistic patterns are reliable clues for that sentiment, when we all know language evolves rapidly, and words of a negative connotation can change or even reverse their value. Sentiment algorithms are great for high-level directional measurement, but at ground-level, can be insufficient.

It’s extremely critical to be self-aware when baking these assumptions into any equation, because they necessarily limit the outcomes you can create. After all, if the world is only black and white, it’s extremely hard to see color.

2. Algorithms are probabilistic.

Most behavioral algorithms are based off of statistical models, using past events to find patterns that appear with a significant level of consistency, then applying these patterns against new data gathered to score potential outcomes for the future. The key word there is “score” – very often, algorithms provide a confidence interval, not an “answer.” These scores define a range of futures that may be more possible than others, but they all could exist. For the ultimate sport of stat nerds – baseball – ESPN often provides probabilities for a given team winning a given game at a given time. Even though these may say my beloved Rangers have a 99% chance of winning, there’s always a chance for the other team to find that 1% – usually with a ball somewhere in the bleachers. I have a lot of certainty before that point, but that certainty isn’t absolute.

3. Algorithms are reactionary.

Let’s say you wanted to create an algorithm to predict what time I would get home from work every weekday. How would you start? Like any good behaviorist, you’d like to have a set of information about my habits to start from. Especially useful would be data that have a high correlation with the event you want to know about – things like what time every day I pass the gas station down the street. But the important nuance here is that you have to start from informative data about this event, not just any old data. For example I could tell you my car is grey, has a v6 engine, and the right-front tire is about 4psi low. All very personal data points, but totally useless for this purpose. They’re not an effective data set for training an algorithm in my behavior. In fact, it’s very possible to create a bad algorithm using this data, using an assumption (there’s that word again) that people who own grey cars consistently arrive home after 5pm. This can be entered into an algorithm, but it’s still totally incorrect. Every algorithm created is reactive to the data set available to the creator, and can not be pulled out of thin air.

Algorithms are tools. Like all tools, they carry no inherent “good” or “bad.” And, like all tools, they carry the flaws of the humans that create them. So the next time you see movies (or real life) treat algorithms as some omniscient source of clairvoyance, just remember they’re only that way in the comics.


Is Your Focus Strategic or Tactical? Why Not Both?

As mobile usage becomes more dominant the options for better, faster and more useful information grow exponentially.



By Ellen Woods

In most research organizations the research is primarily tactical with the more strategic initiatives left to other areas. If you’ve been at this awhile, you probably have sat through more than your share of meetings where horns have locked on the best approach with those who have a strategic voice often at odds with the more tactical assessments.

In fact, that is one of the biggest reasons why digital marketing has changed the face of most marketing organizations and why, in many organizations, market research finds itself struggling to stay relevant.

Tactical research is far cheaper to execute and provides answers to specific questions. It usually has little impact beyond the question or project at hand unless normatives are involved and the execution is usually done quickly. Therefore, it wasn’t a big surprise that it became the basis for tracking and the method of choice when Internet research made frequent data collection cheaper and quicker and at least by “time to the boardroom”, better.

Therein lies the rub.

Cheaper, quicker, better worked for research up until the time data analytics entered the stage. By design, the research was meant to be directional and it was quickly discovered that most people would barely sit through a fifteen minute instrument, let alone longer without an incentive. What happened next is a reality we are living, but it’s important to understand that even in the glory days of trackers and long surveys, there were market scientists who were looking for more. The short term solution was self-administered surveys that provided insight to more specific questions and allowed dwindling outside panels to be reserved for the larger surveys. Then came communities, the power and value of which was solely in the hands of the administrator. Planning was hard and in many cases, respondents became bored. Surveys came fast and furious from the check-outs, pop-ups, special requests and direct mail.

The need for speed accelerated and as it did, mobile technology changed the playing field and tactical surveys became even shorter and less effective.

The strategist on the other hand, being the turtle chasing the rabbit, decided to invest in the data. As data analytics became widely accepted, first with real time transaction measurements, the power of existing data began to flourish. The camps, now fully divided, took their corners and their cases moved to the boardroom.

In all fairness, data analytics can never replace tactical evaluations. While “data” can provide context, on its own, it can never answer the all-important why or how questions. Tactical research does a really good job of identifying what doesn’t work but not such a good job of identifying what does. Neither tells us how or why choices were made.

The strategist understood this dilemma far earlier than those of us seeing the trees rather than the forest. Enter stage left, behavioral analytics.

Behavioral measurements have the same problems that plague tactical research, because humans often aren’t logical. Measurements exist largely in snapshots and aggregation was iffy at best.

Meanwhile, back at the ranch, digital marketing was advancing rapidly and taking market analytics along with it. Geo-location, search incorporation and the general big brother nature of data measurements was advancing rapidly into a science know as predictive marketing.

Many researchers, still stubbornly stuck in quagmires of quadrants were now trying to understand neuroscience and patterning to create relevancy.

Strategists understood what they didn’t know and they knew many of their answers were in the data, lots of data. By harnessing IT to “sort” the data, a new model was emerging. Applying behavioral measurements to data began to yield a new kind of segment, the kind that exists in real time and has relevancy to the problems at hand. Enough data, they thought and there is an ability to predict at least what the range of reactions might yield for some very specific populations.

The best part, it could match activities in real time. As most researchers and strategists understand, people often say what they think is wanted in a survey, interview or community, or they have an agenda. Now, we know what they actually do. When the pieces of the puzzle are connected, we know why.

But there is still a piece of the puzzle missing. We don’t know to what extent. That’s where the tactical aspect of market research loops back into the picture. Short surveys, communities, tactical assessments (taste tests, IDI, etc.) yield a great deal of insight into the potential success of products and services and they tell us in real time how we are doing.

Concept, product and advertising tests yield an assessment of the degree of potential success. Since we know the range of possibilities within our strategic assessments, we can now understand the degree within a specific circumstance and with a very discreet audience. We are one step closer to an ROI and a lot closer to meaningful assessments.

As mobile usage becomes more dominant, the options for data collection grow ever narrower, but the options for better, faster and more useful information grow exponentially. The biggest danger in any new method is the damage is does to the consumer or corporate buyer. The next big frontier for market research may be in determining the responsible use of data, especially if the trend toward more localized purchasing continues to accelerate.