1. Clear Seas Research
  2. SIS International Research
  3. RN_In_App_Effectiveness_GBook_480_60
  4. grit2014_480x60

The Conundrum of Changes in Market Research and Predictive Analytics

Technologists and market research practitioners have long lived in parallel universes. In technology, we deal with tables, joins and the ETL process. In market research and analysis, we deal with datasets, merges and data preparation. When you think about it, these are the same things! The subtle difference is that technologists have had a data mining mindset and market researchers have had a hypothesis-driven mindset.

solving conundrum

 

By Tony Cosentino

On November 14th Research For Good held a webinar on “Predictions for 2013″ with Simon Chadwick of Cambiar, Reineke Reitsma of Forrester, Leslie Townsend of Kinesis Survey Technologies, and Lenny Murphy of GreenBook / Gen2 Advisors  to discuss their thoughts on what 2013 may hold for the insights industry. As a market research practitioner and a technology industry analyst covering big data and business analytics, I enjoyed listening to other analysts discuss the market research industry in a webinar.  My own research augments and sometimes contrasts with that of the webinar participants.

In the webinar Simon Chadwick spoke about data mining and the need to focus on the analytic skills gap. I’d maintain that businesses need to focus on traditional hypothesis-driven statistics in addition to data mining, especially when we start talking about predictive analytics. While data mining is directed from the data itself, statistics may be thought of as more hypothesis-driven. If you’re familiar with SPSS tools that I recently assessed, you can see the difference by comparing SPSS Statistics and SPSS Modeler. Market research practitioners have traditionally been trained in latter, but not in the former. I go into more detail in a joint webinar I did with IBM on predictive analytics.

Folks who know data mining are in limited supply. They often come from the data warehouse or business intelligence worlds, which have not traditionally churned out deep analytical expertise. Data warehouse and business intelligence is often the domain of database administrators or folks who have a good understanding of structured query language (SQL). Some technology companies are trying to fill the skills gap related to big data (read unstructured data) by taking advantage of these SQL skillsets. Teradata is moving in this direction with its Aster Data integration, and Karmasphere with its toolset, but SQL is a declarative language, and while it fills some gaps in organizations’ ability to access big data, it has own limitations.

vr_predanalytics_usage_of_predictive_analyticsTo bring the point home of how important advanced analytic skillsets are to an organization, our benchmark research shows that companies are more satisfied (70% versus 59%) when their predictive analytics initiatives are led by analytics professionals than by the data warehouse team. (As an aside, our benchmark research into predictive analytics shows predictive to be a key area where marketing and sales are focusing their efforts right now, with social media analytics, attrition, response and attribution modeling as key components of the strategy.)

Since neither the technology industry nor the market research industry has trained analytics professionals especially well, we have a big shortage of folks who can lead big analytics initiatives. This skills gap is driving significant funding for companies such as Mu Sigma, Absolute Data and GoodData. Not surprisingly, these companies target the marketing and market research client-side professional. It’s not a big leap for analytics professionals in the market research industry who already know hypothesis-driven statistics to move into data mining and data modeling, but these skills generally exist on the market research supplier side, not inside client organizations themselves. For this reason, I’d argue that market research supplier firms that are savvy in advanced analytics, such as Probit Research and Definitive Insights (now YouGov), are in a great position to help companies drive their analytics strategies.

Technologists and market research practitioners have long lived in parallel universes.  In technology, we deal with tables, joins and the ETL process. In market research and analysis, we deal with datasets, merges and data preparation. When you think about it, these are the same things! The subtle difference is that technologists have had a data mining mindset and market researchers have had a hypothesis-driven mindset.

As analytics and data environments come together, market researchers need to get more into data mining and more comfortable with data modeling. At the same time, technologists need to get more into hypothesis-driven analytics. In the webinar, Chadwick mentioned the massive advances in technologies, which are apparent when we look at trends in in-database analytics and embedded analytics. We’re seeing a rise in the availability of complex algorithms and open source languages such as R; nevertheless the three most used algorithms in enterprises today are still the simpler ones –  logistic regression, linear regression and decision trees. These should sound very familiar to those of you in the market research industry.

vr_predanalytics_top_predictive_techniques_usedAnother part of the discussion focused on web analytics. I’d extend that to digital analytics and a new class of tools beyond cookie-driven web analytics pulling from machine data, which is exposed in a variety of ways. The net impact is that rather than just looking at numbers of hits or click-through rates or transactions, we’re beginning to be able to see into the customer journey – sort of like doing shop-along in retail, but in a digital space. This gives us great insight into the purchase funnel and competitive dynamics, the likes of which we just didn’t see before. As we marry this technology with analysis of offline behavior, things get even more interesting, but also more complex. We start to deal with privacy issues unless we invoke faceless types of analysis, but that limits us in our ability to market at an individual level. At the same time, attribution modeling becomes more complex given the increases in the number of both promotional and transaction channels.

Finally, my one disagreement with the webinar speakers is their assertion that there’s not enough investment in education going into schools. On the contrary, every major vendor in the technology space I speak with highlights the schools they are aligned with, and most of these folks talk about investing in these schools with respect to analytics training. I agree with webinar participant Lenny Murphy that academia is often slow to change; I often compare market research with academia, in fact. But schools are getting more private funding as the government pulls back its spending, and for this reason schools are becoming more responsive to the needs of private enterprise. It’s here where the skills gap will begin to be filled as schools move away from classical education underpinnings to be more aligned with the needs of the 21st century.

 

Reposting courtesy of Ventana Research

 

Share
You can leave a response, or trackback from your own site.

2 Responses to “The Conundrum of Changes in Market Research and Predictive Analytics”

  1. martin Silcock says:

    January 2nd, 2013 at 3:00 pm

    I found it interesting that “consumer” was used only once the article. All these different techniques what do they really value add in understanding? Where does the law of diminishing terms kick in? Does the extra complexity help when paradoxically decision makers crave simplicity? Infographics and visualisation are probably not the answer either.

  2. Tony Cosentino says:

    January 2nd, 2013 at 4:01 pm

    Good questions, Martin. They focus us on the broader discussion which is around successful decision making. Decision making (whether B2C or B2B) should be based on a combination of behavioral, attitudinal, and profile information. Unfortunately, we have a diversity of data types and sources (both internal and external) in our companies (especially large companies) that make things a little more complex than we would like. This is especially true when we think about Big Data. The term is so broad, and the use cases are very narrow. For this reason, outcome driven approaches are the way many companies are thinking about Big Data and analytics. I study and write a lot about outcome driven initiatives because this is the one way that companies won’t face the diminishing returns problem you mention. At the end of the day, everyone wants a user friendly integrated app like Yelp! to make their decisions in the enterprise. Unfortunately, we’re early stage and there is no clear direction on exactly how our systems will be integrated in the enterprise environment (it’s much easier in a consumer environment for a number of reasons). For this reason, I’d suggest the answers to your questions are somewhere in the middle- some complexity is needed, but as little as neccessary.

Leave a Reply

*

%d bloggers like this: