1. Clear Seas Research
  2. SIS International Research
  3. Research Live
  4. RN_In_App_Effectiveness_GBook_480_60

Hacking The Gap Between Clients & Suppliers

At IIeX Amsterdam we introduced a new feature of the events: Hacking Market Research. The results were one of the highlights of the event and unveiled a whole new way to deliver solutions to the industry.
Hackathon-logo
Editor’s Note: At IIeX Europe we introduced a new feature of the events: Hacking Market Research. Working with Mark Earls (author of Herd and I’ll Have What She’s Having) and John Willshire of Smithery we split the audience between clients and suppliers and engaged each group in an experiment to figure out how to fix what problems they thought exist in our industry, with an emphasis on bridging the gap between clients and suppliers. The results were one of the highlights of the event and unveiled a whole new way to add impact to the conferences by continuing to focus on delivering solutions to the industry.
Today Mark and John give a view of the experience from their perspective as well as a detailed view on the problems and solutions uncovered during the event. They also detail a continuation of the process in partnership with Jon Puleston of GMI to bring the Hack process into an online environment. You can participate in that here: http://qsdc.gmisurveys.com/srv/?p=jwgwl1.
We’ll be doing more with this in the future, but for now enjoy this in depth review of an important new initiative to move the industry forward.
By Mark Earls & John Willshire

“If we knew what it was we were doing, it would not be called research, would it?” - Albert Einstein

Introduction

I’d like to let you into a little secret. I don’t know much about market research. Of course, if you were at IIEX in Amsterdam and witnessed Mr Mark Earls and myself onstage, it’s hardly likely to be a secret. If Mark’s a Sherpa in the mountain range of market research, I’d still be standing in the foothills, frantically staring at the map on my phone. The great and the good of MR no doubt quickly drew this conclusion themselves. Well, the Qually ones did, that is; the Quants no doubt asked a significant number of people from a random distraction across the room before deciding… (See, I’m even trying to discombobulate by throwing in a few MR references, veiling exactly how much, or more accurately little, I do know about market research.)

Yet the shallowness of my market research education is not for the want of trying. In 2000, fresh-of-face and impenetrable-of-accent, I spent a year as a graduate executive in the media division of BJM in London. Part of this entailed riding the methodological roller-coaster of a training regime; any kind of research the company conducted, the graduates had to get hands on with. I found myself interrogating people in hall tests for acrid-tasting energy drinks in Camden, to traipsing the streets of Hounslow one wet autumn, conducting door-to-door paper surveys about washing powder, where untempered Glaswegian tones did not help my cause.

Being in the media division, though, I spent a lot of the time doing the preparation for JICREG surveys; cutting out regional and local newspaper mastheads with scissors, sticking them onto paper, and photocopying them. The most important part of the exercise, as I was instructed, was the randomization. So as not to introduce bias in the placement order of the titles, each of the six mastheads on each prompt sheet was rotated into all the different possible positions, and an equal number of each variation was made up to distribute to the field force. Fast forward 14 years and I wonder if, in a way Mr. Miyagi would be proud of, this simple principle has had a significant impact on the design of Artefact Cards which Mark and I used to run the Hacking Market Research experiment at IIEX. For those of you who weren’t there, a brief recap; at the start of the client/supplier streams, we asked everyone to take an Artefact Card and write on it the thing they thought was ‘broken’ about Market Research.

hack

We grouped them as client things and supplier things, including sub-groupings of interest, then displayed them on the wall. For the rest of IIEX, we talked about the territories with all delegates, and together we made quick ‘three card hacks’ – take three cards from anywhere across the wall, and make quick solutions for them. You can read what some of them are over here, on a tumblr we set up on the day – http://iiex2014.tumblr.com/.  The full details of what we found that day are below as well.

Building From Here

In hindsight, what’s emerged from the Hacking Market Research experiment at IIEX? Firstly, the experimental nature of the exercise itself is important. It was something we hadn’t done before, and the participants hadn’t done before; it was research of a method, as well as of a topic. Everybody we talked to there was willing to get hands on, to embrace experimentation. This was ably demonstrated by Jon Puleston from GMI, who came up to us on the day, and volunteered to use a simple platform they had to pull the cards into an online survey format we’d love you to take part in here -

http://qsdc.gmisurveys.com/srv/?p=jwgwl1

Basically, both participation of the audience, and client engagement before/during/after were really high. Given both of these issues surfaced strongly in the ‘what’s broken in MR’ material we collected, that’s an interesting thing to take forward; instead of selling a finished methodology, should you invite clients and participants in to explore new ones together?

Secondly, of course, there’s the randomization factor, taught as a principle to me all those years ago in BJM; eliminate potential bias in the answers by rotating the elements. The principle remains in the Hacking Market Research experiment, yet the methodology is different. It’s of the same school as the ‘cut-up technique’ used by the Dadaists, David Bowie, and of course William S Burroughs (“when you cut into the present, the future leaks out”). And whilst it’s something that the community clearly knows in principle, it is perhaps too embedded in existent methodologies to be truly useful in the current age. What is a principle in market research, and what is a method? Perhaps the two have become too entwined in a lot of cases. Think about the GMI survey above; what in here is principle, and what is method? And how might we create a better version that tells us more?

Which brings us to our last point; it is much, much easier to innovate when you’re looking at fixing something wrong than it is starting from scratch. And to do that, you’ve got to make things and put them in front of people, and work through the issues that become apparent. When you’re looking at the GMI survey, it’s a pop up two of the broken things next to each other. But don’t choose either/or, do something different. Combine the two you see, frame a problem, and solve it, quickly. Make a version. Show it to someone else.Start solving the broken things. Because whilst I may not know much about market research, you do. And there’s nobody better placed to fix it. If I can help you in any way though, feel free to reach out! John V Willshire http://smithery.co @willsh

Here are the results of the IIeX Hacking MR exercise. Take the online version for more solutions and join us at IIeX North America as we build on this with the audience there as well!

Research ROI

In the face of the fire hose of data which now washes through many client organizations, MR expenditure increasingly needs to justify itself. Unfortunately, neither client buyers or research suppliers are good at the language of business and in particular the language of economic value. Is there a way that we can start to frame MR projects to identify their value? To talk usefully about the ROI of a given research project? What if we were to identify the value of getting a given decision right or wrong (Think Coke Original or Tropicana redesign)? What is it worth to the company to get the prediction right? And to build research rewards on the back of it? This would not only serve to align the research client to the broader business outcomes but also align the research supplier and how it designs the project?

Talent Agencies

Whatever else they do, procurement procedures don’t seem to be helping small and niche suppliers – often highly specialist premium price businesses. Tendering is expensive and cumbersome for these types of business and is often seen to encourage low pricing. Many expert suppliers find themselves thus excluded for working with clients with whom they have done so for many years and feel they have little scope in gaining access to them in the future. What’s more, the financial implications of such large tenders are significant: they squeeze cash flow of suppliers by delaying payments. Over the fence many clients feel that they can be excluded from working with specialists when they need them by existing roster arrangements and by procurement processes. And the bigger suppliers can struggle to access the latest and expert thinking and practice – they cannot carry what is often expensive overhead in their trim and well-run businesses based around such large tenders. What if we adopted models from industries like aerospace: a handful of big suppliers act as talent agents for smaller expert ones, subcontracting to the talent where relevant and managing both the financial and the contractual arrangements as appropriate. Everyone would win – small suppliers, clients and bigger suppliers. What’s not to like?

Partnerships

It’s clear that in many cases, clients are buying the people not the method sold by those people. However, many suppliers’ focus is on the method and the business of that method rather than the people piece. Suppliers’ understanding of the clients business, its priorities and processes and how research might impact them is often seen to need improvement. Agencies have long talked up the value of strategic partnerships as a solution to this but these work only some of the time. One answer to this is the notion of strategic partnerships – nice to talk about but incredibly difficult to implement in many circumstances, not least because of the downsides of “preferred supplier” status and procurement’s desire to drive price down again and again. What if Client and Research Agency were to build new businesses together? To use some of the tools and practices from start-up culture to build businesses that involved MR? To replace the research report with a business plan (at least symbolically)? This might offer some opportunity to hedge against the relationship being merely transactional and encourage both parties to look to create long term mutual value.

Do:Think vs Think:Do

In many corners of the business world (most notably in software development) more “agile” practices have emerged in recent years. All of us recognize the need to be more experimental – to try things out rather than merely talk about them. However, research practice is still largely stuck in the old paradigm of Think:Do, allying itself with the thinkers rather than the do-ers. This runs directly against the pressures that researchers and their clients often feel: on the one hand, the need for speed is pushing us to conclusions before we are ready – conclusions, which can be held against us later; on the other, despite of (or perhaps even because of) the explosion of data in many organizations, more decisions are being taken on the basis of hunches and intuitions rather than data. At the same time, while fast results are sought by research-users, is research practice able to handle the frequent short iterations demanded by “agile” processes? What if we set out to reconfigure research practice so that we seek to gather insights AFTER the business acts rather than before? To spend more time in the wild (c/o Mike at Instagram) than in the office? To learn from the changing responses to more frequent iterations of a prototype than to big set-piece research projects? To be more rough and ready then thoroughly professional?

Procuring Dull Stuff

Both sides acknowledge the pains of procurement processes (tendering for ever and as many feel, on the basis of price not quality) and the unhelpful outcomes the best can creates in terms of hindering those who want to work together working together (whatever else it does for financial management). Some feel that it can also lead to reinforcing unhelpful or undesirable practices (suppliers feeling duty bound to make projects over engineered and over complicated in order to keep the price up). Rather than abandoning the procurement process and the value it creates for the organization, we encourage them to invert its terms: could researchers be asked to bid – as academics do for grants – based on the desired outcome rather than the deliverable service and its costs. Could the hoop-jumping be done subsequent to the pitch rather than during it? What would these new boundaries look like for procurement? What kind of practices can they model such a new approach on?

Minute Masterpieces

The dusty research report is emblematic of what is wrong with MR today. Unread (and even unreadable in some cases) they take a lot of time and effort to make but serve no obvious purpose (apart from making it easy for suppliers to justify pricing – given that procurement see them as standard research practice and thus easy to price). What’s more they are written in jargon-heavy, impenetrable text. What if suppliers never actually send the report but instead use it as a “shooting script”? Doing the work of compiling the report but using it not to communicate the findings of the research project (in exhaustive detail) but to decide what the key points to get across actually are. This would enable the supplier to focus much more on how to best articulate what the audience needs to know and to make it more likely that the points would land. Imagine 6 1-minute masterpiece videos that make the key points and get watched and share, rather than 1 x 120 page report which doesn’t get watched and doesn’t get shared.

The Language of MR

Perhaps the issue of client boredom lies in the way we communicate research, research practices and its output. The language of MR tends to be serious, reserved and professional. However, this can be boring and reduce engagement. It is dominated by verbal- rather than visual communication which we know is not the easiest to process (“System 2” heavy?). The solution might involve embedding the skills of journalists and designers into the practice of research. Infographics are increasingly used by smart research practitioners but not widely enough yet. More training in communication skills – especially in non-text based media such as film, design and audio would help and feedback mechanisms (what did the client organization hear when we told them x…?). Perhaps a focus on reducing the takeaways to e.g. 1 card for each project would help? A “thumbnail” debrief, if you like.

Small, Sharp, Shareable

Within their own organizations, clients are too often seen as PowerPoint pushers; foisting information in huge, unwieldy decks on unsuspecting colleagues.  A research debrief is a thing of dread.  It’s frustrating for everyone involved, as the vital information gets stuck at this gatekeeper to the organization, and all the insight doesn’t turn into anything useful. The burden of change lies with how the information is supplied.  If the client only has small letterboxes to post the right information through, then don’t send them shipping containers full of information. Think about how to create small, razor-sharp, shareable pieces of information for the client to deal to the organization.  Treat these precise, desirable pieces of deep insight with the respect they deserve, and people will start asking ‘…is there any more where that came from..?’

The Next Generation

image We heard from both sides about the young, inexperienced people that are part and parcel of both client and supplier sides of the equation.  Combine that with the fact that everyone complains about how long-winded, wordy and boring the outputs of Market Research can be, and you can see an interesting possible solution. Younger generations, more than ever, are more likely to have developed their skills in communicating ideas in short-form, mixed-media formats; short expressions in text, quick snapping and editing skills in photos and visuals, a feel for how to make video work to deliver ideas in quick, compelling ways. The industry can and should make more of this innate talent pool.  Don’t look for a generation that can replicate the work that currently leads to the issues described; look for people who can take those crucial insights for client businesses, and design them in a way that they will travel.
Share
You can leave a response, or trackback from your own site.

One Response to “Hacking The Gap Between Clients & Suppliers”

  1. Ellen Woods says:

    March 4th, 2014 at 9:37 am

    This article resonates in many ways. First of all, thanks for acknowledging the elephant in the room. Secondly, many issues have been identified and at least for this group, prioritized. I wonder what others in a company would say if they were given the same exercise?

    Most likely they would focus on the insights or the ability to apply the information. The issue in research has rarely centered around methods, sample maybe, but not methods. Long before there was an abundance of information, there was an issue with data translation. Data by definition is reflective. When you add sampling and survey length into the mix, the combination of lag time, audience and attention spans make it a less than perfect tool and that’s without considering factors outside of the survey. There are just faster and better ways to understand the market now. That pretty much explains why Google Surveys and communities have become so mainstream and such a large part of the focus of research. So, what happens next?

    If you believe the conclusions here, then the answer is in faster and with better communication. I would agree with the communication but it still seems like the elephant is there waiting to be heard. It doesn’t matter how you communicate if the message isn’t stops short of delivering any real insight or worse, provides no basis for action. Perhaps the first question should always be “What do you want to do with the answers from this survey?” Ask a question and you will get an answer, but does it simply answer the question or does it allow you to complete the task that generated the question?

    Chances are that the survey can only partially answer the question. So, how do you enhance the survey data? Better yet, is there a way you can answer the question without a survey? Probably, unless you are in innovation or product development. Even then there is likely data that will speed up the process and allow a more qualitative orientation. Perhaps the best answer for both suppliers and researchers is to understand the business question, then the technique becomes obvious and the data more meaningful and the report more visual because you can “see” the answer.

Leave a Reply

*