Download the
NEW e-book from GreenBook: Insights That Work - best research from the GRIT Top 50 companies | DOWNLOAD NOW

Information Assassination – How They’ll Simply Get the Research Wrong

 

By Ron Sellers

It happened again.  Grey Matter Research released survey information to the media, citing a study we did for Russ Reid Company.  The study found “rather than being in competition for the donor dollar, charitable organizations and places of worship may actually complement each other in fundraising.”  In short, people who give heavily to a place of worship also tend to give heavily to separate non-profit organizations.  In a variety of ways, giving begets giving.

In the news release, we were very careful to use words such as, “this suggests,” or “the findings may be saying,” or “tend to support,” rather than definitive statements such as “this proves” or “the findings demonstrate.”  This was because we could present the actual data, but part of the release was dependent on our analysis of that data.

So the news release hits the media.  What do we get?  Headlines such as “Churches, Charities Don’t Compete for Dollars.”

Not that this should be a surprise.  Our news release was filled with a careful explanation of how we did the analysis and how we arrived at our conclusions.  A major news service picked it up and condensed it to eliminate all the “uninteresting” stuff (like the details).  Various media outlets then picked up their synopsis and condensed it further, making our research sound like inarguable fact rather than analysis of the available information.

This isn’t a rant about the media.  They’re unlikely to give as much attention to all the details as researchers are.   But this experience is a good reminder of two things.

First, when you read about research in the media, realize what is often being done to the data.  It’s being shortened, restated, spun, and sometimes even blatantly misinterpreted or misapplied (we’ve had that happen, as well).  Rather than use any statistics you read about in an article, you’d be wise to go back to the original source and find out what the study really said.

Second, what the media tends to do is no different than what your clients are probably doing with each report.  You write a detailed, carefully worded 25 page analysis.  The marketing director then shortens that to three key pages of bullet points.  Her boss only wants one page, and the CEO will then give it 60 seconds in the monthly marketing meeting, so it gets shortened to a single paragraph.  And that’s what the decision-makers see.

It’s true that important nuance and detail is being lost, but at the same time, it’s also a fact that non-researchers generally aren’t going to give the same attention to important details that researchers are.

So what to do?  There really isn’t a solution, but there are steps researchers can take to mitigate the problem:

  1. Realize it’s inevitable that this will happen, and attempt to control for it.  Coordinate with the client to write those brief summary conclusions yourself, rather than allowing non-researchers to control the process (and possibly lose or misstate critical details).
  2. At the very least, offer to review what’s been written, and provide input.
  3. Work with the client to learn what type of reporting will be most valuable.  If they’re only going to use a one-page summary, provide them with a fantastic one-page summary.
  4. Explain things in lay terms.  As soon as most non-researchers see things such as “probability sampling” or “confidence interval,” they’re likely to skip that paragraph entirely, possibly losing critical detail.
  5. Remind, remind, remind – when non-researchers are observing focus groups, I usually go through a brief spiel reminding them to concentrate on the “why” responses rather than worrying about how many people held a particular opinion.
  6. Be a broken record if necessary.  After my focus group reminder (from #5), if I return to the back room and hear someone saying, “But six of those respondents liked that name,” I gently remind the observers that the number is meaningless, and they need to focus on why the six liked it, and why the other four didn’t.
  7. I hate to say it, but there’s also a certain amount of CYA necessary.  As the researcher, I have only so much control over how the findings are used by the client, but I can certainly put any necessary caveats up front in clear language.  If they then misuse the findings, it won’t be because I left them any possible doubt about how the findings can be used.

Please share...

3 responses to “Information Assassination – How They’ll Simply Get the Research Wrong

  1. To your list, Ron, you might also add a caution about the ability to generalize to the larger population given that at least part of the sample appears to be from a volunteer panel. A good standard sentence to include in the methodology disclosure might be: Due to its opt-in nature, this online panel (like most others) does not yield a random probability sample of the target population. As such, it is not possible to compute a margin of error or to statistically quantify the accuracy of projections.

  2. Reg, you raise a very valid but very different issue – is there such a thing as a random probability sample any more? With cell-only households, call blocking, caller ID, etc. is phone even representative any more?

Join the conversation

Sign up to our newsletter

Don't miss out...
Get great insights content delivered straight to your inbox.
I agree to receive emails with insights-related content from GreenBook.


You can manage your email preferences or unsubscribe at any time. GreenBook protects your privacy under the General Data Protection Regulation.