When the Conclusion Isn’t Supported by the Methodology (and why RateStars is utter nonsense)
Editor’s Note: The friend that got phished that Kathryn mentions in her post was me, so take heed of this one folks! I think there a few lessons here:
- Just because an invite comes from a friend doesn’t mean it’s legit.
- Use common sense and if something seems off, verify it.
- As Forest Gump said “Stupid is as stupid does.” Sometimes that means what looks like a bad idea is just that: a bad idea.
Of course the broader issue Kathryn raises shouldn’t be glossed over due to my personal pique. New techniques and approaches that are within the wheelhouse of MR are emerging regularly and we need to investigate each ine based upon it’s merits and context and learn what we can from them, even if the lesson is “well, that was pretty dumb.”.
By Kathryn Korostoff
Sure, we all know to question whether a conclusion is supported by the data. But what about the data collection methodology itself? We sometimes have cases where the data and the conclusion seem to match, but a closer look and you realize the methodology was fundamentally unable to support the conclusion. Today’s case in point: RateStars.
RateStars is a new website that promises to identify “The Top 100 in any Industry.” Sounds cool, right? Being on that list would be a sign of prestige. If I was looking for a consultant in a specific field, I could now find the top 100 to choose from, right?
Not so fast.
First, let’s step back. If I wanted to create a list of the top 100 market researchers, how would I accomplish this?
- A survey of peers?
- Nominations by an esteemed council of recognized experts?
- A calculation based on number of magazine/journal articles published in the past 5 years with points for public speaking, books published, and awards?
Any of those methods would result in a list that could be credibly called “the Top 100 market researchers.”
So, how does RateStars do it? Well, I looked. And I know this may be shocking, but they didn’t use any of the methods I thought of.
Here is how RateStars works:
- First, the candidates have to allow it to access their LinkedIn account (or install the Facebook app), which I did temporarily only so I could peek under the hood.
- The candidate nominates themselves to be on the list by creating a profile and selecting what category they want to be listed in.
- The candidate then asks their friends and connections to rate them using a scale of not just stars, not just smileys, but stars AND smileys (yes, really).
- Reviewers can rate anonymously but must sign in for the review to be published. Reviewers must also offer at least 20 words about the candidate.
- The candidates are now listed in the top 100 of their category, in order of number of ratings.
So unless more than 100 people ask to be listed, a candidate is guaranteed a place on the list. And because there are many, many related categories, a candidate could place themselves on many lists—becoming a bigger fish in a smaller pond in order to gain a more desirable position.
Within Market Research, a niche to begin with, there are over 40 subcategories.
So what would a reader conclude? Joe Schmo is on the Top 100 list of “Market Research Project Directors”? And is in the top 10 of “Project Coordinators”? Really?
Don’t Trust the Methodology? Don’t Trust the Results
So to be precise, RateStars does not identify the top 100 in any industry. It identifies the top 100 people who want to be listed and who took the time to campaign with their friends and connections and who have friends willing to connect to applications and who have friends who buy in to this methodology (or are too nice to say no). It is just another online popularity contest.
There is nothing wrong with a popularity contest. But to claim it is anything more is just silly.
[Apologies to the friends who have asked me to rate them on RateStars; I don’t mean to be harsh, just honest. Frankly, everyone who has asked me is a great researcher; your bios already demonstrate this.].
[UPDATE: Just heard from a friend who was phished by RateStars; his LinkedIn account was hijacked to send out ratings requests without his knowledge. Since I signed in to the site to peek at the details, I am hoping I did not inadvertently trigger the same treatment!]