Categories
June 21, 2021
Layering a quality-control approach produced high-quality data.
0
Data quality is a persistent and critical theme in market research. As companies increasingly turn to multiple data sources to make business-critical decisions, the pressure is on to deliver quality outcomes from primary research.
Knowing that quality begins with survey responses, we at DM2, along with our partners at Rep Data and Research Defender, decided to tackle current issues head-on by conducting in-depth research-on-research. During the study, we assessed the efficacy of applying different screening and data quality techniques in a survey setting. Our hypothesis was that the sweet spot for delivering quality would require a coupling of soft skills such as expert, consultative project management with techniques such as agnostic sampling for representative audiences and advanced fraud mitigation methods.
For the study, we conducted a survey in early Q2 2021 among n=2,002 gen pop consumers. Rep Data sourced samples equally from four of the research industry’s larger online sample providers. Completes were evenly distributed across five cells, with providers delivering n=100 to each cell with consistent age and gender quotas. This provided a basis for data comparison among five overall cells using various quality assurance techniques including Research Defender’s proprietary digital fingerprinting, fraud identification, text analytics, and respondent-level tracking.
The five cells in the study included:
We measured our results with a previously established quality scoring methodology that has context and benchmarks – something we’ve dubbed the “Qscore.” This methodology leverages trackable, quality-oriented question sets used for many years to assess respondent quality and characteristics. The longevity of these question sets provided data that gave significant benchmarks for the United States, from 50K+ interviews in the past year alone. In addition, some standard questions from sources such as the U.S. Census were included to give a foundation for outside comparisons.
We calculated Qscores at the respondent level to provide a baseline for comparison, and reviewed aggregate scores by demo group, provider, and data quality technique. A key finding was that layering data quality techniques positively impacts research outcomes, and can lead to a clean, healthy and efficient market research ecosystem.
Adding to the quality equation, we were able to infer that unbiased sourcing delivers more representative results and that expert project management for fieldwork eliminates common challenges in the data collection process. These findings reinforce the complexity of the issues facing researchers today, and how a deft balance of techniques during the data collection and fieldwork stage is needed to produce the very best research outcomes.
Disclaimer
The views, opinions, data, and methodologies expressed above are those of the contributor(s) and do not necessarily reflect or represent the official policies, positions, or beliefs of Greenbook.
Comments
Comments are moderated to ensure respect towards the author and to prevent spam or self-promotion. Your comment may be edited, rejected, or approved based on these criteria. By commenting, you accept these terms and take responsibility for your contributions.
Top in Quantitative Research
Why are we still measuring brand loyalty? It isn’t something that naturally comes up with consumers, who rarely think about brand first, if at all. Ma...
Sign Up for
Updates
Get content that matters, written by top insights industry experts, delivered right to your inbox.
67k+ subscribers