The InnovateMR Blog

OMG Survey Panel Pains – WTH is Really Going On?


LOS ANGELES, March 23, 2016 — In the last few weeks, I’ve read numerous blog posts from researchers and panel companies about the sad, sad (yes double sad) state of online panels.

 

You can see those posts here (from Adriana Rocha) and here (from Dave McCaughan) and one from Innovate here. If you know me then you know I just can’t stay silent on hot panel and sampling topics. Panels are no longer responsive, email click-through rates are abysmal, panelists must try 10 surveys before earning a significant reward; I could go on and on and on. Is there a solution for online panel quality or are we doomed for all eternity?

                                                     

The sad state of online survey panels is really no one’s fault! Here’s why: Panel companies create databases of people interested in participating in surveys to earn a reward. For the most part, these people genuinely want to provide honest data in exchange for this reward. Survey panelists use their spare time and don’t really equate the time spent in terms of the value of their hourly rate (at work). While creating the databases, the panel sites will ask the user for basic demographic, geographic and psychographic information.

 

For complete background, I’ll start off by stating the obvious. When researchers have a study that needs sample, they send a specific request for the types of people that need to be targeted. Sometimes the sampling firm can match 100% of the client request data fields to the panelist data. But more often than not, the sampling firm doesn’t have 100% data match in their database. Why? Because there are millions of possible questions that could be asked of a respondent. For example: “I want men, 35-54, living in California, have psoriasis, but don’t actively treat it with medicine.” A sampling firm may know everything except for whether or not the respondent is using medicine. Therefore, either the sampling firm asks the user to answer extra questions before being redirected to the client survey or they send the user into the client survey to be screened. The latter increases the possibility of a poor user experience as the user will most likely answer many other questions before getting to the specific termination point regarding usage of medicine.

                                                 

What’s not so obvious is the lack of technical integration between sampling firms and survey software. I’ve written about this topic many times over the years. Whether the sample company is programming the survey or just sending panelists to a client-hosted survey, there’s almost no technical integration between sampling and survey software. Meaning, sampling firms usually cannot send panelist’s demographic or geographic data that’s already known about a panelist, into the survey software. Also, sampling technology systems usually do not have visibility into open quotas, in real-time, via API. So, a panelist clicks into survey only to be told, “Sorry you don’t qualify” or “Sorry the survey is closed.” That notification usually occurs after the panelist has already answered many different questions, and most of these answers are already known by the panel company. But because sampling firms are only paid for each user who completes a survey, there are financial constraints on how much a rejected panelist can be paid when they don’t qualify.

 

The solution is multi-faceted. We (The Research Industry) need better integration between survey software and sampling technology platforms AND we need agreement that there are best practices surrounding survey design which help panelists provide valuable data. Years ago there was an industry-wide initiative for research agencies and sampling companies to create a shared data warehouse, but I don’t think there was much interest in bearing the costs of managing the platform. I still think this would be a valuable asset for the entire industry.

 

Lastly, mobile mobile mobile. Do you check your email on your mobile phone? So do panelists! Do you want to take a 20+ minute survey staring at your phone? Neither do panelists! If you write and design surveys, you have to ask yourself, “Would I enjoy taking this survey?” When the answer is “Yes,” then we’ll all be helping to build better quality databases of respondents. I think the saying is: “Help me, help you.”

 

Oh, one last thing. Mark, I told you I’d mention you in a blog post. So there you go. You know who you are.

 

About Matt Dusig:

Matt Dusig is a serial entrepreneur from Los Angeles and has co-founded three pioneering digital sampling firms since 1999: goZing, uSamp/Instantly and most recently, Innovate. Matt’s technical know-how and business experience help drive his companies. Matt envisions the ideal market research world where every consumer has a positive experience with surveys, thereby expanding the universe of people willing to participate. Connect with Matt on: LinkedIn or Twitter.

 

About Innovate:
Real People. Quality Data.™ Innovate is a global online sampling firm, generating high-quality data from engaged panelists. Founded by Matt Dusig, Gregg Lavin and George Llorens, Innovate provides 24/7 client service to thousands of market researchers and research departments around the world. Innovate pioneered Human-Powered Sampling which promotes responsive communication for client satisfaction and created the First Class Panel which is a heavily-screened, profiled and engaged audience of survey respondents.

 

Follow Innovate on LinkedIn, FacebookTwitter or Google+.