Kantar's Profiles Blog

Transitioning to Online: Differences in Marketing Research Data

Posted by Susan Frede on Oct 13, 2015

Everyone hates data transitions, but sometimes they are necessary. In most of the world, marketing research has undergone the transition to online from either telephone or face to face. When these transitions happen, we typically experience data differences, some of which can be measured, calibrated and explained while in other situations we are less able to explain the root cause.

Offline_to_online_researchDifferences in population: One set of differences that are easy to measure are caused by differences in the populations being measured. Lower Internet penetration means an online sample is less representative of the population because not all members of the population are online. Regardless of country there are generally some key groups who are less likely to be online. This includes older adults, those less affluent, those in more rural areas, and those with less education. As Internet penetration increases the online population becomes more like the total population of the country. It is important to consider whether or not a particular target group can be reached with an online survey. If the group being reached online is different than the group previously reached with an offline methodology this could lead to differences in results. 

Differences between self-completion and interviewer assisted modes: A second factor that has been well documented is data differences between self-completion modes and interviewer assisted modes. These differences can sometimes be attributed to differences in how questions are asked.  Interviewer assisted modes generally rely on oral presentation of questions while self-completion modes rely on visual presentation. This often means questions can’t be asked in exactly the same way. For example, with online it is possible to use a check all that apply format, but with phone each item needs to be read and a yes/no response provided. Complicating this further is the use of don’t know with phone and face-to-face. A don’t know answer is often allowed with phone and face-to-face even though it is not a stated answer choice. The only way to offer a don’t know with online is to include it in the list as an answer choice. Offering don’t know as an explicit choice can change responses. Bottom line, with online research survey design and layout can impact results and it is essential that questions be written for self-completion with clear instructions.  

Social desirability bias:  Latin America, China, and several other Asian countries are now in the throes of transition to online. Researchers are busy trying to explain data differences and I had the pleasure of contributing to a scholarly work that shows that social desirability bias is a key explanation for differences between online and phone. With social desirability bias respondents are less likely to report negative things and more likely to report positive things. The respondent wants to look good, so for example they are less likely to claim they smoke and more likely to claim they exercise. The presence of an interviewer with phone increases social desirability bias. 

This new research used data from the Advertising Research Foundation’s Foundations of Quality 2 project (ARF FoQ 2). In this study, 17 online sample providers supplied nonprobability based online samples and one sample provider supplied a probability based RDD phone sample. We developed a multivariate model that predicts the direction and magnitude of social desirability bias. By applying the social desirability correction factor phone and online results are much more inline. The model was also applied to data not used to develop the model and it also showed significantly less difference between the two modes. This research and analysis could suggest that online results are less biased especially for sensitive questions. 

Lightspeed GMI has an abundance of experience moving from offline to online. This past experience can be utilized to ease the transition in Asian and Latin America. 

Gittelman, S., Lange, V., Cook, W.A., Frede, S.M., Lavrakas, P.J., Pierce, C., & Thomas, R.K. (2015). Accounting for social-desirability bias in survey sampling. Journal of Advertising Research, 55(3), 242-254.

 

   THE KEY TO BUILDING QUALITY MARKETING RESEARCH PANELS IS TO MINIMIZE THE OPPORTUNITIES FOR BIAS.

 Learn How To Build and Maintain a Panel 

Topics: Marketing Research Data, Emerging Markets

Subscribe to Email Updates

Recent Posts

Posts by Topic

see all