What does the length of a survey have to do with the quality of information?
This is a question that is increasingly asked in market research circles , and it is also a topic that I have spent a lot of time on for the simple reason that long and complex surveys make the process of collecting valid and reliable information more complicated.
How long should a survey be?
Surveys range from a quick one-question poll to marathons of questions and exercises that test everyone's patience. It seems like the authors of these surveys want users to give them 40 minutes of undivided attention, but this has the opposite effect as most users are not motivated to do so.
What do we know about those who are willing to sit through lengthy exercises? For starters, they turn out to be very different from typical customers. These are highly engaged internet users who tend to search and purchase online more frequently than the rest of the population.
What other evidence is there? Well, Pew Research has reported that their namibia phone number response rates have fallen from 36% to 9% between 1996 and 2012. ESOMAR's 2014 Global Market Research report showed a decline in the proportion of the budget spent on research using surveys. There has also been an increase in the satisfaction of respondents doing these types of surveys than simply doing whatever they can to complete the survey and get the incentive.
A 2011 study reported that the larger the number of questions in a survey, the shorter the average response time. Long surveys lead to a linear and boring approach for respondents.
Then there is the emergence of the mobile phone user. Surveys designed for the computer or tablet do not translate well to mobile devices, even with their size. This experience needs to be simplified so that mobile phone users will respond to the survey.