By Josh Zagorsky on June 1, 2016
Researchers worry a lot about respondents dropping out and failing to complete a survey. So for as long as we’ve been building text message survey technology, we’ve heard one question over and over again: “How long should SMS surveys be?” Nearly everyone, including us, has assumed that SMS surveys need to be short to keep respondents from getting annoyed. We’ve sometimes suggested that SMS surveys should be kept under 10 questions. But is that true? What really is the upper limit on questions that people are willing to answer?
One daring survey researcher set out to push the limits of human knowledge on this subject. Brian McDonald, Associate Director of High Point University’s Survey Research Center, used Instant Census to run a pioneering SMS survey in December 2015. One unknown he wanted to test was how many questions an SMS survey could ask before people started dropping out in droves. So he created an unusually long text message survey: 25 questions (19 yes/no/skip questions, 4 multiple choice questions, and 2 open response questions). He sent out the survey to 246 randomly-selected North Carolina adults who several months earlier had agreed to be texted a survey for no compensation.
The results astounded us.
45% of people started the survey
89% completed the entire survey
11% attrition rate
If we compare this to completion rates for completion rates for SurveyMonkey’s web surveys, the Instant Census text message survey actually did slightly better. SurveyMonkey typically sees an 89% completion rate for 10-question surveys, and it drops off slowly from there; for 25-question surveys, they expect only about an 86% completion rate, which is 3 percentage points lower than what we saw. What this means is that if you’re asking a medium-length survey, text message surveys are actually as suitable a mode as web.
I took the survey (although because I’m one of the engineers on Instant Census, I was removed from the dataset), and it took me about 10 minutes, meaning an average of about 25 seconds per question. The survey was fast: I got every question a few seconds after I’d answered the previous one, and I was in Mexico, on a Mexican cellular network; despite texting a US number internationally, the lag was unnoticeable. It felt like texting a friend- a friend who types at lightspeed and has already decided what restaurant we’re meeting at.
One aspect of the text message survey did feel a little slower than a web survey: because each question was self-contained in a message, I was forced to think about each question individually. Sometimes on web surveys I scan the page and answer a mass of questions at once, especially if they’re in a matrix. I couldn’t do that easily on the SMS survey, which made it feel more like a CATI-powered phone survey in which the interviewer makes me answer one question at a time. So if the SMS survey was slower than a web survey, the positive consequence is that I was concentrating a lot more on the individual questions.
There’s another aspect of text message surveys that may have contributed to the high completion rate: it’s very easy to stop and restart an SMS survey. When you take a mobile- or desktop-web survey, if you leave the browser tab for too long and try to come back, that survey is often gone because your session expired, your data were cleared, or the page reloaded and lost everything. With SMS surveys, when you stop the survey and switch tasks to something else, the last unanswered question is still lurking there at the end of the text message thread, and you can easily return to it at any time. That’s something we saw people do a lot in High Point’s 25-question survey: many people answered several questions in a row, took a break for a few minutes or a few hours, and came back and answered the rest. The median time to complete the survey was 1 hour and 14 minutes, but the range ran from 8 minutes to 16 hours. This also means that a large portion of respondents will break an SMS survey into more than one sitting. That makes the survey more convenient for the respondent and may improve the total completion rate, but we also don’t yet know what effect (if any) it has on data quality.
More research needs to be done on the effects of SMS survey length. We’d love to see additional studies on how survey topic, recruitment method, question styles, etc. affect the survey length vs. completion rate function. But based on Brian McDonald’s research, it looks like the conventional wisdom is wrong in believing that text message surveys need to be short. Because it’s easy for respondents to pause SMS surveys and pick up later right where they left off, if a text message survey has too many questions, respondents may break it up on their own without the ultimate completion rate suffering. Text message surveys may be able to ask as many questions as web surveys; we may someday be taking heavily comprehensive 50-100-question text message surveys!