By Chris McCarthy on October 26, 2015
SMS is a great way to conduct a survey. It can help you reach normally difficult to contact consumers, can get responses in real time, and can be very low intrusion so that people are willing to keep answering day after day, month after month, year after year. With all of that said, not all SMS surveys are created equal. Similarly, not all SMS survey tools are created equal. If you have a failure in either the design or the deployment of your survey, you are not going to maximize the potential of that survey. So, let’s take a look at how you can make your SMS survey the most equal of them all.
Let’s start with the obvious: there is a major technical limitation with SMS surveys. You can’t fix it, Instant Census can’t fix it, and anyone who says they can fix it is lying right to your face. This limitation should inform basically every design choice you make when designing an SMS survey. This is the hard and fast limit on the length of a message. 160 characters (70 if you’re using Cyrillic or another non-Latin alphabet). No single text message can be longer than that. As a result, no single question you send should be longer than that limit, and no question should aim to elicit a response longer than that.
Now, you’re a smart, savvy person. You may be thinking at this point “that’s fine; I’ll just split my super long questions up into multiple messages and send them one after another.” Technically, you’re correct. You could do this, but you wouldn’t want to for a number of other reasons. First and foremost, SMS does not guarantee delivery order of messages. This is another hard and fast mechanical limitation of the very system. If you send out three messages at once, they could arrive in any order. If you send out three messages one shortly after the other, they could still arrive in any order. And if you’re sending out one incredibly long question, there’s no better way to make sure the user is confused and unable to answer in a useful way than for them to receive that question in a random order, reducing your finely crafted data-gathering instrument into incoherence, which will probably elicit similar incoherence in response (if the user even bothers to answer). That’s why the key to keeping an engaging and healthy survey is to send one message, receive an answer, and only then send another. This ensures that the user has received your previous communication, and will have that content when they get the next one.
This process works both ways, incidentally. If a user sends in more than 160 characters in response to a question, that will get chopped up into multiple messages. And you could get those messages back in any order. Making sense of data is troublesome enough without having to figure out the correct order of their SMS manifesto. Of course, in some ways you cannot control how many messages a user will send. If someone wants to send in their 30,000 thesis on why Donald Trump will lead America into a glorious new future, you can’t stop them. But, you can decrease the likelihood of them doing that by asking simple, direct questions that elicit one or two word responses.
So, to reiterate: do not make questions longer than 160 characters, and do not attempt to elicit answers longer than 160 characters.
And as long as we’re talking about keeping things short, it’s also worth mentioning that no single SMS survey should be particularly long. The reason people respond so readily, so quickly, and so often to SMS surveys is that they aren’t very intrusive. That can quickly change if you abuse the system, and spam people with question after question. Keep the surveys short, maybe a half dozen questions at most, and people will keep responding. If you need to ask more questions than that, no problem. Just split the survey up over the course of a few days, or maybe even weeks if it is long and difficult enough. Better to ask someone one question a day for twenty days than twenty questions on the same day. One maintains the leisurely, low-impact pace that makes the system work. The other feels like a strenuous intrusion on people’s lives.
Having dealt with the ramifications of the inherent technical limitations of SMS, you might think it’s smooth sailing going forward. But, as I said at the outset, not all SMS survey systems are created equal, and many of them come with their own, often incredibly detrimental, technological limitations. The key to SMS’ success as a medium is how easy, unobtrusive, and fluid it is as a medium, and when a tool stands at cross purposes with that then your survey will suffer.
Let’s take the study put forward by Johan Hellström of Stockholm University in the International Journal of Public Information Systems, vol 2015:1. Hellström attempted to use SMS communication to address the democratic deficit in Uganda. A noble goal, and one that underscores SMS’ ability to reach often underserved populations. However, upon inspection, there are a number of ways the survey deployment system failed to work as effectively as possible.
To begin with, the survey required a user to answer every question in exactly the right way or the user was simply dropped from the survey. The inability of users to indicate they wished not to answer a question is obviously undesirable, both from a survey design and technical implementation standpoint, and obviously lead to a drop off in responses as the survey went on. This was compounded by the fact that the template for responses was needlessly complex. To indicate a willingness to participate, a simple yes or no would not do (in fact, it would result in the user being dropped from the system due to an improper response); instead, the user was prompted to respond with “PART
As Hellström himself says, “this meant that the number of respondents dropped with every question, which must be considered normal”. While the first part of that is beyond doubt, requiring a needlessly complex template for answer response, prohibiting the skipping of a question, and being unable to handle variations in response will certainly lead to the number of respondents dropping off, you should not settle for that and consider it “normal”. Your SMS survey tool should be smart enough to handle people responding as if they were people. Someone should be able to respond to a yes-or-no question with just “yes” or “no”, or even “yep” or “nope” without them being dropped from the survey for responding in an incorrect way.
Incidentally, the fact that users must begin their response with some kind of identifying tag suggests that without one, his system wouldn’t know to what questions answers were in response. Timestamps alone ought to be able to handle that, or some other feature running behind the scenes. There is no reason that work should get forwarded onto the end user, as is the case here.
Furthermore, Hellström’s survey itself remains rather rudimentary. It doesn’t employ skip logic, answer piping, or any other more complex survey design feature. Furthermore, it asks the next question immediately after receiving an answer to the previous one, and there are 10 questions. All of these might be intentional and thought out design choices. Sometimes, you need to ask everyone the same questions all back to back. But, that should be a design choice, not a design limitation. A system that forces you do so, that doesn’t allow skip logic and can’t pipe in previous answers is severely limiting your ability to ask a good survey.
What all this boils down to is the fact that your SMS survey tool should be smart. While the ability to fully parse human speech is not yet possible in computers, a rudimentary ability to handle slightly malformed responses should come standard with any system. Similarly, there should be tools built right into the system that allow you to start asking people the most germane questions as quickly as possible; survey’s are not a one-size-fits-all affair. Finally, it should be able to sustain interaction over the course of not just one survey, but of multiple surveys over multiple days. Once you have an important datum about a user, you should be able to use not just the current survey, but any survey you ever send the user. Your SMS tool should be doing the grunt work, not you, and certainly not your users.
Want to know how adding SMS surveys to your research methods can help you reach new audiences and improve your data collection efforts? Get in touch with Instant Census today!