By Elyse Desmarais on November 11, 2015
This past October, the Massachusetts Bay Transportation Authority (MBTA) was looking to give subway trains a paint scheme makeover. In an effort to engage with riders, the MBTA publicly released a survey with exterior design options for trains from each subway line for customers to vote on. While this Marketing stunt was, most likely, to make good with riders who suffered service interruptions this past winter, it was also an opportunity for the MBTA to show customers they listen. Sounds like a good idea, right?
Unfortunately, due to poorly designed survey technology, the MBTA appeared disorganized to survey participants and the general public. Upon initially releasing paint scheme winners for the Red, Orange and Green line subway cars, the MBTA is currently holding the data after strange results were identified for Red and Green line voting.
Boston Globe reporter Jack Newsham called foul on the initial Red and Green line results, saying they were “rather suspicious,” as both received more than 177,00 votes, while only 25,000 votes were received for the Orange line. To add to suspicion, Red and Green line design winners exhibited overwhelming margins:
90% of the vote went to the winning Red Line option, featuring a large “T” on the side of the car.
89% of the vote went to the winning Green Line paint scheme, with the design exhibiting a large section of green along the side.
However, the Orange Line winner only took 41% of the vote, with the simple design almost matching the current paint scheme.
It’s clear Red and Green line vote totals were heavily skewed, which is most likely due to the survey technology used by the MBTA. According to a Boston Globe report, “hundreds of votes appeared to have originated from the same computer,” with the same computer submitting up to three survey responses per second. Votes also spiked towards the end of the survey period, which is an unusual trend. Spokesman Joe Pesaturo said they were in the midst of contacting Survey Monkey, the survey vendor used for the project, about these inconsistencies.
Despite not knowing the whole story behind why Survey Monkey’s survey technology failed the MBTA, it’s clear from initial data participants were allowed to vote as many times as they wanted, creating skewed results. In other words, they had no way of verifying each participant only took the survey once. The MBTA should have ensured with Survey Monkey each participant only be allowed to cast 1 vote per each subway line - but hindsight’s 20/20.
With Instant Census SMS surveys, you don’t have to worry about participants illegally voting more than once and skewing your results. The beauty of text message surveys is that each phone number (participant) can only take a survey one time. In fact, it’s a security verification built into our software to make our customers’ lives easier. This means you can count on getting the most accurate results and data every time, without having to worry about fraudulent survey respondent behavior.
Don’t let yourself run into inaccurate and skewed results on your next survey. Make sure your survey technology is capable of providing you with accurate data every time. With Instant Census automated SMS surveys, you won’t have to think twice.