Chris Antoun, winner of the 2013 James N. Morgan Fund for New Directions in Analysis of Complex Interactions, is studying the potential impacts of completing surveys on smart phones.
They’re familiar sights. A man walking down the street, a woman riding on a bus, both staring fixedly at the smart phones on which they’re avidly tapping. They could be sending a text, or answering an email, or buying a pair of pants. Or they could be responding to a web survey. In which case, does it matter that a bike just cut in front of the man, or that a friend just got on the same bus as the woman, or that their phones just rang?
These are some of the questions Chris Antoun is wrestling with as he investigates the impact of completing surveys on mobile devices. “Does the device that people use to complete web surveys matter?” asks Antoun, a doctoral student in the Program in Survey Methodology at the University of Michigan. “If it doesn’t, that’s great for researchers because then respondents should be able to choose the device they use to access these web surveys.”
To learn more, Antoun conducted research late last year through the LISS panel in the Netherlands, an online panel consisting of 5,000 Dutch-speaking households. For Antoun’s study, about 700 participants were invited to fill out a questionnaire using a computer, while another 700 were invited to complete the same questionnaire using a smart phone. A month later, the two groups swapped technology in completing a second questionnaire.
Antoun hopes the results will help answer three main questions. First, he wants to see if the device used to complete a survey affects how thoughtful respondents are in providing answers. If they’re multitasking or interrupted by distractions, how might that affect the quality of the information they provide?
Second, he hopes to learn whether the device changes respondents’ willingness to disclose sensitive information. Phone users often go online in public places, after all, which could make them feel more exposed, or make the answers they give seem less private.
And third, he wants to look at nonresponse: Are the people who don’t respond to surveys on smart phones different in some way from those who do? For example, do older people avoid completing surveys on phones? If so, the findings arrived at from those who participate will be biased and not reflect all possible respondents.
It’s on the final question that Antoun will be using SEARCH, a statistical analysis program developed at the Institute for Social Research by founder James Morgan and others. This part of Antoun’s research will be supported by the first ever award from the James N. Morgan Fund for New Directions in Analysis of Complex Interactions, an award intended to spur innovative uses of SEARCH.
A more traditional method of statistical modeling requires the researcher to identify specific hypotheses ahead of time, and then to test them out. SEARCH, by contrast, “searches through all possible predictors to find the one that’s most predictive of your outcome,” Antoun explains. “So it can look for these very complicated interactions, which would be hard to do using other techniques.” Antoun will use both a traditional method and SEARCH, and then compare the performance of the two approaches to learn which performs most effectively, “at least for this particular survey.”
Antoun is now analyzing the results of the LISS surveys, and the findings will form the core of his dissertation; he expects to complete his doctoral studies in 2015. What he learns, he says, should just be the first step in understanding and making the best use of mobile devices in survey research. “I can imagine future studies comparing tablets to smart phones, and tablets to PCs, and different questionnaire design formats, and the effect of touch screen input compared to a physical keyboard and mouse,” he says. “It would be really unfortunate if social science researchers didn’t know how to take advantage of this change in how people go online.”