Charles (Charlie) Cannell was a founder of the Institute for Social Research (ISR), serving the U-M community from 1946 until his passing in 2001. He was a leader in the field of survey research, and was deeply committed to mentorship. Charlie exemplified the values with which the Survey Research Center (SRC) was launched: the belief that high-quality data on society are essential for the formulation and adoption of wise and humane policies. Since most such data are based on respondents’ self-reports, it is critical that research participants understand their role and are motivated to fulfill it. Charlie devoted his career to understanding and improving the quality of self-reports.
His many publications — books and chapters, journal articles, and technical reports – document the growth and development of that methodological research. The overarching theme was the understanding of the factors that affect the quality and validity of self-reported survey data, in particular, the interaction among the respondent, the interviewer, and the survey questions. The designs were varied: experimental and nonexperimental, quantitative and qualitative, observational and self-reported. Almost all of them are jointly authored by Charlie and one or more of his colleagues and students, who also became his friends. It was in this spirit that Charlie created the Charles Cannell Fund in Survey Methodology to support emerging researchers at the University of Michigan.
The Charles Cannell Fund in Survey Methodology furthers research in survey methodology at the Survey Research Center, especially graduate student research and training in an array of issues related to measurement and response quality. The Fund particularly seeks to support research on the interaction among the interviewer, the respondent and the instrument and their effects on the quality of survey data.
Graduate students, postdocs, and faculty at the University of Michigan are eligible to receive funds. Awards to senior faculty should be for the purpose of supporting research by graduate students. Faculty and postdocs can also apply for funds that will cover the creation and implementation of courses on new data collection methodologies for the ISR Summer Institute.
SRC leadership, in collaboration with a faculty committee, will review potential uses of the fund, ensuring the funds are used to further this important area of research. If you have any questions about the award process, please send an email to firstname.lastname@example.org.
The application for funding from the Charles Cannell Fund can be accessed here: InfoReady.
- At one year mark, you will be asked to share an update about the status of your research.
- Acknowledge ISR and this award’s support in publications and presentations.
Cannell Fund Recipients and Their Projects
Tiffany Neman is a doctoral student in sociology at the University of Wisconsin-Madison. Using a racially and ethnically diverse sample of respondents, her Cannell funded research examines interviewer and respondent interactional behaviors across a set of respondent and question characteristics, with a particular focus on question sensitivity and respondents’ race or ethnicity. She hopes that her research will help identify the potential impact of interactional behaviors on the data quality of sensitive versus non-sensitive items. Neman plans to share her findings in publication in 2021.
Mariel McKone Leonard is a doctoral student in the Graduate School of Economic and Social Sciences at the University of Mannheim in Mannheim, Germany. As part of her dissertation, she proposes to develop and test an instrument to measure emotional labor of interviewers when conducting surveys on sensitive topics and how emotional labor can potentially lead to vicarious trauma and burnout in interviewers. She plans to share her findings at conferences in 2020, as well as in her dissertation.
Antje Rosebrock is a doctoral student in the Graduate School of Economic and Social Sciences at the University of Mannheim in Mannheim, Germany. Her research aims to evaluate whether eye-tracking data can help researchers identify potential problems with survey questions prior to the field stage, as certain eye-tracking data has been shown to be linked to cognitive burden. Additionally, she will investigate learning effects of interviewers over the field period on a cognitive level, particularly whether interviewers process questionnaire content differently in later interviews. She will share her findings in her dissertation.
Yuliya Kosyakova is a researcher in Migration and International Labour Studies at the Institute for Employment Research in Nuremburg, Germany. Her current research aims to develop tools to detect interviewer falsification and to evaluate the existing methods of detecting interviewer falsification and compare those with the new tool she develops through the use of multi-level modeling. She received her Ph.D. in Political and Social Sciences from the European University Institute in 2016. She and her team have published this report: Development of Tools to Detect Interviewer Falsification.
Kristen Olson and Jolene Smyth, associate professors at the University of Nebraska-Lincoln, were awarded funds to support junior researcher attendance at Interviewers and Their Effects from a Total Survey Error Perspective Workshop in Lincoln, Nebraska, February 26-28, 2019.
Isabel Anadon is a doctoral candidate in sociology at the University of Wisconsin-Madison. Her Cannell funded research examines respondents’ response latencies during the administration of a series of items measuring trust/mistrust in medical research among a diverse group of respondents. She hopes that the analysis of the interaction will help contribute to our understanding of health disparities in health-related outcomes. Anadon plans to share her findings in publication in 2018.
Felicitas Mittereder is a doctoral candidate in the Michigan Program in Survey Methodology. Her dissertation investigates respondent-researcher communication in online surveys. She is examining respondent interaction with the survey instrument based on so-called paradata (passively collected data about response process, such as time spent on one question, skipping questions, etc.). She is using her Cannell funds to test her prediction model of breakoff (or incomplete surveys) in real-time data collection. She hopes to shed light on the breakoff process to improve the respondent-researcher communication process in web survey research. Mittereder intends to defend her dissertation in 2019.
Shelley Feuer is a doctoral candidate in cognitive psychology at the New School for Social Research. Her Cannell funded project is her dissertation which examines video-mediated interviewing (e.g., via Skype) as a cost-effective potential mode for data collection. Her study seeks to investigate the effects of self-view on respondents’ levels of disclosure and feelings of comfort in the interview. She will also use eye tracking to determine how self-view affects disclosure. Feuer plans to share her findings in late 2017.
Jamie Griffin is an Assistant Research Scientist with the Panel Study of Income Dynamics (PSID) at the Institute for Social Research. Using Cannell Award funds, she will evaluate the interviewer-respondent interaction during the administration of the PSID event history calendar in the main interview. Specifically, she aims to document the prevalence of interviewer and respondent key verbal behaviors associated with respondent recall, the respondent uncertainty about the timing of residential and employment-related events, and interviewer reactions to respondent uncertainty. Griffin shared her findings at AAPOR in 2016. The work is available as a technical paper on the PSID website.
Simon Kühne is a doctoral student at the Berlin Graduate School of Social Sciences, Humboldt-Universität zu Berlin and a Research Associate with the Socio-Economic Panel Study at DIW Berlin. As part of his dissertation, the Cannell Award funds allowed him to collect and combine data on both self-reported attitudes and mutual estimates of each other’s attitudes of the interviewer and respondent. Kühne shed light on how interviewer attitudes and mutual interpersonal perceptions of interviewers and respondents affect respondents’ answers related to attitude questions in face-to-face interviews. Kühne shared his results at AAPOR in 2017 and his presentation slides are available here.
Kristen Cibelli Hibben’s Cannell funded project examines the effect of respondent commitment, tailored feedback, and the use of contextual recall cues on the quality and accuracy of reported health care utilization in an online survey of the parents or guardians of child patients at the University of Michigan Health Service. Her study extends the existing research by examining the effect of these techniques in increasing response accuracy by validating responses with medical record information. The survey was fielded in March 2015. She completed her dissertation, The Effects of Respondent Commitment and Feedback on Response Quality in Online Surveys, and received her Ph.D. in 2017. Cibelli Hibben plans to present results from the Cannell funded survey at the 2017 AAPOR National Conference. She currently works part-time as a Survey Methodologist at the Survey Research Center’s International Survey Operations at the Institute for Social Research at the University of Michigan and on various projects as an independent consultant. (Read Kristen Cibelli Hibben’s Profile.)
Hanyu Sun is investigating whether rapport can be similarly established in a video-mediated and computer-assisted personal interviews, whether video-mediated interviews increase disclosure of moderately sensitive information to the same extent as CAPI, and whether the interaction between the interviewer and respondent in the preceding module (CAPI or video-mediated) has an effect on the reporting of sensitive information in the subsequent ACASI module. Sun completed her dissertation, Rapport and Its Impact on the Disclosure of Sensitive Information in Standardized Interviews, and received her Ph.D. in 2014. She presented at the 2015 AAPOR National Conference a presentation entitled, “The Impact of Rapport on Data Quality in CAPI and Video-Mediated Interviews: Disclosure of Sensitive Information and Item Nonresponse.” She is currently a survey methodologist at Westat.
In 2013, the Cannell Fund co-sponsored the Interviewer-Respondent Interaction Workshop, May 15-16, in Boston, immediately preceding the 68th Annual Conference of the American Association for Public Opinion Research. For more information about the workshop, visit the Workshop website.
Dana Garbarski, 2012-2013 Cannell Fund winner. Read her profile. Photo courtesy of Dana Garbarski.
Dana Garbarski studies the interaction between interviewers and respondents regarding end-of-life treatment preferences and that of their spouses in order to better understand interactional “rapport” both conceptually and empirically. She received her Ph.D. in Sociology and an M.S. in Population Health Sciences from the University of Wisconsin-Madison in August of 2012. Garbarski is currently an Assistant Professor in the Department of Sociology at Loyola University Chicago. Garbarski, along with colleagues, Nora Cate Schaeffer and Jennifer Dykema, edited a special issue for Survey Practice on “Interviewer-Respondent Interaction” They also have published, “Interviewing Practices, Conversational Practices, and Rapport: Responsiveness and Engagement in the Survey Interview” in Sociological Methodology. (Read Dana Garbarski’s profile.)
Julie Korbmacher and Ulrich Krieger, researchers at the Munich Center for the Economics of Aging with is part of the Max Planck Institute for Social Law and Social Policy, are examining interviewer influence on cooperation rates and survey quality in the Survey of Health, Ageing, and Retirement in Europe (SHARE). They will interview interviewers who all work for the same survey but in different countries. It is of special interest to their research how interviewers’ attitudes towards the SHARE influence their performance on response rate and nonresponse on sensitive questions which interviewers have been shown to have some influence. Korbmacher is a Ph.D. student in statistics at the LMU in Munich. Krieger is in his last year of his Ph.D. in sociology at the University of Mannheim.
Brady West‘s research attempts to fill in some of the gaps left by nonresponse asking, “Is the Collection of Interviewer Observations Worthwhile in an Economic Panel Survey? New Evidence from the German Labor Market and Social Security (PASS) Study.” West received his Ph.D. in Survey Methodology from the University of Michigan in 2011. He is now an Research Associate Professor in the Survey Methodology Program at the University of Michigan. (Read Brady West’s profile.)
Jessica Broome received her doctorate in 2012 from the Michigan Program in Survey Methodology. Her dissertation (PDF 1.2 MB) explores vocal characteristics, speech and the behavior of telephone interviewers. She has also published “How Telephone Interviewers’ Responsiveness Impacts their Success” and “First Impressions of Telephone Survey Interviews.” In 2015, Broome also had an opinion piece published in the Data Freaks section of Forbes Magazine, “Scripting Cold Calls Is a Bad Idea: Why most Call Center Managers Have it Wrong.” Broome is currently a research consultant for clients in the private and nonprofit sectors.
Rebecca Rosen‘s research, “Effects of Mood and Interviewing Mode on Self-Disclosure by College Students,” (PDF 1.2 MB) seeks to assess the effect of mood and interviewer characteristics when face-to-face interviews are conducted about sensitive issues. Her study focuses on depressed college students and will be carried out at the New School for Social Research. Rosen received her Ph.D. in Clinical Psychology in 2011.
Ashley Clark‘s research, “An Investigation of the Effects of Job Attitudes on Interviewer Turnover and Quality of Job Performance in U.S. and Canadian Centralized Telephone Interviewing Facilities,” is an initial step to fill the gap in the understanding of the effects of interviewer job attitudes on job outcomes as well as the cost and data quality of the data collected. Her research design includes both a quantitative phase and a qualitative phase to study these effects in several centralized telephone interviewing facilities. Bowers expects to receive her Ph.D. from the University of Michigan Program in Survey Methodology. She is currently the Director of the Center for Survey Research, and a Clinical Assistant Professor in the School of Public and Environmental Affairs at Indiana University.
2009 – 2010
Stephanie Eckman‘s dissertation, “Errors in Housing Unit Listing and Their Effects on Survey Estimates,” (PDF 924K) explores the mechanisms of error in interviewer created housing unit listing, using original data collection in conjunction with the National Survey of Family Growth. Eckman received her Ph.D. from the University of Maryland’s Joint Program in Survey Methodology in 2010. She is currently employed as a Senior Researcher at the Institute for Employment Research in Nuremberg, Germany.
Ipek Bilgen‘s research, “Is Less More & More Less…? The Effect of Two Types of Interviewer Experience on ‘Don’t Know’ Responses in Calendar and Standardized Interviews,” (PDF 2 MB) explores the effect of interpersonal communication dynamics and retrieval strategies on item non-response in an interviewer-administered telephone survey. This study also appraises the influence of survey-specific and general interviewer experience on interviewers’ behavior and perception change. Standardized and calendar interviewing techniques are the two different methods explored in this study. Bilgen received her Ph.D. from the University of Nebraska-Lincoln, and is currently a Survey Methodologist with NORC at the University of Chicago. She presented “The Effect of Interviewer Experience on Item Non-Response: A Verbal Behavior Study” (PDF 283K) at AAPOR in 2011.
2008 – 2009
Matthew Jans‘s dissertation, “Can Speech Cues and Voice Qualities Predict Item Nonresponse and Inaccuracies in Answers to Sensitive Questions,” (PDF 2 MB) explores the impact of speech and voice quality (change in pitch, pauses and repairs) on the quality of responses to sensitive questions in surveys. Jans received his Ph.D. from the University of Michigan Program in Survey Methodology in 2009. He presented “Using Respondent Verbal Paradata to Predict Income Nonoresponse: How They Say it Can Predict What They’ll Say,” (PDF 303KB) at AAPOR in 2010. Jans is currently the Data Quality and Survey Methodology Manager for the California Health Interview Survey at the UCLA Center for Health Policy Research.
Brooke Foucault Welles‘s research uses a series of studies to explore rapport between the interviewer and the respondent. Her goal is to develop a more robust understanding of rapport, including a detailed understanding of which components and surface-level behaviors increase socially desirable responses. In 2009, she and her colleagues presented “Nonveral Correlates of Survey Rapport” at AAPOR. She earned her Ph.D. from Northwestern University and is currently an Assistant Professor in the department of Communication Studies at Northeastern University. She and her team have published the following: Nonverbal Behavior in Face-to-face Survey Interviews: An Analysis of Interviewer Behavior and Adequate Responding.
2007 – 2008
Rachel Davis‘s dissertation, “Whatever it Means to You: Ethnicity, Language, and the Survey Response in Telephone-Administered Health Surveys of African Americans,” (PDF 475KB) explores the impact of race and ethnic identity on health survey data, interviewer race preferences, and ratings of interviewers among African American telephone survey respondents. Davis received her Ph.D. from the University of Michigan School of Public Health, Department of Health Behavior and Health Education in 2008. She is currently an Assistant Professor in the Department of Health Promotion, Education, and Behavior at the Arnold School of Public Health, University of South Carolina. Research publications based on her Cannell funded research include “Preferences for Interviewer Dialect Use and Race among African American Health Survey Respondents” (PDF 192K) and “Interviewer Effects in Public Health Surveys.”
Laura Lind‘s research, “The Use of Animated Agents in Surveys: How Does Manipulating the Level of Animation and Interactivity of a Computerized Interviewing Agent Affect Respondents Answers to Sensitive Survey Questions” is a lab-based experimental study of how respondents answers to sensitive survey questions are affected by four different modes of survey administration (A-CASI administration, low-end animated agent, high-end animated agent and traditional face-to-face interviews). Lind received her Pd.D. from the New School for Social Research in Cognitive, Social, & Developmental Psychology in 2008. This research publication, “Why Do Survey Respondents Disclose More When Computers Ask the Questions?” is based on her Cannell Funded research.
No awards were made.
2005 – 2006
David Wilson‘s research, “An Experimental Approach to Estimating Race of Interviewer Effects in Telephone Interviews,” is an experimental study of how the perception of interviewer’s race in telephone surveys affects responses to different kinds of interview content. Wilson received his Ph.D. from Michigan State University in 2005 and is currently an Associate Professor of Political Science and International Relations at the University of Delaware. “Statistical Profiles of Race of Interviewer Perceptibility in National Surveys” (PDF 96K) is based on Wilson’s work funded by the Cannell Fund.
Lindsay Benstead is currently an Assistant Professor of Political Science at the Mark O. Hatfield School of Government at Portland State University. “Effects of Interviewer Gender and Hijab on Gender-Related Survey Responses: Findings from a Nationally-Representative Field Experiment in Morocco” (PDF 373K) was made possible in part by the award from The Cannell Fund.
Jennifer Dykema received support for digitizing tape-recordings of interviewer-respondent interactions that allowed her to continue her research on how these interactions affect the quality of survey responses. Dykema received her Ph.D. from the University of Wisconsin in Sociology in 2004. She is a Senior Scientist and Survey Methodologist at the University of Wisconsin Survey Center. Along with colleagues Nora Cate Schaeffer and Dana Garbarski, Dykema edited a special issue for Survey Practice on “Interviewer-Respondent Interaction Coding: Current Issues and Recent Findings.” She serves on the editorial board of BMC Medical Research Methodology and is the 2017 Conference Chair for the American Association for Public Opinion Research (AAPOR).
2004 – 2005
Patrick Ehlen‘s research, “The Dynamic Role of Some Conversational Cues in the Process of Referential Alignment,” focuses on conceptual alignment in answers to survey questions, or how well the concept held by the respondent matches that intended by the designer of the question. The goal is to identify speech behaviors that might indicate conceptual misalignment in order to improve the question-and-answer process. As a post-doc at CSLI at Stanford University, Ehlen subsequently worked on the telephone survey implications of the widespread use of mobile phones in the United States. “Cellular-only Substitution in the United States as Lifestyle Adoption: Implications for Telephone Survey Coverage” was published in Public Opinion Quarterly in 2007, and presented at the 2008 Annual AAPOR Conference. His most recent project involved NSF-funded research on multimodal aspects of the interviewer/respondent relationship, undertaken as a joint effort among ISR, the New School for Social Research, and AT&T Labs. Results from this research were presented at AAPOR 2012 and SIGDIAL 2013, and published as “Precision and disclosure in text and voice interviews on smartphones” in PLOS ONE in 2015. Ehlen is currently Chief Scientist at Loop AI Labs, an artificial intelligence company based in San Francisco.
Frauke Kreuter‘s research, “Interviewer Effects as a Function of Respondents, Interviewer and Question Type,” developed a theoretical model to help identify, measure, and reduce interviewer effects in surveys by taking into account the interaction of the respondent, the interviewer, and the properties of the survey question. Frauke Kreuter is a Professor in the Joint Program in Survey Methodology at the University of Maryland, Professor of Statistics and Methodology at the University of Mannheim, and head of the Statistical Methods Research Department at the Institute for Employment Research (IAB) in Nürnberg, Germany.
To support this fund and make future awards possible, please visit our Next Generation Giving Page.