22

Leads participants through the process of designing their own survey. Examines the major decisions faced by a health researcher who wants to design and implement a survey. Explores the potential sources of bias associated with alternative approaches to sample design, respondent recruitment, data collection methods (interviews in-person or by telephone, computer-assisted interviews, or mail surveys) instrument design, and field administration. Participants prepare a defensible proposal for a survey that they would like to conduct. Emphasizes population surveys, but not exclusively so.

FREE
This course includes
Hours of videos

10 hours, 35 minutes

Units & Quizzes

12

Unlimited Lifetime access
Access on mobile app
Certificate of Completion

Course Objectives

Upon successfully completing this course, students are able to:
  • Identify primary sources of error in surveys, and discuss the consequences of each type of error for survey findings
  • Critically evaluate the design, construction and implications of studies based on survey research
  • Formulate strategies for surveys that minimize error
  • Critically evaluate the design, construction, and implications of surveys

Readings

Textbook

Groves, R. M., et al. (2009). Survey Methodology Second Edition. Wiley-Interscience.  
Session Topic Readings
Introduction
1 Overview of Course and the Survey Design Process Groves, et al. Chapters 1-2
2 Choosing Your Own Survey Research ProjectClass Preparation Questions: Describe a research question you would like to explore in this class and any initial thoughts you have about the survey that you would like to do or any special design challenges you already see ahead? Groves, et al. Chapter 11
Module 1: From Whom Should You Collect Information?
3 Selecting Target Population and Sampling FrameClass Preparation Questions: What is (are) the unit(s) of analysis for your study? To what population would you like to generalize findings? Groves, et al. Chapters 3-4Iannacchione, VG.  (2011)  The Changing Role of Address-Based Sampling in Survey Research.  Public Opinion Q. 75 (3) 556-575. Examples of Sample Designs: Read one of the examples below. Ezzati, T. M., K. Hoffman, et al. (1995). A dual frame design for sampling elderly minorities and persons with disabilities. Stat Med 14(5-7): 571-83. Harris, KM, F. Florey, J. Tabor, P.S. Bearman, J. Jones, & J. R. Udry  2003 The National Longitudinal Study of Adolescent Health: Research Design. Montaquila, JM, L. Mohadjuer et al. NYC Hanes: Design of the Community Health and Nutrition Examination Survey. Frerichs, RR, & Shaheen, MA. Small Community Based Surveys. Annu. Rev. Public Health 2001, 22:231-47. Luman ET, et al. Comparison of two survey methodologies to assess vaccination coverage. International Journal of Epidemiology 2007;36:633�641 Herbenick D, Michael R,  Schick V, Sanders S.A., Dodge B, & Fortenberry, J.D. Sexual Behavior in the United States: Results from a National Probability Sample of Men and Women Ages 14�94. J Sex Med 2010;7(suppl 5):255�265 Puma, M., Bell, S.; Cook, R., and Heid, C. (2010)  Head Start Impact Study: Final Report. Rockville, MD, US Department of Health and Human Services. 20850. pp. 2.01-2.11.
4 Hard-to-Reach PopulationsClass Preparation Questions: Are there hard to reach populations that are of interest to you? How might you collect sufficient information from them? Heckathorn DD, Semaan S, Broadhead R, Hughes JJ.  Extensions of Respondent Driven Sampling: A New Approach to the Study of Injection Drug Users Aged 18-25.  AIDS and Behavior. 2002;6:55-67Robinson, WT, Risser JMH, McGoy S, Becker AB, Rehman H, Jefferson M, Griffin V, Wolverton M. and Tortu S. Recruiting Injection Drug Users: A Three Site Comparison of Results and Experiences with Respondent Driven and Targeted Sampling Procedures. Journal of Urban Health, 24 August 2006Meyer, I, Rossano L, Ellis JM and Bradford J.  A Brief Telephone Interview to Identify Lesbian and Bisexual Women in Random Digit Dialing Sample.  J of Sex research, 39 (2) May 2002 39-144.
5 Reducing Non-ResponseClass Preparation Questions: How big a problem will non-response be? How could you minimize non-response? Groves, et al. Chapter 6Read at least one of the selections below. Singer E, Hoewyk JW, et al. (1998). Does the payment of incentives create expectation effects? Public Opinion Quarterly 62(2): 152-164. VanGeest JB, Wynia MK, et al. (2001). Effects of different monetary incentives on the return rate of a national mail survey of physicians. Med Care 39(2): 197-201. Teitler JO, et al. (2009). Costs and Benefits of Improving Response Rates for a Hard to Reach Population. Public Opinion Quarterly. 67:126�138 Martin E. (2009) Can a Deadline and Compressed Mailing Schedule Improve Mail Response in the Decennial Census. Public Opinion Quarterly 73(3): 361-367
6 Discussion of Class Participants' Sample Plans No Reading
Module 2: How Should Information Be Collected?
7 Overview and Face-to-Face InterviewingClass Preparation: What if you chose to use face-to-face interviewing to collect data, what would be the advantages versus disadvantages? Groves, et al. Chapter 5
8 Telephone InterviewsClass Preparation Questions: What if you chose to use face to face or telephone interviewing to collect data, what would be the advantages versus disadvantages? Groves R. (1990) Theories and methods of telephone surveys Annual Review of Sociology 16:221-240.Lavrakas PJ, et al.  The state of surveying cell phone numbers in the United States: 2007 and Beyond. Public Opinion Quarterly 71, 840�854 Brown D. (2009) Cellphones' Growth Does a Number on Health Research. Washington Post. Monday, January 12, 2009; Page A04 Tomlinson M, et al. (2009). The use of mobile phones as a data collection tool: A report from a household survey in South Africa. BMC Medical Informatics and Decision Making, 9:51 Read one of the following. Blumberg SJ, Luke JV (2007). Coverage Bias in traditional telephone surveys of low-income and young adults. Public Opinion Quarterly. 71 (5):734�749 Houtkoop-Steenstra H, Van den Bergh H (2000). Effects of introductions in large-scale telephone survey interviews. Sociological Methods and Research 28 (3): 281-300. Brick JM, Dipko S, Presser S, Tucker C, Yuan Y (2006). Nonresponse Bias in a Dual Frame of Cell and Landline Numbers. Public Opinion Quarterly. 70 (5) 780-793.
9 Mail, Self-Administered, Web-based QuestionnairesClass Preparation Questions: What if you chose to use mail or web-based surveys to collect data, what would be the advantages versus disadvantages? Chang L, Krosnick JA (2009).  National Surveys via RDD Telephone Interviewing versus the Internet: Comparing Sample Representativeness and Response Quality. Public Opinion Quarterly 73(4):641-678Dillman D (1991). The design and administration of mail surveys. Annual Review of Sociology 17:225-249 Couper M, et al. (2001). Web Survey Design and Administration. Public Opinion Quarterly 65(2): 230-253. Read one of the following articles. Fricker S, Galesic M., Tourangeau R, Yan T. An Experimental Comparison of Web and Telephone Surveys. Public Opinion Quarterly 69 (3), 370-392 Battaglia MP, et al. An evaluation of respondent selection methods for household mail surveys. Public Opinion Quarterly 72 No. 3, Fall 2008, pp. 459�469
10 Computer Assisted InterviewingClass Preparation Questions: What if you chose to use computer assisted interviewing to collect data, what would be the advantages versus disadvantages? Tourangeau R, Smith TW (1996). Asking sensitive questions: The impact of data collection mode, question format, and question context. Public Opinion Quarterly 60(2): 275-304.Read one of the following articles: Couper MP, Singer E, et al (2003). Understanding the effects of Audio-CASI on self-reports of sensitive behavior. Public Opinion Quarterly 67(3): 385-395. Perlis T, Jarlais D, Friedman S, Arasteh K (2004). Audio-computerized self-interviewing versus face-to-face interviewing for research data collection at drug abuse treatment programs. Addiction 99(7), 885-96 Villarroel MA, Turner CF, Eggleston E, Al-Tayib A, Rogers S, Roman AM, Cooley PC and Gordek, H. (2006)  Same-Gender Sex in the United States: Impact of T-ACASI Public Opinion Quarterly. 70(2), 166-196. Harmon T, Turner CF, Rogers SM, et al. (2009). Impact of T-ACASI on Survey Measurements of Subjective Phenomena. Public Opinion Quarterly 73(2): 255-280.
11 Discussion of Class Participants' Sampling and Data Collection Proposals No Reading
Module 3: Constructing the Instrument
12 Instrument Construction: EventsClass Preparation Questions What will your main outcome and independent variables be? How will your measures be influenced by your data collection strategy and the people you are studying? Groves, et al. Chapter 7Read one of the following: Gaskell GD, O'Muircheartaigh CA, et al. (1994). Survey questions about the frequency of vaguely defined events: The effects of response alternative. Public Opinion Quarterly 58(2): 241-254. Gaskell GD, Wright DB, et al. (2000). Telescoping of landmark events: implications for survey research. Public Opinion Quarterly 64(1): 77-89.
13 Instrument Construction: OpinionsClass Preparation Questions What types of variables are your main outcome and independent variables? Attitudes, Knowledge, Behavior,Events, Demographic Characteristics or what? What problems may respondent face trying to answer questions about these variables? Schaeffer NC, Presser S (2003). The Science of Asking Questions. Annual Review of Sociology. 65-88.Read one of the following: Christian LM, Parsons NL, Dillman DA (2009). Designing Scalar Questions for Web Surveys. Sociological Methods Research 37:393-425 Krosnick JA, Holbrook AL, et al. (2002). The impact of "No Opinion" response options on data quality: Non-attitude reduction or an invitation to satisfice? Public Opinion Quarterly 66(3): 371-403. Shoemaker P, Eichholz M, Skewes E. (2002). Item non-response: Distinguishing between don�t know and refuse. International Journal of Public Opinion Research 14(2): 193-201. Streb MJ, et al. (2008). Social Desirability Effects and Support for a Female American President. Public Opinion Quarterly. 72(1): 76�89 Schuldt JP, Konrath SH, Schwarz N (2011). "Global Warming" or "Climate Change": Whether the planet is warming depends on question wording.  Public Opinion Q.  75(1): 115-124.
14 Evaluating Survey QuestionsClass Preparation Questions: How will your measures maximize the quality of the information collected? Groves, et al. Chapter 8Presser S,  Couper M,  Lessler J, Martin E., Martin J, Rothgeb J, Singer E. (2004). Methods for testing and evaluating survey questions. Public Opinion Quarterly 68(1)109-130. Mathiowetz NA (1998). Respondent expressions of uncertainty: Data source for imputation. Public Opinion Quarterly 62(1): 47-56.
15 Discussion of Class Participants' Instruments No Reading
16 Interview ErrorClass Preparation Questions: What management and implementation strategies do you plan to use to ensure the quality of your data? Groves, et al. Chapter 9Davis DW (1997). Nonrandom measurement error and race of Interviewer effects among African Americans. Public Opinion Quarterly 61(1): 183-207. Schober MF, Conrad FG (1997). Does conversational interviewing reduce survey measurement error? Public Opinion Quarterly 61(4): 576-602. Fowler FJ (1991). Reducing interviewer-related error through interviewer training, supervision, and other means. In: Measurement Errors in Surveys. John Wiley and Sons.

Course Currilcum

    • Lecture 1: Overview of Course and the Survey Design Process 00:55:00
    • Lecture 2: Choosing Your Own Survey Research Project 00:55:00
    • Lecture 3: Selecting Target Population and Sampling Frame 00:55:00
    • Lecture 5: Reducing Non-Response 00:55:00
    • Lecture 7: Overview and Face-to-Face Interviewing 00:55:00
    • Lecture 8: Telephone Interviews 00:55:00
    • Lecture 9: Mail, Self-Administered, Web-based Questionnaires 00:55:00
    • Lecture 10: Computer Assisted Interviewing 00:55:00
    • Lecture 12: Instrument Construction: Events 00:55:00
    • Lecture 13: Instrument Construction: Opinions 00:55:00
    • Lecture 16: Interview Error 00:55:00
    • Issues in Survey Research Design 00:30:00