“Reconciling Public Participation Rate Differences in Census Bureau versus Academic/Commercial Survey Estimates” - The Survey of Public Participation in the Arts (SPPA) was established in 1982 to provide accurate benchmarks of Americans’ participation in the arts. When the 1997 SPPA was conducted by a commercial survey firm (Westat), rather than the US Bureau of the Census, the arts participation estimates were notably higher than in the previous three surveys conducted by Census in 1982, 1985 and 1987 (as well as subsequent SPPA surveys conducted by Census in 2002 and 2008). For each arts activity, the Census figures were up to 13 percentage points lower than those from Westat and other surveys. This research explores several explanations for these higher figures. This paper was presented at the May 2014 American Association for Public Opinion Research conference in Anaheim California. If you would like to download the slides from this presentation click HERE!
“When proxy interviews are acceptable: Does it help to speak with the spouse or partner?” - In most situations, you usually want to question your targeted respondent directly, however many surveys allow a knowledgeable person to answer on behalf of others. There are three main reasons why surveys accept proxy responses. First there is proxy out of necessity because the individuals that you would like to interview cannon be reached or are unable to respond on their own behalf because of they may be either too young or too old to respond or cannot respond due to a physical or mental condition. Second there is proxy to increase the efficiency of a survey. This occurs when you accept proxy responses for people capable of providing their own information to save time and money. Third there is proxy to improve the quality of the data for studies in which it is believed that proxy information would be as good as or perhaps even better than the information you would obtain from a direct interview. We do know that in general proxy responses have the potential to be less accurate, but less is known about the degree of proxy reporting inaccuracies. This presentation looks at recent survey studies to explore proxy out of necessity, proxy used to increase efficiency, and proxy reporting by design to see what information can be reliably collected via proxy and looks at whether spouse or partners are generally good proxy respondents. This research was first presented at the May 2012 International Field Director's and Technology Conference in Orlando, Florida and latter presented at the 2014 AAPOR conference in Anaheim California. If you would like to download the slides from this presentation click HERE!
“Alternate Strategies for Obtaining Cell Phone Samples: Preliminary Comparisons of Cell Phone Respondents Attained from RDD and ABS Samples in Massachusetts” - It has now become widely agreed that a telephone survey that relies entirely on the use of a landline random digit dial (RDD) sample for conducting a general population study is likely to miss a majority of potential respondents. This research was presented at the May 2011 AAPOR conference held in Phoenix, Arizona. The purpose of this research was to not further investigate the decline in the use of landline RDD, but rather to compare two promising alternative solutions. The first solution is to select a sample using the United States Postal Service (USPS) delivery sequence file (DSF). This sampling method is referred to as address based (AB) sampling and is becoming popular because virtually all households have an address and almost all households receive mail from the USPS. Using reverse-matching data bases telephone numbers for many addresses can be obtained so that telephone survey can still be conducted, but the AB sampling also requires a multi-mode survey approach in order to reach people in households for which phone numbers cannot be found. The second solution is to conduct the telephone survey using both a landline RDD sampling frame and a cellphone RDD sampling frame.
“Survey Design of the Study of Congregational Giving for International Development & Relief” - This methods research was presented at the 2010 Association for Research on Nonprofit Organizations and Voluntary Action (ARNOVA) conference. The presentation described the methodology and key findings from the 2009 survey of U.S. congregations. The main goal of this nationally representative survey was to estimate the how much international relief is given both directly or indirectly by congregations of all sizes and denominations. In addition, to describing how much and who is providing international assistance, this presentation also provided information on where the money is being sent. If you would like to download the slides from this presentation click HERE!
Papers and Conference Presentations 2000 to 2009:
“Comparing Random Digit Dial (RDD) and United States Postal Service (USPS) Address-Based Sample Designs for a General Population Survey: The 2008 Massachusetts Health Insurance Survey” - This paper was presented at the 2009 AAPOR conference in Hollywood Florida. This paper compares respondents from a Random Digit Dial (RDD) sample design with respondents from a United States Postal Service (USPS) address list sample design for a general population study conducted to gather information on health insurance coverage in Massachusetts. The research provides insight into the coverage and cost/benefit trade-offs that researchers can expect from RDD sample designs that conduct all interviews over the phone versus using a full or combined USPS address sample design that utilizes a multi-mode (phone, web, mail) data collection approach.
"Can Your Spouse Accurately Report Your Activities? An Examination of Proxy Reporting" - This research was presented at the May, 2009 AAPOR conference held in Hollywood, Florida. The 2008 Survey of Public Participation in the Arts (SPPA) is sponsored by the National Endowment for the Arts and was conducted as a supplement to the May 2008 Current Population Survey (CPS). A big challenge in being a supplement to the CPS is that respondent selection procedures for the SPPA supplement differ from the CPS procedures. The CPS is administered to any person 16 or older who can report employment information on all persons 16 years or older in the household. While the SPPA collects information on all adults 18 or older, it is felt that many of the questions on the survey require self-reports rather than proxy reports. In 2002, the Census Bureau interviewers attempted to complete the SPPA supplement with all persons 18 or older, but after 4 call attempts, they accepted proxy reports. To make the SPPA a better fit for the CPS protocol, rather than attempt to interview all adults in the household , the 2008 SPPA accepted proxy responses for spouses or partners (for many of the questions). This change in design makes it much easier to measure the impact of proxy reports, given that they were collected by design rather than out of necessity. This research explores the extent to which proxy reporting may have resulted in over or under reporting participation. This paper has been published in the online survey research journal Survey Practice
"Estimates of the Uninsurance Rate in Massachusetts from Survey Data : Why Are They So Different?" - Researchers from the Urban Institute and the State of Massachusetts explored why existing surveys generate very different estimates of the uninsurance rate in Massachusetts. The surveys they examined are the Current Population Survey (CPS), the Behavioral Risk Factor Surveillance System (BRFSS), the Massachusetts Health Insurance Survey, and the Massachusetts Health Reform Survey (MHRS). This brief described how estimates may vary because of differences in the wording of the insurance questions asked in the surveys, differences in question placement and context within the survey, differences in survey design and fielding strategies, differences in accounting for missing data and other data preparation, and differences in survey fielding time frames. The analysis concludes that there has been no single survey in Massachusetts that is clearly superior across all of these important dimensions. This paper was revised in August 2008, if you want to download this paper click HERE!
"Finding low-income telephone households and people who do not have health insurance using auxiliary sample frame information for a random digit dial survey" - This research was presented at the May, 2007 American Association for Public Opinion Research Conference held in Anaheim, California and also at the August, 2007 DC-AAPOR seminar in Washington, D.C. This paper describes the results of oversampling low-income areas in Massachusetts by separating telephone numbers into high, medium and low-income strata based on census tract information for each telephone exchange’s 100 banks of telephone numbers. If you would like to download the slides from this presentation click HERE!
"Using an E-Mail Invitation to Screen Survey Respondents" - This research was presented at the May, 2004 American Association for Public Opinion Research Conference held in Phoenix, Arizona. Internet surveys can be designed so that the respondent can simply click on a link that indicates that they do not want to fill out the survey. The link could be embedded in the e-mail invite or for on-line invite surveys they could be included on the invite page. The decline option would be appropriate for those respondents that are not actually the end-user and, thus cannot answer most of the questions. This option can potentially improve your response rate estimate as well as provide additional information about your respondents. However, there is the concern that the decline option would provide an easy out for legitimate respondents. This research analyzes the effect the decline option had on the response rate and survey responses. If you want to download this paper click HERE!
"Double Sampling: A Method for Reducing Interviewing Cost " - This research was presented at the March, 2004 Council for Marketing and Opinion Research (CMOR) respondent cooperation workshop held in Las Vegas, Nevada. While we are currently seeing a growing demand for survey data, we seemed to be faced with reduction in respondents' willingness to supply survey data. Are there new methods of collecting survey data from respondents that have the potential of improving the willingness of people to provide data? At this time there does not appear to be any method that can fix the problems of a much more skeptical public that is reluctant to answer questions via any survey mode. However, we can look to established methods that we are not taking enough advantage of, such as sampling nonrespondents. Sampling nonrespondents saves money that can be reallocated to the overall sample size or towards more effective refusal conversion. Sampling nonrespondents also boosts morale as interviewers have to deal with less negative respondents. In addition, the staffing of interviews becomes easier since strong interviewers who often are used to convert refusals will have more time to work on other parts of the sample. The downside is that you will need to include a weighting adjustment that will increase the variance of your survey estimates due to having sampled nonrespondents. However, with the increasing percentage of respondents refusing and the growing costs of interviewing difficult respondents the likelihood that a survey will benefit for sampling nonrespondents is increasing. If you would like to download this paper click HERE!
"Determining the Probability of Selection for a Telephone Household in a Random Digit Dial Sample Design is Becoming Increasingly More Difficult" This research was presented at the May, 2003 American Association for Public Opinion Research Conference held in Nashville, Tennessee. For many years, researchers using a RDD sample design could estimate the total number of residential telephone numbers in a household by simply asking one, sometimes two, and at most three questions. The 2002 National Survey of America's Families (NSAF) is a telephone survey that relies primarily on a large RDD sample design using over 400,000 telephone numbers. For the 2002 study a more complex set of questions was asked of each household which included learning more about what these additional phone numbers were being used for. This paper compares the results of these questions with other large RDD studies, with previous rounds of NSAF, and discusses the impact these questions have on the probability of selection adjustments. If you want to download this paper click HERE!
"Comparing Incentives at Initial and Refusal Conversion Stages On a Screening Interview for a Random Digit Dial Survey" This research was presented at the May, 2003 American Association for Public Opinion Research Conference held in Nashville, Tennessee. This paper describes the results of an experiment that tested the use of pre-paid incentives to increase response rates at the initial household contact on a random digit dial (RDD) survey. The experiments were conducted as part of the National Survey of America's Families (NSAF), a large RDD effort, sponsored by a number of private foundations to assess the impact of changes in federal policy on social programs. The goal of the experiment was to assess the relative effects of sending money prior to the initial interview or at the refusal conversion stage. Sending money prior to the initial call should increase the initial cooperation rate and thereby reduce the amount of time spend converting refusals. Conversely, waiting to send money at refusal conversion may more effectively target those persons for whom an incentive will make the biggest difference.
"Success and Failures of Various Approaches People Have Been Using to Try and Maintain Decent Response Rates" This panel discussion held at the May, 2003 International Field Director's and Technology Conference in Nashville, Tennessee. This expert panel discussion was put together to discuss the problems of survey nonresponse. Special emphasis being placed on finding consensus of what things seem to work as well as what does not work.
"Using a Short Follow-up Survey to Compare Respondents and Nonrespondents" This research paper analyzes the potential for nonresponse bias in the 1999 National Survey of America's Families (NSAF) survey. The NSAF is primarily a random digit dial (RDD) telephone survey, consisting of a short screener interview to determine household eligibility and a longer extended interview during which survey items of interest are gathered for sampled household members. In order to examine the potential for nonresponse bias, a follow-up survey of a sample of respondents and refusals from the NSAF screener interview was conducted by a different survey organization than the one which conducted the main survey. The follow-up survey contained key items from the main survey, which were used to examine differences between respondents and nonrespondents on these measures. In addition, the follow-up survey also contained questions on subjects thought to be correlated with willingness to participate in a survey, such as attitudes towards surveys and government, and perceptions of being busy. This research was presented at the August, 2002 Joint Statistical Meeting held in New York City. If you want to download this paper click HERE!
"Effects on Survey Estimates from Reducing Nonresponse in the National Survey of America's Families" This research was presented at the May, 2002 American Association for Public Opinion Research Conference held in St, Petersburg, Florida. This poster session showed the results of research conducted to analyze the effects of the extensive efforts to reduce potential nonresponse bias in NSAF survey estimates.
"Collecting Time Diary Data Using a Web Survey - Does it Produce Similar Results?" This research was presented at the May, 2002 International Field Director's and Technology Conference in Clearwater, Florida. Within minutes the time diary information that is entered can be converted into the traditional minutes per day data file. However, how does this data compare with traditional paper diary forms or telephone diaries that researchers have traditionally used? This presentation evaluated both the advantages and disadvantages of the web diary, using student diary data for comparison purposes. Click "HERE" for a one page summary of the findings.
"What is Gained from Additional Call Attempts & Refusal Conversion and What are the Cost Implications?" (Updated November 2002) This paper is the latest summary of my research findings in the area of call attempts and refusal conversion. This research uses data from more than 20 random digit dialing studies that have been conducted at the Survey Research Center over the past 10 years. Research looks closely at call attempt patterns, outcomes of call attempts, and response rates. The research is useful in making informed decisions on when is the best time to call certain types of households, how useful is refusal conversion, and how to best design an auto scheduler. I have done many Conference presentations on optimizing call attempts using results from this research. If you want to download this paper click HERE!
"How Long Should You Wait Before Trying to Convert a Refusal?" How long should you wait before attempting to convert a telephone refusal? Often you hear the argument that you should allow a cool down period of a few weeks. However, project schedules often force us to make refusal conversion well before this two to three week period. Perhaps more importantly, there is not any real quantitative evidence that two or three weeks is necessary to improve the chances of successful refusal conversion. This presentation was done at the May, 2001 American Association for Public Opinion Research Conference in Montreal, Canada. If you want to download this paper click HERE!
"Comparing An E-mail Survey With A Web Survey" This presentation compared a government employee satisfaction survey conducted using e-mail a few years ago and most recently conducted using a Web form. The presentation discussed advantages and disadvantages of both modes of data collection with recommendations for future employee satisfaction surveys. This presentation was done at the May, 2001 International Field Technology Conference in Montreal, Canada.
"Internet Data Collection - What Have We Learned and What Do We Do Next?" This presentation reviewed previous e-mail studies conducted at the University of Maryland Survey Research Center. After this historical review the presentation went on to discuss the University of Maryland's current and future plans for implementing e-mail and web surveys. This presentation was done at the May, 2000 International Field Technology Conference in Portland, Oregon.
Papers and Conference Presentations 1982 to 1999:
"A Comparison of Mail and E-Mail for a Survey of Employees in U.S. Statistical Agencies" Published in the Journal of Official Statistics March 1999 (with Mick P. Couper and Johnny Blair). This research described the procedures and difficulties that occurred in using E-mail for collecting data from a large sample of Federal employees. In addition, the paper analyzed the differences between e-mail data collection and traditional mail data collection. If you want to download this paper click HERE!
"A Transition from Paper Training Manuals to On-Line Training Manuals" This presentation demonstrated how the Survey Research Center has begun the process of implementing the use of HTML formatted telephone supervisor manuals. This presentation was done at the May, 1999 International Field Technology Conference in St. Petersburg, Florida.
"Changing Patterns of Telephone Call Attempts" Organized, presented and chaired this panel discussion on the changing pattern of telephone call attempts in RDD studies and how these changes are affecting budgets, response rates and auto-scheduling algorithms. This panel was held at the May, 1999 American Association for Public Opinion Research in St. Petersburg, Florida. The Panelist were Marla Cralley, The Arbitron Company; Michael Link, Research Triangle Institute; Tom Piazza, University of California, Berkeley; Timothy Triplett, University of Maryland; and Mike Weeks, Research Triangle Institute
"Using Groupware to Improve Questionnaire Design" This presentation outlined and demonstrated how the Survey Research Center uses Lotus Notes to improve questionnaire design. This presentation was done at the May, 1998 International Field Technology Conference in St. Louis, Missouri.
"Results From the CASES Users Survey" This was a presentation of the final results from the electronic mail questionnaire that was sent out to many of the organizations who use the CASES software. This presentation was done at the May, 1997 Field Technology Conference.
"To Minimize Call Attempts: How Many Times Should a Phone Number be Tried" This was a presentation done at the May, 1997 International Field Directors Conference held in Norfolk, Virginia. This presentation used results from a number of survey projects to determine what strategy would minimize the call attempts needed to complete interviews. In addition, the presentation went over how this strategy would effect the final sample distribution.
"Trials and Tribulations: Using E-mail for Conducting Surveys" This was a presentation done at the May, 1997 AAPOR conference held in Norfolk, Virginia. Presentation described the procedures and difficulties that occurred in using E-mail for collecting data from a large sample of Federal employees. This presentation was also done at the May, 1997 International Field Technology Conference also held in Norfolk Virginia.
"A Study of Residents and Employer Attitudes and Awareness Concerning Air Quality in the Washington D.C. and Baltimore, MD Metropolitan Areas" Report prepared for the Baltimore and Washington Metropolitan Council of governments. Report summarized the findings of an ozone awareness survey conducted by the University of Maryland's Survey Research Center (with Clifford Fox) (March 1997)
"Initial Cooperators versus Converted Refusals are there Differences in Data Quality?" American Statistical Association 1996 Proceedings of the Section on Survey Research Methods, Volume II (Timothy Triplett, Johnny Blair, Teresa Hamilton, Yun Chiao Kang). This paper compares the quality of survey data provided by respondents who initially refused to participate with those respondents who never refused to participate.
"Using a parallel CASES instrument to edit call record information and remove incorrect data ( A description of the SRC fixit program)" This paper was first presented at the March 1996 Data Editing Workshop and Exposition held at the Bureau of Labor Statistics. Also Presented at the May 1996 International Field Technology conference held in Salt Lake City, Utah. The presentation described both how the fix-it program works and how other organizations could easily develop their own in-house editing program.
"Initial Cooperators versus Converted Refusals are there Differences in Data Quality?" First Presented at the 1995 International Field Directors/Technologies Conference. The presentation was revised then presented at the May 1996 AAPOR (American Association of Public Opinion Research) in Salt Lake City, Utah. This presentation showed that respondents who initially refused but eventually agree to complete an interview provide less information than those who cooperate from the start.
"Call Attempts and Refusal Conversion Cost Analysis" Presented at the 1994 International Field Directors/Technologies Conference. This paper focused on the costs of making the extra effort. This extra effort being measured in terms of call attempts and refusal conversion.
"Design and Implementation of a Survey Cost Information Data Base" Presented at the 1994 International Field Directors/Technologies Conference. This paper explained the procedures and design plans for the comprehensive management information system that is currently in use at the Survey Research Center. Both the objectives of tracking costs and survey performance for all projects and technical programming issues were discussed in this presentation.
"How Important are Additional Call Attempts" A paper presented at the 1993 Field Directors Conference. This paper examined both the benefits and costs of making additional call attempts and attempting to complete interviews with respondents who initially refused to participate. In addition to presenting this paper, served as chair for the session on refusal conversion, in which this paper was presented.
"Automated Management of Two Stage RDD" A paper presented at the 1993 International Field Director's and Technology Conference. This paper describes the programs developed to improve the efficiency in releasing and managing a two-stage RDD sample with replacement.
"Survey Sponsorship, Response Rates, and Response Effects" (Stanley Presser, Johnny Blair and Timothy Triplett) Social Science Quarterly September 1992. This article looks at the effects on people's opinions depending who they are told is sponsoring the survey.
"Report on the Mean Call Attempts to Complete an Interview" A paper presented at the 1992 Field Directors Conference. The paper measures the difficulty of reaching various demographic groups in terms of number of call attempts needed to complete an interview.
"An Alternative Respondent Selection Process for Random Digit Dialing Surveys" A paper presented at the 1989 Field Directors Conference. The paper documents the respondent selection process used on many of the surveys conducted at the University of Maryland's Survey Research Center.
"Activity Pattern Differences Between Telephone and Non-Telephone Households" Presented at the International Conference on Telephone Survey Methodology, Charlotte, North Carolina, 1987. The paper looked at time diary records of houses with and without telephones. (with John P. Robinson)
"The Flotilla Entrants" Cuban Studies, Volume 1, January, 1982 (with Robert Bach). This article provides a description of the Cuban entrants based on demographic data collected at the time of entry into the country.
Timothy Triplett's Home Page