Measuring partisanship in Europe: How online survey questions compare with phone polls
We compared three different online survey methods in certain countries to see which one would most closely replicate our phone results.
A behind-the-scenes blog about research methods at Pew Research Center.
For our latest findings, visit pewresearch.org.
We compared three different online survey methods in certain countries to see which one would most closely replicate our phone results.
In our surveys, people are much less likely to skip questions online than when speaking to interviewers in person or on the phone; we explore how offering a “Don’t know” option in online surveys affects results.
In this piece, we demonstrate how to conduct age-period-cohort analysis, a statistical tool, to determine the effects of generation.
To test whether machine transcription would be practical for studies of sermons in 2019 and 2020, we compared human and machine transcriptions of snippets from a random sample of 200 audio and video sermons.
In 2022, we experimented with a new question in cross-national surveys to capture the international equivalent of U.S. partisan “leaners.”
Demographic characteristics and other factors, such as the devices that respondents use to take surveys, are tied to Americans’ willingness to engage with open-ended questions.
Given the complexities of geopolitics, how might wording affect responses to a question about a hypothetical conflict between China and Taiwan?
While there is no magic length that an online survey should be, Pew Research Center caps the length of its online American Trends Panel surveys at 15 minutes.
In our March 2021 survey, we decided to take a fresh look at the consent language we used when asking Americans to give us their Twitter handles.
Having a sample of adult Twitter users allows researchers to filter out bots, minors, institutional accounts and international users.