Michael Hanmer, Antoine Banks, and Ismail White have a new paper in Political Analysis that returns to a longstanding problem in voting and survey research: overreporting bias among survey respondents.
From the abstract:
Voting is a fundamental part of any democratic society. But survey-based measures of voting are problematic because a substantial proportion of nonvoters report that they voted. This over-reporting has consequences for our understanding of voting as well as the behaviors and attitudes associated with voting. Relying on the “bogus pipeline” approach, we investigate whether altering the wording of the turnout question can cause respondents to provide more accurate responses. We attempt to reduce over-reporting simply by changing the wording of the vote question by highlighting to the respondent that: (1) we can in fact find out, via public records, whether or not they voted; and (2) we (survey administrators) know some people who say they voted did not. We examine these questions through a survey on US voting-age citizens after the 2010 midterm elections, in which we ask them about voting in those elections. Our evidence shows that the question noting we would check the records improved the accuracy of the reports by reducing the over-reporting of turnout.
What is neat about this paper is that the authors suggest a relatively simple way to reduce (but not eliminate–see the attached graphic) the bias.
It’s also notable that the research comes out of the TESS (Time Sharing Experiments in Social Science), an innovative and low-cost project funded by the Political Science Program of the National Science Foundation (Congress: are you listening?).
Just kidding.
Doug Chapin at the Election Academy highlights a report out of Ohio showing that, of the 210 cases described, 40 (all from Franklin County” were “referred for more investigation” and only 2 resulted in any prosecution, one for a man who voted for President in another state but for local initiatives in Ohio, and a second for a petition gatherer who falsified names on a petition.
The latter case, of course, does not constitute voting fraud.
The results is 1/210, or .004 of the cases, constituted actual voter fraud. Zero cases of voter impersonation at the polls. Zero cases of illegal immigrants voting. Zero cases or organized voter fraud at all. As one Republican county prosecutor put it: “There’s a couple of isolated incidents of people making bone-headed decisions.”
I don’t expect to see many news stories helping to educate skeptical Americans that vote fraud is not, in fact, rampant in Ohio or in other states.
Links are courtesy of the Public Policy Institute of California, an active policy research shop in Sacramento. PPIC has a broad portfolio that includes high quality work on elections, election administration, and voter turnout.
- January 23: California’s Future: Voter Turnout.
DATE: January 23, 2014 TIME: 12:00 to 1:30 p.m. (registration 11:45 a.m.) LOCATION: CSAC Conference Center
1020 11th Street, second floor
Sacramento, CA - Brief video summarizing PPIC research findings on California voter turnout at You Tube.
This looks like a nice effort by Susan, Claire, and others at US Votes and the Overseas Vote Foundation:
https://www.overseasvotefoundation.org/files/OVF_research_newsletter_vol4_issue2_winter2014.pdf
The Current Population Survey’s Voting and Registration Supplement is the gold standard to understand voter turnout in the United States. The study is the largest ongoing survey of voting participation in the United States, and is used not only by political scientists, election lawyers and civil rights advocates, but is also cited by Supreme Court Justices.
Michael McDonald of the United States Election project has been warning for years that CPS turnout estimates were beginning to deviate in worrisome ways from data collected from exit polls, validated surveys, and official election returns.
New research in the Public Opinion Quarterly by Aram Hur and Christopher Achen validates McDonald’s claims.
From the abstract:
The Voting and Registration Supplement to the Current Population Survey (CPS) employs a large sample size and has a very high response rate, and thus is often regarded as the gold standard among turnout surveys. In 2008, however, the CPS inaccurately estimated that presidential turnout had undergone a small decrease from 2004. We show that growing nonresponse plus a long-standing but idiosyncratic Census coding decision was responsible. We suggest that to cope with nonresponse and overreporting, users of the Voting Supplement sample should weight it to reflect actual state vote counts.
Important reading for anyone who uses the CPS.
I just received an interesting set of proposals for improving election administration in California, courtesy of California Forward (I have no affiliation with this organization, but the leadership appears to be non-partisan).
Among the ideas they support:
- California Forward Action Fund (CFAF) supported Assemblymember Mullin’s AB 1135 which expands tools that are used to verify signatures on vote-by-mail ballots.
- Senator Padilla’s SB 360 will enable California to move forward with the development of new voting systems that reflect today’s electorate.
- The group is called Future of California Elections (FOCE). California Forward is a member of this group because we believe that modernizing out elections system is a cornerstone critical to restoring a vibrant and responsive democracy in California.
- California was the first in the country to designate the state’s Health Benefit Exchange as a voter registration agency under the National Voter Registration Act (NVRA).
- The League of Women Voters of California is leading a study to develop a Best Practices Manual for Official Voter Information Guides (http://ca.lwv.org/announcement/2013/dec/open-call-voter-information-guides).
A full list of their ideas and proposals is here: http://www.cafwd.org/reporting/entry/year-in-review-voting-and-elections
An interesting new article by Keith Bentele and Erin O’Brien at the University of Massachusetts, Boston came out in the December 2013 Perspectives on Politics. Titled “Jim Crow 2.0? Why States Consider and Adopt Restrictive Voter Access Policies.” It should be of interest to everyone in the political science, law, and policy side of election administration. (Hat tip to the Monkey Cage, which features a guest post by the authors.)
Recent years have seen a dramatic increase in state legislation likely to reduce access for some voters, including photo identification and proof of citizenship requirements, registration restrictions, absentee ballot voting restrictions, and reductions in early voting. Political operatives often ascribe malicious motives when their opponents either endorse or oppose such legislation. In an effort to bring empirical clarity and epistemological standards to what has been a deeply-charged, partisan, and frequently anecdotal debate, we use multiple specialized regression approaches to examine factors associated with both the proposal and adoption of restrictive voter access legislation from 2006–2011. Our results indicate that proposal and passage are highly partisan, strategic, and racialized affairs. These findings are consistent with a scenario in which the targeted demobilization of minority voters and African Americans is a central driver of recent legislative developments.We discuss the implications of these results for current partisan and legal debates regarding voter restrictions and our understanding of the conditions incentivizing modern suppression efforts. Further, we situate these policies within developments in social welfare and criminal justice policy that collectively reduce electoral access among the socially marginalized.
Courtesy of Pennsylvania DMV
I’m often asked, particularly by junior faculty, about how social scientists get involved in litigation. A good way to learn how statistical reasoning gets used in legal cases is to review cases. The “All About Redistricting” website maintained by Justin Levitt at Loyola Law School, and the “litigation page” at the Moritz College of Law at The Ohio State University, for example, are treasure troves of case materials.
The recent Pennsylvania trial court ruling striking down the state’s voter ID law is only the most recent instance where a court relied heavily on evidence produced by social scientists and statisticians. (All the page numbers referenced below refer to this decision.)
The full set of documents pertaining to the case can be found at the Moritz site (unfortunately many of the documents are low quality scans and can’t be searched).
The Findings of Fact are a good place to start (pg. 54 of the decision) because they summarize the source of the evidence submitted to the court and can often used to quickly identify expert witnesses.
The Moritz site is pretty comprehensive for this case, including most of the expert witness reports. The social scientists used in the case were:
Aspiring expert witnesses can learn at least two lessons from this case.
Learn the tools: Siskin’s report is a virtual manual for matching complex databases to estimate racial and ethnic disparities. A key piece of evidence was Siskin’s estimate of the number of PA citizens who did not have valid photo IDs. The work involved fuzzy set matching of Penn DOT and voter registration databases (pg. 62 “Scope of Need”; pg. 17 of the expert witness report); he used “BISG” methodology to estimate racial disparities even though his data sources did not contain racial or ethnic identifiers (pg. 20 of the report); and relied on “Open Street maps” data to estimate drive times for residents without IDs to the closest drivers license office (pg 27 of the report).
You don’t necessarily need to use the most advanced technology (Siskin uses SPSS for all of his statistical estimation), but your methodology must be scientifically sound.
Honor scientific standards of evidence: Wecker was hired by the defendants solely to, in the words of the Court, “refute Dr. Siskin’s work.” The court’s treatment of Wecker’s evidence is illustrative of what happens if your evidence can be criticized for not following conventional scientific practice. The court refers to the testimony as “flawed and assumption laden”.
Compare this to the court’s treatment of Marker, and by implication Barreto, both of whom followed valid scientific standards.
A nice introduction to expert witness work was penned by Dick Engstrom and Mike McDonald in 2011. It’s a nice exercise to read Engstrom and McDonald’s useful essay and then review expert witness reports in this and other cases.