A Blast From the Past: The PACEI “Report” on Vote Fraud (and my reactions at the time)

Some enterprising news organizations have found this blog posting from September 2017: https://electionupdates.caltech.edu/2017/09/11/report-on-voter-fraud-rife-with-inaccuracies/

A bit of nostalgia I would have thought, but apparently Kris Kobach still hasn’t figured this one out.

 

New EVIC Research Director

Election geeks! I’m pleased to announce that Paul Manson, a PhD student and Senior Research Assistant at the Center for Public Service at Mark O. Hatfield School of Government, Portland State University, will be joining EVIC as research director.

Paul is a policy researcher who focuses on public involvement and participation technologies – systems used to collect, manage, and synthesize public perceptions and interests. His research on these technologies to understand how environmental debates are framed by the application of representational technologies. He has also research the role of public involvement in resilience and disaster recovery policy efforts in Oregon. Paul manages a series of election research projects with the Center for Public Service, with a focus on vote at home methods and voter registration efforts.

Some of you will know Paul as our event organizer for the 2017 Election Science, Reform, and Administration conference. Paul also collaborated on a recent study of the turnout effects of vote by mail in Utah.

I’m thrilled to have Paul on the research team as we move forward on positioning EVIC as a west coast hub for information and research on election sciences and election administration.

Reader Update: 2018 Democracy Fund / Reed College LEO Survey Reader Update: 2018 Democracy Fund / Reed College LEO Survey

The Early Voting Information Center at Reed College is honored to have been selected as a research partner for the 2018 Democracy Fund / Reed College Local Election Official Survey. As Natalie Adona of the Fund wrote in April at the Democracy Fund blog:

We have two main motivations for the survey. First, we want to better understand LEO’s views about the roles, responsibilities, and challenges of their work. By tapping into their experience and deep knowledge of election administration, we hope to uncover new ideas to improve the capacity and quality of elections, and address LEOs’ most urgent needs.

Second, we want to amplify the voices of LEOs in national, regional, and state conversations about election administration, integrity, and reform. Far too often, these conversations don’t consider the “street view” realities of election administration. The insights of LEOs from across the country are vital and should be considered in the national dialogue about improving and securing our elections.

While it is too soon to release detailed findings, we want to share with the elections community some things we’ve learned about surveying local election officials. We will follow up with some (very preliminary) results this week.

  1. Election Officials Support Our Work, But Elections Are Their #1 Priority
    We sent out 3000 emails in mid May, inviting LEOs to respond to our brief (10 minute) survey. We immediately got some very pointed comments about the elections calendar in a number of states.Thank you!  We recognize that this is a busy time for local election officials. We altered the timing of our outreach so as not to conflict with the primary calendar of any given state.This is a practice we plan to improve upon in the future.We’re thrilled that our response rates so far are approaching 20% for the online survey and over 30% overall including the print survey (more on this below). This compares favorably with the 31% response rates reported by Kimball and Baybeck in their 2013 Election Law Journal paper. (We actually hope to beat the 31% rate.)
  2. You Got (Postal) Mail! LEOs Recognize The Importance of Cybersecurity
    We hoped to complete a substantial portion of our interviews via an online survey platform. However, it soon became apparent that many, many local officials are rightly wary of clicking on email links that they don’t recognize.  (The survey link connects to Qualtrics.com, a trusted site but one not well-known to many outside of the survey community.)We’d like to thank those officials who filled out the online survey, but we’ve taken the step of following up with print surveys, and we are, shall we say, overwhelmed with the response from officials.  One of the most interesting things we found is that at least a hundred officials are following up after we sent the print instrument, saying now they will fill out the online version. Lesson learned!
  3. Election Officials Have Opinions About Improving Elections, So Let’s Ask Them!
    We provided a number of “open-ended” items on the survey, providing a chance for election officials to tell us in their own words about how elections in the United States can be improved.  Over 67% of the respondents so far have written something in the free response category.We asked officials to enter their names and emails if they would be willing to allow us to follow up for elaboration and further comments, and over half have done so.

    This is great news! It shows that LEOs are interested, engaged, and want to share their view points about election reform.

We want to personally thank officials in these states who have responded in high numbers.  The table below is based on our online numbers only. We will update this later in the week, incorporating the print surveys.

Response Rates
State Response Rate
AK 100.0%
OR 68.0%
DE 66.7%
WA 48.1%
OH 38.3%
PA 37.3%
VA 34.6%
NC 32.9%
RI 31.8%
NM 31.6%

We hope many more local officials will find the time to respond over the coming weeks. If you have any questions about the survey, please feel free to email dfrcleosurvey@reed.edu.

CRS Elections Analyst Position

Sounds like a good opportunity!

The Congressional Research Service (CRS) Government and Finance Division is seeking an Analyst in American National Government to analyze public policy issues related to the regulation and administration of elections and voting in the United States. The focus of the Division’s work in this area is on the role played by various institutions, policies, and procedures in shaping electoral processes and practices. The issues may include, but are not limited to, election administration, voter registration and turnout, apportionment and redistricting, voting rights, and other election policies and practices.

https://www.usajobs.gov/GetJob/ViewDetails/481538900/

Voter “fraud” report for Presidential commission rife with errors and inaccuracies

(Crossposted to electionupdates.caltech.edu)

I look forward to a more detailed analysis by voter registration and database match experts of the GAI report that will be presented to the Presidential Advisory Commission on Election Integrity , but even a cursory reading reveals a number of serious misunderstandings and confusions that call into question that authors’ understanding of some of the most basic facts about voter registration, voting, and elections administration in the United States.

Fair warning: I grade student papers as part of my job, and one of the comments I make most often is “be precise”. Categories and definitions are fundamentally important, especially in a highly politicized environment like that current surrounding American elections.

The GAI report is far from precise; it’s not a stretch to say at many points that it’s sloppy and misinformed. I worry that it’s purposefully misleading. Perhaps I overstate the importance of some of the mistakes below. I leave that for the reader to judge.

  • The report uses an overly broad and inaccurate definition of vote fraud.

American voter lists are designed to tolerate invalid voter registration records, which do not equate to invalid votes, because to do otherwise would lead to eligible voters being prevented from casting legal votes.

But the report follows a very common and misleading attempt to conflate errors in the voter rolls with “voter fraud”. Read their “definition”:

Voter fraud is defined as illegal interference with the process of an election. It can take many forms, including voter impersonation, vote buying, noncitizen voting, dead voters, felon voting, fraudulent addresses, registration fraud, elections officials fraud, and duplicate voting.8

Where did this definition come from? As the source of the definition, they cite the Brennan Center report “The Truth About Voter Fraud” (https://www.brennancenter.org/sites/default/files/legacy/The%20Truth%20About%20Voter%20Fraud.pdf).

However, the Brennan Center authors are very careful to define voter fraud. From Pg. 4 of their report in a way that directly warns against an overly broad and imprecise definition:

Voter fraud” is fraud by voters. More precisely, “voter fraud” occurs when individuals cast ballots despite knowing that they are ineligible to vote, in an attempt to defraud the election system.1

This sounds straightforward. And yet, voter fraud is often conflated, intentionally or unintentionally, with other forms of election misconduct or irregularities.

To be fair to the authors, they do not conflate in their analysis situations such as being registered in two places at once with “voter fraud”, but the definition is sloppy, isn’t supported by the report they cite, and reinforces a highly misleading claim that voter registration errors are analogous to voter fraud.

David Becker can describe ad nauseam how damaging this misinterpretation has been.

  • The report makes unsubstantiated claims about the efficacy of Voter ID in preventing voter fraud.

Regardless of how you feel about voter ID, if you are going to claim that voter ID prevents in-person vote fraud, you need to provide actual proof, not just a supposition. The report authors write:

GAI also found several irregularities that increase the potential for voter fraud, such as improper voter registration addresses, erroneous voter roll birthdates, and the lack of definitive identification required to vote.

The key term here is “definitive identification”, a term that appears nowhere in HAVAThe authors either purposely or sloppily misstate the legal requirements of HAVA.  On pg. 20 of the report, they write that HAVA has a

“requirement that eligible voters use definitive forms of identification when registering to vote”

The word “definitive” appears again, and a bit later in the paragraph, it appears that a “definitive” ID, according to the authors, is:

“Valid drivers’ license numbers and the last four digits of an individual’s social security number…”,

But not according to HAVA. HAVA requirements are, as stated in the report:

“Alternative forms of identification include state ID cards, passports, military IDs, employee IDs, student IDs, bank statements, utility bills, and pay stubs.”

The rhetorical turn occurs at the end of the paragraph, when the authors conclude that these other forms of ID are:

“less reliable than the driver’s license and social security number standard”. This portion of the is far from precise.

and apparently not “definitive” and hence prone to fraud.

Surely the authors don’t intend to imply that a passport is “less reliable” than a drivers license and social security number. In many (most?) states, a “state ID card” is just as reliable as a drivers license. I’m not familiar with the identification requirements for a military ID—perhaps an expert can help out?[ED NOTE: I am informed by a friend that a civilian ID at the Pentagon requires a retinal scan and fingerprints]–but are military IDs really less “definitive” than a driver’s license?

If you are going to claim that voter fraud is an issue requiring immediate national attention, and that states are not requiring “definitive” IDs, you’d better get some of the most basic details of the most basic laws and procedures correct.

  • The authors claim states did not comply with their data requests, when it appears that state officials were simply following state law

The authors write:

(t)he Help America Vote Act of 2002 mandates that every state maintains a centralized statewide database of voter registrations.14

That’s fine, but the authors seem to think this means that HAVA requires that the states make this information available to researchers at little to no cost. Anyone who has worked in this field knows that many states have laws that restrict this information to registered political entities. Most states restrict the number of data items that can be released in the interests of confidentiality.

Rather than acknowledging that state officials are constrained by state law, the authors claim non-compliance:

In effect, Massachusetts and other states withhold this data from the public.

I can just hear the gnashing of teeth in the 50 state capitols.I am sympathetic with the authors’ difficulties in obtaining statewide voter registration and voter history files. Along with the authors, I would like to see all state files be available for a low or modest fee, and to researchers.

There is no requirement that the database be made available for an affordable fee, nor that the database be available beyond political entitles.  These choices are left to the states.  it is wrong to charge “non-compliance” when an official is following statute (passed by their state legislatures).

I don’t know whether the report authors didn’t have subject matter knowledge or were purposefully trying to create a misleading image of non-cooperation with the Commission.

  • The report shows that voter fraud is nearly non-existent, while simultaneously
    claiming the problem requires “immediate attention”.

But let’s return to the bottom line conclusion of the report: voter fraud is pervasive enough to require “immediate attention.” Do their data support this claim?

The most basic calculation would be the rate of “voter fraud” as defined in the report The 45,000 figure (total potential illegally cast ballots) is highly problematic, based on imputing from suspect calculations in 21 states, then imputed to 29 other states without considering even the most basic rules of statistical calculation.

Nonetheless, even if you accept the calculation, it translates into a “voter fraud” rate of 0.000323741007194 (45,000 / 139 million), or three thousandths of a percent.

This is almost exactly the probability that you will be struck across your whole lifetime (a chance of 1 in 3000 http://news.nationalgeographic.com/news/2004/06/0623_040623_lightningfacts.html)

I’m not the first one to notice this comparison—see pg. 4 of the Brennan Center report cited below. And here I thought I found something new!


There are many, many experts in election sciences and election administration that could have helped the Commission conduct a careful scientific review of the probability of duplicate registration and duplicate voting.  This report, written by Lorraine Minnite more than a decade ago lays out precisely the steps that need to be taken to uncover voter fraud and how statewide voter files should be used in this effort. There are many others in the field including those worried about voter fraud and those who are skeptics of voter fraud who have been calling for just such a careful study.

Unfortunately, the Commission instead chose to consult a “consulting firm” with no experience in the field, and which chose to consult database companies who also had no expertise in the field.

I’m sure that other experts will examine in more detail the calculations about duplicate voting. However, at first look, the report fails the smell test. It’s a real stinker.


Paul Gronke
Professor, Reed College
Director, Early Voting Information Center

http://earlyvoting.net

A little election sciences knowledge can go a long way: tooting the horn for Jay Lee

Just wanted to take a chance to toot the horn for Jay Lee, a rising junior Math – Statistics major at Reed College who helped the Open Elections team finish wrangling the precinct level elections results from North Dakota.

Jay got interested in elections work after taking my class on US Elections in the Spring and taking my co-taught Election Sciences course, offered in partnership with Andrew Bray.

Those of you who have been following this blog may remember that Jay, along with Matthew Yancheff,  has also released an R package, “RCV”, to help process and report on ranked choice voting results.  They worked on the project in our Election Sciences class, and, supported by funding from the College, produced the R package this summer.

#electionsciences2017 starting this week!

The first Election Sciences, Reform, and Administration Conference is happening this week in Portland, OR!

I’d like to thank Phil Keisling and Paul Manson of the Center for Public Service at Portland State University for helping to organize, and the Reed College Department of Political Science, the MIT Election Data and Science Lab, the National Science Foundation, and the Elections Team at the Democracy Fund for making this event possible.

Follow the link above, or point your browser to electionsciences.net for more information.

Ranked Choice Voting Package Available on CRAN

This announcement from Jay Lee, Matthew Yancheff, and Mia Leung, three Reed students who were in the Data and Election Sciences course that I taught along with Prof. Andrew Bray this spring.  They have released the results of their work to CRAN.

Thanks to Rob Richie and Theo Landsman of FairVote for helping push this forward.


Hello,

Just wanted to let you know that the first version of our RCV package is now submitted to CRAN, the R package archive! Going forward we’ll be updating this work, so if you have any comments or bug fixes please feel free to submit a pull request or issue to our GitHub repository, or just email us directly.
I’ve included a few lines of code at the bottom of this email to install the package locally and go through an example election (San Francisco Board of Supervisors, District 7). You’ll need at least version 3.3 of R installed to run these. If you don’t have this installed and don’t want to, some of the examples are available at our GitHub repo (scroll down to the README).
Again, thank you so much for your interest in our work and any help you’ve given us in regards to this project. We look forward to hearing any comments or critiques you might have on your experience using our package.
Thank you,
Jay Lee
Reed College
install.packages("rcv")
library(rcv)
sf_cleaned <- clean_ballot(ballot = sf_bos_ballot, b_header = T, 
                        lookup = sf_bos_lookup, l_header = T, 
                        format = "WinEDS")
results7 <- rcv_tally(sf_cleaned, "Board of Supervisors, District 7")

The results table for this election is stored in the `results7` object. It can be printed in the console with the first line of code provided below, or viewed in the RStudio window with the second line:
results7
View(results7)
We also have a functionality for producing an interactive type of flowchart called a Sankey diagram. This is done with the networkD3 package, which you must install separately to produce the visualization. The code for that is again provided here, but if you don’t want to install it we have an example on our GitHub repo.
install.packages("networkD3")
library(networkD3)
d3_7 <- rcv::make_d3list(results = results7)
networkD3::sankeyNetwork(Links = d3_7$values, Nodes = d3_7$names,
                         Source = "source", Target = "target",
                         Value = "value", NodeID = "candidate", units = "voters",
                         fontSize = 12, nodeWidth = 20)