Some enterprising news organizations have found this blog posting from September 2017: https://electionupdates.caltech.edu/2017/09/11/report-on-voter-fraud-rife-with-inaccuracies/
A bit of nostalgia I would have thought, but apparently Kris Kobach still hasn’t figured this one out.
Election geeks! I’m pleased to announce that Paul Manson, a PhD student and Senior Research Assistant at the Center for Public Service at Mark O. Hatfield School of Government, Portland State University, will be joining EVIC as research director.
Paul is a policy researcher who focuses on public involvement and participation technologies – systems used to collect, manage, and synthesize public perceptions and interests. His research on these technologies to understand how environmental debates are framed by the application of representational technologies. He has also research the role of public involvement in resilience and disaster recovery policy efforts in Oregon. Paul manages a series of election research projects with the Center for Public Service, with a focus on vote at home methods and voter registration efforts.
Some of you will know Paul as our event organizer for the 2017 Election Science, Reform, and Administration conference. Paul also collaborated on a recent study of the turnout effects of vote by mail in Utah.
I’m thrilled to have Paul on the research team as we move forward on positioning EVIC as a west coast hub for information and research on election sciences and election administration.
The Early Voting Information Center at Reed College is honored to have been selected as a research partner for the 2018 Democracy Fund / Reed College Local Election Official Survey. As Natalie Adona of the Fund wrote in April at the Democracy Fund blog:
We have two main motivations for the survey. First, we want to better understand LEO’s views about the roles, responsibilities, and challenges of their work. By tapping into their experience and deep knowledge of election administration, we hope to uncover new ideas to improve the capacity and quality of elections, and address LEOs’ most urgent needs.
Second, we want to amplify the voices of LEOs in national, regional, and state conversations about election administration, integrity, and reform. Far too often, these conversations don’t consider the “street view” realities of election administration. The insights of LEOs from across the country are vital and should be considered in the national dialogue about improving and securing our elections.
While it is too soon to release detailed findings, we want to share with the elections community some things we’ve learned about surveying local election officials. We will follow up with some (very preliminary) results this week.
- Election Officials Support Our Work, But Elections Are Their #1 Priority
We sent out 3000 emails in mid May, inviting LEOs to respond to our brief (10 minute) survey. We immediately got some very pointed comments about the elections calendar in a number of states.Thank you! We recognize that this is a busy time for local election officials. We altered the timing of our outreach so as not to conflict with the primary calendar of any given state.This is a practice we plan to improve upon in the future.We’re thrilled that our response rates so far are approaching 20% for the online survey and over 30% overall including the print survey (more on this below). This compares favorably with the 31% response rates reported by Kimball and Baybeck in their 2013 Election Law Journal paper. (We actually hope to beat the 31% rate.) - You Got (Postal) Mail! LEOs Recognize The Importance of Cybersecurity
We hoped to complete a substantial portion of our interviews via an online survey platform. However, it soon became apparent that many, many local officials are rightly wary of clicking on email links that they don’t recognize. (The survey link connects to Qualtrics.com, a trusted site but one not well-known to many outside of the survey community.)We’d like to thank those officials who filled out the online survey, but we’ve taken the step of following up with print surveys, and we are, shall we say, overwhelmed with the response from officials. One of the most interesting things we found is that at least a hundred officials are following up after we sent the print instrument, saying now they will fill out the online version. Lesson learned! - Election Officials Have Opinions About Improving Elections, So Let’s Ask Them!
We provided a number of “open-ended” items on the survey, providing a chance for election officials to tell us in their own words about how elections in the United States can be improved. Over 67% of the respondents so far have written something in the free response category.We asked officials to enter their names and emails if they would be willing to allow us to follow up for elaboration and further comments, and over half have done so.This is great news! It shows that LEOs are interested, engaged, and want to share their view points about election reform.
We want to personally thank officials in these states who have responded in high numbers. The table below is based on our online numbers only. We will update this later in the week, incorporating the print surveys.
Response Rates | |||
State | Response Rate | ||
AK | 100.0% | ||
OR | 68.0% | ||
DE | 66.7% | ||
WA | 48.1% | ||
OH | 38.3% | ||
PA | 37.3% | ||
VA | 34.6% | ||
NC | 32.9% | ||
RI | 31.8% | ||
NM | 31.6% |
We hope many more local officials will find the time to respond over the coming weeks. If you have any questions about the survey, please feel free to email dfrcleosurvey@reed.edu.
Sounds like a good opportunity!
The Congressional Research Service (CRS) Government and Finance Division is seeking an Analyst in American National Government to analyze public policy issues related to the regulation and administration of elections and voting in the United States. The focus of the Division’s work in this area is on the role played by various institutions, policies, and procedures in shaping electoral processes and practices. The issues may include, but are not limited to, election administration, voter registration and turnout, apportionment and redistricting, voting rights, and other election policies and practices.
Just wanted to take a chance to toot the horn for Jay Lee, a rising junior Math – Statistics major at Reed College who helped the Open Elections team finish wrangling the precinct level elections results from North Dakota.
Jay got interested in elections work after taking my class on US Elections in the Spring and taking my co-taught Election Sciences course, offered in partnership with Andrew Bray.
Those of you who have been following this blog may remember that Jay, along with Matthew Yancheff, has also released an R package, “RCV”, to help process and report on ranked choice voting results. They worked on the project in our Election Sciences class, and, supported by funding from the College, produced the R package this summer.
The first Election Sciences, Reform, and Administration Conference is happening this week in Portland, OR!
I’d like to thank Phil Keisling and Paul Manson of the Center for Public Service at Portland State University for helping to organize, and the Reed College Department of Political Science, the MIT Election Data and Science Lab, the National Science Foundation, and the Elections Team at the Democracy Fund for making this event possible.
Follow the link above, or point your browser to electionsciences.net for more information.
This announcement from Jay Lee, Matthew Yancheff, and Mia Leung, three Reed students who were in the Data and Election Sciences course that I taught along with Prof. Andrew Bray this spring. They have released the results of their work to CRAN.
Thanks to Rob Richie and Theo Landsman of FairVote for helping push this forward.
Hello,
install.packages("rcv")
library(rcv)
sf_cleaned <- clean_ballot(ballot = sf_bos_ballot, b_header = T, lookup = sf_bos_lookup, l_header = T, format = "WinEDS")
results7 <- rcv_tally(sf_cleaned, "Board of Supervisors, District 7")
results7
View(results7)
install.packages("networkD3")
library(networkD3)
d3_7 <- rcv::make_d3list(results = results7)
networkD3::sankeyNetwork(Links = d3_7$values, Nodes = d3_7$names, Source = "source", Target = "target", Value = "value", NodeID = "candidate", units = "voters", fontSize = 12, nodeWidth = 20)
(Crossposted to electionupdates.caltech.edu)
I look forward to a more detailed analysis by voter registration and database match experts of the GAI report that will be presented to the Presidential Advisory Commission on Election Integrity , but even a cursory reading reveals a number of serious misunderstandings and confusions that call into question that authors’ understanding of some of the most basic facts about voter registration, voting, and elections administration in the United States.
Fair warning: I grade student papers as part of my job, and one of the comments I make most often is “be precise”. Categories and definitions are fundamentally important, especially in a highly politicized environment like that current surrounding American elections.
The GAI report is far from precise; it’s not a stretch to say at many points that it’s sloppy and misinformed. I worry that it’s purposefully misleading. Perhaps I overstate the importance of some of the mistakes below. I leave that for the reader to judge.
American voter lists are designed to tolerate invalid voter registration records, which do not equate to invalid votes, because to do otherwise would lead to eligible voters being prevented from casting legal votes.
But the report follows a very common and misleading attempt to conflate errors in the voter rolls with “voter fraud”. Read their “definition”:
Where did this definition come from? As the source of the definition, they cite the Brennan Center report “The Truth About Voter Fraud” (https://www.brennancenter.org/sites/default/files/legacy/The%20Truth%20About%20Voter%20Fraud.pdf).
However, the Brennan Center authors are very careful to define voter fraud. From Pg. 4 of their report in a way that directly warns against an overly broad and imprecise definition:
To be fair to the authors, they do not conflate in their analysis situations such as being registered in two places at once with “voter fraud”, but the definition is sloppy, isn’t supported by the report they cite, and reinforces a highly misleading claim that voter registration errors are analogous to voter fraud.
David Becker can describe ad nauseam how damaging this misinterpretation has been.
Regardless of how you feel about voter ID, if you are going to claim that voter ID prevents in-person vote fraud, you need to provide actual proof, not just a supposition. The report authors write:
The key term here is “definitive identification”, a term that appears nowhere in HAVAThe authors either purposely or sloppily misstate the legal requirements of HAVA. On pg. 20 of the report, they write that HAVA has a
The word “definitive” appears again, and a bit later in the paragraph, it appears that a “definitive” ID, according to the authors, is:
But not according to HAVA. HAVA requirements are, as stated in the report:
The rhetorical turn occurs at the end of the paragraph, when the authors conclude that these other forms of ID are:
and apparently not “definitive” and hence prone to fraud.
Surely the authors don’t intend to imply that a passport is “less reliable” than a drivers license and social security number. In many (most?) states, a “state ID card” is just as reliable as a drivers license. I’m not familiar with the identification requirements for a military ID—perhaps an expert can help out?[ED NOTE: I am informed by a friend that a civilian ID at the Pentagon requires a retinal scan and fingerprints]–but are military IDs really less “definitive” than a driver’s license?
If you are going to claim that voter fraud is an issue requiring immediate national attention, and that states are not requiring “definitive” IDs, you’d better get some of the most basic details of the most basic laws and procedures correct.
The authors write:
That’s fine, but the authors seem to think this means that HAVA requires that the states make this information available to researchers at little to no cost. Anyone who has worked in this field knows that many states have laws that restrict this information to registered political entities. Most states restrict the number of data items that can be released in the interests of confidentiality.
Rather than acknowledging that state officials are constrained by state law, the authors claim non-compliance:
I can just hear the gnashing of teeth in the 50 state capitols.I am sympathetic with the authors’ difficulties in obtaining statewide voter registration and voter history files. Along with the authors, I would like to see all state files be available for a low or modest fee, and to researchers.
There is no requirement that the database be made available for an affordable fee, nor that the database be available beyond political entitles. These choices are left to the states. it is wrong to charge “non-compliance” when an official is following statute (passed by their state legislatures).
I don’t know whether the report authors didn’t have subject matter knowledge or were purposefully trying to create a misleading image of non-cooperation with the Commission.
claiming the problem requires “immediate attention”.
But let’s return to the bottom line conclusion of the report: voter fraud is pervasive enough to require “immediate attention.” Do their data support this claim?
The most basic calculation would be the rate of “voter fraud” as defined in the report The 45,000 figure (total potential illegally cast ballots) is highly problematic, based on imputing from suspect calculations in 21 states, then imputed to 29 other states without considering even the most basic rules of statistical calculation.
Nonetheless, even if you accept the calculation, it translates into a “voter fraud” rate of 0.000323741007194 (45,000 / 139 million), or three thousandths of a percent.
This is almost exactly the probability that you will be struck across your whole lifetime (a chance of 1 in 3000 http://news.nationalgeographic.com/news/2004/06/0623_040623_lightningfacts.html)
I’m not the first one to notice this comparison—see pg. 4 of the Brennan Center report cited below. And here I thought I found something new!
There are many, many experts in election sciences and election administration that could have helped the Commission conduct a careful scientific review of the probability of duplicate registration and duplicate voting. This report, written by Lorraine Minnite more than a decade ago lays out precisely the steps that need to be taken to uncover voter fraud and how statewide voter files should be used in this effort. There are many others in the field including those worried about voter fraud and those who are skeptics of voter fraud who have been calling for just such a careful study.
Unfortunately, the Commission instead chose to consult a “consulting firm” with no experience in the field, and which chose to consult database companies who also had no expertise in the field.
I’m sure that other experts will examine in more detail the calculations about duplicate voting. However, at first look, the report fails the smell test. It’s a real stinker.
—
Paul Gronke
Professor, Reed College
Director, Early Voting Information Center
http://earlyvoting.net