The Montana field experiment has been deemed a campaign finance violation by the chief regulator in Montana.
I’m not sure any other outcome was expected, politically, and I’m also not sure this will stand up legally (or be pursued at all by the state attorney–they may just leave this statement of a violation as the appropriate slap on the wrist).
But as I wrote back when the controversy erupted, the bigger concern for academia is how we got here at all.
This has sparked a lot of discussion within academia, including a number of panels at MPSA and the CASBS, with more to come (including a symposium in PS if I can get my act together!). The most positive outcome may be a bit of circumspection and modesty among academic researchers.
A new article in Electoral Studies by Professor Francisco Cantu and Omar Garcia-Ponce examines perceptions of electoral integrity during the Mexican presidential election of 2012.
They have an interesting design, combining pre-election and post-election surveys and an exit poll–the former allow them to evaluate the impact of voting for the winner (or lower) on changes in perceptions of integrity.
The measures of integrity are are follows:
Our dependent variable measures citizens’ confidence in the integrity of the election. For the pre[post]-electoral survey, we use the following question: “In your opinion, how clean will [were] the presidential elections be [held last July 1st]?” Respondents chose among the following options: “Very clean,” “Somewhat clean,” “A little clean,” and “Not clean at all.” From the exit poll, we measure the voter’s confidence that her vote will be counted using the following question: “In general, how confident are you that the vote you cast for president will be respected and counted for the final result?” Respondents chose among the following options: “Very confident,” “Some- what confident,” “A little confident,” and “Not at all confident.”
The summary of findings are below. One note (not in the quote); the presence of election observers had no impact on perceptions of integrity.
On one hand, we show that confidence in the electoral process among supporters of the incumbent party decreased only after realizing that their candidate had lost. This change in the perceptions of electoral integrity responds to a pure “losers’ effect,” in which supporters of a losing candidate try to explain her defeat as a consequence of a poor electoral administration. On the other hand, we show that the discredit of electoral integrity among supporters of a party that has never won the presidential election is consistent over time. In this case, the skepticism from leftist partisans arose from both the systematic manipulation against left-wing parties during the twentieth century, and the dis- course of electoral distrust expressed by left-wing parties during recent presidential campaigns.
The full paper is available on early release.
Ok, this is just unfair:
At least he humored me by laughing at Doug Chapin’s “Nerd-Vana” joke.
Full story here: http://www.oregonlive.com/music/index.ssf/2015/03/krist_novoselic_brings_wit_and.html
I’m not sure if this legislation will go anywhere, but S0626 in the Rhode Island State Senate would allow for early in person voting in the state.
The bill includes:
- Early in person starting 21 days before the general election, ending the Saturday before Election Day (13 days for primaries)
- Voting would occur “at locations to be determined by each local board and approved by the state board”
- Hours for early voting would be 9-4:30 on Saturday, Sunday, Monday, Tuesday, and Wednesday, and 12-8 on Thursday and Friday
My three pieces of advice to the Legislature, should they move forward:
- Past work, including my own research, has shown a marked preference for voting on the last Sunday prior to an election. Given that Rhode Island is a small state, and given provisions in the bill that specify that ballots are going to be collected each day by an official from the state board, I’m not sure why they chose to end early voting on that last Saturday.
- The bill is confusing about what technology is going to be used. Part (f) specifies that “the state board shall provide the local 5 boards with the ballots, ballot applications, tabulation equipment, ballot storage boxes, voting 6 booths, instructions as to voting, and other supplies necessary to effectuate the provisions of this 7 section.” But part (e) specifies that the ballots will be filled out and sealed in an envelope; e.g. not processed or tabulated: “The early voter shall be provided with a voting 11 booth identical to the voting booths used on the regularly scheduled election days. Once the early 12 voter has completed the ballot, the early voter shall place the ballot in the ballot envelope and seal 13 the envelope. An official of the local board shall mark the envelope with the appropriate voting 14 precinct designation and return the envelope to the early voter. The early voter shall place the 15 envelope in the ballot box.” The implication is that the early in-person voting technology is actually “in person” absentee. Charles Stewart and I show that this will result in higher residual vote rates, since voters are not given any immediate feedback about any errors on the ballot. Why not have an optical scan machine at each early in-person voting location?
- The legislation makes no statement about how many early in-person voting locations will be required. This can lead to inequities during the early in-person voting period. Some states establish population floors or have other formula in place that help local officials determine how many early in-person locations they are expected to put in place.
The annual announcement for the ICSPR Summer Program, what we used to call “summer camp for social scientists”, came across the transom. The summer program used to be a place where advanced graduate students and faculty gathered each summer in Ann Arbor, Michigan, on the campus of the University of Michigan, to learn new statistical skills and polish up on old techniques.
But it’s grown to be much more than that, with lots of short one-week courses on specialized topics, many of which would be of interest to technical support staff in elections offices, lawyers, advocates, and others who work in the community.
This announcement below particularly struck me as potentially of interest to state elections officials, regional elections associations, non-profits engaged in elections data collection, and of course the EAC.
Many of these entities already engage in some form of data archiving but there seems to me little attention paid to curation for re-use. While everyone doesn’t have the time or resources to send a staff member to the workshop below, it strikes me that it would be very valuable for the elections community to begin to build bridges with the community of data librarians. There are pretty obvious areas of shared interest.
The ICPSR Summer Program is offering a five-day workshop on Curating and Managing Research Data for Re-Use, July 27-31, 2015. This workshop is for individuals interested or actively engaged in the curation and management of research data for sharing and reuse, particularly data librarians, data archivists, and data producers and stewards with responsibilities for data management.
Instructors Louise Corti (UK Data Archive), Jared Lyle (ICPSR), and Veerle Van den Eynden (UK Data Archive) will discuss best practices and tools for data curation, from selecting and preparing data for archiving to optimizing and promoting data for reuse. ICPSR social science quantitative datasets and UK Data Archive qualitative and cross-disciplinary data collections will serve as case studies and participants will track the datasets as they make their way through the data assessment, review, processing and curation pipeline.
Participants will learn about and gain proficiency in the full range of life cycle activities: data review and preparation; confidential data management; effective documentation practices; how to create, comply with, and evaluate required data management plans; digital repository requirements and assessment; and running user support and promotional activities for data. Emphasis will be placed on hands-on exercises demonstrating curation practices and on discussion for sharing local experiences and learning from others. Additional context and expertise will be provided through invited keynote lectures by research data experts.
Participants will leave with knowledge and experience of how to review, assess, curate, and promote data collections for long-term preservation and access.
Enrollment is limited to 25 participants. Registration is available through the ICPSR Summer Program Web site.
Questions? Contact the Summer Program at sumprog@icpsr.umich.edu or (734) 763-7400.
Here is the report from Jeff Mapes of the Oregonian. Unfortunately for advocates of efficient and effective elections systems, the bill passed on a nearly straight party vote (one Democrat voted nay).
The text of House Bill 2177 is contained here.
If you’re in the field, the RFP’s at the Multnomah County Elections website are interesting reading. They provide some insight into what a large, fully vote by mail county is looking for in order to move to a new generation of election technology.
https://multco.us/purchasing/opportunities/elections-ballot-tally-system-replacement
Working my way through Electoral Studies recent releases (thanks for the RSS feed!) came across an interesting analysis by Shaun Bowler of a survey conducted by the Public Policy Institute of California (PPIC).
The survey asked a series of detailed questions about whether or not voters were happy with the amount of information that they had about a set of initiatives. But perhaps more interesting, the survey provided the opportunity for voters to explain the reasons they voted for some of the initiatives, and the results are pretty encouraging for those who argue that citizens can accumulate enough information to cast an informed ballot.
Percent saying “don’t know” across 34 initiatives that were on the ballot from 2000-2012
Most striking to me were two things.
First the relatively low percentage of “don’t knows” across 34 initiatives that had been on the ballot, shown below (hey Shaun, look up the “s1mono” scheme in Stata).
Second, voters were provided a list of reasons that they voted for the initiative legalizing marijuana. The reasons were, well, reasonable, and interest group information (the most oft-cited source of voter information) ranks very low on the list. (Sorry about the poor quality of the screen grab.)
All in all, a nice piece summarizing a lot of survey results over a decade, looking at voting on referenda and initiatives in California.
Full piece available here.