For other versions of this document, see http://wikileaks.org/wiki/CRS-RL34363 ------------------------------------------------------------------------------ Order Code RL34363 Election Reform and Local Election Officials: Results of Two National Surveys Updated February 27, 2008 Eric A. Fischer Senior Specialist in Science and Technology Policy Resources, Science, and Industry Division Kevin J. Coleman Analyst in Elections Government and Finance Division Election Reform and Local Election Officials: Results of Two National Surveys Summary Local election officials (LEOs) are critical to the administration of federal elections and the implementation of the Help America Vote Act of 2002 (HAVA, P.L. 107-252). Two surveys of LEOs were performed, in 2004 and 2006, by Texas A&M University; the surveys were sponsored and coordinated by CRS. Although care needs to be taken in interpreting the results, they may have implications for several policy issues, such as how election officials are chosen and trained, the best ways to ensure that voting systems and election procedures are sufficiently effective, secure, and voter-friendly, and whether adjustments should be made to HAVA requirements. Major results include the following: The demographic characteristics of LEOs differ from those of other government officials. Almost three-quarters are women, and 5% are minorities. Most do not have a college degree, and most were elected. Some results suggest areas of potential improvement such as in training and participation in professional associations. LEOs believed that the federal government has too great an influence on the acquisition of voting systems, and that local elected officials have too little. Their concerns increased from 2004 to 2006 about the influence of the media, political parties, advocacy groups, and vendors. LEOs were highly satisfied with whatever voting system they used but were less supportive of other kinds. However, their satisfaction declined from 2004 to 2006 for all systems except lever machines. They also rated their primary voting systems as very accurate, secure, reliable, and voter- and pollworker-friendly, no matter what system they used. However, the most common incident reported by respondents in the 2006 election was malfunction of a direct recording (DRE) or optical scan (OS) electronic voting system. The incidence of long lines at polling places was highest in jurisdictions using DREs. Most DRE users did not believe that voter-verified paper audit trails (VVPAT) should be required, but nonusers believed they should be. However, the percentage of DRE users who supported VVPAT increased in 2006, and most VVPAT users were satisfied with them. On average, LEOs mildly supported requiring photo identification for all voters, even though they strongly believed that it will negatively affect turnout and did not believe that voter fraud is a problem in their jurisdictions. LEOs believed that HAVA is making moderate improvements in the electoral process, but the level of support declined from 2004 to 2006. They reported that HAVA has increased the accessibility of voting but has made elections more complicated and has increased their cost. LEOs spent much more time preparing for the election in 2006 than in 2004. They also believed that the increased complexity of elections is hindering recruitment of pollworkers. Most found the activities of the Election Assistance Commission (EAC) that HAVA created only moderately beneficial to them. They were neutral on average about the impacts of the requirement for a statewide voter-registration database. Contents Who Are Local Election Officials? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Voting Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 Current Voting System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 Influence of Stakeholders on the Acquisition of Voting Systems . . . . . . . . 11 Attitudes toward Voting Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 Electronic Voting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 The Help America Vote Act (HAVA): Impacts and Attitudes . . . . . . . . . . . . . . 27 Election Assistance Commission . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Voter Registration Database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 Voter Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 Election Administration Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 2006 Election . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 Use, Training, and Experience of Pollworkers . . . . . . . . . . . . . . . . . . . . . . 47 Nonpartisan Election Officials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 Possible Caveats . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 Potential Policy Implications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 Appendix. Notes on Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 List of Figures Figure 1. Age Distribution of LEOs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 Figure 2. Length of Tenure of LEOs in Their Current Positions . . . . . . . . . . . . . 4 Figure 3. Level of Education Reported by LEOs . . . . . . . . . . . . . . . . . . . . . . . . . 5 Figure 4. Distribution of Memberships among LEOs Who Belong to One or More Professional Associations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 Figure 5. Assessments by LEOs of the Quality of the Training They Have Received . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 Figure 6. Agreement/Disagreement of LEOs on Statements about Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 Figure 7. Percentages of Jurisdictions Using Different Kinds of Primary Voting Systems as Reported by LEOs in 2004 and 2006 . . . . . . . . . . . . . . . 9 Figure 8. Average Length of Use of the Current Voting System as Reported by LEOs, 2004 and 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 Figure 9. Reactions of LEOs to Statements about the Influence of Various Stakeholders on Decisions about Selection of Voting Systems . . . 12 Figure 10. Support of LEOs for the Use of Different Kinds of Voting Systems, 2004 and 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 Figure 11. Overall Satisfaction of LEOs with Their Primary Voting System and with the Performance of the System in the 2004 and 2006 Elections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Figure 12. Average Levels of Agreement among LEOs That Their Current Voting System Is the Best Available, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . 17 Figure 13. Characteristics of the Primary Voting System, 2004 and 2006 . . . . . 18 Figure 14. Assessment by Users and Nonusers of Electronic Voting Systems of the Strictness of Standards for Those Systems, 2006 . . . . . . . . 20 Figure 15. Views of DRE Users and Nonusers about DREs . . . . . . . . . . . . . . . 21 Figure 16. Views of Users and Nonusers of Optical Scan (OS) Voting Systems about OS Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 Figure 17. Support for VVPAT among Users and Nonusers of DREs, 2004 . . . 24 Figure 18. Attitudes among DRE Users about Whether DREs Should Produce VVPATs, 2004 and 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 Figure 19. Reasons Chosen by LEOs for Disagreeing or Agreeing That DREs Should Print a VVPAT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 Figure 20. Reactions to VVPAT by Users, 2006 . . . . . . . . . . . . . . . . . . . . . . . . 26 Figure 21. Assessment by LEOs of Whether HAVA Is Improving the Election Process in Their Jurisdictions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 Figure 22. Assessment of HAVA Provisions as Advantage or Disadvantage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 Figure 23. Perceived Level of Difficulty by LEOs in Implementing HAVA Provisions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 Figure 24. Response of LEOs to Questions about Funding Effects of HAVA . . 32 Figure 25. Reactions of LEOs to Statements about the Impacts of HAVA, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Figure 26. Perceived Importance by LEOs of Selected EAC Responsibilities, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 Figure 27. Perceived Overall Helpfulness of the EAC to LEOs, 2006 . . . . . . . . 34 Figure 28. Perceived Degree of Benefit to LEOs from EAC Functions, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 Figure 29. Sources of Funds Reported by LEOs for Additional Local Staffing for the Voter Registration Database Required by HAVA, 2006 . . 36 Figure 30. Agreement/Disagreement of LEOs with Statements about the Voter Registration Database, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Figure 31. Frequency Distributions of Responses by LEOs to Questions about Voter Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 Figure 32. Percentage of LEOs Reporting Various Occurrences in Their Jurisdictions on Election Day 2006, by Primary Voting System . . . . 43 Figure 33. Percentage of Votes LEOs Reported as Cast via Absentee Voting, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 Figure 34. Agreement/Disagreement by LEOs with Statements about Absentee and Early Voting, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 Figure 35. Relationships between Kinds of Voting Systems Used and Selected Characteristics of Jurisdictions, 2006 . . . . . . . . . . . . . . . . . . . . . . 48 Figure 36. Views of LEOs on the Responsibility of Inadequate Pollworker Training for Problems with Election Administration . . . . . . . . . . . . . . . . . 49 Figure 37. Views of LEOs on the Need for Improvement of Pollworker Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Figure 38. Number of Hours of Pollworker Training Reported by LEOs . . . . . 50 Figure 39. Areas of Training for Pollworkers Reported by LEOs, 2006 . . . . . . 51 Figure 40. Level of Concern Reported by LEOs about the Negative Impact of Increased Election Complexity on Pollworker Recruitment, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 Figure 41. Assessments by LEOs about Aspects of the Election Administration Environment, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 Figure 42. Views of LEOs about Whether Election Administration Should Be Part of the Civil Service in Their States, 2006 . . . . . . . . . . . . . . 53 Figure 43. Frequency Distribution of the Number of Local Election Jurisdictions in the States . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Figure 44. Frequency Distribution of Response Rates by State, 2004 and 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 Figure 45. Kinds of Jurisdictions Administered by Survey Respondents, 2004 and 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 List of Tables Table 1. Comparison of Selected Demographic Characteristics of LEOs from the 2004 and 2006 Surveys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Table 2. Selected Election Administration Responsibilities Reported by LEOs, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 Table 3. Training Reported by LEOs, 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 Table 4. Assessment by LEOs of Advantageousness of HAVA Provisions in 2004 and 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 Table 5. Distribution of Responses of LEOs to Statements about the Impacts of HAVA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Table 6. Percentages of Jurisdictions Accepting Different Forms of Identification for Registration and Voting for All Voters, 2006 . . . . . . . . . 39 Table 7. Percentage of LEOs Reporting Various Events in Their Jurisdictions on Election Day 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 Election Reform and Local Election Officials: Results of Two National Surveys U.S. elections are highly decentralized, with much of the responsibility for election administration residing with local election officials (LEOs). There are thousands of such officials, many of whom are responsible for all aspects of election administration in their local jurisdictions -- including voter registration, recruiting pollworkers, running each election, and choosing and purchasing new voting systems. These officials are therefore critical not only to the successful administration of federal elections, but also to the implementation of the Help America Vote Act of 2002 (HAVA, P.L. 107-252). Nevertheless, there has been little objective information on the perceptions and attitudes of LEOs about election reform. This report discusses the results of two scientific opinion surveys of principal local election officials1 that were designed to help fill that gap in knowledge. The surveys were performed pursuant to two projects sponsored by the Congressional Research Service (CRS). The projects were developed in collaboration with and the surveys performed by faculty and students at the George Bush School of Government and Public Service at Texas A&M University. The Bush School team developed and administered the surveys, in consultation with CRS, to a sample of LEOs from all 50 states. The responses to each survey, from approximately 1,400 LEOs, were analyzed by CRS for purposes of this report. Methodological details are described in the appendix. The surveys were administered following the 2004 and 2006 federal elections. While they were not identical, many of the questions were the same, and comparisons of the results are discussed where appropriate.2 The findings may be useful to Congress as it considers funding for HAVA, oversight of its implementation, and possible revisions. The report begins with a description of some characteristics of local election officials and their jurisdictions. That is followed by a discussion of perceptions and attitudes of LEOs about the different kinds of voting systems used in different jurisdictions -- lever machines, punchcard ballots, hand-counted paper ballots, central-count optical scan (CCOS), precinct-count optical scan (PCOS), and direct- 1 The survey was aimed at officials with primary responsibility for elections within a local jurisdiction -- for example, a town clerk or county election director. 2 For discussion of results from the 2004 survey, see also CRS Report RL32938, What Do Local Election Officials Think about Election Reform?: Results of a Survey, by Eric A. Fischer and Kevin J. Coleman. CRS-2 recording electronic (DRE) systems such as "touchscreens." The report then describes how HAVA has affected local jurisdictions and the opinions LEOs expressed about the law. The section after that discusses three other topics covered in the 2006 survey -- issues related to the 2006 election, characteristics of pollworkers, and attitudes about nonpartisan election administration. The final sections discuss caveats to consider in interpreting the results, and potential policy implications of the findings. Who Are Local Election Officials? There are about 9,000 local election jurisdictions in the United States.3 In most states, they are counties or major cities, but in some New England and Upper Midwest states, they are small townships -- for example, more than 1,800 townships in Wisconsin. The number of registered voters and polling places in a jurisdiction also varies greatly. The average reported was 40,000 voters, ranging from fewer than 100 to more than 1 million, and 32 polling places,4 ranging from 0 to almost 1,000,5 with 16% of jurisdictions having only one and 14% more than 50. The number of election personnel working in a jurisdiction, in addition to the local election official, also varied greatly, from none to more than 10,000. Given such diversity and other differences among states -- such as wealth, population, and the role of state election officials -- responsibilities and characteristics of LEOs are likely to vary greatly. Nevertheless, some patterns emerged from the survey. The demographic characteristics of LEOs differ from those of other government officials. According to the survey results, the typical LEO is a white woman between 50 and 60 years old who is a high school graduate. She was elected to her current office, works full-time in election administration, has been in the profession for about 10 years, and earns under $50,000 per year. She belongs to a state-level professional organization but not a national one, and she believes that her training as an election official has been good to excellent. As with any such description, the one above does not capture the diversity within the community surveyed: About one-quarter of LEOs are men, about 5% belong to minority groups, 40% are college graduates, and 8% have graduate degrees (see Table 1). They range from 21 to more than 80 years of age, and have served from 1 to 45 years. About one-third were appointed rather than elected to their 3 Source: Election Reform Information Project, [http://www.electionline.org]. 4 As is typical with such skewed distributions, the medians were smaller: 12,000 voters and 13 polling places. Not surprisingly, the number of polling places was strongly correlated with the number of registered voters. 5 Oregon is a vote-by-mail state and does not generally use polling places. CRS-3 posts.6 Reported salaries range from under $10,000 to more than $120,000. About three-quarters belong to at least one professional organization. The demographic profile of LEOs is unusual, especially for a professional group. They differ from those of other local government employees. For example, according to U.S. Census figures, while women comprise a higher proportion of the local government workforce than men overall,7 men comprise a higher proportion of local government general and administrative managers.8 About 20% of those managers are members of minorities.9 The patterns do not appear to be a result of the fact that most LEOs are elected, as the demographic characteristics of legislators appear to be largely similar to those for local government managers.10 Table 1. Comparison of Selected Demographic Characteristics of LEOs from the 2004 and 2006 Surveys Percentages of LEOs who... 2004 2006 were elected. 65 58 worked full-time. 66 76 had served for more than 10 years in current position. 47 44 spent more than 20 hours per week on election duties. 41 47 did not belong to an association of election professionals. 30 26 had a salary under $40,000. 47 39 were women. 75 77 were older than 50. 63 62 were not college graduates. 60 59 were not white. 5.6 5.4 professed a conservative political ideology. 50 47 Source: Analysis by the Congressional Research Service (CRS) of data from studies performed collaboratively by CRS and Texas A&M University. Note: Bold type denotes statistically significant differences between the two surveys. The average tenure in the current position declined by about one year from 2004 to 2006, with the proportion of LEOs who had served for two years or less in their current positions rising to 15% in 2006 from 11% in 2004 (see Figure 1). Thus, 6 This result is similar to the figure of 37% reported from an independent study in David C. Kimball and Martha Kropf, "The Street-Level Bureaucrats of Elections: Selection Methods for Local Election Officials," p. 1257-1268. 7 Women make up about 60% of that workforce: see U.S. Census Bureau, "2000 Supplementary Survey Summary Table P068," available at [http://factfinder.census.gov]. 8 About 53% of the managers are men: see U.S. Census Bureau, "Census 2000 EEO Data Tool," available at [http://www.census.gov/eeo2000/index.html]. 9 Ibid. 10 Ibid. CRS-4 there appeared to be a small increase in job turnover between the two elections.11 However, there was no significant change in average age (Figure 2). Figure 1. Age Distribution of LEOs 50% 40% % of LEOs 30% 2004 20% 2006 10% 0% <30 31-40 41-50 51-60 61-70 71-80 81-90 Age Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: Throughout this report, bar or column graphs comparing results between the two surveys show data for 2004 in light gray (black and white copies) or blue (color) bars and data for 2006 in dark gray or burgundy bars. Figure 2. Length of Tenure of LEOs in Their Current Positions 40% 30% % of LEOs 2004 20% 2006 10% 0% 0-2 3-5 6-10 11-20 >20 Years in Current Position Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. The survey was not designed to identify the causes of such changes, but they appear to be consistent with the impacts of federal and state election reform on local 11 The cause of this change is not clear. However, the pattern is consistent with the contention by some observers that the changes in election administration brought about by HAVA could increase turnover. CRS-5 jurisdictions. That reform led to increased funding for election administration, changes in voting systems used by many jurisdictions, and an increased workload for election officials. For example, the survey found that those who reported that they worked full-time on election administration increased from 66% in 2004 to 76% in 2006, while those who reported that they spent more than twenty hours per week on election duties increased from 41% to 47%. The increasing complexity of elections and the increased federal role after the passage of HAVA have focused more attention on the role of professionalism in election administration. Given that change, it might be expected that election officials who began serving more recently would have more formal education than those who have served for longer periods. Such a pattern could yield a statistical association between the highest education level attained and the number of years in service as an election official. In fact, there was a small but significant relationship, with LEOs who did not have a college degree averaging 11-12 years of service and those with graduate degrees averaging 9 years. However, there was no significant change in the distribution of maximum education level between the 2004 and 2006 surveys (Figure 3). Figure 3. Level of Education Reported by LEOs 50% 40% % of LEOs 30% 2004 20% 2006 10% 0% Some High Some College Some Graduate high school college graduate graduate degree school graduate school Level of Education Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Fewer than half of LEOs belonged to a national or international association. The survey also examined other factors related to election administration as a profession. About three-quarters of LEOs belonged to at least one professional association.12 About 40% of those belonged to a national or 12 The proportion is an estimate determined by comparing the number of LEOs who answered this question with the number answering the gender question, which was in the same section of the survey. Such a comparison was necessary because LEOs were asked only to indicate the organizations to which they belong, not whether they belong to any organization. That question was chosen for the comparison because only 13 LEOs in the (continued...) CRS-6 international association, with 60% belonging only to a state or regional association (see Figure 4).13 Those results did not change significantly from 2004 to 2006. Figure 4. Distribution of Memberships among LEOs Who Belong to One or More Professional Associations 100% % of Respondents 75% 2004 50% 2006 25% 0% S er ED T . C er n. sn EO AS R th ss t en As AS AC O lA N R C N C te N na n IA a io io St ct eg e El R Professional Association Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: Abbreviated names of associations are as follows: NASED = National Association of State Election Directors; NASS = National Association of Secretaries of State; NACRC = National Association of County Recorders, Election Officials and Clerks; IACREOT = International Association of Clerks, Recorders, Election Officials and Treasurers. The choice of regional association was new for the 2006 survey. The data used in this graph include only those LEOs who indicated that they belonged to at least one professional association. See text. In 2006, the percentage of LEOs reporting that they had a written job description was 43% for those who had been elected and 70% for those who had been appointed. Most LEOs reported a broad range of election-administration responsibilities beyond solely running elections. Most are also responsible for budgeting, personnel, and purchasing, for example (Table 2). Most LEOs received some initial training specifically designed to prepare them for their duties, but for most that training was less than 20 hours, and only one-fifth of LEOs were required to pass an examination (Table 3). Most have also received additional training. More than two-thirds of LEOs assessed that their training was 12 (...continued) 2006 survey answered the question on membership but not the question on gender, fewer than for any other question in that section. Using the other questions in the section -- on age, race, education, political ideology, and salary -- yields estimates of 21-27% for 2006, and 24-29% for 2004. Using the total number of respondents yields 36% for 2006 and 33% for 2004, but those are almost certainly overestimates. 13 The number for state association membership in Figure 4 is higher because it includes LEOs who belong to more than one organization, such as a state association plus NACRC. CRS-7 good to excellent and resulted in moderate to substantial improvement in their effectiveness and ability to solve problems. More than four-fifths believe that training and experience are equally important in ensuring a successful election. Table 2. Selected Election Administration Responsibilities Reported by LEOs, 2006 Responsibility % Reporting Managing poll workers and other election administrators 90 Serving as a liaison between my jurisdiction and state and federal 90 election officials Overseeing an election recount when necessary 88 Authorizing and adhering to a budget 83 Hiring poll workers and other election administrators 83 Reporting inappropriate conduct by voters or politicians at polling 82 place Maintaining contact with vendors 80 Maintaining the voter registration database 80 Purchasing election equipment 78 Maintaining an electronic voting system 76 Purchasing an electronic voting system 63 Additional duties not listed 57 Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: LEOs were asked to check all applicable items in the list of responsibilities presented in the table. The data presented may be overestimates. They are percentages of the 1,406 LEOs who responded to the question; 7% of LEOs who responded to the survey did not answer this question. Using the total number of 1,506 survey respondents would reduce the percentages by 4-6 points but would probably constitute underestimates. Table 3. Training Reported by LEOs, 2006 Kind of Training Percentage of LEOs who... Initial Additional received any training. 78 82 received > 20 hours of training. 43 52 received certification from training. 45 36 received mandatory training. 54 35 were required to pass an exam. 19 n/a Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: n/a = not applicable. The question was not asked about additional training. LEOs were less satisfied with their training in 2006 than in 2004. This result, shown in Figure 5, might reflect the impact of HAVA requirements, most of which went into effect in 2006. For example, election officials might have felt less well prepared by their training to implement HAVA in 2006 than in 2004, but the survey did not address that possibility. Other possible factors include CRS-8 increasing public attention to problems in election administration, and recent controversies about the reliability and security of voting systems. Two-fifths of respondents to the 2006 survey commented on additional training needs. The most common suggestions were for more training in technical and legal aspects of elections, and more "hands-on" training. Figure 5. Assessments by LEOs of the Quality of the Training They Have Received 50% % of LEOs Choosing Level 40% 30% 2004 2006 20% 10% 0% Excellent Good Adequate Poor Quality of Training Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Figure 6. Agreement/Disagreement of LEOs on Statements about Technology The use of new information technologies can dramatically improve government services. Governments should move cautiously when adopting new technology. The benefits of new technologies greatly outweigh the risks. When it comes to new technologies, I think it is best to wait until all the bugs have been worked out. Strongly Strongly Disagree Agree 2004 2006 Level of Agreement Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: Error bars on graphs in this report denote upper and lower 95% confidence limits for the average response (arithmetic mean). CRS-9 Given the increasing role of technology in elections, both surveys asked LEOs questions about their attitudes toward technology (Figure 6). Respondents believed that technology can be useful for government services, but were cautious about implementation. They were only slightly positive on average about whether the benefits outweigh the risks. They held those views somewhat more strongly in 2006 than in 2004. Figure 7. Percentages of Jurisdictions Using Different Kinds of Primary Voting Systems as Reported by LEOs in 2004 and 2006 40% % of Jurisdictions Using Voting System 30% 2004 20% 2006 10% 0% Lever Punch Paper CCOS PCOS DRE Other Type of Voting System Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: Types of voting systems listed are as follows: Lever = mechanical lever machines; Punch = punchcard ballots; Paper = hand-counted paper ballots; CCOS = central-count optical scan systems; PCOS = precinct-count optical scan systems; DRE = direct-recording electronic systems; and Other = cases where the respondent checked "Other" and the primary voting system could not be determined from the written response -- for example, the respondent wrote "DRE and OS." That might indicate, for example, that DREs were used only for accessibility, or that OS (optical scan) was used only for absentee ballots. Voting Systems Current Voting System The kinds of voting systems used in the United States changed significantly between 2004 and 2006, with a substantial increase in the use of precinct-count optical scan (PCOS) and direct-recording electronic systems (DREs). Respondents reported that the percentage of jurisdictions using lever machines, punchcards, hand-counted paper ballots, and central-count optical scan (CCOS) as their primary voting system decreased substantially, while the percentage using PCOS and DREs increased (see Figure 7). CRS-10 These changes are consistent with results from other sources.14 The trends conform with expectations arising from HAVA requirements that emphasized improved usability and accessibility of voting systems for voters.15 Jurisdictions appeared reluctant to change the kinds of voting systems they use. The average length of time jurisdictions have been using a particular kind of voting system varies greatly with the kind of system (Figure 8). The average length of use varies with the length of time a voting system has been available for use. At one extreme, jurisdictions with hand-counted paper ballots have used them for 80 years, on average. At the other, jurisdictions with DREs have had them under 10 years on average. Figure 8. Average Length of Use of the Current Voting System as Reported by LEOs, 2004 and 2006 120 100 Mean Number of Years 80 2004 2006 60 40 20 0 Lever Punch Paper CCOS PCOS DRE Voting System Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: See note for Figure 7 for an explanation of types of voting systems. Data on punchcard users in not presented for 2006 because only 4 LEOs reported using them. 14 See, for example, Election Data Services, "Almost 55 Million, or One-Third of the Nation's Voters, Will Face New Voting Equipment in 2006 Election," October 2, 2006, [http://www.edssurvey.com/images/File/ve2006_nrpt.pdf]. 15 The results described here refer to the primary or main voting system used in a jurisdiction -- the one that most voters would use. HAVA also requires that every polling place have at least one fully accessible voting system such as a properly equipped DRE. As a result, many jurisdictions using other kinds of voting systems also had one DRE per polling place. CRS-11 The pattern of use shown in Figure 8 suggests that jurisdictions do not readily change the kinds of voting systems they use. On the one hand, such reluctance to change creates stability that may be beneficial to voters and administrators. On the other hand, it may mean that a particular kind of technology is used far longer than it should be, with increasing risks of negative consequences. For example, many of the problems associated with the 2000 presidential election were attributed to the continued use of outmoded or flawed technology, such as the punchcard systems in use at the time. The causes of such long-term use patterns are complex and may include factors such as legal and budgetary constraints and various forms of transaction costs that would be incurred with any change. Such factors, if they continue to be important, may impede jurisdictions from taking advantage of the kinds of improvements that are likely to occur in voting technology over the next decade. Influence of Stakeholders on the Acquisition of Voting Systems Most LEOs play a role in decisions on what voting systems to use in their jurisdictions (see Table 2 above). Many other stakeholders may also influence those decisions. To help provide an understanding of how LEOs assess the appropriateness of the roles other stakeholders play, the survey asked respondents to what extent they agreed or disagreed with statements about the influence of those stakeholders on the decision-making process. Two examples are "The federal government has too great an influence," and "Local level, elected officials should have greater influence." LEOs believed that the federal government has too great an influence on the acquisition of voting systems and local elected officials have too little. The results are presented in Figure 9. On average, in fact, LEOs felt more strongly about the role of local elected officials than any other stakeholder. LEOs were largely neutral about the level of influence of state election officials and the public, and did not believe that nonelected officials, professional associations, and independent experts should have greater influence than they do now. LEOs have become more concerned about the influence of the media, political parties, advocacy groups, and vendors. Some of the differences between the 2004 and 2006 results are notable. In 2004, LEOs were largely neutral about the influence of the media, political parties, and various advocacy groups.16 In 2006, they thought those groups had too much influence. They also agreed more strongly than in 2004 that elected local officials should have more influence. Also, in 2006 more LEOs believed that vendors have too great an influence than in 2004, and fewer believed that the public and independent experts should have greater influence. Their views did not change on the roles of the federal government, elected state officials, professional associations, and nonelected state and local officials. 16 Specifically, LEOs were asked about the statement, "Public interest groups/civil rights groups/advocates for the disabled have too great an influence on the process." CRS-12 Figure 9. Reactions of LEOs to Statements about the Influence of Various Stakeholders on Decisions about Selection of Voting Systems Have too great an Influence? 2004 Federal Government 2006 Media Advocates Political Parties Vendors Should have greater influence? Elected Local Officials Elected State Officials The Public Professional Associations Non-elected State Officials Independent Experts Non-elected Low-level Officials Strongly Strongly Disagree Agree Mean Level of Agreement Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Overall, the observed patterns of response are not surprising. LEOs generally either report to elected local officials or are elected themselves. The concerns of local officials about the influence of the federal government are well-known in many areas, not just election administration, and many may have resented the HAVA requirements that led to changes in long-used voting systems.17 Also, it is not surprising that LEOs have become more concerned about the roles of stakeholders such as the media, advocates, and political partisans, who are closely associated with the recent controversies about the reliability and security of voting systems. There has also been debate and uncertainty specifically about the role and influence of voting system manufacturers and vendors in the selection of voting systems by local jurisdictions. Some observers have argued that vendors have undue influence in what voting systems jurisdictions choose. Others believe that such concerns are unwarranted. But little has been known previously of how LEOs view vendors and their relationships with them. The results of the 2004 survey were mixed with respect to the importance of vendors. (These questions were not included in the 2006 survey.)18 LEOs in 2004 17 Many respondents commented that they should not have been required by the federal government to change voting systems or to add accessible ones. 18 Several questions in the 2004 survey were omitted in 2006 to make room for additional questions about election administration and the impacts of HAVA. Nevertheless, the 2006 (continued...) CRS-13 appeared to have high trust and confidence in vendors but did not rate them as being especially influential with respect to decisions about voting systems. Fewer than 10% believed that there was insufficient oversight of vendors by the federal government and states, but about one in six believed that local governments did not exercise enough oversight. Most jurisdictions using computer-assisted voting reported in 2004 that they had interacted with their voting-system vendors within the last four years.19 More than 90% of LEOs considered their voting system vendors responsive and the quality of their goods and services to be high.20 They felt equally strongly that the recommendations of those vendors could be trusted. However, about a fifth of respondents thought that vendors were willing to sacrifice security for greater profit, although 60% disagreed. Also, a quarter felt that vendors provide too many elements of election administration.21 When LEOs were asked in 2004 what sources of information they relied on with respect to voting systems, state election officials received the highest average rating, with about three-quarters of LEOs indicating that they rely on state officials a great deal. Next most important were other election officials, followed by the EAC and advocates for the disabled. About one-third of LEOs stated that they relied on vendors a great deal, a level similar to that for professional associations. Only 2% of LEOs rated vendors higher than any other source, whereas 20% rated state officials highest. Interest groups were rated lower than vendors, and political parties and media received the lowest ratings. When LEOs were asked in 2004 about the amount of influence different actors had on decisions about voting systems, the overall pattern of response was similar to that for information sources. Once again, state, local, and federal officials were judged the most influential,22 and political parties and the media the least, with vendors in between. An exception was that local nonelected officials were considered less influential on average than vendors. Both voters and advocates for the disabled were rated as more influential on average than vendors. No LEOs rated vendors as more influential than any other source. 18 (...continued) survey had more than twice as many questions as the 2004 instrument. 19 Not surprisingly, the lowest interaction (13% of LEOs) was in paper-ballot jurisdictions, and the highest was in optical scan and DRE jurisdictions (about 85%). 20 However, in the 2006 survey, about one in eight reported that vendors did not provide the expected level of support on election day (discussed later in this report). 21 This question explored the views of LEOs about the concern that some observers have raised that the range of services vendors provide in some jurisdictions may amount to a kind of privatization of election administration. 22 For this question, LEOs were also asked to rate their own influence, which received the highest average score. The question also asked about the influence of some other actors, such as courts and voters, and it listed elected and nonelected state and local officials but not election officials specifically, except the respondents themselves and the EAC. CRS-14 Those results contrast with the views of LEOs described above about whether the levels of influence of stakeholders are too little or too great (Figure 9). Of the three actors considered most influential, LEOs believed that local elected officials should have more influence and the federal government has too much, and they were neutral about state officials. They did not believe on average that those considered least influential should have more. Congress may find it useful to take these attitudes into account in conducting oversight of HAVA implementation and in considering additional election-reform legislation. Attitudes toward Voting Systems LEOs were highly satisfied with whatever voting system they were using but were less supportive of other kinds of systems. LEOs had strong opinions about the different kinds of voting systems used in the United States. Those whose jurisdiction used a particular kind of system, whatever it was, supported its use more strongly than any other system (see Figure 10).23 Thus, users of lever machines strongly supported their use, showed some support for the use of DREs, were neutral about optical scan systems, and were opposed to the use of punchcard and hand-counted paper ballot systems. In general, except for those using them, LEOs opposed the use of lever machines, punchcard systems, and paper ballots. Those views changed little across the two surveys. However, there was a slight but significant decrease in the level of support for DREs among users of optical scan and DRE systems. DREs were the only voting system for which support of users dropped between 2004 and 2006, although it still remained very high. It was not possible to determine if the change in support for users of DREs resulted from changes in the views of long-time users or from lower initial support among those who used DREs for the first time in the 2006 election. Satisfaction with the voting systems LEOs used declined from 2004 to 2006. Overall, and consistent with the above results, LEOs reported a high level of satisfaction with their voting systems and assessed that they performed very well during the most recent election. On a scale of 1-10, average ratings were 8 or higher for each of those questions in both surveys (Figure 11). However, ratings for satisfaction with and performance of optical scan and DRE systems were significantly lower in 2006. Ratings for performance were also lower for paper systems. There was no difference in ratings between years for lever machines in 23 For this question, LEOs were asked to rank how they felt about the use of different types of voting systems for elections in the United States, on a scale of 1 (strongly oppose) to 7 (strongly support). The types of voting systems listed were lever machines, punchcard systems, hand-counted paper ballots, central-count optical scan, precinct-count optical scan, DRE, Internet, and other. Only 10% of LEOs supported Internet voting, and since this type of system has not been used in public elections in the United States (except experimentally on occasion), it is not discussed further in this report. The category "other" is not discussed because the response rate was very low (<5%). CRS-15 satisfaction or performance.24 Figure 10. Support of LEOs for the Use of Different Kinds of Voting Systems, 2004 and 2006 Lever Machine Users Central Count OS Users Strongly Support Neutral Strongly Oppose Lever Punch Paper CCOS PCOS DRE Lever Punch Paper CCOS PCOS DRE Punchcard Users Precinct Count OS Users Strongly Support Neutral Strongly Oppose Lever Punch Paper CCOS PCOS DRE Lever Punch Paper CCOS PCOS DRE Hand-Counted Paper Ballot Users DRE Users Strongly Support Neutral Strongly Oppose Lever Punch Paper CCOS PCOS DRE Lever Punch Paper CCOS PCOS DRE 2004 Voting System Voting System 2006 Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: The X-axis variable (Voting System) is categorical. The data are presented as line, rather than bar, graphs purely as a visual aid to facilitate comparison. The lines do not denote any relationship among the categories. See note for Figure 7 for an explanation of types of voting systems. Each of the six graphs presents the views of LEOs who primarily use the particular kind of voting system denoted on the graph. Data on punchcard users is not presented for 2006 because only four LEOs reported using them. LEOs who used DREs and precinct-count optical scan systems were more satisfied with them in 2004 than LEOs who used lever machines, paper ballots, or central-count optical scan, but in 2006, there were no significant differences in satisfaction among users of different voting systems. However, users of PCOS systems were slightly more satisfied overall than users of either CCOS or DRE 24 Too few jurisdictions used punchcards in 2006 to permit meaningful statistical comparisons. CRS-16 systems.25 There were also no significant differences in rated performance of different voting systems in either 2004 or 2006, despite the striking difference between the two years. Figure 11. Overall Satisfaction of LEOs with Their Primary Voting System and with the Performance of the System in the 2004 and 2006 Elections Overall Satisfaction 10 Mean Rating 9 8 7 Lever Punch Paper CCOS PCOS DRE Performance in the Most Recent Election 10 Mean Rating 9 8 2004 2006 7 Lever Punch Paper CCOS PCOS DRE Voting System Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: LEOs were asked to rate overall satisfaction on a scale from 0 (not satisfied at all) to 10 (extremely satisfied), and performance from 0 (not well at all) to 10 (extremely well). Note that the scale on the graph is 7-10, not 0-10. The number of LEOs using punchcard systems in 2006 was too low to calculate meaningful error bars for that data point. See note for Figure 7 for an explanation of types of voting systems. See also note for Figure 10 on the use of line graphs. To assess more directly how LEOs rated their own voting systems in 2006, they were asked whether their current system is the best available, and what voting system they believed is best overall. Almost 80% agreed with the statement that their current voting system is the best available, although the level of agreement was somewhat 25 This conclusion is the result of a statistical comparison from a separate question and is not shown in the graph. CRS-17 higher among optical scan and DRE users (Figure 12). The same percentage believed that their current voting system is the best overall, with a significantly higher percentage of PCOS users holding that view than users of other systems. Figure 12. Average Levels of Agreement among LEOs That Their Current Voting System Is the Best Available, 2006 7 5.7 5.9 5.7 Mean Rating 5.1 5.2 4 Neutral 1 Lever Paper CCOS PCOS DRE Voting System Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: LEOs were asked how strongly they agreed with the statement, "The voting system in my jurisdiction is the best available," on a scale from 1 (strongly disagree) to 7 (strongly agree). See note for Figure 7 for an explanation of types of voting systems. Data on punchcard users in not presented because only four LEOs reported using them in 2006. LEOs rated their primary voting systems as very accurate, secure, reliable, and voter- and pollworker-friendly, no matter what voting system they used. To further assess voting system preferences, both surveys asked LEOs to assess their primary voting systems on fifteen specific characteristics (Figure 13). The high ratings for accuracy, security, reliability, and usability changed little from 2004 to 2006. For other characteristics, there were substantial differences both among voting systems and between the two surveys. For most of those, LEOs were less happy with performance in 2006 than 2004, especially with respect to optical scan and DRE systems, which they rated lower for cost, size, storage requirements, and machine error in 2006 than 2004. Ratings for usability were also slightly lower, but those for multilingual capacity were higher. Optical scan systems, both central- and precinct-count, were rated higher for accessibility in 2006 than in 2004. The reasons for this change are not clear.26 All systems were rated lower for machine and voter error in 2006 -- LEOs switched from positive to fairly neutral about these performance characteristics. 26 The change seems surprising on its surface, because hand-marked optical scan ballots of either type are not accessible to persons with disabilities in the sense used in HAVA. However, at least one manufacturer has marketed an accessible ballot-marking machine. CRS-18 Figure 13. Characteristics of the Primary Voting System, 2004 and 2006 Multilingual Capacity Excellent Acquisition Costs Security Poor Excellent Maintenance Costs Sociodemographic Counting Speed Impact on Groups Poor Excellent Counting Accuracy Voter Usability Physical Size Poor Excellent Voter Accessibility Requirements Machine Error Storage Poor Excellent Pollworker Usability Voter Error Reliability 2004 2006 Poor Lever Punch Paper CCOS PCOS DRE Lever Punch Paper CCOS PCOS DRE Lever Punch Paper CCOS PCOS DRE Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: See note for Figure 7 for an explanation of types of voting systems. See also note for Figure 10. CRS-19 It was not surprising that DREs received the highest ratings of any system for accessibility and ability for use in multiple languages, or that hand-counted paper ballots were rated lowest for counting speed. Some of the comparisons among voting systems, however, did yield surprising results. The ratings for reliability, security, accuracy, and ease of use by voters were very high and were similar for all voting systems. Given media reports about problems with the reliability and security of electronic voting, somewhat different outcomes might have been expected -- namely, that DREs would have been rated lower in reliability and security. Also, given that modern DREs are often described as more voter-friendly than other systems, and certainly have the capability of providing higher levels of usability than other types, the lack of difference in ratings for usability is somewhat surprising. With respect to accuracy, a lower rating might have been expected for punchcards, given the difficulties with recounts that were prominent during the 2000 presidential election. It is possible that such confidence exists because few jurisdictions use punch cards now, and those that do have them declined to replace them after 2000. Those jurisdictions kept the system despite intense negative media coverage of system limitations and opted not to take part in the punchcard buyout program offered through the Help America Vote Act. The relative lack of difference in ratings of optical scan and DRE systems for acquisition and maintenance costs, and size and storage requirements, appears to run counter to widely held views. Many observers regard DREs as the most expensive voting systems, given that several machines may be needed for each polling place, whereas optical scan systems usually require one machine per polling place (PCOS) or none (CCOS). These differences from expectation suggest that LEOs' perceptions of how their voting systems perform may differ substantially in some ways from public perceptions about those systems. If the perceptions of election officials are accurate, then several of the criticisms leveled at specific voting systems could lead, if acted upon, to unnecessary and even counterproductive regulation and expenditure. For example, if in fact there is little difference in security between an optical scan system and a DRE, then requirements for paper trails may be unnecessary. If, however, LEOs' perceptions are inaccurate, then understanding and addressing the causes of those inaccuracies may be beneficial. Electronic Voting Much of the recent controversy about election reform has focused on electronic voting systems. Questions about the security and reliability of those systems were a relatively minor issue until 2003. Two factors led to a sharp increase in public concerns about them: (1) HAVA promoted the use of both PCOS and DREs through its provisions on preventing voter error and making voting systems accessible to persons with disabilities; and (2) the security vulnerabilities of electronic voting CRS-20 systems, especially DREs, were widely publicized as the result of several studies released in 2003.27 Both surveys asked several questions designed to elicit the views of LEOs about aspects of that controversy. When asked whether current federal and state guidelines and standards about electronic voting systems (both optical scan and DRE systems) are strict enough, most LEOs, about 60%, replied in the affirmative. Those who did not were fairly evenly split among officials who believed that the current standards are too strict and those who believed they are not strict enough. There was no significant difference in average assessment between users and nonusers of electronic voting systems, but nonusers were slightly more likely to believe that the standards are either too strict or not strict enough (Figure 14). Figure 14. Assessment by Users and Nonusers of Electronic Voting Systems of the Strictness of Standards for Those Systems, 2006 % of LEOs Choosing Level 60% 40% Nonusers Users 20% 0% Too strict Just strict Not strict enough enough Perceived Level of Strictness Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: LEOs were asked, "Do you believe that the state and federal standards for electronic voting systems are too strict or not strict enough?" using a scale from -5 (too strict) to 0 (just strict enough) to 5 (not strict enough). The three categories in the graph show the summed percentages who chose -5 to -1, 0, and 1 to 5, respectively. DRE users differed more from nonusers in their views about their voting system than optical scan users differed from nonusers. In both surveys, LEOs were asked to what extent they agreed with several statements about DRE and optical scan systems. In 2004 those questions were asked of all LEOs, but in 2006 they were asked only of those who used DREs and optical scan as their 27 See CRS Report RL33190, The Direct Recording Electronic Voting Machine (DRE) Controversy: FAQs and Misperceptions, by Eric A. Fischer and Kevin J. Coleman. CRS-21 primary voting systems. Also, two questions asked in 2004 were not asked in 2006 (See Figures 15 and 16). Figure 15. Views of DRE Users and Nonusers about DREs I understand how DREs operate I have adequate information on DREs to assess whether they are a good choice for my jurisdiction I consider certification procedures by NASED and the EAC to be adequate I consider state certification procedures to be adequate Any security concerns about DREs can be adequately addressed by good security procedures DRE software is vulnerable to being hacked DREs are more vulnerable to tampering than other types of voting systems DRE software is vulnerable to viruses and other malicious software DRE software should be available for public inspection (an open-source approach) The public should have greater trust in DREs I follow news regarding DREs in the media The media reports too many criticisms of DREs Strongly Strongly Disagree Agree Level of Agreement Nonusers, 2004 DRE Users, 2004 DRE Users, 2006 Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: See text for explanation of the question. CRS-22 Figure 16. Views of Users and Nonusers of Optical Scan (OS) Voting Systems about OS Systems I understand how OS voting systems operate I have adequate information on OS systems to assess whether they are a good choice for my jurisdiction I consider NASED certification procedures to be adequate I consider state certification procedures to be adequate Any security concerns about OS systems can be adequately addressed by good security procedures OS systems are vulnerable to being hacked OS systems are more vulnerable to tampering than other types OS system software is vulnerable to viruses and other malicious software OS system software should be available for public inspection (an open-source approach) The public should have greater trust in OS systems I follow news regarding OS systems in the media The media reports too many criticisms of OS systems Strongly Strongly Disagree Agree Level of Agreement Nonusers, 2004 OS Users, 2004 OS Users, 2006 Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: See text for explanation of the question. CRS-23 Not surprisingly, the opinions of nonusers of either kind of system were generally less strong than those of users. Nonusers were neutral on average with respect to several statements about DREs, including their level of knowledge about the systems, vulnerabilities to tampering, and the need for more public trust. LEOs whose primary voting systems were precinct-count optical scan were more neutral about DREs than were users of other voting systems.28 Users of DREs, in contrast, generally agreed that they had sufficient knowledge about the voting system, that certification procedures were adequate, that DREs are not vulnerable to tampering and security concerns can be addressed with good procedures, that the public should have greater trust in DREs, and that the media report too many criticisms of that voting system. Those views were similar in both surveys. Nonusers were less neutral about optical scan (OS) systems, but users nevertheless held stronger views than nonusers about these systems, except for the statement about media criticism, about which both users and nonusers were neutral on average. LEOs whose primary voting systems were DREs were less neutral about OS systems than users of other voting systems.29 The controversy about the security and reliability of DREs has led to widespread calls for the adoption of a paper trail of the ballot choices that a voter can verify before casting the ballot. These paper trails, printed as separate ballot records that the voter can examine, are usually called voter-verified paper audit trails, or VVPAT. LEOs whose primary voting system is a DRE were asked several questions in both surveys about VVPAT. The percentage who used them doubled to 36% in 2006, from 18% in 2004 . About one-third of LEOs whose jurisdictions used DREs as their primary voting system stated that voters who did not wish to use a DRE had the option of using a paper ballot instead. However, it was not possible to determine which of those jurisdictions permitted that choice in the polling place rather than through the use of "no excuse" absentee balloting.30 Most DRE users did not believe that VVPAT should be required, but nonusers believed they should be. In the 2006 survey, only DRE users were asked if VVPAT should be required. However, in the 2004 survey, both users and nonusers were asked. Among DRE users, only 14% supported such a requirement, whereas among nonusers 68% did (Figure 17). The percentage of DRE users who believed that VVPAT should be used increased in 2006. In 2004, 47% of respondents strongly disagreed, and only 5% strongly agreed that DREs should produce a VVPAT, while in 2006 the numbers were 36% strongly disagreeing and 12% strongly agreeing (Figure 18). 28 This conclusion is the result of a statistical comparison of responses from users of all voting systems in 2004 and is not shown in Figure 15. 29 This conclusion is the result of a statistical comparison of responses from users of all voting systems in 2004 and is not shown in Figure 16. 30 States increasingly offer absentee ballots to any voter requesting them, rather than requiring a reason such as disability or absence from the jurisdiction on election day. CRS-24 Figure 17. Support for VVPAT among Users and Nonusers of DREs, 2004 50% 40% % of LEOs 30% DRE Users Nonusers 20% 10% 0% Strongly Strongly Disagree Agree DREs Should Have VVPAT Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Figure 18. Attitudes among DRE Users about Whether DREs Should Produce VVPATs, 2004 and 2006 50% % of LEOs Choosing Level 40% 30% 2004 2006 20% 10% 0% Strongly Strongly Disagree Agree Level of Agreement/Disagreement Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. In 2006, LEOs were also asked if they would be willing to use a VVPAT if reimbursed for the costs by the federal government, and 57% answered in the affirmative. However, even those respondents (DRE users and nonusers) who CRS-25 expressed support for VVPAT were generally willing (65%) to spend only $300 or less for the feature. LEOs were asked to choose one or more of several reasons for disagreeing or agreeing that DREs should produce a VVPAT (Figure 19). The most frequent reasons chosen were the risk of printer failure, the complexity of implementation, and risks to voter privacy. Among the choices available in both surveys, LEOs were more concerned in 2006 about costs and the risk of printer failure, and less concerned about the risk of tampering with the VVPAT. Figure 19. Reasons Chosen by LEOs for Disagreeing or Agreeing That DREs Should Print a VVPAT Reasons for Reasons for Disagreeing Agreeing 70% 60% 2004 2006 50% % of LEOs Choosing 40% 30% 20% 10% 0% Complex/Time-Consuming Risk to Voter Privacy Improves Voter Confidence Accuracy & Integrity Risk of Printer Failure Risk of Tampering Needed for Recounts Cost of Paper Size of VPAT Check on DRE Reason Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Most VVPAT users in 2006 were satisfied with them. About three-quarters of LEOs who used a VVPAT were somewhat to very satisfied with it. However, about one-fifth were dissatisfied. More than four-fifths of LEOs had confidence in their accuracy, with fewer than one-tenth expressing concerns. More than two-thirds thought that voters reacted positively to them, but about one-quarter thought that voters were neutral (Figure 20). CRS-26 Figure 20. Reactions to VVPAT by Users, 2006 Satisfaction 40% 30% % of LEOs 20% 10% 0% Not Extremely Satisfied Satisfied at All Confidence in Accuracy 40% % of LEOs 30% 20% 10% 0% Not Extremely Confident Confident at All Assessment of Voter Reaction 40% % of LEOs 30% 20% 10% 0% Very Neutral Very displeased pleased Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. CRS-27 The Help America Vote Act (HAVA): Impacts and Attitudes Most LEOs, about 90%, considered themselves familiar with and knowledgeable about HAVA's requirements in both surveys. The level of familiarity increased from 2004, when about 20% considered themselves "very familiar" with the law, to 2006, with almost 40% very familiar. Those who were "not familiar at all" with HAVA decreased from 4% in 2004 to 0.1% in 2006. About 90% of respondents believed that almost all jurisdictions in their state were in full compliance with HAVA provisions in 2006. Figure 21. Assessment by LEOs of Whether HAVA Is Improving the Election Process in Their Jurisdictions 25% % of LEOs Choosing Level 20% 15% 2004 2006 10% 5% 0% No Major Improvement Improvement Level of Improvement Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. LEOs believed that HAVA is making moderate improvements in the electoral process overall in their jurisdictions. However, more LEOs believed that the law resulted in no improvements than in major improvements, and the level of support was lower in 2006 than in 2004 (Figure 21). Most LEOs regarded the major provisions of HAVA as advantageous, although the level of support varied both among the provisions and between the two surveys. LEOs were most supportive of federal funding and least supportive of the requirement for provisional voting and the creation of the Election Assistance CRS-28 Commission (Figure 22). However, provisional voting received substantially higher negative ratings than any other provision in both surveys (Table 4). Figure 22. Assessment of HAVA Provisions as Advantage or Disadvantage Provision of federal funds to states Facilitating participation for military or overseas votes Requirements for centralized voter registration Requirements for voter-error correction Provision of information for voters Process for certification of voting systems Codification of voting system standards in law Requirements for disabled access to voting systems Identification requirements for certain first-time voters State matching requirement for federal funds Creation of the Election Assistance Commission 2004 Requirement for provisional voting 2006 Disadvantage Advantage Mean Rating by LEOs Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. The level of support for HAVA, while positive, declined among LEOs from 2004 to 2006. While remaining positive overall, the level of support declined for all provisions except the voter registration and identification requirements, which were unchanged, and provisional voting, where support in 2006 was higher than in 2004. This was the only provision for which the percentage of negative ratings declined between the two surveys. The steepest decline in support was for the state matching-fund requirement. The decline in support for HAVA from 2004 did not result from a change in the perceived difficulty of implementation. In general, LEOs reported in both surveys that implementation of HAVA provisions was moderately difficult (Figure 23). CRS-29 Table 4. Assessment by LEOs of Advantageousness of HAVA Provisions in 2004 and 2006 Percentage of LEOs Choosing Assessment HAVA Provision Advantage Neutral Disadvantage 2004 2006 2004 2006 2004 2006 Provision of federal funds to 90 81 -9 6 12 6 4 7 3 states Facilitating participation for 82 72 -10 11 18 7 7 10 3 military or overseas votes Requirements for centralized 71 70 -1 16 17 1 13 13 0 voter registration Requirements for voter-error 78 68 -10 13 22 9 8 11 3 correction Provision of information for 79 67 -12 15 25 10 5 8 3 voters Process for certification of 79 67 -12 15 21 6 7 13 6 voting systems Codification of voting system 74 64 -10 19 25 6 8 11 3 standards in law Requirements for disabled 76 64 -12 13 18 5 11 17 6 access to voting systems Identification requirements for 68 64 -4 16 20 4 16 16 0 certain first-time voters State matching requirement for 74 57 -17 14 24 10 12 20 8 federal funds Creation of the Election 62 48 -14 23 31 8 15 21 6 Assistance Commission Requirement for provisional 49 51 2 17 20 3 35 30 -5 voting Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: = Change from 2004 to 2006. LEOs were asked to rate the provisions on a scale of 1 (disadvantage) to 7 (advantage). Entries for the Advantage column include respondents who chose 5-7, for the Neutral column 4, and for the Disadvantage column 1-3. The perceived difficulty of implementing most HAVA provisions declined from 2004 to 2006. The level of difficulty declined for all but two provisions:31 The assessed level of difficulty increased for the process for 31 This conclusion holds despite a small inadvertent change in this question between the two surveys. In 2004, LEOs were asked to rate the difficulty on a scale of 0 (not difficult at all) to 10 (extremely difficult). In 2006, the scale began at 1. However, that change should have (continued...) CRS-30 certification of voting systems, and there was no significant change in perception about the difficulty of implementing provisions to facilitate participation by military and overseas voters. Figure 23. Perceived Level of Difficulty by LEOs in Implementing HAVA Provisions Requirements for disabled access to voting system Requirements for centralized voter registration Process for certification of voting systems Requirement for provisional voting Facilitating participation for military or overseas voter Requirements for voter-error corrections Identification requirements for certain first- time voters 2004 Provision of information for voters 2006 Not Difficult Extremely at All Difficult Mean Level of Difficulty Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. The comparatively large drop in support for the state matching-fund requirement suggests that the decrease in support for HAVA provisions overall in 2006 may have resulted in part from perceptions about costs and funding. Their importance is also supported by the responses to three questions in the 2006 survey: ! How has HAVA affected the cost of elections in your jurisdiction? ! To what degree is the funding your jurisdiction has received to implement HAVA requirements sufficient for their implementation? 31 (...continued) caused a slight increase, not a decrease, in the scores -- the opposite of the observed change for all but the two items discussed in the text. CRS-31 ! How concerned are you that limited funding in the future will leave you unable to comply with HAVA requirements for election administration? The results are presented in Figure 24. Most LEOs reported that HAVA has increased the cost of elections, and they are concerned about future funding. About 90% of respondents believed that HAVA has increased the cost of elections, and only 2% believe the costs have decreased. LEOs were fairly evenly divided on whether current funding is sufficient to implement the requirements, but most expressed concerns about the sufficiency of future funding, with 30% stating that they were "extremely concerned." LEOS reported that HAVA has increased the accessibility of voting but has made elections more complicated to administer. LEOs were also asked in 2006 to respond to a set of statements about the impacts of HAVA (Figure 25). While agreeing on average that HAVA has made elections more accessible for voters, they disagreed that the law has made elections fairer or more reliable. They did not believe that HAVA requirements are inconsistent with state requirements, but they strongly believed that the law has made elections more complex to administer. As Table 5 shows, with the exception of the statement on complexity of elections, responses were fairly evenly distributed, with about one-quarter to one-third of respondents expressing a neutral position. Table 5. Distribution of Responses of LEOs to Statements about the Impacts of HAVA Percentage of LEOs Who... Statement Disagreed Were Neutral Agreed HAVA has made elections more accessible for voters 26% 23% 51% HAVA has made elections more fair 40% 31% 30% HAVA has made elections more complex 7% 8% 85% to administer HAVA has made elections more reliable 42% 28% 29% HAVA requirements are not consistent 44% 33% 23% with state requirements Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: LEOs were asked to rate their level of agreement or disagreement on a scale of 1 (strongly disagree) to 4 (neutral) to 7 (strongly agree). Entries for the Agreed column include respondents who chose 5-7, for the Were Neutral column 4, and for the Disagreed column 1-3. CRS-32 Figure 24. Response of LEOs to Questions about Funding Effects of HAVA How Has HAVA Changed the Cost of Elections? 45% % of LEOs 30% 15% 0% Decreased The Same Increased Is Funding Sufficient to Implement HAVA Requirements? 45% % of LEOs 30% 15% 0% Not Entirely Sufficient Sufficient at All How Concerned Are You That Funding Limitations Will Prevent Compliance? 45% % of LEOs 30% 15% 0% Not Extremely Concerned Concerned at All Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. CRS-33 Figure 25. Reactions of LEOs to Statements about the Impacts of HAVA, 2006 HAVA has made elections more accessible for voters HAVA has made elections more fair HAVA has made elections more complex to administer HAVA has made elections more reliable HAVA requirements are not consistent with state requirements Strongly Strongly Disagree Agree Mean Level of Agreement/Disagreement Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Election Assistance Commission When HAVA created the Election Assistance Commission, the law gave it several specific responsibilities. The EAC carries out grant programs, provides for voluntary testing and certification of voting systems, studies election issues, and issues voluntary guidelines for voting systems and guidance for the requirements in the act. The EAC has no rule-making authority (other than very limited authority under the National Voter Registration Act, the "motor-voter" law, P.L. 103-31) and does not enforce HAVA requirements. In the 2006 survey, LEOs were asked about the EAC's responsibilities, helpfulness, and benefits. They were asked to rank the importance of the following four EAC responsibilities: ! Provide guidance to local election officials, ! Research issues related to election administration, ! Certify voting systems, and ! Ensure that local jurisdictions are in compliance with federal law. Most LEOs found the activities of the EAC only moderately beneficial to them. The results are presented in Figure 26. LEOs regarded guidance to them as the most important of the listed responsibilities and ensuring compliance by them as the least. Research and certification were rated in the middle and the ratings for them did not differ significantly. However, more than 60% of LEOs reported that the EAC had not helped them understand or perform their duties CRS-34 during the preceding year. About 6% found the EAC to be "extremely helpful" to them overall (Figure 27), whereas 13% found the agency "not helpful at all." Figure 26. Perceived Importance by LEOs of Selected EAC Responsibilities, 2006 45% % of LEOs Choosing Rating 30% 15% 0% Most essential Least Essential Importance Guidance Research Certification Compliance Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Figure 27. Perceived Overall Helpfulness of the EAC to LEOs, 2006 25% % of LEOs Choosing Rating 20% 15% 10% 5% 0% Not Extremely helpful at helpful all Rating CRS-35 Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. LEOs were also asked how they had benefitted from the four functions listed above plus the distribution of federal funds for use by local jurisdictions. The ratings (Figure 28) generally reflect the pattern seen in the responses on overall helpfulness. On average, LEOs responded that they had benefitted only moderately overall. However, while they considered guidance as the most important responsibility, they rated it lowest in benefit, along with compliance, which they regarded as the least important responsibility. About a quarter rated EAC guidance as "not beneficial at all," with about 7% rating it "extremely beneficial." Perceived benefits from research and certification were somewhat higher, and funding, not surprisingly, was rated highest. Figure 28. Perceived Degree of Benefit to LEOs from EAC Functions, 2006 Guidance Research Certification Compliance Funding Not Beneficial Extremely at All Beneficial Mean Rating Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. The discrepancy in the ratings for EAC guidance have several possible explanations. For example, it could reflect frustration with the delays in start-up of the EAC and consequently in the issuance of guidance. It could reflect difficulties in understanding the guidance that was issued. It might reflect the fact that the purpose of the guidance is to assist states, not local jurisdictions, in meeting the title III requirements (§311(a)). Or it could simply be an expression of opposition to or uncertainty about the requirements themselves. Individual comments from LEOs suggest a diversity of views: - A clear and concise plan needs to be formulated as to what the EAC must do and definite timelines attached to the responsibilities. - Rating this committee is somewhat unfair; once finally appointed, funding was delayed; they really haven't had an opportunity to function in the capacity anticipated. - All I have received from them have been brochures that come too close to an election to be of any real use. - The EAC's information on their website can be very helpful. - At the local level we only deal with the Secretary of State and not with the EAC. CRS-36 - EAC commissioners and staff are very well aware of their situation and environment. I work closely with them on a regular basis and know they are doing the best they can, as a federal agency with no enforcement powers.... - Exempt cities or other entities with less than 2,000 voters from the very expensive HAVA equipment requirements. - Get rid of it. Elections...should be free of federal control. - I believe they need more power to correct election problems. Voter Registration Database HAVA required each state to implement a statewide, computerized voter registration list before the 2006 election. A few states were unable to meet that deadline, and that is reflected in the survey, with 6% of respondents indicating that their states had not yet met the requirement. Most LEOs were familiar with their state's database, with about a third assessing themselves as "very familiar." Given the concerns expressed in the first survey about the burdens of HAVA implementation, the second survey asked LEOs whether the implementation of the computerized list had required the hiring of additional staff in the local jurisdiction. Four-fifths responded that it had not. Those that did hire additional staff were asked to identify all sources of funds. More than three-quarters received funding from local governments (Figure 29), with about 70% receiving only local funding. Figure 29. Sources of Funds Reported by LEOs for Additional Local Staffing for the Voter Registration Database Required by HAVA, 2006 100% % of Jurisdictions 75% 50% 25% 0% Federal State Local Non-Govt Source Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: There were 234 jurisdictions that reported requiring additional staffing. LEOs were neutral on average about the impacts of the requirement for a statewide voter-registration database. To explore perceptions about the effectiveness of the computerized statewide voter registration database, LEOs were asked about security, contingency plans in case of failure on CRS-37 election day, and agreement or disagreement with a series of statements.32 Respondents were very confident about both security and contingency plans. Figure 30. Agreement/Disagreement of LEOs with Statements about the Voter Registration Database, 2006 The voter register or electronic poll book could be moved from the polling station by an unauthorized person The voter registration database could be accessed by an unauthorized person Agree It is difficult to match driver's license and social security numbers with the new voter registration database The new voter registration database is more difficult to update than the previous record keeping system Because of the centralized, computerized voter registration database, fewer provisional ballots will be needed Identity theft is more of a risk with a centralized, computerized voter registration database Voters could be inadvertently removed from the voter registration database Neutral The new voter registration database is more accurate than the previous record keeping system The new voter registration database represents a significant improvement over the previous record keeping system. The centralized, computerized voter registration database makes elections more fair. The voter registration database places a heavy administrative burden on local governments Disagree A power outage on Election Day would compromise election administrators' ability to access the voter registration database. Strongly Strongly Disagree Agree Mean Level of Agreement/Disagreement Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: The graph is divided into three segments: statements with which LEOs agreed on average, those on which they were neutral, and those with which they disagreed. Grouping was based on statistical analysis (see Appendix). 32 The number of LEOs who responded to these questions was unusually small, because of an error in the survey instrument that caused most respondents to this question to be only those who answered the staffing question in the affirmative -- about 250 respondents. Therefore, additional caution in interpreting the significance of these answers is warranted. CRS-38 The responses to the statements (Figure 30), however, appear to conflict with the responses to the question on security, in that most LEOs agreed that an unauthorized person could remove the register from the polling station and access the database, although they were neutral about the risk of identity theft. LEOs also expressed concerns about matching drivers' licenses and Social Security numbers, and the difficulty of updating records in the new system, but they did not believe that the system places a heavy burden on local governments overall. They were neutral about whether the new systems would improve the election process. Voter Identification Issues relating to voter identification have been controversial.33 HAVA requires that first-time voters who register by mail must present a specified form of identification, either when registering or when voting. It does not require photo identification, although a few states have such requirements, and many states require some form of identification document.34 The kinds of identification accepted for all voters to register and to vote, as reported by respondents, is shown in Table 6. About one quarter of LEOs reported no identification requirement whatsoever, and about one-third stated that signature comparison or personal information was sufficient. LEOs supported requiring photo identification for all voters, even though they believed it will negatively affect turnout and did not believe that voter fraud is a serious problem in their jurisdictions. One of the principal policy35 arguments for tightening voter-identification requirements is concern about the risk of significant levels of voting by ineligible voters. Opponents counter that those risks are small and that requiring identification, especially photo IDs, would effectively disenfranchise eligible voters who would have difficulty obtaining such documents. To help determine the views of LEOs about this issue, the 2006 survey asked several additional questions about voter identification: ! As a local election official, how supportive are you of requiring all voters in your jurisdiction to provide valid photo identification? ! How often do non-eligible persons attempt to vote in your jurisdiction, either in person or by absentee ballot? ! Do you agree or disagree that deliberate voter fraud is a serious problem in your jurisdiction? 33 For more information on this issue, see CRS Report RS22505, Voter Identification and Citizenship Requirements: Overview and Issues, by Kevin J. Coleman and Eric A. Fischer. 34 See, for example, electionline.org, "Voter ID Laws," September 18, 2007, [http://www.electionline.org/Default.aspx?tabid=364]. 35 Some observers also believe that views about voter identification are also influenced by nonpolicy considerations such as perspectives relating to partisan advantage from different kinds of requirements -- that some kinds of requirement may be thought to suppress turnout disproportionately with respect to the political party affiliation of voters. CRS-39 ! Do you believe that requiring photo identification of all voters would make elections more secure, less secure, or have no impact on election security? ! Do you believe that asking for photo identification of all voters would increase turnout, decrease turnout, or have no impact on turnout? Table 6. Percentages of Jurisdictions Accepting Different Forms of Identification for Registration and Voting for All Voters, 2006 Percentage of Kind of Identification Jurisdictions Registration Voting Government issued photo identification 60 48 Other government documents that show the name and address 45 38 of the voter Current utility bill 48 33 Bank statement 34 22 Government check 28 21 None 21 27 Other proof of address 31 16 Paycheck 26 17 Signature Comparison n/a 33 Personal information (address, date of birth, etc.) n/a 30 Other 26 10 Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: n/a means that this option was not available to respondents as a separate choice. Other includes such alternatives as identification numbers (e.g., driver's license, Social Security), birth certificates, attestation, and voter registration cards (for voting). Total percentages do not add to 100 because LEOs were asked to check all forms of identification accepted. The results are presented in Figure 31. On average, LEOs mildly supported a requirement for photo identification. However, 29% of respondents chose "extremely supportive," 12% "do not support at all," and the choices of the other 60% were spread across the scale of possible responses. Two-thirds also believed that requiring such identification will make elections more secure. These views do not, however, appear to be based on concerns about ineligible voters or voter fraud, which few believe are problems in their jurisdictions. In addition, 41% believe that requiring photo IDs would depress turnout, while 56%, almost all the rest, believe it would have no impact. CRS-40 The causes of this apparent discrepancy are unclear. It is possible that however low the risk of fraud, LEOs believe reducing it outweighs any negative impact on turnout. There might also be other reasons that the survey did not explore. In any case, the range of perspectives in the responses to the questions shows that the controversy is not settled, even among local election officials.36 Figure 31. Frequency Distributions of Responses by LEOs to Questions about Voter Identification Support for Photo ID Requirement % of LEOs Choosing 90% 60% 30% 0% = mean response Do Not Extremely Support at Supportive All Level of Support Voting Attempts by Ineligible Voters Predicted Impact of Photo ID on Election Security % of LEOs Reporting 90% % of LEOs Choosing 90% 60% 60% 30% 30% 0% Not Very 0% Often Often Less No More at All Secure Impact Secure Frequency Predicted Impact Voter Fraud is a Serious Problem in the Predicted Impact of Photo ID on Election Jurisdiction Turnout % of LEOs Choosing % of LEOs Choosing 90% 90% 60% 60% 30% 30% 0% 0% Strongly Strongly Decreased No Increased Disagree Agree Turnout Impact Turnout Level of Agreement Predicted Impact Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. 36 For more information on this issue, see CRS Report RS22505, Voter Identification and Citizenship Requirements: Overview and Issues, by Kevin J. Coleman and Eric A. Fischer. CRS-41 Election Administration Issues 2006 Election LEOs spent much more time preparing for the election in 2006 than the one in 2004. The 2006 election was the first under which all HAVA requirements were in effect.37 Consistent with the perception of LEOs that HAVA has made elections more complex to administer (Figure 24), three-quarters found that they spent more time preparing for the 2006 than the 2004 election, with 40% spending much more time. This perception was supported by comparing the number of hours per week LEOs reported spending on election duties in the 2004 and 2006 surveys. On average, the time spent increased 15%, from 21 to 24 hours. In 2006, LEOs also stated that they worked an additional 20 hours per week in the month before the election. This difference may be especially significant given that 2006 was not a presidential election year, with the additional work required for that contest. In addition, there were prominent issues of concern in 2006 such as voting-system malfunctions and problems with pollworkers, vendors, long lines, media coverage, and timely and accurate reporting of results. The survey therefore presented a list of 16 potential problems and other events and asked LEOs to indicate which, if any, had occurred. The results are presented in Table 7 and Figure 32. The most commonly reported incident in the 2006 election was malfunction of a DRE or optical scan system. Not surprisingly, this was most commonly reported by LEOs using DREs as the primary voting system (Figure 32), but the differences were relatively small. Among DRE users, 53% reported that at least one repairable malfunction occurred, and 12% that at least one malfunction occurred that could not be repaired. More such machines would be used on average in jurisdictions where DREs are the primary voting system (as opposed to those where only one is used per polling place to meet the HAVA accessibility requirement). Therefore, the chance of at least one malfunction would be expected to be higher on average than in jurisdictions using another kind of primary system, such as precinct-count optical scan, where typically only one OS machine is used in a precinct.38 However, if DREs had lower 37 One HAVA requirement (§301(a)(3)(C)) went into effect January 1, 2007, but it applies only to voting systems purchased with funds made available under title II after that date. 38 The survey asked LEOs to indicate only whether a particular event had occurred, not how many times. So if a DRE and precinct-count optical scan system have similar failure rates, then a jurisdiction using 1 DRE and 1 OS unit per polling place will probably have a lower incidence of failures than a jurisdiction that uses 10 DRE units per polling place. If the rate of failure per unit is 5%, the polling place using 1 OS and 1 DRE would have a 10% chance that at least one unit would fail, and the polling place using 10 DREs would have a 40% chance. CRS-42 failure rates per machine than optical scan systems, the difference would be correspondingly lower.39 Table 7. Percentage of LEOs Reporting Various Events in Their Jurisdictions on Election Day 2006 Event % Repairable electronic voting system malfunction 43 Unrepairable electronic voting system malfunction 11 Electronic voting system was hacked 0 Vendors did not provide the support expected 13 Insufficient supply of paper ballots 3 Excessively long lines 12 Polling places failed to accurately report election results 2 Polling places failed to report election results in a timely manner 4 Central office failed to report election results in a timely manner 3 Unfair media coverage of election administration 10 Poll workers did not understand their jobs 21 Poll workers did not report for duty 10 A close race (2-3% margin of victory) 23 A race resulting in an election recount 19 A race resulting in a legal challenge 2 Deliberate election fraud 1 Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: The percentages in this table are based on the total number of respondents who reported the kind of voting system they used (1,360). This base was chosen because it seemed most likely to reflect the number of respondents who considered the question. The percentages would have been different if another denominator were used: (1) if the number of respondents to this question (1,029) was used, the percentages would have been higher, but those results would be overestimates of the true percentage, since LEOs who had no problems at all would not have responded to the question at all (the question did not have an option for LEOs to check if they had no problems whatever). (2) If the total number of LEOs responding to the survey was used, the percentages would have been lower, but those results would have been underestimates, since the denominator would likely have included LEOs who had problems but skipped the question. For example, under alternative (1), the estimates would be higher by a factor of 1.3 (e.g., 57% rather than 43% for the first event), and under (2), lower by a factor of 0.9 (39%). However, the effects of such changes on the significance of the results is negligible. In fact, the incidence of such occurrences was almost equally as high for users of both precinct- and central-count optical scan systems (47% and 36%, respectively, 39 For example, if the failure rate for DREs were 1% and that for OS 5%, a polling place using 1 OS and 1 DRE would have a 6% chance that at least one unit would fail, and the polling place using 10 DREs would have a 10% chance. CRS-43 for repairable malfunctions, and 12% and 15% for unrepairable ones) as their primary systems. In comparison, the reported failure rates in jurisdictions using lever machines and paper ballots was much lower (9% and 10% for repairable malfunctions, and 5% and 6% for unrepairable ones). About one in seven users of optical scan and DREs as their primary systems were disappointed in the level of support provided by vendors. Those LEOs were twice as likely to have experienced unrepairable malfunctions of their voting systems as LEOs who were not disappointed with vendor support. Figure 32. Percentage of LEOs Reporting Various Occurrences in Their Jurisdictions on Election Day 2006, by Primary Voting System Repairable electronic voting system malfunction Unrepairable electronic voting system malfunction Electronic voting system was hacked Vendors did not provide the support expected Insufficient supply of paper ballots Excessively long lines Polling places failed to accurately report election results Polling places failed to report election results in a timely manner Central office failed to report election results in a timely manner Unfair media coverage of election administration Poll workers did not understand their jobs 0% 20% 40% 60% % of LEOs Reporting Lever Machine Paper (hand-counted) Central count optical scan Precinct count optical scan DRE Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: Only one LEO reported that an electronic voting system was hacked (see text). The reports of malfunctions of electronic voting systems by users of lever machines and hand-counted paper ballots may seem puzzling. However, many of those jurisdictions use DREs to meet HAVA accessibility requirements, and lever-machine jurisdictions may also use CCOS to process absentee ballots. CRS-44 The results suggest that current optical scan systems may not be significantly more reliable than DREs. They also contrast strikingly with the uniformly high ratings all users gave for the reliability of their voting systems (see Figure 13 above). LEOs did not appear to assess the malfunctions as being the result of tampering. In fact, only one reported a system being hacked, and that was a precinct-count optical scan user.40 The incidence of long lines at the polling place was highest in jurisdictions using DREs. Another notable result was the fairly high incidence of LEOs, 12%, who reported excessively long lines at the polling place. The prevalence was much higher in jurisdictions using DREs primarily, occurring in about one quarter. In those using other kinds of voting systems, long lines occurred in only about 6% (Figure 32). Jurisdictions using DREs also reported more unfair media coverage (19%) than users of other systems (6% on average). The incidence of problems with accurate and timely reporting of election results was low and did not differ among users of the different kinds of voting systems. Reports of deliberate election fraud of any kind were also few -- 8 LEOs, one out of every 170 jurisdictions or 0.75%. Such a low rate might nevertheless be considered unacceptably high, depending on such factors as the seriousness of the offense, the impact of such attempts at fraud on the election, and the degree to which election officials are able to detect all such attempts. LEOs noticed no change on average in residual votes (overvotes plus undervotes plus spoiled ballots) from 2004 to 2006. About 60% reported no change, and about 20% each reported an increase or a decrease. This result suggests that the decreased confidence LEOs had in 2006 in the ability of voting systems to reduce voter error was not a result of a noticeable increase in such error. Alternatively, the decrease in confidence might have resulted from sources such as changes in media coverage of voting-system problems. The number of provisional ballots used varied greatly among jurisdictions in 2006. About 30% of that variability is explainable by the number of voters in the jurisdiction. Thus, jurisdictions with fewer than 1,000 registered voters used about 10 provisional ballots on average and those with more than 100,000 voters used 1,500. Across all jurisdictions, one provisional ballot was used for every 140 registered voters on average. About a quarter of jurisdictions, mostly small, used no provisional ballots, and about 4% used more than 1,000, with a maximum of 15,000 in a jurisdiction with about half a million voters. When asked whether these ballots were easier to use than in 2004, about three-quarters of LEOs reported no change, but more found them easier (16%) than harder (9%) to use in 2006. Three-quarters of jurisdictions used optical scan systems for absentee ballots, and most of the rest used hand-counted paper ballots. More than half of respondents 40 Since many such users also use DREs to meet the HAVA accessibility requirements, it was not possible to determine whether it was an optical scan system or a DRE that the LEO assessed as having been hacked. CRS-45 indicated that their jurisdictions offered early voting. About a third each of those offering it used optical scan, a third DREs, and under 10% hand-counted paper ballots. The rate of absentee voting has been increasing nationally over the last several elections, as the number of states offering early and "no excuse" absentee voting has increased.41 The survey asked LEOs to provide information on the percentage of all votes cast by absentee voting in 2006. On average, respondents reported that about 14% of votes were cast by absentee ballot, with 1-5% being most commonly reported (Figure 33).42 The average rate is very similar to the one reported in the EAC's election day survey (14.2%).43 Some observers have expressed concerns about early and "no excuse" absentee voting, arguing, among other things, that they do not increase turnout and pose some security risks. These concerns were largely not shared by LEOs (Figure 34). Three-quarters agreed that absentee voting should be considered a voter's right, and more than half that early voting should be. Three-quarters also agreed that absentee voting is worth the costs, and that verification of authenticity is not difficult for those ballots. However, they were equivocal about whether early voting is worth the costs. Both absentee and early voting reduce the pressures of election day administration; it is possible that election officials support absentee voting over early voting because it is easier to administer in the pre-election period. Problems with pollworkers were common. About 10% of jurisdictions experienced one or more instances of pollworkers not reporting for duty. Since the average jurisdiction used more than 150 pollworkers, the impact may be small on average (although not in the affected polling places). Nevertheless, absenteeism among pollworkers has been cited as a significant problem on election day.44 Factors that might contribute include long hours, low pay, poor training, and age, but analysis of pay and training data from the survey did not point to those factors as being significant.45 41 Historically, most states have required voters to provide a reason such as illness, disability, or absence from the jurisdiction on election day as part of an application for an absentee ballot. However, most states now offer early voting, "no excuse" absentee voting, or both (for specifics, see electionline.org, "Pre-Election Day and Absentee Voting by Mail Rules," October 22, 2007, [http://www.electionline.org/Default.aspx?tabid=474]). 42 The survey also asked about early voting, but the results were ambiguous and therefore are not reported here. 43 Election Assistance Commission, "The 2006 Election Administration and Voting Survey: A S u mma r y o f K e y F i n d i n gs , " D e c e mb e r 2 0 0 7 , a va i l a b l e a t [http://www.eac.gov/News/press/clearinghouse/2006-election-administration-and-voting- survey]. The EAC reported a domestic civilian absentee-voting rate of 13.8% and an overseas-voter rate of 0.4%. 44 electionline.org, "Helping Americans Vote: Poll Workers," September 2007, [http://www.electionline.org/Portals/1/Publications/ERIPBrief19_final.pdf]. 45 The survey did not include questions on the age or number of hours worked by pollworkers. CRS-46 Figure 33. Percentage of Votes LEOs Reported as Cast via Absentee Voting, 2006 81-90 61-70 % of Absentee Votes 46-50 36-40 26-30 16-20 6-10 0 0% 15% 30% 45% % of Jurisdictions Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Figure 34. Agreement/Disagreement by LEOs with Statements about Absentee and Early Voting, 2006 Early voting should be considered a voter's right Absentee voting should be considered a voter's right The popularity of early voting is on the rise The benefit of early voting outweighs its cost The cost of absentee voting outweighs its benefits The authenticity of an absentee ballot is difficult to verify Strongly Strongly Disagree Agree Mean Level of Agreement/Disagreement Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. CRS-47 More than 20% of LEOs reported instances of pollworkers who did not understand their jobs.46 The lowest rate, 5%, was in jurisdictions using hand-counted paper ballots. Results from LEOs using other kinds of voting systems ranged from 17-25%, but those differences were not statistically significant. It seems unlikely that the differences between the results for paper and those for other voting systems arose purely from differences in the roles of technology in the different voting systems, since the technology-related tasks of pollworkers in jurisdictions using central-count optical scan are unlikely to be much greater than those in jurisdictions using hand-counted paper ballots. There are several other possible factors. For example, the average total number of pollworkers, polling places, and registered voters reported by LEOs is far lower for jurisdictions using hand-counted paper than for any other voting system (see Figure 35 in the next section). Use, Training, and Experience of Pollworkers The 2006 survey included several questions about pollworkers. All but 3% of LEOs reported using one or more pollworkers, with a mean number of 164 in a jurisdiction47 and a maximum of 4,000. The number of pollworkers in the jurisdictions was strongly correlated with the number of registered voters reported, as was the total number of polling places. The kind of voting system used also varied with the number of registered voters. Overall, jurisdictions using hand-counted paper ballots had the smallest number of registered voters, polling places, and pollworkers, and those using DREs and lever machines the highest (Figure 35). On average, there were 5-6 pollworkers per polling place. Jurisdictions using paper ballots had the highest average number, and those using lever machines the lowest. Compensation of pollworkers also varied substantially. About 60% of respondents reported paying them a lump-sum amount for work on election day, $100 on average. The remainder of respondents reported an hourly wage of $7.25 on average. Very few respondents reported paying nothing to pollworkers, and few likewise reported paying more than $200 per day or $12 per hour. The results suggest that there is some regional variation. For example, the average rate of pay by state varied in New England from $50 to $106 per day, and in the West from $70 to $155. While LEOs who reported problems with pollworker performance paid them $5-10 less per day on average, the effect of pay on performance was not statistically significant. However, the survey did not explore potentially influential demographic factors such as age of pollworkers or average cost of living. 46 Note that this result does not mean that 20% of pollworkers did not understand their jobs, but that 20% of LEOs reported that lack of understanding had occurred often enough for them to consider it a problem. 47 The median was 50. CRS-48 Figure 35. Relationships between Kinds of Voting Systems Used and Selected Characteristics of Jurisdictions, 2006 No. of Registered Voters 80 60 (000) 40 20 0 Lever Paper CCOS PCOS DRE 90 No. of Polling Places 60 30 0 Lever Paper CCOS PCOS DRE 500 No. of Pollworkers 400 300 200 100 0 Lever Paper CCOS PCOS DRE Pollworkers per Polling 9 6 Place 3 0 Lever Paper CCOS PCOS DRE Voting System Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: The X-axis variable (Voting Systems) is categorical. The lines in the graph are provided purely as a visual aid to facilitate comparison and do not denote any relationship among the categories. See note for Figure 7 for an explanation of types of voting systems. CRS-49 Figure 36. Views of LEOs on the Responsibility of Inadequate Pollworker Training for Problems with Election Administration 40% 30% % of LEOs 20% 10% 0% Not Frequently Responsible Responsible at All Degree of Responsibility Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Figure 37. Views of LEOs on the Need for Improvement of Pollworker Training 40% 30% % of LEOs 20% 10% 0% None A Great Deal Level of Improvement Needed Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Perhaps more surprisingly, the amount of training pollworkers received was also not associated statistically with reports of performance problems.48 However, more 48 This result does not necessarily mean that no relationship exists, only that none was detected. While little research is available on this topic, available evidence supports the contention that training and performance are related (see, for example, Thad Hall, J. Quin Monson, and Kelly D. Patterson, "Poll Workers and the Vitality of Democracy: An Early Assessment," PS: Political Science and Politics, Vol. XL(4), October 2007, p. 647-654, (continued...) CRS-50 LEOs than not believed that inadequate training was responsible for problems with election administration, and most believed that training needs significant improvement (Figures 36 and 37). Not surprisingly, those views were strongly correlated: LEOs who believed more strongly that inadequate training caused problems also tended to believe more strongly that improvements in training were needed. On average, pollworkers received 3.5 hours of training in 2006 (Figure 38). In about 10% of jurisdictions, training was 1 hour or less. In three quarters, it was 2-4 hours, and in only 5% was it one day or more. Nevertheless, 70% of LEOs considered pollworker training "extremely important," and only a few considered it "not important at all." Figure 38. Number of Hours of Pollworker Training Reported by LEOs 40% % of Jurisdictions 30% 20% 10% 0% 1 2 3 4 5 6-7 8-10 11-40 Length (Hours) of Pollworker Training Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. There appeared to be substantial uniformity among respondents in the areas in which pollworkers were trained (Figure 39), with more than 90% of pollworkers being trained in voter check-in, accessibility, election laws, operation of voting machines, and election integrity. LEOs were not asked what areas of training should be improved, but another study that surveyed pollworkers in New Mexico found that many desired more training in voting-machine operation and election laws.49 Interestingly, that finding reflects the views of many LEOs about their own training, as discussed earlier in this report. 48 (...continued) available at [http://www.vote.caltech.edu/journals/PS-ThadHall.pdf]). 49 R. Michael Alvarez, Lonna Rae Atkeson, and Thad E. Hall, The New Mexico Election Administration Report: The 2006 November General Election, August 2, 2007, p. 20, available at [http://www.vote.caltech.edu/reports/NM_Election_Report_8-07.pdf]. CRS-51 Figure 39. Areas of Training for Pollworkers Reported by LEOs, 2006 Administering voter check-in procedures Assisting handicapped voters Adhering to federal, state & local election laws Protecting the integrity of the election Operating voting equipment Verifying voter identification Administering provisional ballots Resolving conflict with problem voters Reporting election results Accessing the electronic voter registration list Responding to the media Other 0% 25% 50% 75% 100% % of Pollworkers Trained in Area Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Figure 40. Level of Concern Reported by LEOs about the Negative Impact of Increased Election Complexity on Pollworker Recruitment, 2006 40% 30% % of LEOs 20% 10% 0% Not Extremely Concerned Concerned at All Level of Concern Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. LEOs also believed that HAVA is changing the nature of pollworker training, with 20% reporting that the changes were "substantial." As reported earlier (see Table 5 and Figure 25 above), most LEOs believed that HAVA has made elections CRS-52 more complex to administer. Most also expressed concern that the increased complexity of elections will have a negative impact on recruitment of pollworkers, and more than a third of respondents were "extremely concerned" (Figure 40). Nonpartisan Election Officials Some observers have suggested that the environment in which election officials operate is too politically contentious and that steps should be taken to make election administration more nonpartisan. For example, some believe that state election officials should not be permitted to be involved in political campaigns other than for their own positions. The 2006 survey asked LEOs several questions about this issue. In general, LEOs were satisfied with election administration at the state level (Figure 41), with only about 10% expressing significant dissatisfaction. More LEOs than not also believed that election administration in their state is independent of partisan politics. However, more than half of elected LEOs (57%) indicated that they communicated their party affiliation during their election.50 Figure 41. Assessments by LEOs about Aspects of the Election Administration Environment, 2006 Level of Satisfaction with State-Level Election Administration Degree of Independence of Election Administration from Partisan Politics Degree of Political Contentiousness of the Election Administration Environment Low High Rating Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. There was more variation in the views of LEOs about the political contentiousness of the election administration environment, with about 18% believing it is "not contentious at all," and 9% that it is "extremely contentious." Nevertheless, on average LEOs rated the level of contentiousness relatively low. Finally, LEOs were asked whether election administration should be a civil service function in their state. About half had no opinion, but significantly more elected LEOs were opposed to the idea than favored it. Appointed LEOs were evenly divided (Figure 42). 50 According to another study, about one-fifth of local jurisdictions are administered by Republicans and one-quarter by Democrats, with about two-fifths nonpartisan and the remainder bipartisan (Kimball and Kropf, "Street-Level Bureaucrats," p. 1262). CRS-53 Figure 42. Views of LEOs about Whether Election Administration Should Be Part of the Civil Service in Their States, 2006 75% Elected LEOs Appointed LEOs % of LEOs 50% 25% 0% In Favor No Opinion Opposed Preference Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Possible Caveats As with any survey, care needs to be taken in drawing inferences from the results. One question that could arise is whether the sample is representative of LEOs as a whole. For example, simply drawing the sample at random from the nationwide pool of election administrators would have resulted in a disproportionately large number of jurisdictions from New England and the upper Midwest, where elections are administered by townships rather than counties.51 Steps were taken in the design of the studies to minimize the risk that the sample would not be representative (see the appendix on methodology below). Overall, neither the sample design nor the characteristics of the responses suggest that the results are unrepresentative of the views and characteristics of local election officials. Another potential caution for interpretation relates to the inherent limits of surveys such as these. In particular, there is no way to guarantee that the responses of the election officials correspond to their actual beliefs. In addition, there is no way to be certain that any particular belief corresponds to reality. The question on voting- system characteristics (see Figure 13) provides an illustration of the possibility for disparity. For several reasons, LEOs might be reluctant to rate their voting systems 51 For example, Maine ranks 37th among states in population, with 1.3 million residents, but it ranks 4th in the number of election jurisdictions, with 518. CRS-54 low in reliability, accuracy, and security, despite the anonymity of the results. Alternatively, they might truly believe that their voting systems are highly reliable, accurate, and secure, even if independent evidence does not support that view. Also, some caution is needed in assigning cause and effect. The mere existence of an association or correlation between a factor and an effect does not necessarily mean that the factor caused the effect. For example, the survey showed a strong association between the kind of voting system used in a jurisdiction and the number of pollworkers (see Figure 35). However, while the kind of voting system may have some independent effect, a more important factor is the number of registered voters. A final caution involves how survey results might be used to inform policy decisions. On the one hand, the results could be used to support the shaping of policy in directions expressed by LEOs in their responses. In many cases, such policy changes might be appropriate. On the other hand, it is possible that at least some of those desired changes would not in fact yield the most effective or appropriate policies. In such cases, the results might more constructively be used to help policymakers identify issues for which improvements in communication and understanding are needed. Potential Policy Implications The survey results may have policy implications for several issues at the federal, state, and local levels of government. Some issues that may be relevant for congressional deliberations are highlighted below. Election Officials. Many observers have commented favorably on the experience and dedication of the nation's local election officials. Survey results are consistent with that view. At the same time, other observers, including some election officials, have called for increased professionalism in election administration. Some survey results suggest areas of potential professional improvement, such as in education and in professional involvement at the national level. Congress could address this potential need by several means, for example facilitating educational and training programs for LEOs and promoting professional certification of election officials by entities accredited through the EAC. The seemingly unique demographic characteristics of LEOs as a group of government officials may have other policy implications, but they are not altogether clear. However, some observers may argue that efforts should be undertaken to ensure that LEOs reflect the diversity of the workforce or voting population as a whole, especially with respect to minority representation. The issue of partisanship among election officials has been controversial for several years. Most national attention has been on state officials, but, given that most LEOs are elected and only about half the local jurisdictions in the United States are administered on a nonpartisan or bipartisan basis, policymakers may wish to consider the influence of partisanship among LEOs. CRS-55 Voting Systems. Since the enactment of HAVA, controversy has arisen over whether DRE voting systems are sufficiently secure and reliable. The survey revealed that LEOs who have experience with DREs are very confident in them, consider them superior for accessibility, and do not generally support the addition of a voter-verified paper audit trail (VVPAT) to address security concerns, although those who use a VVPAT are satisfied with its performance. However, LEOs using other systems are much less confident in DREs and more supportive of VVPAT. The strongly dichotomous results suggest that as Congress considers whether to require changes in the security mechanisms used in voting systems, it might be useful to determine whether DRE users are overconfident in the security of their systems and procedures in practice, or, alternatively, whether nonusers might need to be better educated about the reliability and security of DRE systems.52 The Help America Vote Act (HAVA). The survey results suggest that HAVA is in the process of achieving several of its policy goals. The general support of HAVA provisions -- including those such as the creation of the EAC and the provisional ballot requirement that have been somewhat controversial -- implies that LEOs are in agreement with the goals of the act and are active partners in its implementation. The overwhelming choice of new voting systems that assist voters in avoiding errors indicates that the HAVA goal of reducing avoidable voter error is in the process of being met. The areas of concern expressed by LEOs -- such as how to meet the costs of ongoing implementation of HAVA requirements -- raise issues that Congress may wish to address as it considers HAVA appropriations and reauthorization. In addition, the reduction in the levels of support from 2004 for HAVA and the EAC, while small, and broader concerns about the effectiveness of the EAC, may raise concerns for Congress. The close relationship between LEOs and the vendors of their voting systems seems unlikely to change as a result of HAVA. However, with the codification by HAVA of the voting system standards and certification processes, the influence of the federal government in decisions about new voting systems might be expected to increase in relation to that of vendors and others. The increased concerns of LEOs in 2006 that vendors, media, political parties, and advocacy groups have too much influence on such decisions may raise concerns. Research Needs. Scientific opinion surveys of local election officials are rare,53 and additional research may be useful to address some of the matters raised by 52 For discussion of the DRE security issue and proposals for resolving it, see CRS Report RL22190, The Direct Recording Electronic Voting Machine (DRE) Controversy: FAQs and Misperceptions, by Eric A. Fischer and Kevin J. Coleman; and CRS Report RL32139, Election Reform and Electronic Voting Systems (DREs): Analysis of Security Issues, by Eric A. Fischer. 53 The Government Accountability Office surveyed a sample of about 600 LEOs nationwide by mail and about 160 by telephone following the 2000 federal election (see Government Accountability Office, Elections: Perspectives on Activities and Challenges Across the (continued...) CRS-56 these studies. For example, a survey of state election officials might provide useful information and might additionally be helpful in assessing the most appropriate federal role in promoting the effective implementation of HAVA goals at all levels of government. One common suggestion of LEOs for improving HAVA was to provide a means of adjusting requirements to fit the needs of smaller jurisdictions. To determine what, if any, such adjustments would be appropriate, it may be useful to have specific information on how the needs and characteristics of different jurisdictions vary with size -- something that was beyond the scope of these surveys. It could also be useful to identify how the duties of LEOs vary with size and other characteristics of the jurisdiction. In many jurisdictions, election administration is only part of the LEO's job. It is not known to what degree these other responsibilities might affect election administration -- negatively or positively. Finally, these surveys have provided only snapshots of LEO characteristics and perceptions over a two-year period. It might be beneficial to perform similar surveys periodically to identify trends and explore new questions and issues. 53 (...continued) Nation, GAO-02-3, October 2001). That survey focused largely on issues of election management, such as the availability of poll workers and the processing of absentee ballots. While results of the two surveys are not generally comparable because of differences in focus and methodology, the GAO survey did find that a high percentage of local officials expressed satisfaction with the performance of their existing voting systems, a finding consistent with the results of the current survey. CRS-57 Appendix. Notes on Methodology The results presented and analyzed in this report are from two surveys sponsored by CRS as part of its Capstone program and performed by graduate students and faculty at the George Bush School of Government and Public Service at Texas A&M University. The principal investigators for the 2004 survey were Donald P. Moynihan and Carol Silva for the 2004 study and Carol Silva for 2006. Ten graduate students participated in the first survey,54 and six in the second.55 For both studies the CRS project manager was Eric Fischer and the project liaison was Kevin Coleman.56 The topics for the two surveys were developed collaboratively by the CRS and Texas A&M participants. The major factor in choosing the topics was potential usefulness of the results for Congress. The Bush School team developed and administered the survey instrument in consultation with CRS and provided the authors with the data used in performing the analyses. The two surveys were conducted after the November 2004 and 2006 federal elections, between December and the following March. For each survey, a sample of approximately 3,800 LEOs was drawn from the roughly 9,000 election jurisdictions in the 50 states.57 To ensure that LEOs from all states were included, but that states with large numbers of LEOs were not disproportionately represented (see Figure 43), a modified random-sampling regime was used, as follows: Surveys were sent to all LEOs in states with 150 or fewer local jurisdictions. For the ten states with more than 150 LEOs, a sample of 150 was chosen at random from the local jurisdictions, and surveys were sent to those LEOs.58 54 The students were Jennifer Gray, Marshall Gray, Joshua Hodges, Jeff Jewell, Marcia Larson, Ryan Mitchell, Erin Murello, Steve Murello, Alice Reeves, and Julie Siddique. 55 They were Brock Ramos, Robert Thetford, Trait Thompson, Staci Thrasher, Shavonda Johnson, and Carlos Cruz-Fernandez. 56 The authors wish to thank the many people who devoted time and energy to this project. Most important among them were nearly 1,500 local election officials who took the time from busy schedules to answer the many questions in the two surveys. Doug Chapin and Sean Greene of the Election Reform Information Project (electionline.org) provided the original data set of local election officials. The skills and dedication of the principal investigators and students at Texas A&M University were essential to the successful completion of the project. 57 Privacy requirements prevented the inclusion of the District of Columbia, which has only one LEO. 58 The number of LEOs per state varies greatly, from fewer than 10 to more than 1,000. The number varies much more strongly with the way states have chosen to organize their election jurisdictions than it does with variables such as the voting-age populations of the states. Consequently, a simple random sample of the total number of election officials in the United States would have caused states with more decentralized election administration to be disproportionately represented in the set of responses. Alternative approaches that attempted to weight the data (by state, voting-age population, or portion of LEOs, for example) would also have had weaknesses in addressing questions of representativeness. (continued...) CRS-58 Most surveys were administered electronically, with respondents visiting a website to enter their responses. The remainder were paper surveys sent via the U.S. Postal Service. LEOs who did not respond were sent reminders or contacted by telephone. Figure 43. Frequency Distribution of the Number of Local Election Jurisdictions in the States 12 9 Number of States 6 3 0 3-25 26-50 51-75 76-100 101-250 251-500 501-1000 >1000 Number of Jurisdictions per State Source: CRS analysis of data provided by the Election Reform Information Project (electionline.org) and other sources. Note: Data are from 2004, but the distribution of jurisdictions did not change significantly for 2006. For each survey, the overall final response rate was 40% of the sample, or about 17% of all jurisdictions in the United States. Respondents answered 85-90% of questions, on average.59 The response was sufficiently high to permit statistical analysis and comparison of the results between the surveys. Individual response rates per state were between 25% and 50% for about three-quarters of states (see Figure 44). The remainder were evenly split between those for which under 25% of LEOs responded, and those for which the rate was greater than 50%. Response rates were similar among states across the two surveys, and did not vary significantly for either survey with the number of local election jurisdictions in a state or its voting age population. About 70% of respondents worked in county election jurisdictions, with most of the remainder working in townships (Figure 45). The small difference between the two years in those choosing "town/township" and in those choosing 58 (...continued) There is no simple solution to this problem, and the sampling strategy used in the two surveys was chosen as a way to strike a reasonable balance between populational and geographic representation. In combination with the unweighted statistical analyses performed for this report, the strategy has the effect of increasing the relative influence of the four-fifths of states with fewer than 150 LEOs while ensuring a relatively strong influence of states with large numbers of LEOs. 59 This number is for questions that applied to all LEOs. Some questions were targeted to specific groups, such as users of DREs. CRS-59 "other" was almost certainly a result of a small change in the structure of the question for the 2006 survey.60 Figure 44. Frequency Distribution of Response Rates by State, 2004 and 2006 18 15 Number of States 12 2004 9 2006 6 3 0 0-10 10-20 21-30 31-40 41-50 51-60 61-70 71-80 81-90 91-100 Percentage of Jurisdictions Responding Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. All the results presented in this report are from analyses by CRS of data provided from the surveys by researchers at Texas A&M University. The raw data were first examined for errors, and corrections were made where necessary, in a few cases, such as if a LEO claimed to work more hours per week than is physically possible.61 Where the correct answer could be reasonably discerned, the response was corrected.62 Otherwise it was discarded. Once cleaned, the data were analyzed using standard parametric methods, mainly analysis of variance, linear regression, and Student's t-tests as appropriate. Three kinds of hypotheses were tested: ! differences between groups, such as whether results for 2004 differed from those for 2006; 60 In each survey, the choices for kind of jurisdiction were county, town, township, borough, and other. In 2006, LEOs could write in the kind of jurisdiction they administered in the "other" category, and almost all of those indicated their jurisdiction as a city. The option to write in a response did not exist in the 2004 survey, and the pattern of response strongly suggests that most LEOs with city jurisdictions chose "town" or "township" as the most closely matching category. 61 This was only an issue for those few questions where LEOs provided "ad-lib" answers rather than choosing from among a range of options. 62 For example, when asked how many additional hours per week LEOs worked in the four weeks preceding the election, the responses of five LEOs presented in the database as impossibly large numbers such as 1015 or 2530 (there are 168 hours in a week). Those responses were clearly incorrect. Given the structure of those responses, the intent was interpreted as a range, 10-15 and 25-30 in the examples, and the number of hours was corrected to the midpoint of the range, 12.5 and 27.5. CRS-60 ! differences from a hypothetical value, such as whether LEOs were neutral about, agreed with, or disagreed with a particular statement; and ! tests for associations, such as whether the number of pollworkers in a jurisdiction was correlated with the number of registered voters. Figure 45. Kinds of Jurisdictions Administered by Survey Respondents, 2004 and 2006 80% % of Respondents 60% 2004 40% 2006 20% 0% County Town/Township Other Type of Jurisdiction Source: Analysis by CRS of data from studies performed collaboratively by CRS and Texas A&M University. Note: In each survey, the choices for kind of jurisdiction were county, town, township, borough, and other. For this graph, the replies for town and township were combined, as were the replies for borough and other. Statistical significance was determined using a significance level () of .01. However, for display purposes, graphs with error bars were drawn showing 95% confidence intervals for the means. Most tests yielded highly statistically significant results -- p-values much lower than the significance level (p << .01). For tests where statistically significant effects were not found, the lack of effect is noted in the text, for example, by stating that no change was found between 2004 and 2006 for a particular survey item. Additional methodological details can be provided upon request. ------------------------------------------------------------------------------ For other versions of this document, see http://wikileaks.org/wiki/CRS-RL34363