For other versions of this document, see http://wikileaks.org/wiki/CRS-RL33246 ------------------------------------------------------------------------------ Order Code RL33246 Reading First: Implementation Issues and Controversies Updated February 6, 2008 Gail McCallion Specialist in Social Legislation Domestic Social Policy Division Reading First: Implementation Issues and Controversies Summary The Reading First program was authorized as part of the Elementary and Secondary Education Act (ESEA) through the No Child Left Behind Act of 2001 (NCLBA). The NCLBA was signed into law on January 8, 2002, and will expire at the end of FY2008 (including the automatic General Education Provisions Act one- year extension). It is expected that the 110th Congress will consider legislation to reauthorize the ESEA. Reading First was drafted with the intent of incorporating scientifically based reading research (SBRR) on what works in teaching reading to improve and expand K-3 reading programs to address concerns about student reading achievement and to reach children at younger ages. By the end of October 2003, all states and the District of Columbia had received their FY2002 and FY2003 Reading First awards. Information from the U.S. Department of Education's (ED) April 2007 report on state performance data; a February 2007 Government Accountability Office report, and a 2007 Center on Education Policy report, Reading First: Locally Appreciated, Nationally Troubled, have all provided relatively positive information about states' and local school district's opinions of the impact of Reading First on student achievement. However, state assessment measures and cut-off scores for determining reading proficiency vary from state to state, making it difficult to draw definitive conclusions on Reading First's performance from these data. There have, however, been criticisms of the program that centered on the perceived "overprescriptiveness" of the program as it has been administered, perceptions of insufficient transparency regarding ED's requirements of states, and allegations of conflicts of interest between consultants to the program and commercial reading and assessment companies. Controversies have also arisen regarding the application of the SBRR requirements in the NCLBA to the Reading First program. Three groups representing different reading programs filed separate complaints with ED's Office of Inspector General (OIG), asking that the program be investigated. In September of 2006, the OIG issued a report on Reading First's grant application process. Subsequent OIG audit reports were issued on ED's administration of selected aspects of the program, on the RMC Research Corporation's Reading First contracts, and on several states' administration of the program. The OIG reports were highly critical of ED's implementation of the Reading First program, and essentially validated many of the concerns that had been raised in complaints filed with the OIG. In response to the controversy surrounding Reading First, the program's funding was cut from $1 billion in FY2007 to $393 million in FY2008. The Administration has requested that the program's funding be restored to $1 billion for FY2009. This report will be updated periodically. Contents Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 Reading First Implementation Evaluation: Interim Report . . . . . . . . . . 4 Center on Education Policy Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 February 2007 GAO Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 Scientifically Based Research and Reading First . . . . . . . . . . . . . . . . . . . . . . . . . . 9 Scientifically Based Research Requirements in the No Child Left Behind Act . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 SBRR Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 Application of Scientifically Based Reading Research to the Reading First Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 Limitations of Existing Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 Identifying Relevant Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 Office of the Inspector General Audits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 OIG Final Inspection Report: The Reading First Program's Grant Application Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 OIG Final Audit Report: The Department's Administration of Selected Aspects of the Reading First Program . . . . . . . . . . . . . . . . . . 19 OIG Final Audit Report: RMC Research Corporation's (RMC) Administration of the Reading First Program Contracts . . . . . . . . . . . 20 Congressional Oversight and Legislation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 Reading First: Implementation Issues and Controversies Introduction1 The Reading First program was authorized as part of the Elementary and Secondary Education Act (ESEA) through the No Child Left Behind Act of 2001 (NCLBA). The NCLBA was signed into law on January 8, 2002, and will expire at the end of FY2008 (including the automatic General Education Provisions Act one- year extension). It is expected that the 110th Congress will consider legislation to extend the authorization of the ESEA as amended by the NCLBA. The NCLBA included three new reading programs: Reading First, Early Reading First, and Improving Literacy Through School Libraries. The NCLBA also reauthorized the William F. Goodling Even Start Family Literacy Programs. This report focuses on the Reading First program. Reading First was drafted with the intent of incorporating scientifically based research on what works in teaching reading to improve and expand K-3 reading programs to address concerns about student reading achievement and to reach children at younger ages. The Reading First program includes both formula grants (states are allocated funds in proportion to the estimated number of children, aged 5 to 17, who reside within the state from families with incomes below the poverty line) and targeted assistance grants to states.2 For the first two years of the program, 100% of funds, after national reservations, was allocated to states as formula grants. States then competitively award grants to eligible local educational agencies (LEAs). LEAs that receive Reading First grants shall use those funds for the following purposes: 1 For a discussion of Reading First's funding history and program requirements, see CRS Report RL31241, Reading First and Early Reading First: Background and Funding, by Gail McCallion. 2 The NCLBA specifies that beginning with FY2004, 10% of funds in excess of the FY2003 appropriation or $90 million, whichever is less, be reserved for targeted assistance state grants. Targeted assistance grants are intended to reward schools that are achieving the goals of increasing the percentage of 3rd graders who are proficient readers and improving the reading skills of 1st and 2nd graders. CRS-2 ! selecting and administering screening, diagnostic, and classroom- based instructional reading assessments; ! selecting and implementing a learning system or program of reading instruction based on scientifically based reading research that includes the essential components of reading instruction; ! procuring and implementing classroom instructional materials based on scientifically based reading research; ! providing professional development for teachers of grades K-3, and special education teachers of grades K-12; ! collecting and summarizing data to document the effectiveness of these programs; and accelerating improvement of reading instruction by identifying successful schools; ! reporting student progress by detailed demographic characteristics; and ! promoting reading and library programs that provide access to stimulating reading material. LEAs may use Reading First funds for the Prime Time Family Reading Time program;3 for training parents and other volunteers as reading tutors; and for assisting parents to encourage and provide support for their child's reading development. Implementation Status The Reading First program required significant start up time on the part of states. Because the program is complex and many of its requirements are new, it took time for states and LEAs to put together the necessary staff, curriculum, assessment, and evaluation components for the program. By the end of October 2003, all states and the District of Columbia had received their FY2002 and FY2003 Reading First awards. The Virgin Islands received its first Reading First funds in September of 2004. Reading First state grants are awarded for a six-year period, pending a satisfactory midterm review. According to the U.S. Department of Education (ED), only two states were able to distribute Reading First money to LEAs for the 2002-03 school year. Twenty-seven states conducted their first distribution of Reading First funds to LEAs for the 2003-04 school year, and for the 2004-05 school year, 24 additional states awarded their first Reading First grants to LEAs.4 The Virgin Islands awarded its first grants for the 2005-06 school year. Puerto Rico's situation is unique because it did not spend the first Reading First funds it received (for FY2003), and it declined funds for FY2004 because of disagreements with ED over instruction and methods to be employed. Puerto Rico's application for FY2005 funds was not found acceptable by ED. Puerto Rico reapplied for FY2006 funds; however, its application was not approved. Puerto Rico received the Reading First Advisory Committee's comments on its FY2006 application in November of 3 The Prime Time Family Reading Time program is a 6-8 week program of storytelling and discussion held at public libraries based on award winning children's books. 4 Based on an October 10, 2005, conversation with Sandi Jacobs, employed at that time by the U.S. Department of Education. CRS-3 2007. ED has notified Puerto Rico that it may revise its application to incorporate responses to the Committee's comments and resubmit it for FY2007 funds.5 The NCLBA specifies that a midterm peer review of states' performance in the Reading First program be conducted after the completion of the program's third grant period (which would mean a review would have occurred in the fall of 2005). Because of the time involved in initial implementation of the program, ED made adjustments to the timeline to provide states with sufficient time to have participated in three grant cycles as envisioned by the statute, before undergoing a midterm review. ED established November 2006 as the deadline for states' submission of their midterm progress reports. These state reports are being reviewed by the Reading First Advisory Committee. On the basis of the Committee's comments, ED will determine whether states have made sufficient progress to continue receiving their Reading First grant funds.6 The awarding of the first targeted assistance grants was delayed so that there would be more states meeting the requirement of having one year of baseline data and two years of follow up data showing improvement. States that wished to be considered for one of the first round of targeted assistance grants were required to have submitted an application by July 30, 2005. The first Reading First targeted assistance award (of approximately $3 million) was awarded to Massachusetts in September of 2005 (out of FY2004 funds).7 The second round of targeted assistance applications was due to ED by July 30, 2006. Tennessee was the only state to receive an FY2005 targeted assistance grant; it received $4.8 million. FY2006 awards were given to Massachusetts ($950 thousand), Tennessee ($1.4 million), and Virginia ($1.2 million). The Reading First program is required to meet relatively extensive standards. In addition to midterm reviews of states' performance, LEAs are required to track the progress of individual students, and states are required to submit annual evaluations to ED with data on overall school, LEA and state progress. ED has also contracted to have several evaluations of Reading First conducted. These evaluations include 5 ED published a notice in the Federal Register on March 1, 2007, announcing the establishment of a Reading First Advisory Committee. This panel evaluated the FY2006 RF application submitted by Puerto Rico. The panel is also evaluating state Reading First applications and mid-term progress reports. The Committee is made up of individuals selected from each of the following agencies: ED, the National Institute for Literacy, the National Research council of the National Academy of Sciences, and the National Institute of Child Health and Human Development. The committee members will serve for three years or until the date of reauthorization of the ESEA, whichever comes first. 6 States must provide information on progress being made by the state and LEA's in reducing the number of students in grades 1, 2 and 3 reading below grade level. States must also provide evidence that the state and LEA's have significantly increased then number of students reading at grade level or above, as well as the percentages of students (by specified demographic categories) reading at grade level or above. 7 The Massachusetts award was the only targeted assistance award for FY2005. The state annual performance report also served as an application for the targeted assistance grants, but the July deadline required states to push out their annual report on an expedited schedule (the annual report was not due until November 30, 2005). CRS-4 an impact study of Reading First's effect on student achievement. The first report from this study, which is being conducted by Abt Associates and MDRC, is expected to be available in early 2008. In addition, ED has contracted with Abt Associates for an implementation study of Reading First based on a nationally representative sample of schools participating in Reading First. The interim implementation report was issued in July of 2006 (discussed in more detail below); the final implementation report is expected to be issued in the summer of 2008. There will also be a follow-up evaluation of the implementation of RF; data collection will occur in the 2008-2009 school year. ED is also conducting a descriptive study of the relationship between a school's receipt of Reading First funds and its rate of learning disabilities. It is anticipated that a report from this study will be issued in 2008. Another study is investigating how well prospective teachers are prepared to teach the essential components of reading instruction -- a report from this study is anticipated in the summer of 2008. Finally, ED contracted with RMC Research Corporation to sample grades K-3 in 20 states to see how well reading standards are aligned with the five essential components of reading delineated in Reading First. RMC issued its report in December of 2005. Implementation Issues Information from ED's April 2007 report on state performance data, a 2007 Center on Education Policy report, Reading First: Locally Appreciated, Nationally Troubled, and a 2007 GAO report have all provided relatively positive information about states and local school districts opinions of the impact of Reading First on student achievement. However, state assessment measures and cut-off scores for determining reading proficiency vary from state to state, making it difficult to draw definitive conclusions on Reading First's performance from these data. ED's report -- The Reading First Annual Performance Report Data, based on state data, found improvements in the percentages of students reaching proficiency in reading fluency and comprehension on state measures. According to these data, on average, between 2004 and 2006, the 26 states with baseline data increased the performance of students meeting or exceeding proficiency on fluency outcome measures by 16% for 1st graders, 14% for 2nd graders, and 15% for 3rd graders. In addition, these 26 states also increased the performance of students meeting or exceeding proficiency on comprehension outcome measures by 15% for 1st graders, 6% for 2nd graders, and 12% for 3rd graders.8 Reading First Implementation Evaluation: Interim Report. The first of two implementation reports, prepared for ED by Abt Associates, was issued in 2006. The Reading First Implementation Evaluation: Interim Report (Interim Report) was based on data collected during the 2004-2005 school year through surveys of teachers, principals, and reading coaches chosen from a nationally 8 Reading First Annual Performance Report Data, is available online at [http://www.ed.gov]. ED has also issued a report providing profiles of state implementation of reading first, including data on the level of funding and the numbers of LEAs, schools, students, and teachers who have participated in the program. This report titled: The Reading First State Data Profiles, is also available on ED's website. CRS-5 representative sample of Reading First and non-Reading First Title I schools; and through interviews of Reading First state coordinators and a review of states' Reading First applications. The report also drew on other existing data sources.9 The interim report addressed two questions -- how Reading First is being implemented by LEAs and schools, and whether instruction in Reading First schools is different from that of non-Reading First schools. Questions related to student achievement in Reading First schools will be addressed in the final implementation report, after a second round of data has been collected. Overall, the interim report found that Reading First was being properly implemented (as intended by the NCLBA) by schools, and that there are differences between Reading First schools and non- Reading First schools (such as the presence of reading coaches) that have the potential to improve student achievement in reading. The Interim Report summarized its key findings as the following. ! Reading First schools appear to be implementing the major elements of the program as intended by the legislation. ! Reading First schools received both financial and nonfinancial support from a variety of external sources. ! Classroom reading instruction in Reading First schools is significantly more likely to adhere to the Reading First legislation than classroom reading instruction in Title I schools. ! Reading First teachers in three grades (kindergarten, second, and third) were significantly more likely than their counterparts in other Title I schools to place their struggling students in intervention programs. ! Assessment plays an important role in reading programs in both Reading First and non-Reading First Title I schools. ! Principals in Reading First schools were significantly more likely to report having a reading coach than were principals of non-RF Title I schools. ! RF staff received significantly more professional development than did Title I staff.10 9 U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Program Studies Service, Reading First Implementation Evaluation: Interim Report, Washington, D.C., 2006. 10 The Implementation Report includes this caveat: "We can make comparisons between RF and non-RF Title I samples, but because the two samples are not matched they cannot be assumed to be equivalent. Thus, the differences between groups discussed in this report cannot be attributed to the Reading First program." U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Program Studies Service, Reading First Implementation Evaluation: Interim Report, Washington, D.C., 2006. CRS-6 Center on Education Policy Reports. Information from an October 2007 Center on Education Policy (CEP) report on Reading First indicates that many states and districts believe that the professional training, reading instruction, and assessments provided through Reading First have been important causes of increases in student achievement. However, the CEP report notes that "these responses represent the views of state and district officials, rather than a cause and effect relationship between Reading First and achievement."11 The 2007 CEP report is based on annual surveys of states and districts, and on in-depth case studies. According to the CEP report, in 2006, 82% percent of states indicated that Reading First professional development was very or moderately effective in increasing achievement in reading. The percentage of states indicating that Reading First curriculum and assessments were very or moderately effective in increasing student achievement in reading equaled 78% in 2006. Of districts reporting increases in reading achievement, 69% indicated that Reading First assessments were an important or very important factor, and 68% indicated that Reading First instruction was an important or very important factor.12 The CEP report also noted that 80% of states and 75% of districts indicated that they coordinated Reading First and Title I. In addition, more than half of Reading First districts indicated that they used elements of Reading First in non-Reading First schools. A June 2005 CEP study examined ED's administration of the state application process for Reading First grants, among other things.13 The 2005 report is based on a review of all state Reading First applications, an in depth review of 15 randomly selected state applications and a review of revisions to state applications based on 10 representative states; in addition to state and district surveys and case studies. The CEP found that many states were required to revise their initial application for Reading First grants one or more times before ultimately having their application accepted. In addition, it found that "states are remarkably consistent in their selection of specific instruments for assessing students' reading progress." It noted that in their final applications, almost all states included the Dynamic Indicators of Basic Early Literacy Skills (DIBELS) in their list of approved assessments, and used A Consumer's Guide to Evaluating a Core Reading Program Grades K-3: A Critical Elements Analysis (Consumer's Guide) to evaluate and choose a reading 11 Reading First: Locally Appreciated, Nationally Troubled, Center on Education Policy, October, 2007. 12 Reading First: Locally Appreciated, Nationally Troubled, Center on Education Policy, October, 2007. 13 In addition to state and district surveys and case studies, the June 2005 CEP report was also based on an overview of all state Reading First applications, an in depth review of fifteen randomly selected state applications, and a review of revisions to state applications based on ten representative states. Scott, Caitlin and Tom Fagan, Ensuring Academic Rigor or Inducing Rigor Mortis? Issues to Watch in Reading First, Center on Education Policy, June 2005. CRS-7 curriculum.14 CEP analysis of a sample of original and final applications from 10 states found that some modified their original applications to adopt these specific instruments: In each case, 4 of the 10 states added DIBELS and the Consumer's Guide to their applications after initial review, and none dropped either item. In all, 9 of 10 states are using DIBELS and 8 of 10 are using the Consumer Guide.15 Additionally, the CEP study found that state recommendations of specific reading programs appear to have influenced districts' choice of reading programs. The survey of districts receiving Reading First funds indicated that half changed the reading programs used by the district to qualify for a grant from their state. February 2007 GAO Report.16 GAO focused on three Reading First issues: ! whether there have been changes in reading instruction as a result of Reading First; ! the criteria used by states to award subgrants and the difficulties states have had in implementing Reading First; and ! the guidance, assistance, and oversight provided to states by ED and its contractors. The GAO report was written in response to a September 23, 2005, Senate Committee on Health, Education, Labor, and Pensions request for an investigation of questions related to the implementation of the Reading First program. The GAO report was based on ED data, a web survey of 50 states' and the District of Columbia's Reading First Directors, 12 in-depth interviews, and four site visits. In addition, GAO interviewed federal, state, and local education officials as well as Reading First Technical Assistance Center administrators and providers of reading programs and assessments. GAO's findings generally support the findings of ED's performance report data, the CEP study, and the interim Reading First evaluation. The GAO report included information on state responses to a variety of Reading First implementation issues. ! Forty-eight states reported that ED staff were helpful or very helpful in addressing their implementation-related questions. ! Thirty-nine states reported that ED staff were helpful or very helpful in addressing their application-related questions. 14 Both publications were produced by researchers at the University of Oregon. 15 Scott, Caitlin and Tom Fagan, Ensuring Academic Rigor or Inducing Rigor Mortis? Issues to Watch in Reading First, Center on Education Policy, June 2005. 16 GAO-07-161, Reading First: States Report Improvements in Reading Instruction, but Additional Procedures Would Clarify Education's Role in Ensuring Proper Implementation by States, February 2007. CRS-8 ! Ten states reported receiving suggestions that they eliminate specific programs or assessments, and four received suggestions to adopt specific programs or assessments. ! Forty-eight states modified their Reading First grant applications at least once. ! Most states reported changing the assessments they used, and most indicated that they had included multiple assessment tools on their approved list. ! DIBELS was the assessment program most frequently listed on states' (48 states) approved list. ! Twenty-two states developed a state-approved list of Reading programs for districts to select from. GAO reported the following findings on Reading First. ! States reported changes and improvements in reading instruction, including more emphasis on the five key components of reading, assessments, and professional development. ! Reading First schools made more use of reading coaches and increased the amount of time devoted to reading. ! Sixty-nine percent of states reported great or very great improvement in reading instruction. ! Eighty percent of states reported great or very great improvement in professional development, and approximately 75% reported an increase in resources for this purpose. ! However, GAO also found the ED had not developed written policies and procedures to guide ED officials and contractors in dealing with the states, districts, and schools to ensure compliance with statutory requirements regarding local control of curriculum. ! In addition, GAO found that ED had not developed written procedures governing its monitoring visits, which caused confusion among states regarding monitoring procedures, timelines, and expectations for taking corrective actions. GAO recommended that ED take the following actions. ! Establish control procedures to guide ED officials and contractors in their interactions with states, districts, and schools. CRS-9 ! Develop and distribute guidelines regarding its monitoring procedures so that states and districts are made aware of their roles, responsibilities, and timelines. Scientifically Based Research and Reading First There has been considerable debate in the field of education research on the value of different research methodologies, and on what types of research should receive priority for federal dollars. Many researchers argue that the type of research that is appropriate varies with the question that is being asked.17 However, many have also argued that scientifically-based research (SBR), and randomized controlled trials (RCTs) in particular, are the "gold standard" in research. RCT research protocol requires random assignment -- with participants assigned randomly to either an experimental group that receives the treatment under investigation, or a control group that does not.18 RCTs are viewed by many as the most credible way to verify a cause-effect relationship, when the RCT study employs a well designed and implemented methodology with a large sample size. Nevertheless, RCT studies do not necessarily provide a one-size-fits all solution to all educational research needs. A CRS report analyzing RCTs included a summary of some of the potential limitations of putting too much emphasis on RCTs: ... RCTs are occasionally seen as impractical, unethical, requiring too much time, or being too costly compared to other designs that also seek to assess whether a program causes favorable outcomes. Finally, there is wide consensus that RCTs are particularly well suited for answering certain types of questions, but not others, compared to other evaluation research designs. For example, RCTs typically do not assess how and why impacts occur, how a program might be modified to improve program results, or a program's cost-effectiveness. RCTs also typically do not provide a full picture of whether unintended consequences may have resulted from a program or indicate whether a study is using valid measures or concepts for judging a program's success. Many of these kinds of questions have been considered to be more appropriately addressed with observational or qualitative designs.19 17 "The scientific enterprise depends on a healthy community of researchers and is guided by a set of fundamental principles. These principles are not a set of rigid standards for conducting and evaluating individual studies, but rather are a set of norms enforced by the community of researchers that shape scientific understanding." Richard Shavelson and Lisa Towne, Eds., Scientific Research in Education, National Research Council, National Academy Press, 2002. 18 CRS Report RL33301, Congress and Program Evaluation: An Overview of Randomized Control Trials (RCTs) and Related Issues, by Clinton Brass, Blas Nuņez-Neto, and Erin D. Williams. 19 CRS Report RL33301, Congress and Program Evaluation: An Overview of Randomized Control Trials (RCTs) and Related Issues, by Clinton Brass, Blas Nuņez-Neto, and Erin D. Williams. CRS-10 Scientifically Based Research Requirements in the No Child Left Behind Act The NCLBA has endorsed the use of SBR in funded activities, including over 100 references to the use of SBR in choosing instructional and assessment programs, as well as for professional training programs, and other NCLBA funded activities. The emphasis is on experimental research, particularly RCTs.20 Programs in the NCLBA affected by the requirement that funded educational interventions be based on SBR include Title I, Part A, grants for the education of the disadvantaged, Reading First, Early Reading First, Even Start, Literacy Through School Libraries, Comprehensive School Reform, Improving Teacher Quality State Grants, Mathematics and Science Partnerships, English Language Acquisition State Grants, and Safe and Drug-Free Schools and Communities. This discussion focuses on the application of SBR to the Reading First program. The NCLBA language authorizing Reading First makes clear that the intent of the program is to require recipients of Reading First funds to implement programs which are based on scientifically based reading research (SBRR). The definition of SBRR in the NCLBA, is as follows: The term "scientifically based reading research" means research that (A) applies rigorous, systematic and objective procedures to obtain valid knowledge relevant to reading development, reading instruction, and reading difficulties; and (B) includes research that (I) employs systematic, empirical methods that draw on observation or experiment; (ii) involves rigorous data analyses that are adequate to test the stated hypotheses and justify the general conclusions drawn; (iii) relies on measurements or observational methods that provide valid data across evaluators and observers and across multiple measurements and observations; and (iv) has been accepted by a peer-reviewed journal or approved by a panel of independent experts through a comparably rigorous, objective, and scientific review.21 ED's application of SBRR to the Reading First program draws extensively on the work conducted by the National Reading Panel (NRP). In 2000, the NRP issued a report, Teaching Children to Read. The NRP was convened by the National Institute of Child Health and Human Development (NICHD) in consultation with ED in response to a congressional charge to review the literature on reading and use it to assess the effectiveness of different techniques for teaching reading, and whether these techniques were ready to be applied to classroom settings. Based on the NRP's 20 Some authors argue that in the context of encouraging basic educational research, SBR must be interpreted more broadly, in contrast to the more prescriptive definition of SBR contained in the NCLBA, "narrowly conceived for service providers trying to justify their use of federal dollars." Margaret Eisenhart and Lisa Towne, Contestation and Change in National Policy on "Scientifically Based" Education Research, Educational Researcher, vol. 32, October 2003. 21 Elementary and Secondary Education Act of 1965, Section 1208. CRS-11 research, the NCLBA incorporated five essential components of reading as requirements for reading instruction funded under the Reading First program. These essential components are defined in the NCLBA as ... explicit and systematic instruction in -- (A) phonemic awareness; (B) phonics; (C) vocabulary development; (D) reading fluency, including oral reading skills; and (E) reading comprehension strategies.22 SBRR Implementation Issues Application of Scientifically Based Reading Research to the Reading First Program. This section summarizes major implementation issues that have arisen regarding the application of SBRR to the Reading First program. Issues discussed here include ED's implementation of SBRR requirements, and the implications of the current state of SBRR for states and LEAs trying to navigate and apply existing research and resources to their educational programs as well as maintain local autonomy in choosing curricula. Implementing SBRR. Some criticisms have been raised regarding ED's application of SBRR to the Reading First Program. For example, Robert Slavin, of the Success for All Program, has argued that the NCLBA's requirement that interventions be based on SBR does not differentiate between programs that have themselves been rigorously evaluated and those programs that have not been rigorously evaluated for efficacy, but can cite SBR that supports their interventions. The Success for All Foundation also argues in a letter to the Office of the Inspector General of the U.S. Dept. of Education (OIG), that ED has inappropriately narrowed the definition of scientifically based research in its implementation of the Reading First program: In essence, through the implementation of Reading First, the U.S. Department of Education has narrowed the definition of SBRR to the five "essential components" of reading as identified by the National Reading Panel. Research on program efficacy has been ignored. Because Reading First was so closely managed by the U.S. Department of Education, and because it contains such a strong focus on the use of scientifically based research, it is paving the way for how states, districts and schools are coming to understand the meaning of SBR, and how they will apply it to other Federal programs.23 As a consequence of the alleged "narrowing" of the definition of SBRR, states have been unnecessarily limited in their choices of reading programs, assessments and professional development packages, according to critics of ED's implementation of Reading First. 22 P.L. 107-110, Section 1207. [20 U.S.C. 6367]. CRS Report RL32145, Early Intervention in Reading: An Overview of Research and Policy Issues, by Gail McCallion. 23 Robert Slavin, Letter to U.S. Department of Education, The Success for All Foundation, May 27, 2005. CRS-12 Limitations of Existing Research. Some of the controversies that have surrounded implementation of SBRR in the Reading First program reflect the current state of SBRR and the difficulties of applying existing research to concrete educational interventions. Some observers have noted that there are many areas of education research with few if any RCT studies to draw upon. Robert Boruch, who served on the National Research Council that produced the book Scientific Inquiry in Education, stated in an interview with Education Week that "One cannot just demand controlled experiments ... That's akin to asking people to levitate."24 Some have argued that navigating the existing array of resources is difficult for states and LEAs because much of the research is academic. In addition, although there is more user-friendly material available than ever before, evaluations of the application of SBRR to concrete educational interventions is still limited, and there is no single federal website or resource that currently catalogs and evaluates all the available user-friendly resources. The following discussion summarizes some of the resources that are currently available. Identifying Relevant Resources. There are a variety of federally funded offices and resources that provide information, and/or technical assistance offering guidance on SBR to states and LEAs. There are also guides intended to provide user- friendly information on SBR, that states and LEAs can access through ED websites and publications. Online resources include a NCLBA website with information on SBR and related resources, a searchable ERIC database on education research, and access to educational statistics and National Assessment of Educational Progress (NAEP) data on ED's National Center for Educational Statistics website.25 The Institute of Education Sciences (IES) has made publications and other resources available on SBR. In December of 2003 IES published a report, Identifying and Implementing Educational Practices Supported by Rigorous Evidence: A User Friendly Guide. In addition, ED has awarded 20 five-year grants to comprehensive centers to provide advice to states and LEAs on meeting the requirements of the NCLBA. There are also ten regional centers with functions defined in the Education Sciences Reform Act of 2002.26 One of these centers, the Mid-continent Research Center for Education and Learning, in conjunction with the Education Commission on the States (ECS), published a February 2004 publication, A Policymaker's Primer on Education Research: How to Understand, Evaluate and Use it. ECS has also published user-friendly guides on teacher issues and maintains a 50 state database on teacher preparation, recruitment, and retention. Another of the regional centers funded by ED, the North Central Regional Education Laboratory, published a report 24 Lynn Olson, "Law Mandates Scientific Based for Research," Education Week, January 30, 2002. 25 See the following: [http://www.ed.gov/nclb], [http://www.ed.gov/about/pubs/intro/pubdb. html], [http://www.nces.ed.gov]. 26 The mission of the regional centers includes serving regional needs, disseminating SBR, providing professional training and technical assistance, and responding to the needs of stakeholders to ensure the academic success of all students. Responding to Regional Needs and National Priorities, Regional Educational Laboratories, 2004 Annual Report. CRS-13 in its Spring 2003 edition of Learning Point, A Call for Evidence: Responding to the New Emphasis on Scientifically Based Research. These resources are however, not all centralized in one location, and relatively few provide analysis of specific educational instruction or assessment packages that might meet the SBR requirements of the NCLBA. It can be difficult for states and LEAs to sift through the volume of information that is available and find what they need to chose effective curriculum and assessment programs. Ellen Lagemann was interviewed by Education Week on the topic of SBR while working for the Spencer Foundation. She stated We have tended to think that if you do research and get results, that will be useful to practitioners. There's an intermediary step. You have to take the results of research and build it into toys, tools, tests, and texts. You have to build it into things that practitioners can use. They can't use the conclusions of a study.27 ED's IES created a What Works Clearinghouse (WWC) to address this need for clear user-friendly information on SBR, including evaluations of specific educational interventions. The WWC publishes reviews of educational interventions that have SBR to back up their efficacy claims on education topics that the WWC has identified as priorities. Initially the WWC intended to issue only topic reports, but in May of 2006, the WWC modified its website to include new intervention reports.28 These intervention reports have been introduced so that potentially useful 27 Ms. Lagemann is a professor at the Harvard Graduate School of Education. Lynn Olson, "Law Mandates Scientific Based for Research," Education Week, January 30, 2002. 28 The following definitions are taken verbatim from:[http://whatworks.ed.gov/]. "Intervention Reports: Intervention reports are produced for interventions that had one or more studies that met WWC evidence standards. The reports provide key findings from each of the studies pertaining to the particular intervention. Each report describes the intervention (for example, program, product, practice, or policy) and has a brief description of each outcome study. The report also presents in a single table the findings from the WWC-vetted studies. These reports are released as soon as they are produced, typically at the same time as the topic report. Intervention reports cannot be prepared for interventions whose studies do not pass WWC Standards. Topic Reports: Each topic report briefly describes the topic and each intervention that the WWC reviewed. The report covers only interventions that had studies passing WWC Standards. Topic reports are usually released at the same time as intervention reports because each topic level report provides a compilation of completed intervention reports. The topic report describes how the WWC searched the literature, describes the key features of interventions at the time they were studied, and presents the findings. The topic report also notes the over-all strength of the research base for each intervention, providing an accessible picture of interventions that met WWC evidence standards. The topic report links to all related intervention reports. The What Works Evidence Standards identify studies that provide the strongest evidence of effects: primarily well conducted randomized controlled trials and regression discontinuity studies, and secondarily quasi-experimental studies of especially strong design. In addition, the standards rate other important characteristics of study design, such as intervention fidelity, outcome measures, and generalizability." CRS-14 information can be made available as quickly as possible. After an intervention that meets WWC standards is reviewed, an intervention report will be posted on the website. After all such interventions on a specific topic have been reviewed, a topic report will be posted on the website. The information provided in intervention reports includes program descriptions, costs of implementing the programs, and ratings of program effectiveness -- including a category of "potentially positive" for promising results. Resources on SBRR specifically targeted to the Reading First program have also been provided by ED. These include information and links to additional resources provided in the Reading First and NCLBA websites.29 ED sponsored Reading First Leadership Academies to assist states with understanding and applying for Reading First grants, and it has issued nonregulatory guidance on Reading First.30 In addition, ED established a National Center for Reading First Technical Assistance to provide training to states and districts to assist with Reading First.31 According to ED in its March 1, 2004, issue of the Achiever, Administrators and teachers will receive training in scientifically based reading research and instruction; assistance in reviewing reading programs and assessments; critiques of Reading First sub-grant applications and methods of scoring them; and training in using assessment data to improve student reading performance.... Technical assistance will be provided through a range of learning opportunities, including national and regional conferences, institutes and seminars; training and professional development; on-site, telephone and e-mail consultations; and links to national reading experts. The National Institute for Literacy (NIFL) is charged with the mission of disseminating information on SBRR as it relates to children, youth, and adults. NIFL is also to disseminate information on specific reading programs supported by SBR and information on effective classroom reading programs that have been implemented by states and LEAs. NIFL publications are available for downloading on their website.32 Local Control. Perhaps in part because of the difficulties in finding specific information on SBRR based educational interventions that meet the requirements of the NCLBA, many states have chosen to rely upon a limited number of instructional, assessment and professional training programs. This has raised concerns by some about what they call the "overprescriptiveness" of ED's application of SBRR to Reading First and the potential infringement on states' and LEAs' ability to choose 29 See [http://www.ed.gov/programs/readingfirst/index.html], [http://www.ed.gov/nclb]. 30 ED does not endorse any particular program and has stated in print that there is no approved list of reading programs. However, the Reading Recovery Council, among others, cites the naming of particular programs as acceptable in RF Leadership Academies as an indication of ED's preference for particular programs. Investigation of Reading First Implementation Requested, Reading Recovery Council of North America, August 23, 2005. 31 The RMC Corporation is currently administering this Center and three Regional Centers; however, the RMC contract will expire in 2008 and a new contract competition will be held. 32 [http://www.nifl.org]. CRS-15 curricula. Some argue that this "overprescriptiveness" is not consistent with section 9527 of the No Child Left Behind Act. This section states the following: (a) GENERAL PROHIBITION -- Nothing in this Act shall be construed to authorize an officer or employee of the Federal Government to mandate, direct, or control a State, local educational agency, or school's curriculum, program of instruction, or allocation of State or local resources, or mandate a State or any subdivision thereof to spend any funds or incur any costs not paid for under this Act. (b) PROHIBITION ON ENDORSEMENT OF CURRICULUM. -- Notwithstanding any other prohibition of Federal law, no funds provided to the Department under this Act may be used by the Department to endorse, approve, or sanction any curriculum designed to be used in an elementary school or secondary school.33 The 2005 CEP study discussed earlier in this report did find that states were "remarkably consistent" in their choice of programs. For example, the 2005 CEP study found that many states were required to revise their initial application for Reading First before it was accepted. CEP found that in their final accepted applications, almost all states included DIBELS on their list of approved assessments, and used the Consumer's Guide to evaluate and choose a reading curriculum. Additionally, the CEP study found that state recommendations of specific reading programs appear to have influenced districts' choice of reading programs. The survey of districts receiving Reading First funds found that half changed the reading programs used by the district to qualify for a grant from their state. Office of the Inspector General Audits Three groups representing different reading programs filed separate complaints with ED's OIG, asking that the Reading First program be investigated. The three groups that filed complaints are Dr. Cupp's Readers and Journal Writers, Success For All, and the Reading Recovery Council of North America. In response, the OIG has conducted several audits of the Reading First program. It issued its first report on the federal Reading First program, specifically on Reading First's grant application process, in September of 2006. In addition, several audits of state Reading First programs have been issued, and audits have been conducted on ED's administration of the Reading First program and on the RMC Research Corporation's Reading First Contract.34 These three reports essentially validated many of the concerns that had been raised in complaints filed with the OIG. ED concurred with the OIG's recommendations in all three reports and has addressed the recommendations. 33 Elementary and Secondary Education Act of 1965, Section 9527. 34 The state audits were issued on October 3, 2005 (Alabama), October 20, 2006 (Wisconsin), November 3, 2006 (New York), and January 18, 2007 (Georgia). CRS-16 OIG Final Inspection Report: The Reading First Program's Grant Application Process The OIG report on the Reading First application process was highly critical of ED's implementation of the Reading First program. The major findings included in this report are summarized below. ! The OIG found that the expert review panel that reviewed state applications for Reading First grants was not selected as required by the NCLBA. Section 1203(c)(2)(A) of the NCLBA requires the peer review panel to include at a minimum, three individuals selected by each of the following agencies: the Secretary of the U.S. Department of Education, the National Institute for Literacy (NIFL), the National Research Council of the National Academy of Sciences (NAS), and three individuals selected by the National Institute of Child Health and Human Development. ED created 16 subpanels to review state applications, and according to the OIG, a majority of the panelists on 15 out of the 16 subpanels had been nominated by ED. In addition, none of the subpanels included a nominee from each of the other organizations specified in Section 1203(c)(2)(A) of the NCLBA. And the OIG found no evidence that the subpanels met to review applications as a whole before recommending that the Secretary approve or disapprove a state's application. The OIG's report states that "Because the Department did not meet the requirements at Sections 1203(c)(2)(A), it raises the question of whether any of the applications were approved in compliance with the law."35 ! Although not required to do so by law, ED screened potential panelists for conflicts of interest. However, the screening process used was ineffective, according to the OIG. The OIG reviewed resumes provided to ED by 25 Reading First panelists, and found that six of the panelists had significant professional connections to a specific reading program. ! ED failed to follow its own guidance (Reviewer Guidance for the Reading First Program) for conducting the peer review process. The OIG found that the review panelists provided constructive comments in the Panel Chair Summaries submitted to ED that would have been useful to states whose applications were not approved, in making needed modifications to their applications. However, ED did not share these panel summaries with the states; instead, the Reading First director and his assistant used these panel summaries to write their own reports, which were then provided to states. 35 All of the discussion contained in this section is based on the OIG's inspection report and from ED's responses to the OIG report included as attachments to the report. U.S. Dept. of Education, Office of Inspector General, The Reading First Program's Grant Application Process, FINAL INSPECTION REPORT, September 2006. CRS-17 According to the OIG, these reports did not always accurately reflect the Panel Chair summaries -- sometimes the Reading First director and his assistant changed or omitted panelists' comments, and sometimes they added their own comments. As a consequence, states sometimes lacked adequate information to correct their applications and were required to submit amended applications several times before they were approved. In addition, the OIG found that five state applications were approved without documentation that these states had met the required criteria, or that the subpanels had approved these applications. ! Some of the criteria required by the department for panelists to approve a state's application were not based on requirements included in the NCLBA. ED provided panelists with 25 criteria to be rated in each state application (Reading First: Criteria for Review of State Applications). Three rating categories were established for each criterion: "Exemplary," "Meets Standard," and "Does Not Meet Standard." The "Meets Standard" category was the bar all states were expected to meet for application approval. The "Exemplary" category was applied to conditions above and beyond "Meets Standard" that were believed would result in the highest- quality programs. However, the OIG found that some of requirements in the "Meets Standard" category were not requirements contained in the NCLBA, and as a consequence, "State applications were reviewed based upon standards that were not required by statute." ! Finally, the OIG found that "program officials tried to purposely obscure the content of the statute (the ESEA) and otherwise took actions to disregard Congress' direction and intent." The OIG also found that ED's "actions demonstrate that the program officials failed to maintain a controlled environment that exemplifies management integrity and accountability." Further, the OIG found that ED's actions may have violated prohibitions in the Department of Education Organization Act (DEOA) and the ESEA against federal endorsement of particular curricula. The OIG recommended that the Assistant Secretary of ED's Office of Elementary and Secondary Education (OESE) take the following actions. ! Implement procedures to ensure OESE staff know when to solicit advice from the Office of the General Counsel (OGC); as well as procedures to resolve disputes that might arise between OESE staff and the OGC to "ensure that programs are managed in compliance with applicable laws and regulations." ! In consultation with the OGC, make improvements to strengthen procedures for evaluating potential conflicts of interest in panel review processes. CRS-18 ! Review all Reading First applications to ensure all necessary criteria were met. ! Make changes, as appropriate, to the management and staff structure of the Reading First program to ensure that Reading First's implementation is consistent with NCLBA requirements. ! Ask the OGC to provide guidance on what is prohibited by Section 3403(b) of the Department of Education Organization Act. ! Rely upon an internal advisory committee (which includes representatives from OESE programs, the OGC, and ED's Risk Management Team) to ensure that future initiatives are appropriately implemented and coordinated with other ED programs. ! Request that the internal advisory committee evaluate whether "the implementation of Reading First harmed the Federal interest," and whether any remedial actions are required. In addition, request that the internal advisory committee ensure that ED has internal controls in place so that future programs do not have problems similar to those that occurred with Reading First. ! Establish a discussion with state and local education representatives "to discuss issues with Reading First as part of the reauthorization process." The Secretary of the U.S. Department of Education (ED) responded in writing that she agreed with all of the recommendations of the OIG, and would take immediate action to implement these recommendations. However, ED also responded that it did not agree with all of the findings reached by the OIG. ED noted that it has no information to indicate that its peer review process adversely affected any state. It also noted that screening for conflicts of interest was not required -- but it took this extra effort and made reasonable efforts to adapt conflict-of-interest procedures to the Reading First program. Regarding its screening of panelists, ED stated that "We know that while additional steps could have been taken, the steps we took were effective and more than what was required by law." ED also indicated that the statute did not specify the role of peer review comments, and that it had not replaced a process required by the NCLBA. In addition, its further review of Reading First staff summaries of these comments found that, overall, " the summaries did not deviate significantly from the reviewers' comments." ED also stated that the peer review panel was advisory, and that it was not practical to have the panel review every resubmitted state application. In addition, ED noted that the Reading First criteria it issued to panelists was intended to "encourage high-quality projects that go beyond the minimum standards of the statute." ED stated that "Overall, the Reading First guidance has proven to be helpful and it is consistent with the law, and consistent with helping ensure the submission of high quality applications." Finally, ED stated that "We are not aware of information showing inappropriate actions to require particular programs or approaches." CRS-19 OIG Final Audit Report: The Department's Administration of Selected Aspects of the Reading First Program This audit focused on ED's administration of several aspects of the Reading First program: the Reading First Leadership Academies (RLAs) held in January and February of 2002; the Reading First website; ED's April 2007 Guidance for the Reading First Program; and ED's monitoring of conflicts of interest in its technical assistance contracts.36 The major findings included in the OIG's report are summarized below. ! The April 2007 Guidance for the Reading First Program and ED's administration of its Reading First website were consistent with the law. ! ED did not ensure that the RLAs complied with curriculum provisions contained in the Department of Education Organization Act (DEOA) and the NCLBA. In particular, the Theory to Practice sessions provided during the RLAs focused on a select number of reading programs, and the RLA Handbook and Guidebook appeared to promote DIBELS. ! ED did not adequately address issues regarding bias and objectivity when hiring technical assistance providers. The OIG recommended that the Assistant Secretary of ED's Office of Elementary and Secondary Education take the following actions: ! Establish controls to ensure that ED complies with all DEOA and NCLBA curriculum requirements in department-sponsored events. ! Establish controls to ensure that ED does not promote (or appear to promote) any specific curriculum in department-sponsored conference materials. ! In consultation with ED's Chief Financial Officer, establish procedures to ensure that all department contractors have been adequately assessed for bias and objectivity. The Secretary of the U.S. Department of Education responded in writing that she agreed with all of the recommendations of the OIG, and would take immediate action to implement the recommendations. However, ED also responded that it did not agree with all of the findings reached by the OIG. ED said that the OIG report did not provide a balanced perspective of the activities discussed, and failed to mention the positive elements of these activities. ED argued that it was necessary to 36 All of the discussion contained in this section is based on the OIG's audit report and on ED's responses to the OIG report included as attachments to the report. U.S. Dept. of Education, Office of Inspector General, The Department's Administration of Selected Aspects of the Reading First Program, FINAL AUDIT REPORT, February 2007. CRS-20 discuss specific reading programs in the Theory to Practice sessions held at the RLAs in order for these sessions to be useful to participants. Furthermore, ED noted that participants at the RLAs were told that the purpose of the sessions was not to endorse any particular reading program. In addition, ED noted that simply having expertise in a particular program should not disqualify an individual from serving as a provider of technical assistance, so long as the individual does not have a financial interest in the areas for which he or she provides advice. OIG Final Audit Report: RMC Research Corporation's (RMC) Administration of the Reading First Program Contracts This audit focused on RMC's Reading First technical assistance contracts. RMC was issued three contracts by ED.37 The first two contracts were to provide technical assistance to SEAs to assist them in preparing their Reading First applications and to transition to program implementation. The third contract was for RMC to manage three regional technical assistance centers to assist in providing technical assistance to SEAs and LEAs in the program implementation phase. The OIG's major findings are summarized below. ! RMC did not adequately monitor its staff and its subcontractors' staff to ensure that there were no conflicts of interest or potential bias. ! In two instances, a particular assessment may have been inappropriately promoted to SEAs. ! RMC did not include ED's required conflict-of-interest clause in its contracts, and it did not adequately screen the technical assistance providers it used for affiliations with particular reading programs. The OIG recommended that the Assistant Secretary of ED's OESE require RMC to work with the department to take the following actions. ! Implement formal conflict-of-interest procedures to be applied to all current and future contracts with the ED. ! Investigate and try to remedy any instances of bias on the part of TA providers on the National Technical Assistance Center contract. ! Develop and implement a conflict-of-interest certification form for all technical assistance providers. RMC concurred with the OIG recommendations, and has consulted with ED and taken action to improve and strengthen conflict-of-interest requirements. 37 All of the discussion contained in this section is based on the OIG's audit report and RMC's responses to the OIG report included as attachments to the report. U.S. Department of Education, Office of Inspector General, RMC Research Corporation's Administration of the Reading First Program Contracts, FINAL AUDIT REPORT, March 2007. CRS-21 Congressional Oversight and Legislation The House Committee on Education and Labor has held two oversight hearings on Reading First. The first hearing was held on April 20, 2007. Witnesses at the hearing included ED's Inspector General, John Higgins; the Director of the Reading First program (until September, 2006), Chris Doherty; and three members of the Committee on Reading Assessments (Roland Good, Edward Kame'enui, and Deborah Simmons). The focus of the hearing was on the administration of Reading First under Doherty's leadership and on connections of the three panelists who had served on the Committee on Reading Assessments to the DIBELS assessment program. The purpose of the second hearing, held on May 10, 2007, was to receive testimony from the Secretary of the U.S. Department of Education, Margaret Spellings, on the Reading First program and on the student loan program. On May 9, 2007 the Senate Committee on Health, Education, Labor and Pensions issued a report indicating that four out of five Reading First Technical Assistance (TAC) directors had financial ties with publishers while serving as TAC directors. In its conclusion, the report notes that The Chairman's investigation reveals that four Reading First Technical Assistance Center directors -- subcontractors to the Department -- had substantial financial ties to publishing companies while simultaneously being responsible for providing technical assistance to states and school districts seeking guidance in selecting reading programs that would help them secure federal grants. These findings are troublesome because they diminish the integrity of the Reading First program. Congress should act to ensure that future conflicts of interest are identified and addressed.38 The report agreed with all of the OIG recommendations. In addition, it recommended that Congress adopt new requirements regarding financial disclosure to prevent future conflicts of interest by federal employees and others involved in the administration or implementation of K-12 education programs, as well as those providing technical assistance. H.R. 1939 (McKeon), the Reading First Improvement Act, was introduced on April 19, 2007, and referred to the House Committee on Education and Labor. This legislation establishes procedures for setting up a Reading First Advisory Committee and potential subcommittees. It would prohibit one entity or individual from nominating a majority of the committee or subcommittee members. The bill would also require ED to establish stronger conflict-of-interest procedures and provide guidance on how the advisory committee and any subcommittees are to review and provide feedback on state applications, as well as ensure decisions are well- documented and available to the public. The legislation would also prohibit ED from providing a contract or subcontract for program evaluation to any entity that received a contract or subcontract to implement any aspect of Reading First. Additionally, it 38 U.S. Senate Health, Education, Labor and Pensions Committee, The Chairman's Report on the Conflicts of Interest Found in the Implementation of the Reading First Program at the Three Regional Technical Assistance Centers, May 9, 2007. CRS-22 would require conflict-of-interest screening by contractors and subcontractors of all employees involved in the contract or subcontract. ------------------------------------------------------------------------------ For other versions of this document, see http://wikileaks.org/wiki/CRS-RL33246