2019 Federal Standard of Excellence
U.S. Department of Education
8
Leadership
Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY19?
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
- The Commissioner for the National Center for Education Evaluation and Regional Assistance (NCEE) serves as the Department of Education (ED) evaluation officer. ED’s Institute of Education Sciences (IES), with a budget of $616.5 million in FY19, is primarily responsible for education research, evaluation, and statistics. The NCEE Commissioner is responsible for planning and overseeing ED’s major evaluations and also supports the IES Director. IES employed approximately 160 full-time staff in FY19, including approximately 20 staff in NCEE.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
- ED has a designated Chief Data Officer, who serves as the Director of the Policy and Program Studies Service. The Office of Planning, Evaluation and Policy Development (OPEPD) Policy and Program Studies Service (PPSS) has a staff of about 20 and serves as the Department’s internal analytics office. Historically, PPSS has performed data analysis, and conducted short-term evaluations that support continuous improvement of program implementation, fostered a culture of data management and transparency, and worked closely with program offices and senior leadership to inform policy decisions with data and evidence.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
- The Assistant Secretary for the Office of Planning, Evaluation and Policy Development (OPEPD) reports to, and advises, the Secretary on matters relating to policy development and review; program performance measurement and evaluation; and the use of data and evidence to inform decision-making. The Director of IES reports to, and advises, the Secretary on matters related to research, evaluation, and statistics and coordinates education research and related activities carried out by IES with those carried out elsewhere in government.
- Officials from OPEPD and IES participate in the Department’s Policy Committee, which is led by the OPEPD Assistant Secretary. OPEPD officials serve a policy leadership function, managing the Secretary’s policy priorities including evidence, while IES is focused on (a) bringing extant evidence to policy conversations and (b) suggesting how evidence can be built as part of policy initiatives. OPEPD plays leading roles in the formation of the Department’s policy positions as expressed through annual budget requests, grant competition priorities, including evidence. Both OPEPD and IES provide technical assistance to Congress to ensure evidence appropriately informs policy design.
- The Evidence Leadership Group (ELG) supports program staff that run evidence-based grant competitions and monitor evidence-based grant projects. It advises Department leadership and staff on how evidence can be used to improve Department programs and provides support to staff in the use of evidence.
- Upon official designation, the Evidence Act’s named officials began regular meetings to enable better coordination. ED’s new Data Governance Board, chaired and established by the Chief Data Officer in late FY19, similarly enables participation by both the Evaluation Official and Statistical Official.
7
Evaluation & Research
Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence-building plan), and did it publicly release the findings of all completed program evaluations in FY19?
2.1 Did the agency have an agency-wide evaluation policy? (Example: Evidence Act 313(d))
- ED has a scientific integrity policy to ensure that all scientific activities (including research, development, testing, and evaluation) conducted and supported by ED are of the highest quality and integrity, can be trusted by the public, and contribute to sound decision-making. In January 2017, IES published “Evaluation Principles and Practices,” which describes the foundational principles that guide its evaluation studies and the key ways in which the principles are put into practice. That document is expected to serve as the foundation of ED’s formal evaluation policy, under development by the Evaluation Officer for consideration by the Evidence Leadership Group and, subsequently, senior ED leadership.
2.2 Did the agency have an agency-wide evaluation plan? (Example: Evidence Act 312(b))
- Since the passage of ESSA, IES has worked with partners across ED, including the Evidence Leadership Group, to prepare and submit to Congress a biennial, forward-looking evaluation plan covering all mandated and discretionary evaluations of education programs funded under ESSA (known as ED’s “8601 plan”). The plan is biennial, with the current plan covering FY18 and FY19. The process by which that plan is developed serves as the foundation for ED’s work on both its forthcoming Learning Agenda and Annual Evaluation Plan.
2.3 Did the agency have a learning agenda (evidence-building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including, but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda? (Example: Evidence Act 312)
- To develop its draft Learning Agenda, ED has expanded the question generation and prioritization process from ESSA-funded programs, operated by the Office of Elementary and Secondary Education (OESE), to all programs operated by all of its programmatic principal offices. To help ensure alignment of the draft learning agenda to ED’s strategic plan, the Evidence Leadership Group has been expanded to include a member from ED’s Performance Improvement Office, and work has begun to ensure that evidence needs from Strategic Plan Goal Teams is actively solicited. ED anticipates seeking external stakeholder feedback on the draft Learning Agenda in the middle of 2020.
2.4 Did the agency publicly release all completed program evaluations?
- ED’s FY 2018 Annual Performance Report and FY 2020 Annual Performance Plan includes a list of ED’s current evaluations in Appendix D, organized by topic. IES also maintains profiles of all its evaluations on its website, which include key findings, publications, and products. IES publicly releases all peer-reviewed publications from itsevaluations on the IES website and also in the Education Resources Information Center (ERIC). IES regularly conducts briefings on its evaluations for ED, the Office of Management and Budget, Congressional staff, and the public.
2.5 What is the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts? (Example: Evidence Act 315, subchapter II (c)(3)(9))
- To develop its draft Annual Evaluation Plan, ED will expand its current 8601 plan format to include the broader set of evidence activities implied by the Learning Agenda. This includes evaluation studies, as is typical for the 8601 plan, but also information about planned evidence building activities beyond evaluations, such as significant administrative data collections, improvements to ED’s performance monitoring activities, use of evidence in grant competitions, relevant policy studies, and advanced statistical and analytic activities (e.g., predictive modeling of student, borrower, and institutional behavior at Federal Student Aid).
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
- The IES website includes a searchable database of evaluations, including those that use experimental, quasi-experimental, or regression discontinuity designs in order to determine impact. As of August 2019, IES has published 41 experimental studies, 1 quasi-experimental study, and 5 regression discontinuity studies. The What Works Clearinghouse lists studies by design. Currently, the WWC’s database includes 10,646 studies, including 1,115 that meet WWC standards for internal validity. Among them are randomized controlled trials (currently 747), quasi-experimental (currently 217), regression discontinuity (currently 4), and single case (currently 49).
5
Resources
Did the agency invest at least 1% of program funds in evaluations in FY19? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, rigorous evaluations, including random assignments)
3.1. ____ (Name of agency) invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY19 budget.
- ED invested $53.5 million on evaluations, evaluation technical assistance, and evaluation capacity-building, representing 0.11% of the agency’s $47.9 billion discretionary budget (not including Student Financial Assistance and administrative funds) in FY19.
- This total reflects a targeted definition of program funds dedicated to evaluation, including impact studies and implementation studies. It is important to note that the timing of evaluation projects and the type of research projects proposed by the field results in year-to-year fluctuations in this amount and does not reflect a change in ED’s commitment to evaluation.
3.2 Did the agency have a budget for evaluation and how much was it? (Were there any changes in this budget from the previous fiscal year?)
- In FY19, ED spent $53.5 million on evaluation related activities, an increase from the $39.7 million spent in FY18.
- This amount included $39.7 million spent on evaluations in FY19, a slight increase from about $38 million it spent in FY18. While ED does not have a specific budget solely for evaluation, it is authorized by ESEA to reserve up to .5% of ESEA program funds for evaluation activities. Other sources of funding include the IES budget and program funds that require evaluations.
- The FY20 President’s Budget proposed a new pooled evaluation authority in the Higher Education Act (HEA), similar to that of the ESEA, that would permit the Department to reserve up to .5% of funding appropriated for each HEA program (with the exception of the Pell Grant program) to support rigorous independent evaluations and data collection and analysis of student outcomes of all HEA programs.
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
- Since FY15, IES has been supporting a cohort of 16 state grantees, with awards totaling approximately $24 million, as part of the Statewide Longitudinal Data Systems grant program. This supports efforts related to (1) increasing use of data for decision making; (2) conducting training on data use, data tools, or accessing data and reporting systems; and (3) utilizing research and analysis results. In FY19, IES announced a new round of SLDS funding totaling $26.1 million.
- The Regional Education Laboratories (RELs) provide extensive technical assistance on evaluation and support research alliances that conduct implementation and impact studies on education policies and programs in ten geographic regions of the U.S., covering all states, territories, and the District of Columbia. Congress appropriated $55.4 million for the RELs in FY19.
- Comprehensive Centers provide support to States in planning and implementing interventions through coaching, peer-to-peer learning opportunities, and ongoing direct support. The State Implementation and Scaling Up of Evidence-Based Practices Center provides tools, training modules, and resources on implementation planning and monitoring.
8
Performance Management / Continuous Improvement
Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY19?
(Example: Performance stat systems, frequent outcomes-focused data-informed meetings)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- ED’s FY18-22 Strategic Plan includes two parallel goals, one for P-12 and one for higher education (Strategic Objectives 1.4 and 2.2, respectively), that focus on supporting agencies and educational institutions in the identification and use of evidence-based strategies and practices.
- The Department’s FY 2018 Annual Performance Report and FY 2020 Annual Performance Plan contains FY18 performance results as well as metrics for evidence from strategic objective 5.3 in the previous Strategic Plan, where established targets were mostly met.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
- The newly formed Grants Policy Office in the Office of Planning, Evaluation and Policy Development (OPEPD) works with offices across the Department to ensure alignment with the Secretary’s priorities, including evidence-based practices. The Grants Policy Office looks at where the Department and the field can continuously improve by building stronger evidence, making decisions based on a clear understanding of the available evidence, and disseminating evidence to decision makers. Specific activities include: strengthening the connection between the Secretary’s policies and grant implementation from design through evaluation; supporting a culture of evidence-based practices; providing guidance to grant-making offices on how to integrate evidence into program design; and identifying opportunities where the Department and field can improve by building, understanding, and using evidence.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
- The newly formed Grants Policy Office in the Office of Planning, Evaluation and Policy Development (OPEPD) works with offices across the Department to ensure alignment with the Secretary’s priorities, including evidence-based practices. The Grants Policy Office looks at where the Department and the field can continuously improve by building stronger evidence, making decisions based on a clear understanding of the available evidence, and disseminating evidence to decision makers. Specific activities include: strengthening the connection between the Secretary’s policies and grant implementation from design through evaluation; supporting a culture of evidence-based practices; providing guidance to grant-making offices on how to integrate evidence into program design; and identifying opportunities where the Department and field can improve by building, understanding, and using evidence.
6
Data
Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY19? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies; data-use policies)
5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
- ED’s FY18-22 Performance Plan outlines strategic goals and objectives for the Department, including Goal #3: “Strengthen the quality, accessibility and use of education data through better management, increased privacy protections and transparency.” This currently serves as a strategic plan for the Department’s governance, protection, and use of data while it develops the Open Data Plan required by the Evidence Act. The plan currently tracks measurable performance on a number of metrics including the public availability of machine-readable datasets and open licensing requirements for deliverables created with Department grant funds. ED will continue to expand its open data infrastructure to improve how stakeholders find, access and manage the Department’s public data. This will include establishing an enterprise open data platform that will make the Department’s public data discoverable from a single location and easily searchable by topic. As is required by the Evidence Act, the Department will be publishing its open data plan in 2020 within the agency’s Information Resource Management Strategic Plan.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
- Information about Department data collected by the National Center for Education Statistics (NCES) have historically been made publicly available online. Prioritized data is further documented or featured on the Department’s data page.
- In FY20, the Department will be launching an open data platform designed for improved public engagement and tailored to meet the requirements of the comprehensive data asset inventory described in the Evidence Act.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
- In ED’s FY18-22 Performance Plan, Strategic Objective 3.3 is to “Increase access to, and use of education data to make informed decisions both at the Department and in the education community” and outlines actions taken in FY18. In 2019, IES generated analysis and reporting of a new type of data produced by the new NAEP digitally based assessments. This included providing a detailed statistical guide (via an R package), which facilitates external researchers’ computation of analytic weights. Additionally, IES added 12 longitudinal data sets to the DataLab system, which improved access to National Center for Education Statistics’ (NCES) sample survey data. In 2018, ED publicly released 126 data sets in machine readable formats. As of September 2019, the Department is on track to exceed its 2019 target for publicly released machine readable datasets.
- Through leadership of NCES, ED continues to invest in clearer, more defined standards for education data. The Common Education Data Standards (CEDS) have been developed over the past 10 years using an open process to engage a broad range or data stakeholders, including local and state education agencies, postsecondary institutions, and interested organizations. CEDS establishes a common vocabulary, data model and technical tools to help education stakeholders understand and use education data.
- ED has also made concerted efforts to improve the availability and use of its data with the release of the revised College Scorecard that links data from NCES, the Office of Federal Student Aid, and the Internal Revenue Service. In FY19, the Department released provisional data describing debt at the level of fields of study. ED plans to integrate additional field of study data into its College Scorecard consumer site and the Office of Federal Student Aid’s NextGen student tools.
- In September 2019, the Department established an agency-level Data Governance Body (DGB), chaired by the Chief Data Officer (CDO), with participation from relevant senior-level staff in agency business units. The DGB will assist the CDO in assessing and adjudicating competing proposals aimed at achieving and measuring desirable Departmental data outcomes and priorities.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
- The Disclosure Review Board, the EDFacts Governing Board, the Student Privacy Policy Office (SPPO), and SPPO’s Privacy Technical Assistance Center all help to ensure the quality and privacy of education data. In FY19, the ED Data Strategy Team also published a user resource guide for staff on disclosure avoidance considerations throughout the data lifecycle.
- In ED’s FY18-22 Performance Plan, Strategic Objective 3.2 is to “Improve privacy protections for, and transparency of, education data both at the Department and in the education community.” The plan also outlines actions taken in FY18. ED’s Student Privacy website assists stakeholders in protecting student privacy by providing official guidance on FERPA, technical best practices, and the answers to Frequently Asked Questions. ED’s Privacy Technical Assistance Center (PTAC) responded to more than 3,200 technical assistance inquiries on student privacy issues and provided online FERPA training to more than 57,000 state and school district officials. FSA conducted a postsecondary institution breach response assessment to determine the extent of a potential breach and provide the institutions with remediation actions around their protection of FSA data and best practices associated with cybersecurity.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
- InformED, the ED’s primary open data initiative, works to improve the Department’s capacity to make public education data accessible and usable in innovative and effective ways for families, policy makers, researchers, developers, advocates and other stakeholders.
- In FY 2018, ED developed and released a series of three data stories focused on the characteristics, educational experiences and academic outcomes of English learners. ED also updated its data story on chronic absenteeism and released another new data story on career and technical education. These data stories have interactive graphics and accompanying narrative text to promote better access and use of Department data by a wider variety of stakeholders. Also in FY18, the Office of Special Education and Rehabilitative Services supported technical assistance centers to conduct several conferences to assist states in using their Individuals with Disabilities Education Act (IDEA) data to make informed decisions.
9
Common Evidence Standards / What Works Designations
Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY19? (Example: What Works Clearinghouses)
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- ED has an agency-wide framework for evidence that is based on ratings of studies’ internal validity. ED evidence-building activities are designed to meet the highest standards of internal validity (typically randomized control trials) when causality must be established for policy development or program evaluation purposes. When random assignment is not feasible, rigorous quasi-experiments are conducted. The framework was developed and is maintained by IES’s What Works ClearinghouseTM (WWC).
- Since 2002, ED—as part of its compliance with the Information Quality Act and OMB guidance— has required that all “research and evaluation information products documenting cause and effect relationships or evidence of effectiveness should meet that quality standards that will be developed as part of the What Works Clearinghouse” (see Information Quality Guidelines). Those standards, currently in their 4th version, are maintained on the WWC website. A stylized representation of the standards can be found here, along with information about how ED reports findings from research and evaluations that meet these standards.
6.2 Did the agency have a common evidence framework for funding decisions?
- ED’s evidence standards for its grant programs, as outlined in the Education Department General Administrative Regulations (EDGAR), build on ED’s What Works ClearinghouseTM (WWC) research design standards. ED employs these same evidence standards in all of its discretionary grant competitions that use evidence to direct funds to applicants that are proposing to implement projects that have evidence of effectiveness and/or to build new evidence through evaluation.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- ED’s What Works ClearinghouseTM (WWC) identifies studies that provide valid and statistically significant evidence of effectiveness of a given practice, product, program, or policy (referred to as “interventions”), and disseminates summary information and reports on the WWC website. The WWC has reviewed more than 10,600 studies that are available in a searchable database, including a commitment to review all publicly available evaluation reports generated under i3 grants. In spring 2019, the WWC tagged each study in its database to indicate whether study findings met EDGAR (and therefore ESSA) Tier 1/Strong Evidence or Tier 2/Moderate Evidence standards to make it easier for users to identify evidence-based interventions.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
- IES has funded two projects to study and promote knowledge utilization in education, including the Center for Research Use in Education and the National Center for Research in Policy and Practice.
- The Evidence Leadership Group has coordinated the development of revised evidence definitions and related selection criteria for competitive programs that align with ESSA to streamline and clarify provisions for grantees. These revised definitions align with ED’s suggested criteria for states’ implementation of ESSA’s four evidence levels, included in ED’s non-regulatory guidance, Using Evidence to Strengthen Education Investments. ED also developed a fact sheet to support internal and external stakeholders in understanding the revised evidence definitions. This document has been shared with internal and external stakeholders through multiple methods, including the Office of Elementary and Secondary Education ESSA technical assistance page for grantees.
- WWC Practice Guides are based on reviews of research and experience of practitioners. These guides are designed to address challenges in classrooms and schools. The WWC released two new Practice Guides in FY19: Using Technology to Support Postsecondary Student Learning and Improving Mathematical Problem Solving in Grades 4 through 8. WWC began three more guides in FY19: Assisting Students Struggling in Mathematics; Career and Technical Education Programs in Community College Settings; and Supporting Prosocial and Positive Behavior.
- IES manages the Regional Educational Laboratory (REL) program, which supports districts, states, and boards of education throughout the United States to use research and evaluation in decision making. The research priorities are determined locally, but IES approves the studies and reviews the final products.
4
Innovation
Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY19? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with rigorous evaluation requirements)
7.1 Did the agency engage leadership and staff in its innovation efforts?
- In FY19, the Office of Elementary and Secondary Education made strategic investments in innovative educational programs and practices and administered discretionary grant programs. In FY19, the Innovation and Improvement account received $1.035 billion. The Department reorganized in 2019, consolidating the Office of Innovation and Improvement into the Office of Elementary and Secondary Education.
7.2 Did the agency have policies that promote innovation?
- ED uses the Experimental Sites Initiative under section 487A(b) of the Higher Education Act of 1965, as amended, to test the effectiveness of statutory and regulatory flexibility for participating institutions disbursing Title IV student aid. ED has waived specific statutory or regulatory requirements at the postsecondary institutions, or consortia of institutions, approved to participate in the experiments. The outcomes of experiments have the potential to benefit all postsecondary institutions and the students they serve.
7.3 Did the agency have processes, structures, or programs to stimulate innovation?
- The Education Innovation and Research (EIR) program is ED’s primary innovation program for K–12 public education. EIR grants are focused on validating and scaling evidence-based practices and encouraging innovative approaches to persistent challenges. The EIR program incorporates a tiered-evidence framework that supports larger awards for projects with the strongest evidence base as well as promising earlier-stage projects that are willing to undergo rigorous evaluation. Funds may be used for: (1) early-phase grants for the development, implementation, and feasibility testing of an intervention or innovation, which prior research suggests has promise, in order to determine whether the intervention can improve student academic outcomes; (2) mid-phase grants for implementation and rigorous evaluation of interventions that have been successfully implemented under early-phase grants or have met similar criteria for documenting program effectiveness; and (3) expansion and replication of interventions or innovations that have been found to produce a sizable impact under a mid-phase grant or have met similar criteria for documenting program effectiveness.
- The IES Research Grants Program supports the development and iterative testing of new, innovative approaches to improving education outcomes. IES makes research grants with a goal structure including “Goal 2: Development and Innovation,” which supports the development of new education curricula; instructional approaches; professional development; technology; and practices, programs, and policies that are implemented at the student-, classroom-, school-, district-, state-, or federal-level to improve student education outcomes. The Small Business Innovation Research Program provides funds for rapid prototype development and evaluation as well as for product development and evaluation. The related ED Games Expo promotes emerging education technology by allowing for demonstration of game-based learning.
7.4. Did the agency evaluate its innovation efforts, including using rigorous methods?
- ED is currently implementing the Experimental Sites Initiative to assess the effects of statutory and regulatory flexibility for participating institutions disbursing Title IV student aid. ED collects performance data from all participating institutions, and IES is currently conducting rigorous evaluations of selected Experimental Sites, including two related to short-term Pell grants.
- The Education Innovation and Research (EIR) program, ED’s primary innovation program for K–12 public education, incorporates a tiered-evidence framework that supports larger awards for projects with the strongest evidence base as well as promising earlier-stage projects that are willing to undergo rigorous evaluation.
13
Use of Evidence in 5 Largest Competitive Grant Programs
Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY19? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring for evidence; Pay for Success provisions)
8.1 What were the agency’s 5 largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
- ED’s five largest competitive grant programs in FY19 are: (1) TRIO ($1.06 billion; eligible grantees: institutions of higher education, public and private organizations); (2) Charter Schools Program ($440 million; eligible grantees: local charter schools); (3) GEAR UP ($360 million; eligible grantees: state agencies; partnerships that include IHEs and LEAs); (4) Teacher and School Leader Incentive Program (TSL) ($200 million; eligible grantees: local education agencies, partnerships between state and local education agencies); and (5) Comprehensive Literacy Development Grants ($190 million; eligible grantees: state education agencies).
8.2 Did the agency use evidence of effectiveness to allocate funds in 5 largest competitive grant programs? (e.g., Were evidence-based interventions/practices required or suggested? Was evidence a significant requirement?)
- ED uses evidence of effectiveness when making awards in its largest competitive grant programs.
- The vast majority of TRIO funding in FY19 was used to support continuation awards to grantees that were successful in prior competitions that awarded competitive preference priority points for projects that proposed strategies supported by moderate evidence of effectiveness.
- Under the Charter Schools Program, ED generally requires or encourages applicants to support their projects through logic models – however, applicants are not expected to develop their applications based on rigorous evidence. Within the CSP program, the Grants to Charter School Management Organizations for the Replication and Expansion of High-Quality Charter Schools (CMO Grants) supports charter schools with a previous track record of success.
- For the 2019 competition for GEAR UP State awards, ED used a competitive preference priority for projects implementing activities that are supported by promising evidence of effectiveness.
- The TSL statute requires applicants to provide a description of the rationale for their project and describe how the proposed activities are evidence-based, and grantees are held to these standards in the implementation of the program.
- The Comprehensive Literacy Development (CLD) statute requires that grantees provide subgrants to local educational agencies that conduct evidence-based literacy interventions. ESSA requires ED to give priority to applicants demonstrating strong, moderate, or promising levels of evidence.
8.3 Did the agency use its 5 largest competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
- The Evidence Leadership Group (ELG) advises program offices on ways to incorporate evidence in grant programs through encouraging or requiring applicants to propose projects that are based on research and by encouraging applicants to design evaluations for their proposed projects that would build new evidence.
- ED’s grant programs require some form of an evaluation report on a yearly basis to build evidence, demonstrate performance improvement, and account for the utilization of funds. For examples, please see the annual performance reports of TRIO, the Charter Schools Program, and GEAR UP, which all explicitly require annual performance reports. The Teacher and School Leader Incentive Program is required by ESSA to conduct a national evaluation. The Comprehensive Literacy Development Grant requires evaluation reports. In addition, IES is currently conducting rigorous evaluations to identify successful practices in TRIO-Upward Bound, TRIO-Educational Opportunities Centers, and GEAR UP
8.4 Did the agency use evidence of effectiveness to allocate funds in any competitive grant program?
- The Education and Innovation (EIR) program supports the creation, development, implementation, replication, and taking to scale of entrepreneurial, evidence-based, field-initiated innovations designed to improve student achievement and attainment for high-need students. The program uses three evidence tiers to allocate funds based on evidence of effectiveness, with larger awards given to applicants who can demonstrate stronger levels of prior evidence and produce stronger evidence of effectiveness through a rigorous, independent evaluation. The FY19 competition included checklists and PowerPoints to help applicants clearly understand the evidence requirements.
- The Department incorporates the evidence standards established in EDGAR as priorities and selection criteria in many competitive grant programs.
8.5 What are the agency’s 1-2 strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
- The Education and Innovation (EIR) program supports the creation, development, implementation, replication, and scaling up of evidence-based, field-initiated innovations designed to improve student achievement and attainment for high-need students. IES released The Investing in Innovation Fund: Summary of 67 Evaluations, which can be used to inform efforts to move to more effective practices. The Department is exploring the results to determine what lessons learned can be applied to other programs.
8.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
- In 2016, ED released non-regulatory guidance to provide state educational agencies, local educational agencies (LEAs), schools, educators, and partner organizations with information to assist them in selecting and using “evidence-based” activities, strategies, and interventions, as defined by ESSA, including carrying out evaluations to “examine and reflect” on how interventions are working. However, the guidance does not specify that federal competitive funds can be used to conduct such evaluations. Frequently, though, programs do include a requirement to evaluate the grant during and after the project period.
7
Use of Evidence in 5 Largest Non-Competitive Grant Programs
Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY19? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)
9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
- (1) Title I Grants to LEAs ($15.76 billion; eligible grantees: state education agencies); (2) IDEA Grants to States ($12.4 billion; eligible grantees: state education agencies); (3) Supporting Effective Instruction State Grants ($2.1 billion; eligible grantees: state education agencies); (4) Impact Aid Payments to Federally Connected Children ($1.19 billion; eligible grantees: local education agencies); and (5) 21st Century Community Learning Centers ($1.2 billion; eligible grantees: state education agencies).
9.2 Did the agency use evidence of effectiveness to allocate funds in largest 5 non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
- ED worked with Congress in FY16 to ensure that evidence played a major role in ED’s large non-competitive grant programs in the reauthorized ESEA. As a result, section 1003 of ESSA requires states to set aside at least 7% of their Title I, Part A funds for a range of activities to help school districts improve low-performing schools. School districts and individual schools are required to create action plans that include “evidence-based” interventions that demonstrate strong, moderate, or promising levels of evidence.
9.3 Did the agency use its 5 largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
- ESEA requires a National Assessment of Title I– Improving the Academic Achievement of the Disadvantaged. In addition, Title I Grants require state education agencies to report on school performance, including those schools identified for comprehensive or targeted support and improvement.
- Federal law (ESEA) requires states receiving funds from 21st Century Community Learning Centers to “evaluate the effectiveness of programs and activities” that are carried out with federal funds (section 4203(a)(14)), and it requires local recipients of those funds to conduct periodic evaluations in conjunction with the state evaluation (section 4205(b)).
- The Office of Special Education Programs (OSEP), the implementing office for IDEA grants to states, has revised its accountability system to shift the balance from a system focused primarily on compliance to one that puts more emphasis on results through the use of Results Driven Accountability.
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
- Section 4108 of ESEA authorizes school districts to invest “safe and healthy students” funds in Pay for Success initiatives. Section 1424 of ESEA authorizes school districts to invest their Title I, Part D funds (Prevention and Intervention Programs for Children and Youth Who are Neglected, Delinquent, or At-Risk) in Pay for Success initiatives; under the section 1415 of the same program, a State agency may use funds for Pay for Success initiatives.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
- States and school districts are beginning to implement the requirements in Title I of the ESEA regarding using evidence-based interventions in school improvement plans. Some States are providing training or practice guides to help schools and districts identify evidence-based practices.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
- In 2016, ED released non-regulatory guidance to provide state educational agencies, local educational agencies (LEAs), schools, educators, and partner organizations with information to assist them in selecting and using “evidence-based” activities, strategies, and interventions, as defined by ESSA, including carrying out evaluations to “examine and reflect” on how interventions are working. However, the guidance does not specify that federal non-competitive funds can be used to conduct such evaluations.
6
Repurpose for Results
In FY19, did the agency shift funds away from or within any practice, policy, or program that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; incentivizing or urging grant applicants to stop using ineffective practices in funding announcements; proposing the elimination of ineffective programs through annual budget requests; incentivizing well-designed trials to fill specific knowledge gaps; supporting low-performing grantees through mentoring, improvement plans, and other forms of assistance; using rigorous evaluation results to shift funds away from a program)
10.1 Did the agency shift funds/resources away from ineffective practices or interventions used within programs or by grantees?
- ED uses evidence in competitive programs to encourage the field to shift away from less effective practices and toward more effective practices. For example, the Education Innovation and Research (EIR) program supports the creation, development, implementation, replication, and scaling up of evidence-based, field-initiated innovations designed to improve student achievement and attainment for high-need students. IES released The Investing in Innovation Fund: Summary of 67 Evaluations, which can be used to inform efforts to move to more effective practices. The Department is exploring the results to determine what lessons learned can be applied to other programs.
10.2 Did the agency shift funds/resources away from ineffective policies used within programs or by grantees?
- In 2015, Congress included a new provision in ESEA, limiting school districts receiving Title II-A funds (Supporting Effective Instruction) to reducing class size to a level that is evidence-based, to the extent evidence is available, based in part on past research findings that indicated class-size reduction is most effective for low-income students in early grades.
10.3 Did the agency shift funds/resources away from ineffective grantees?
- Program officers review discretionary grantees annually to determine whether they are making substantial progress. If a grantee has not made substantial progress, the Department may reduce an award in a subsequent year or decline to continue funding the grantee.
- ED is implementing a number of Pay for Success projects which seek to shift resources away from grantees that fail to meet project goals:
- Career and Technical Education (CTE): $2 million to support the development of PFS projects to implement new or scale up existing high-quality CTE opportunities.
- Early Learning: $3 million for Preschool Pay for Success feasibility pilots to support innovative funding strategies to expand preschool and improve educational outcomes for 3- and 4- year-olds. These grants are allowing states, school districts, and other local government agencies to explore whether Pay for Success is a viable financing mechanism for expanding and improving preschool in their communities.
- ED seeks to shift program funds to support more effective practices by prioritizing the use of evidence as a requirement when applying for a competitive grant. For ED’s grant competitions where there is data about current or past grantees, or where new evidence has emerged independent of grantee activities, ED typically reviews such data to shape the design of future grant competitions
10.4 Did the agency shift funds/resources away from ineffective programs? (e.g., eliminations or legislative language in budget requests)
- The President’s FY20 Budget eliminates, streamlines or reduces 39 discretionary programs that duplicate other programs, are ineffective, or are supported with state, local, or private funds. Major eliminations and reductions in the FY20 Budget proposal include:
- Supporting Effective Instruction State grants (Title II-A), a savings of $2.1 billion. The program is proposed for elimination because the program lacks evidence of improving student outcomes (see pp. C-16 to C-19 of the FY20 budget request). It also duplicates other ESEA program funds that may be used for professional development.
- 21st Century Community Learning Centers program, a savings of $1.2 billion. The program lacks strong evidence of meeting its objectives, such as improving student achievement. Based on program performance data from the 2014-2015 school year, more than half of program participants had no improvement in their math and English grades and nearly 60% of participants attended centers for fewer than 30 days (see pp. C-21 to C-26 in the FY20 budget request).
- In the previous administration, ED worked with Congress to eliminate 50 programs, saving more than $1.2 billion, including programs like Even Start (see pp. A-72-73) (-$66.5 million in FY11) and Mentoring Grants (see p. G-31) (-$47.3 million in FY10), which the Department recommended eliminating out of concern based on evidence.
10.5 Did the agency shift funds/resources away from consistently ineffective products and services?
- As noted earlier, program officers review discretionary grantees annually to determine whether they are making substantial progress. If a grantee has not made substantial progress, the Department may reduce an award in a subsequent year or decline to continue funding the grantee.