2019 Federal Standard of Excellence


Use of Evidence in 5 Largest Non-Competitive Grant Programs**

Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY19? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

Score
6
Administration for Children and Families (HHS)
9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest 5 non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • The Family First Prevention Services Act (FFPSA) (Division E, Title VII of the Bipartisan Budget Act of 2018), funded under the Foster Care budget, newly enables States to use Federal funds available under parts B and E of Title IV of the Social Security Act to provide enhanced support to children and families and prevent foster care placements through the provision of evidence-based mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services. FFPSA requires an independent systematic review of evidence to designate programs and services as “promising,” “supported,” and “well-supported” practices. Only interventions designated as evidence-based will be eligible for federal funds.
9.3 Did the agency use its 5 largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • TANF Grant Program: ACF has a long-standing and ongoing research portfolio in service of building evidence for the TANF Grant Program. Since FY17, Congress has designated 0.33% of the TANF Block Grant for research, evaluation, and technical assistance related to the TANF Block Grant. ACF conducts research and evaluation projects in collaboration with TANF grantees, typically in areas where TANF grantees are facing challenges, innovating, or carrying out demonstration projects. This ongoing work includes building evidence around career pathways training programs, subsidized employment approaches, job search approaches, and employment coaching interventions. These are all program approaches used by state and county TANF grantees to meet their employment goals.
  • Child Care Development Block Grant Program: The Child Care Development Block Grant Act allows for up to one-half of one percent of CCDBG funding for a fiscal year to be reserved to conduct research and evaluation of the CCDBG grant program. ACF manages this ongoing research portfolio to build evidence for the Child Care and Development Block Grant Program (CCDBG), conducting research and evaluation projects in collaboration with CCDBG grantees, typically in areas where CCDBG grantees are facing challenges, innovating, or carrying out demonstration projects. Major projects in recent years include the National Survey of Early Care and Education; assessment of evidence on ratings in Quality Rating and Improvement Systems (QRIS); and several research partnerships between CCDF lead agencies and researchers.
  • Child Welfare Grant Programs: ACF has an ongoing research portfolio on the Title IV-E foster care grant program and related grant programs. ACF conducts research and evaluation in collaboration with child welfare grantees, typically focusing on areas in which grantees are facing challenges, innovating, or conducting demonstrations. Examples include strategies for prevention of maltreatment, meeting service needs, and improving outcomes for children who come to the attention of child welfare. Major projects include the National Survey of Child and Adolescent Well-Being (NSCAW) and a Supporting Evidence Building in Child Welfare project to increase the number of evidence-supported interventions grantees can use to serve the child welfare population.
  • Child Support Enforcement Research and Evaluation Grant Program: Section 1115 of the Social Security Act provides unique authority for research and evaluation grants to child support enforcement grantees to “improve the financial well-being of children or otherwise improve the operation of the child support program.” ACF manages the child support enforcement research portfolio and administers a variety of research/evaluation components to understand more about cost and program effectiveness. Research and evaluation within the portfolio have consisted of 1) supporting large multi-state demonstrations which include random assignment evaluations (described in criteria question 7.4), 2) funding a supplement to the Census Bureau’s Current Population survey, and 3) supporting research activities of other government programs and agencies by conducting matches of their research samples to the NDNH.
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
  • States applying for funding from ACF’s Community Based Child Abuse Prevention (CBCAP) grant program must“demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” The Children’s Bureau defines evidence-based and evidence-informed programs and practices along a continuum with four categories: Emerging and Evidence-Informed; Promising; Supported; and Well Supported. Programs determined to fall within specific program parameters will be considered to be “evidence informed” or “evidence-based” practices (EBP), as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds directed towards evidence-based and evidence-informed practices.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • ACF’s Office of Child Care drew on research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of dollars to improve the quality of programs, and more to inform regulations related to Child Care and Development Block Grant reauthorization.
  • ACF’s Welfare Research work has produced findings from numerous randomized controlled trials providing evidence on strategies that TANF agencies can use such as subsidized employment and job search strategies.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • The 2019 Program Instruction for the Community-Based Child Abuse Prevention (CBCAP) grant program states that CBCAP funds made available to states must be used for the financing, planning, community mobilization, collaboration, assessment, information and referral, startup, training and technical assistance, information management and reporting, and reporting and evaluation costs for establishing, operating, or expanding community-based and prevention-focused programs and activities designed to strengthen and support families and prevent child abuse and neglect, among other things.
  • Child Care and Development Block Grant Act of 2014 says states are required to spend not less than 7, 8, and 9 percent of their CCDF awards (“quality funds”) (for years 1-2, 3-4, and 5+ after 2014 CCDBG enactment, respectively – see 128 STAT. 1987) on activities to improve the quality of child care services provided in the state, including:
    • 1B: Supporting the training and professional development of the child care workforce through…incorporating the effective use of data to guide program improvement (see 128 STAT 1988)
    • 3: Developing, implementing, or enhancing a quality rating system for child care providers and services, which may support and assess the quality of child care providers in the State (A) and be designed to improve the quality of different types of child care providers (C) (see 128 STAT 1988)
    • 7: Evaluating and assessing the quality and effectiveness of child care programs and services offered in the State, including evaluating how such programs positively impact children (see 128 STAT 1990)
  • ACF requires all CCDF lead agencies to annually report on how their CCDF quality funds were expended, including the activities funded and the measures used by states and territories to evaluate progress in improving the quality of child care programs and services. ACF released a Program Instruction for state and territorial lead agencies to provide guidance on reporting the authorized activities for the use of quality funds.
Score
3
Administration for Community Living (HHS)
9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest 5 non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
9.3 Did the agency use its 5 largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • FY12 Congressional appropriations included an evidence-based requirement for the first time. OAA Title III-D funding may be used only for programs and activities demonstrated to be evidence-based. Consistent with the Administrator’s focus on identifying new ways to efficiently improve direct service programs, ACL is using its 1% Nutrition authority to fund $3.5 million for nutrition innovations and to test ways to modernize how meals are provided to a changing senior population. One promising demonstration currently being carried out by the Georgia State University Research Foundation (entitled Double Blind Randomized Control Trial on the Effect of Evidence-Based Suicide Intervention Training on the Home-Delivered and Congregate Nutrition Program through the Atlanta Regional Commission) has drawn widespread attention is an effort to train volunteers who deliver home-delivered meals to recognize and report indicators of suicidal intent and other mental health issues so that they can be addressed. 
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
  • The Older Americans Act state plans require grantees to provide information about past performance, including “information on the extent to which the area agency on aging met the objectives” related to “providing services to older individuals with greatest economic need, older individuals with greatest social need, and older individuals at risk for institutional placement.”
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Since 2017, ACL has awarded Innovations in Nutrition grants to 11 organizations to develop and expand evidence-based approaches to enhance the quality and effectiveness of nutrition programming. ACL is currently overseeing five grantees for innovative projects that will enhance the quality, effectiveness, and outcomes of nutrition services programs provided by the national aging services network. The grants total $1,197,205 for this year with a two-year project period. Through this grant program, ACL aims to identify innovative and promising practices that can be scaled across the country and to increase the use of evidence-informed practices within nutrition programs.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • All funding opportunity announcements published by ACL include language about generating and reporting evidence about their progress towards the specific goals set for the funds. Grantee manuals include information about the importance of and requirements for evaluation (see the Administration on Aging: Title VI Resource Manual). The National Ombudsman Resource Center, funded by ACL, provides self-evaluation materials for Long-Term Care Ombudsman Programs (LTCOP)  funded under Title VII of the Older Americans Act.
Score
N/A
U.S. Agency for International Development
  • USAID does not administer non-competitive grant programs (score for Criteria 8 applied here).
Score
6
Corporation for National and Community Service
9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
  • CNCS operates one formula grant program in FY19, the AmeriCorps State formula grants program ($174,250,005; eligible grantees: states). CNCS also operates four direct grant programs in FY19: 1) AmeriCorps National Civilian Community Corps (NCCC) ($32 million; eligible grantees: nonprofit organizations); 2) AmeriCorps VISTA ($92 million; eligible grantees: nonprofit organizations, state, tribal, and local governments, institutions of higher education); 3) Senior Corps Foster Grandparents ($110 million; eligible grantees: nonprofit organization, local governments); and 4) Senior Corps Senior Companion Program ($46 million; eligible grantees: nonprofit organizations, local governments).
9.2 Did the agency use evidence of effectiveness to allocate funds in largest 5 non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • In FY18, Senior Corps Foster Grandparents and Senior Companion Program embedded evidence into their grant renewal processes by offering supplemental funding, “augmentation grants,” to grantees interested in deploying volunteers to serve in evidence-based programs. More than $3.3 million of Senior Corps program dollars were allocated, over three years, toward new evidence-based programming augmentations. Grantees will be operating with their augmentations through fiscal year 2021.
  • In a survey completed in FY 2019, Senior Corps grantees reported that more than 390 Senior Corps grants (about 38% of total grants), and more than 21,000 Senior Corps volunteers (about 11% of all Senior Corps volunteers) were engaged in evidence-based programming.
9.3 Did the agency use its 5 largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • In FY19, Senior Corps completed an evaluation with an independent firm to produce case studies and comparative analyses of select grantees that received an evidence based-programming augmentation to understand successes, challenges, and other issues. CNCS anticipates that this management report will be used to inform Senior Corps’ approach to replicating this augmentation initiative, as well as the training/technical assistance needs of grantees. Senior Corps and the Administration for Community Living have also initiated a dialogue about how to build and broaden the evidence base for various programs designed for older adults. CNCS relies on ACL’s list of evidence-based programs for its augmentation grants and is exploring how to collect the same information from its grantees and how to share information about other evidence-based programs Senior Corps grantees may be using.
  • AmeriCorps NCCC is investing in a Service Project Database with the aim of creating a qualitative database of all NCCC projects completed since 2012. The database will thematically organize projects, classify project frameworks, and categorize the outcomes of these service initiatives. NCCC is investing in an evaluation of NCCC’s impact. This research project was initiated in FY18 and is focused on evaluating member retention, studying how NCCC develops leadership skills in its members and teams, and the program’s ability to strengthen communities. Finally, NCCC will continue to invest in research grants to better understand the outcomes of its disaster response efforts.
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
  • CNCS only administers five non-competitive grant programs, as described above.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Senior Corps and the Office of Research and Evaluation completed a longitudinal evaluation of the Foster Grandparents and Senior Companion Programs in FY19 that demonstrated the positive health outcomes associated with volunteering. A 50 year retrospective review of the research conducted on Senior Corps programs will be completed by the end of FY19.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • No examples available.
Score
7
U.S. Department of Education
9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest 5 non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • ED worked with Congress in FY16 to ensure that evidence played a major role in ED’s large non-competitive grant programs in the reauthorized ESEA. As a result, section 1003 of ESSA requires states to set aside at least 7% of their Title I, Part A funds for a range of activities to help school districts improve low-performing schools. School districts and individual schools are required to create action plans that include “evidence-based” interventions that demonstrate strong, moderate, or promising levels of evidence.
9.3 Did the agency use its 5 largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • ESEA requires a National Assessment of Title I– Improving the Academic Achievement of the Disadvantaged. In addition, Title I Grants require state education agencies to report on school performance, including those schools identified for comprehensive or targeted support and improvement.
  • Federal law (ESEA) requires states receiving funds from 21st Century Community Learning Centers to “evaluate the effectiveness of programs and activities” that are carried out with federal funds (section 4203(a)(14)), and it requires local recipients of those funds to conduct periodic evaluations in conjunction with the state evaluation (section 4205(b)).
  • The Office of Special Education Programs (OSEP), the implementing office for IDEA grants to states, has revised its accountability system to shift the balance from a system focused primarily on compliance to one that puts more emphasis on results through the use of Results Driven Accountability.
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
  • Section 4108 of ESEA authorizes school districts to invest “safe and healthy students” funds in Pay for Success initiatives. Section 1424 of ESEA authorizes school districts to invest their Title I, Part D funds (Prevention and Intervention Programs for Children and Youth Who are Neglected, Delinquent, or At-Risk) in Pay for Success initiatives; under the section 1415 of the same program, a State agency may use funds for Pay for Success initiatives.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • States and school districts are beginning to implement the requirements in Title I of the ESEA regarding using evidence-based interventions in school improvement plans. Some States are providing training or practice guides to help schools and districts identify evidence-based practices.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • In 2016, ED released non-regulatory guidance to provide state educational agencies, local educational agencies (LEAs), schools, educators, and partner organizations with information to assist them in selecting and using “evidence-based” activities, strategies, and interventions, as defined by ESSA, including carrying out evaluations to “examine and reflect” on how interventions are working. However, the guidance does not specify that federal non-competitive funds can be used to conduct such evaluations.
Score
3
U.S. Dept. of Housing & Urban Development
9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest 5 non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • Although the funding formulas are prescribed in statute, evaluation-based interventions are central to each program. HUD used evidence from a 2015 Administrative Fee study of the costs that high-performing PHAs incur in administering a HCV program to propose a new FY17 approach for funding Administrative Fees while strengthening PHA incentives to improve HCV outcomes by providing tenant mobility counseling.
  • HUD’s funding of public housing is being radically shifted through the evidence-based Rental Assistance Demonstration (RAD), which enables accessing private capital to address the $26 billion backlog of capital needs funding. Based on demonstrated success of RAD, for FY19 HUD proposed removing the cap on the number of public housing developments to be converted to Section 8 contracts. HUD is also conducting a Rent Reform demonstration and a Moving To Work (MTW) demonstration to test efficiencies of changing rent rules and effects on tenant outcomes.
9.3 Did the agency use its 5 largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • Evidence-building is central to HUD’s funding approach through the use of prospective program demonstrations. These include the Public Housing Operating Fund’s Rental Assistance Demonstration (RAD), the Public Housing Capital Grants’ Rent Reform demonstration, and the Housing Choice Voucher program’s Moving To Work (MTW) demonstration grants. As Congress moved to expand MTW flexibilities to additional public housing authorities (PHAs), HUD sought authority to randomly assign cohorts of PHAs to provide ability to rigorously test specific program innovations.
  • Program funds are provided to operate demonstrations through the HCV account, Tenant-Based Rental Assistance. These include the Tribal HUD-VA Supportive Housing (Tribal HUD-VASH) demonstration of providing permanent supportive housing to Native American veterans and the FSS-Family Unification Program demonstration that tests the effect of providing vouchers to at-risk young adults who are aging out of foster care.
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
  • No examples available.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • To address a severe backlog of capital needs funding for the nation’s public housing stock, the Rental Assistance Demonstration was authorized in 2011 to convert the properties to project-based Section 8 contracts to attract an infusion of private capital. The 2016 interim report on the RAD evaluation showed that conversions successfully obtained $2.2 billion of private funding, representing a 9:1 leverage ratio. Based on the successes, the limit on the number of public housing conversions was increased to 455,000 units in 2018, nearly half of the stock, and HUD proposed to eliminate the cap in FY19. Additionally, HUD extended the conversion opportunity to legacy multifamily programs through RAD 2.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • Communities receiving HUD block grant funding through Community Development Block Grants, HOME block grants, and other programs are required to consult local stakeholders, conduct housing needs assessments, and develop needs-driven Consolidated Plans to guide their activities. They then provide Consolidated Annual Performance and Evaluation Reports (CAPERs) to document progress toward their Consolidated Plan goals in a way that supports continued community involvement in evaluating program efforts.
  • HUD’s Community Development Block Grant program, which provides formula grants to entitlement jurisdictions, increases local evaluation capacity. Specifically, federal regulations (Section 24 CFR 507.200) authorize CDBG recipients (including city and state governments) to use up to 20% of their CDBG allocations for administration and planning costs that may include evaluation-capacity building efforts and evaluations of their CDBG-funded interventions (as defined in 507.205 and 507.206).
Score
7
U.S. Department of Labor
9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
  • In FY19, the 5 largest non-competitive grant programs at DOL are in the Employment and Training Administration: Adult Employment and Training Activities ($845,000,000; eligible grantees: city, county, and/or state governments), Youth Activities ($903,416,000; eligible grantees: city, county, and/or state governments), Dislocated Worker Employment and Training activities ($1,040,860,000; eligible grantees: city, county, and/or state governments), UI State Administration ($2,137,945,000; eligible grantees: city, county, and/or state governments), Employment Security grants to States ($663,052,000; eligible grantees: city, county, and/or state governments).
9.2 Did the agency use evidence of effectiveness to allocate funds in largest 5 non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • A signature feature of the Workforce Innovation and Opportunity Act (WIOA) (Pub. L. 113-128), is its focus on the use of data and evidence to improve services and outcomes, particularly in provisions related to states’ role in conducting evaluations and research, as well as in requirements regarding data collection, performance standards, and state planning. Conducting evaluations is a required statewide activity, but there are additional requirements regarding coordination (with other state agencies and federal evaluations under WIOA), dissemination, and provision of data and other information for Federal evaluations.
9.3 Did the agency use its 5 largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • Section 116(e) of WIOA describes how the state, in coordination with local workforce boards and state agencies that administer the programs, shall conduct ongoing evaluations of activities carried out in the state under these state programs. These evaluations are intended to promote, establish, implement, and utilize methods for continuously improving core program activities in order to achieve high-level programs within, and high-level outcomes from, the workforce development system.
  • Additionally, WIOA’s evidence and performance provisions: (1) increased the amount of WIOA funds states can set aside and distribute directly from 5-10% to 15% and authorized them to invest these funds in Pay for Performance initiatives; (2) authorized states to invest their own workforce development funds, as well as non-federal resources, in Pay for Performance initiatives; (3) authorized local workforce investment boards to invest up to 10% of their WIOA funds in Pay for Performance initiatives; and (4) authorized states and local workforce investment boards to award Pay for Performance contracts to intermediaries, community based organizations, and community colleges.
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
  • Reemployment Services and Eligibility Assessments (RESEA) funds must be used for interventions or service delivery strategies demonstrated to reduce the average number of weeks of unemployment insurance a participant receives by improving employment outcomes. The law provides for a phased implementation of the new program requirements over several years. In FY19, DOL awarded $130 million to states to conduct RESEA programs that met these evidence of effectiveness requirements.  Beginning in FY23, states must also use no less than 25 percent of RESEA grant funds for interventions with a high or moderate causal evidence rating that show a demonstrated capacity to improve outcomes for participants; this percentage increases in subsequent years until after FY26, when states must use no less than 50 percent of such grant funds for such interventions.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Institutional Analysis of American Job Centers: the goal of the evaluation was to understand and systematically document the institutional characteristics of American Job Centers (AJCs), and to identify variations in service delivery, organization structure, and administration across AJCs.
  • Career Pathways Descriptive and Analytical Study: WIOA requires DOL to “conduct a multistate study to develop, implement, and build upon career advancement models and practices for low-wage healthcare providers or providers of early education and child care.” In response, DOL conducted the Career Pathways Design Study to develop evaluation design options that could address critical gaps in knowledge related to the approach, implementation, and success of career pathways strategies generally, and in early care and education specifically. The Chief Evaluation Office (CEO) has recently begun the second iteration of this study. The purpose of this project is to build on the evaluation design work CEO completed in 2018 to build evidence about the implementation and effectiveness of career pathways approaches and meet the WIOA statutory requirement to conduct a career pathways study. It will include a meta-analysis of existing impact evaluation results as well as examine how workers advance through multiple, progressively higher levels of education and training, and associated jobs, within a pathway over time, and the factors associated with their success.
  • Analysis of Employer Performance Measurement Approaches: the goal of the study was to examine the appropriateness, reliability and validity of proposed measures of effectiveness in serving employers required under WIOA. It included knowledge development to understand and document the state of the field, an analysis and comparative assessment of measurement approaches and metrics, and the dissemination of findings through a report, as well as research and topical briefs.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • The Employment & Training Administration’s (ETA) RESEA grantees may use up to 10% of their grant funds for evaluations of their programs. ETA is releasing specific evaluation guidance later in FY19 to help states understand how to conduct or cause to conduct evaluations of their RESEA interventions with these grant funds. The goal of the agency guidance, along with the evaluation technical assistance being provided to states and their partners, is to build states’ capacity to understand, use, and build evidence.
  • Section 116 of WIOA establishes performance accountability indicators and performance reporting requirements to assess the effectiveness of states and local areas in achieving positive outcomes for individuals served by the workforce development system’s core programs. Section 116(e) of WOIA requires states to “employ the most rigorous analytical and statistical methods that are reasonably feasible, such as the use of control groups” and requires that states evaluate the effectiveness of their WOIA programs in an annual progress which includes updates on (1) current or planned evaluation and related research projects, including methodologies used; (2) efforts to coordinate the development of evaluation and research projects with WIOA core programs, other state agencies and local boards; (3) a list of completed evaluation and related reports with publicly accessible links to such reports; (4) efforts to provide data, survey responses, and timely visits for Federal evaluations; (5) any continuous improvement strategies utilizing results from studies and evidence-based practices evaluated. States are permitted to use WOIA grant funds to perform the necessary performance monitoring and evaluations to complete this report.
Score
N/A
Millennium Challenge Corporation
  • MCC does not administer non-competitive grant programs (score for criteria #8 applied).
Score
7
Substance Abuse and Mental Health Services Administration
9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest 5 non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • In FY19, Congress maintained the 10 percent set-aside for evidence-based programs in SAMHSA’s Mental Health Grant Block (MHBG) grant to address the needs of individuals with early serious mental illness, including psychotic disorders, regardless of the age of the individual at onset (see p. 48 of the FY18-FY19 Block Grant application). In the FY20 budget request (p. 3), SAMHSA expressed its desire to continue the set-aside.
  • The FY18-FY19 Block Grant application requires states seeking Mental Health Block Grant (MHBG) and Substance Abuse and Treatment Prevention Block Grant (SAGB) funds to identify specific priorities. For each priority, states must identify the relevant goals, measurable objectives, and at least one-performance indicator for each objective, which must include strategies to deliver evidence-based individualized treatment plans (p. 21); evidence-based interventions for substance use or dependence (p. 21); building provider capacity to deliver evidence-based, trauma-specific interventions (p. 22); evidence-based programs, policies, and practices in prevention efforts (p. 22); evidence-based models to prevent substance misuse (p. 23). 
9.3 Did the agency use its 5 largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • The FY18-FY19 Block Grant application requires states applying for Substance Abuse Prevention and Treatment funds to create an evaluation plan, which must include at least five specified evaluation elements. Additionally, the application specifies that SAMHSA will work with the National Institute of Mental Health (NIMH) to plan for program evaluation and data collection related to demonstrating program effectiveness of the Mental Health Block Grant.
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
  • No examples available.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • No examples available.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • The FY18-FY19 Block Grant Application clarified that “Section 1921 of the PHS [Public Health Services] Act (42 U.S.C.§ 300x-21) authorizes the States to obligate and expend SABG [Substance Abuse and Treatment Prevention Block Grant] funds to plan, carry out and evaluate activities and services designed to prevent and treat substance use disorders” (p. 16). The Application further clarifies that states “may utilize SABG funds to train personnel to conduct fidelity assessments of evidence-based practices” (p. 35).
Back to the Standard

Visit Results4America.org