2019 Federal Standard of Excellence


Administration for Children and Families (HHS)

Score
7
Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY19?

1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The Deputy Assistant Secretary for Planning, Research, and Evaluation serves in a role equivalent to the Chief Evaluation Officer for the Administration for Children and Families (ACF). A Senior Executive Service career official, the Deputy Assistant Secretary oversees ACF’s Office of Planning, Research, and Evaluation (OPRE) and supports evaluation and other learning activities across the agency. ACF’s Deputy Assistant Secretary for Planning, Research, and Evaluation oversees a research and evaluation budget of approximately $200 million in FY19. OPRE has 64 federal staff positions; OPRE staff are experts in research and evaluation methods and data analysis as well as ACF programs, policies, and the populations they serve.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • In 2016, ACF established a new Division of Data and Improvement (DDI) providing federal leadership and resources to improve the quality, use, and sharing of ACF data. The Director of DDI reports to the Deputy Assistant Secretary for Planning, Research, and Evaluation and oversees work to improve the quality, usefulness, interoperability, and availability of data and to address issues related to privacy and data security and data sharing. DDI has 9 federal staff positions and an FY19 budget of approximately $4.4M (not including salaries).
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • With the 2016 reorganization that created the Division of Data and Improvement (DDI), ACF nested the following functions within the Office of Planning, Research, and Evaluation: strategic planning; performance measurement and management; research and evaluation; statistical policy and program analysis; synthesis and dissemination of research and evaluation findings; data quality, usefulness, and sharing; and application of emerging technologies to improve the effectiveness of programs and service delivery. This reorganization was for the purpose of consolidating and giving the Deputy Assistant Secretary for Planning, Research, and Evaluation oversight for evaluation, data, statistical, and related functions.
Score
7
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence-building plan), and did it publicly release the findings of all completed program evaluations in FY19?

 2.1 Did the agency have an agency-wide evaluation policy? (Example: Evidence Act 313(d))
  • ACF’s evaluation policy confirms ACF’s commitment to conducting evaluations and using evidence from evaluations to inform policy and practice. ACF seeks to promote rigor, relevance, transparency, independence, and ethics in the conduct of evaluations. ACF established the policy in 2012 and published it in the Federal Register on August 29, 2014.
2.2 Did the agency have an agency-wide evaluation plan? (Example: Evidence Act 312(b))
  • OPRE annually identifies questions relevant to the programs and policies of ACF and proposes a research and evaluation spending plan to the Assistant Secretary for Children and Families. This plan focuses on activities that the Office of Planning, Research, and Evaluation plans to conduct during the following fiscal year.
2.3 Did the agency have a learning agenda (evidence-building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including, but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda? (Example: Evidence Act 312)
2.4 Did the agency publicly release all completed program evaluations?
  • ACF’s evaluation policy requires that “ACF will release evaluation results regardless of findings…Evaluation reports will present comprehensive findings, including favorable, unfavorable, and null findings. ACF will release evaluation results timely – usually within two months of a report’s completion.” ACF has publicly released the findings of all completed evaluations to date. In 2018, OPRE released nearly 130 research publications. OPRE publications are publicly available on the OPRE website.
2.5 What is the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts? (Example: Evidence Act 315, subchapter II (c)(3)(9))
  • Coverage: ACF conducts research in areas where Congress has given authorization and appropriations. Programs for which ACF is able to conduct research and evaluation using dedicated funding include Temporary Assistance for Needy Families, Health Profession Opportunity Grants, Head Start, Child Care, Child Welfare, Home Visiting, Healthy Marriage and Responsible Fatherhood, Personal Responsibility Education Program, Sexual Risk Avoidance Education, Teen Pregnancy Prevention, Runaway and Homeless Youth, Family Violence Prevention Services, and Human Trafficking services. These programs represent approximately 85% of overall ACF spending.
  • Quality: ACF’s Evaluation Policy states that ACF is committed to using the most rigorous methods that are appropriate to the evaluation questions and feasible within budget and other constraints, and that rigor is necessary not only for impact evaluations, but also for implementation or process evaluations, descriptive studies, outcome evaluations, and formative evaluations; and in both qualitative and quantitative approaches.
  • Methods: ACF uses a range of evaluation methods. ACF  conducts impact evaluations as well as implementation and process evaluations, cost analyses and cost benefit analyses, descriptive and exploratory studies, research syntheses, and more. ACF is committed to learning about and using the most scientifically advanced approaches to determining effectiveness and efficiency of ACF programs; to this end, OPRE annually organizes meetings of scientists and research experts to discuss critical topics in social science research methodology and how innovative methodologies can be applied to policy-relevant questions.
  • Effectiveness: ACF’s Evaluation Policy states that ACF will conduct relevant research and disseminate findings in ways that are accessible and useful to policymakers and practitioners. OPRE engages in ongoing collaboration with ACF program office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions such as ACF regulations and funding opportunity announcements. For example, when ACF’s Office of Head Start significantly revised its Program Performance Standards—the regulations that define the standards and minimum requirements for Head Start services—the revisions drew from decades of OPRE research and the recommendations of the OPRE-led Secretary’s Advisory Committee on Head Start Research and Evaluation. Similarly, ACF’s Office of Child Care drew from research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of funds dedicated to improving the quality of programs, and other information to inform the regulations accompanying the reauthorization of the Child Care and Development Block Grant.
  • Independence: ACF’s Evaluation Policy states that independence and objectivity are core principles of evaluation and that it is important to insulate evaluation functions from undue influence and from both the appearance and the reality of bias. To promote objectivity, ACF protects independence in the design, conduct, and analysis of evaluations. To this end, ACF conducts evaluations through the competitive award of grants and contracts to external experts who are free from conflicts of interest; and, the Deputy Assistant Secretary for Planning, Research, and Evaluation, a career civil servant, has authority to approve the design of evaluation projects and analysis plans; and has authority to approve, release, and disseminate evaluation reports.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
Score
6
Resources

Did the agency invest at least 1% of program funds in evaluations in FY19? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, rigorous evaluations, including random assignments)

3.1. ____ (Name of agency) invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY19 budget.
  • The Administration for Children and Families invested approximately $200 million in evaluations, evaluation technical assistance, and evaluation capacity-building, representing approximately 0.3% of the agency’s approximately $59 billion FY19 budget.
3.2 Did the agency have a budget for evaluation and how much was it? (Were there any changes in this budget from the previous fiscal year?)
  • In FY19, ACF’s Office of Planning, Research, and Evaluation has a budget of approximately $200 million, a $35 million increase from FY18.
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
Score
6
Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY19?
(Example: Performance stat systems, frequent outcomes-focused data-informed meetings)

4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • ACF was an active participant in the development of the FY 2018-2022 HHS Strategic Plan, which includes several ACF-specific objectives. ACF regularly reports on progress associated with those objectives as part of the FY 2019 HHS Annual Performance Plan/Report, including the eight performance measures that support Goal Three objectives to “Strengthen the Economic and Social Well-Being of Americans Across the Lifespan.” ACF supports Objective 3.1 (Encourage self-sufficiency and personal responsibility, and eliminate barriers to economic opportunity), Objective 3.2 (Safeguard the public against preventable injuries and violence or their results), and Objective 3.3 (Support strong families and healthy marriage, and prepare children and youth for healthy, productive lives) by reporting annual performance measures. ACF reports on a total of nine performance measures throughout the FY 2018-2022 HHS Strategic Plan. ACF is also an active participant in the HHS Strategic Review process, which is an annual assessment of progress on the subset of nine performance measures that ACF reports on as part of the HHS Strategic Plan. 
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • Individual ACF programs regularly analyze and use performance data, administrative data, and evaluation data to improve performance. Two performance management systems worth noting are the Participant Accomplishment and Grant Evaluation System (PAGES) management information system for Health Profession Opportunity Grant (HPOG) grantees and the Information, Family Outcomes, Reporting, and Management (nForm) management information system for Healthy Marriage and Responsible Fatherhood grantees. Both are web-based management information systems that are used to track grantee progress for program management and to record grantee and participant data for research and evaluation purposes.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
Score
5
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY19? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies; data-use policies)

5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • ACF’s Interoperability Action Plan was established in 2017 to formalize ACF’s vision for effective and efficient data sharing. Under this plan ACF and its program offices will develop and implement a Data Sharing First (DSF) strategy that starts with the assumption that data sharing is in the public interest. The plan states that ACF will encourage and promote data sharing broadly, constrained only when required by law or when there are strong countervailing considerations.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • In 2018, ACF produced a Compendium of ACF Administrative and Survey Data Resources. All major ACF person-level administrative data sets and surveys are included, including 11 administrative data sources and eight surveys. Each entry includes the following information: data ownership and staff experts, basic content, major publications and websites, available data sets (public, restricted use, in-house), restrictions on data sharing, capacity to link with other data sets along with history of such linking, data quality, and resources to collect, prepare, and analyze the data. The compendium is currently available for internal use at HHS; a public version is forthcoming. 
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • ACF has multiple efforts underway to promote and support the use of documented data for research and improvement, including making numerous administrative and survey datasets publicly available for secondary use and actively promoting the archiving of research and evaluation data for secondary use. These data are machine readable, downloadable, and de-identified as appropriate for each data set. For example, individual-level data for research is held in secure restricted use formats, while public-use data sets are made available online. To make it easier to find these resources, ACF plans to release a Compendium of ACF Administrative and Survey Data and to consolidate information on archived research and evaluation data on the OPRE website.
  • OPRE actively promotes archiving of research and evaluation data for secondary use. OPRE research contracts include a standard clause requiring contractors to make data and analyses supported through federal funds available to other researchers and to establish procedures and parameters for all aspects of data and information collection necessary to support archiving information and data collected under the contract. Many datasets from past OPRE projects are stored in archives including the ACF-funded Child Care & Early Education Research Connections site and the ICPSR data archive. OPRE has funded grants for secondary analysis of ACF/OPRE data; examples in recent years include secondary analysis of strengthening families datasets and early care and education datasets.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • ACF developed a Confidentiality Toolkit that supports state and local efforts by explaining rules governing confidentiality in ACF and certain related programs, by providing examples of how confidentiality requirements can be addressed, and by including sample memoranda of understandings and data sharing agreements. ACF is currently in the process of updating the Toolkit for recent changes in statute, and to provide real-world examples of how data has been shared across domains—which frequently do not have harmonized privacy requirements—while complying with all relevant privacy and confidentiality requirements (e.g. FERPA, HIPPA). These case studies will also include downloadable, real-world tools that have been successfully used in the highlighted jurisdictions.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • ACF engages in several broad-based and cross-cutting efforts to support state, local, and tribal efforts to use human services data while protecting privacy and confidentiality. Through the Interoperability Initiative, ACF supports data sharing through developing standards and tools that are reusable across the country, addressing common privacy and security requirements to mitigate risks, and providing request-based technical assistance to states, local jurisdictions, and ACF program offices. Several ACF divisions have also been instrumental in supporting cross-governmental efforts, such as the National Information Exchange Model (NIEM) that will enable human services agencies to collaborate with health, education, justice, and many other constituencies that play a role in the well-being of children and families. ACF also undertakes many program-specific efforts to support state, local, and tribal efforts to use human services data while protecting privacy and confidentiality. For example, ACF’s TANF Data Innovation Project supports innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. This work includes encouraging and strengthening state integrated data systems, promoting proper payments and program integrity, and enabling data analytics for TANF program improvement.
Score
7
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY19? (Example: What Works Clearinghouses)

6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • ACF has established a common evidence framework adapted for the human services context from the framework for education research developed by the U.S. Department of Education and the National Science Foundation. The ACF framework, which includes the six types of studies delineated in the ED/NSF framework, aims to (1) inform ACF’s investments in research and evaluation and (2) clarify for potential grantees’ and others’ expectations for different types of studies.
6.2 Did the agency have a common evidence framework for funding decisions?
  • While ACF does not have a common evidence framework across all funding decisions, certain programs do use a common evidence framework for funding decisions. For example:
    • The Family First Prevention Services Act (FFPSA) enables states to use funds for certain evidence-based services. In April 2019, ACF published the Prevention Services Clearinghouse Handbook of Standards and Procedures, which provides a detailed description of the standards used to identify and review programs and services in order to rate programs and services as promising, supported, and well-supported practices.
    • The Personal Responsibility Education Program Competitive Grants were funded to replicate effective, evidence-based program models or substantially incorporate elements of projects that have been proven to delay sexual activity, increase condom or contraceptive use for sexually active youth, and/or reduce pregnancy among youth. Through a systematic evidence review, HHS selected 44 models that grantees could use, depending on the needs and age of the target population of each funded project.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions? 
  • ACF sponsors several user-friendly tools that disseminate and promote evidence-based interventions. Several evidence reviews of human services interventions disseminate and promote evidence-based interventions by rating the quality of evaluation studies and presenting results in a user-friendly searchable format. Reviews to date have covered: teen pregnancy prevention; home visiting; marriage education and responsible fatherhood; and employment and training and include both ACF-sponsored and other studies. ACF is currently developing two new websites that will disseminate information on rigorously evaluated, evidence-based solutions: 1) The Pathways to Work Evidence Clearinghouse will be a user-friendly website (expected to launch in Spring 2020) that will report on “projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects”; 2) ACF’s Title IV-E Prevention Services Clearinghouse project launched a website in June 2019 that is easily accessible and searchable and allows users to navigate the site and find information about mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services designated as “promising,” “supported,” and “well-supported” practices by an independent systematic review.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • OPRE’s evaluation policy states that it is important for evaluators to disseminate research findings in ways that are accessible and useful to policy-makers and practitioners and that OPRE and program offices will work in partnership to inform potential applicants, program providers, administrators, policy-makers, and funders through disseminating evidence from ACF-sponsored and other good quality evaluations. OPRE has a robust dissemination function that includes the OPRE website, an OPRE e-newsletter, and social media presence on Facebook and Twitter. OPRE also biennially hosts two major conferences, the Research and Evaluation Conference on Self-Sufficiency and the National Research Conference on Early Childhood to share research findings with researchers and with program administrators and policymakers at all levels.
Score
6
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY19? (Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with rigorous evaluation requirements) 

7.1 Did the agency engage leadership and staff in its innovation efforts?
  • HHS has embarked on a process called ReImagine HHS, which has engaged leadership and staff from around the department to identify strategic shifts to transform how HHS operates. One part of this larger initiative, called Aim for Independence (AFI), is using a human centered design approach to rethink how ACF does work and how that work translates into long-lasting, positive outcomes for parents and children. Engagement activities have included a leadership retreat and opportunities for staff input.
  • ACF leadership has proposed new Opportunity and Economic Mobility Demonstrations to allow states to redesign safety net service delivery by streamlining funding from multiple public assistance and workforce development programs and providing services tailored to their populations’ specific needs. The demonstrations would be subject to rigorous evaluation.
 7.2 Did the agency have policies that promote innovation?
7.3 Did the agency have processes, structures, or programs to stimulate innovation?
  • ACF projects that support innovation include:
    • ACF’s Behavioral Interventions to Advance Self-Sufficiency (BIAS) project was the first major effort to apply a behavioral economics lens to programs that serve poor families in the U.S. The project conducted 15 rapid-cycle randomized tests of behavioral interventions. The Behavioral Interventions to Advance Self-Sufficiency-Next Generation (BIAS-NG) project continues ACF’s exploration of the application of behavioral science to the programs and target populations of ACF. Additionally, the Behavioral Interventions Scholars(BIS) grant program supports dissertation research that applies a behavioral science lens to research questions relevant to social services programs and policies and other issues facing low-income families.
    • ACF’s Human Centered Design for Human Services project is exploring the application of human centered design across ACF service delivery programs at the federal, state, and local levels.
    • Several ACF grant programs are innovation projects, demonstration projects, or allow waivers. See details below in the response to Sub-Criteria 4 below.
7.4. Did the agency evaluate its innovation efforts, including using rigorous methods?
Score
7
Use of Evidence in 5 Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY19? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring for evidence; Pay for Success provisions)

8.1 What were the agency’s 5 largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
8.2 Did the agency use evidence of effectiveness to allocate funds in 5 largest competitive grant programs? (e.g., Were evidence-based interventions/practices required or suggested? Was evidence a significant requirement?)
  •  ACF evaluates Healthy Marriage and Responsible Fatherhood grant applicants based upon their proposed program performance plan; use of evidence-based, evidence-informed, or skill-based curriculum; LOI/MOU with a third-party local evaluator; demonstrated experience with comparable program evaluation; and statement about the relevance of their research to the field, among other factors.
  • The Head Start Designation Renewal System (DRS) determines whether Head Start/Early Head Start grantees are delivering high-quality comprehensive services to the children and families that they serve. These determinations are based on seven conditions, one of which looks at how Head Start classrooms within programs perform on the Classroom Assessment Scoring System (CLASS), an observation-based measure of the quality of teacher-child interactions. When the DRS deems grantees to be underperforming, grantees are denied automatic renewal of their grant and must apply for funding renewal through a standard open competition process. In the most recent Head Start FOA language, grantees who are re-competing for Head Start funds must include a description of any violations, such as deficiencies, areas of non-compliance, and/or audit finding in their record of Past Performance (p. 26). Applicants may describe the actions they have taken to address these violations. According to Head Start policy, in competitions to replace or potentially replace a current grantee, the responsible HHS official will give priority to applicants that have demonstrated capacity in providing effective, comprehensive, and well-coordinated early childhood education and development services and programs (see section 1304.20: Selection among applicants).
  • ACF evaluates Unaccompanied Children Services, Preschool Development Grant, and Runaway and Homeless Youth grant applicants based upon: their proposed program performance evaluation plan; how their data will contribute to continuous quality improvement; and their demonstrated experience with comparable program evaluation, among other factors.
8.3 Did the agency use its 5 largest competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations) 
  • ACF’s template (see p. 14 in Attachment C) for competitive grant announcements includes standard language that funding opportunity announcement drafters may select to require grantees to either 1) collect performance management data that contributes to continuous quality improvement and is tied to the project’s logic model, or 2) conduct a rigorous evaluation for which applicants must propose an appropriate design specifying research questions, measurement and analysis.
  • ACF has an ongoing research portfolio that is building evidence in Head Start. Research sponsored through Head Start funding over the past decade has provided valuable information not only to guide program improvement in Head Start itself, but also to guide the field of early childhood programming and early childhood development. Dozens of Head Start programs have collaborated with researchers in making significant contributions in terms of program innovation and evaluation, as well as the use of systematic data collection, analysis and interpretation in program operations. As a condition of award, Head Start grantees are required to participate fully in ACF-sponsored evaluations, if selected to do so.
  • Unaccompanied Children Services (p. 33), Preschool Development Grants (p. 30), and Runaway and Homeless Youth (p.24) grantees are required to develop a program performance evaluation plan.
  • As a condition of award, Healthy Marriage and Responsible Fatherhood grantees are required to sponsor a third-party local evaluation (descriptive or impact) of their project site. Grantees are also required to participate fully in any ACF-sponsored federal evaluation, if selected to do so. As such, ACF has an ongoing research portfoliobuilding evidence related to Strengthening Families, Healthy Marriage, and Responsible Fatherhood, and has conducted randomized controlled trials with grantees in each funding round of these grants.
  • The 2003 Reauthorization of the Runaway and Homeless Youth Act called for a study of long-term outcomes for youth who are served through the Transitional Living Program (TLP). In response, ACF is sponsoring a study that will capture data from youth at program entry and at intermediate- and longer-term follow-up points after program exit and will assess outcomes related to housing, education, and employment. ACF is also sponsoring a process evaluation of the 2016 Transitional Living Program Special Population Demonstration Project.
8.4 Did the agency use evidence of effectiveness to allocate funds in any competitive grant program?
  • ACF’s Personal Responsibility Education Program includes three individual discretionary grant programs that fund programs exhibiting evidence of effectiveness, innovative adaptations of evidence-based programs, and promising practices that teach youth about abstinence and contraception to prevent pregnancy and sexually transmitted infections.
  • To receive funding through ACFs Sexual Risk Avoidance Education (SRAE) program, applicants must cite evidence published in a peer-reviewed journal and/or a randomized controlled trial or quasi-experimental design to support their chosen interventions or models.
8.5 What are the agency’s 1-2 strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?  
  • As mentioned above, ACF is conducting a multi-pronged evaluation of the Health Profession Opportunity Grants Program (HPOG). Findings from the first cohort of HPOG grants influenced the funding opportunity announcement for the second round of HPOG (HPOG 2.0) funding. For example, based on the finding that many participants engaged in short-term training for low-wage, entry-level jobs, the HPOG 2.0 FOA more carefully defined the career pathways framework, described specific strategies for helping participants progress along a career pathway, and identified and defined key HPOG education and training components. Applicants were required to more clearly describe how their program would support career pathways for participants. Based on an analysis, which indicated limited collaborations with healthcare employers, the HPOG 2.0 FOA required applicants to demonstrate the use of labor market information, consult with local employers, and describe their plans for employer engagement. The HPOG 2.0 FOA also placed more emphasis on the importance of providing basic skills education and assessment of barriers to make the programs accessible to clients who were most prepared to benefit, based on the finding that many programs were screening out applicants with low levels of basic literacy, reading, and numeracy skills.
  • ACF’s Personal Responsibility Education Innovative Strategies Program (PREIS) grantees must conduct independent evaluations of their innovative strategies for the prevention of teen pregnancy, births, and STIs, supported by ACF training and technical assistance. These rigorous evaluations are designed to meet the HHS Teen Pregnancy Prevention Evidence-Based Standards and are expected to generate lessons learned so that others can benefit from these strategies and innovative approaches.
 8.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • ACF’s template (see p. 14 in Attachment C) for competitive grant announcements includes standard language instructing grantees to conduct evaluation efforts. Program offices may use this template to require grantees to collect performance data or conduct a rigorous evaluation. Applicants are instructed to include third-party evaluation contracts in their proposed budget justifications.
  • ACF’s 2018 Preschool Development Grants funding announcement notes that “It is intended that States or territories will use a percentage of the total amount of their [renewal] grant award during years 2 through 4 to conduct the proposed process, cost, and outcome evaluations, and to implement a data collection system that will allow them to collect, house, and use data on the populations served, the implementation of services, the cost of providing services, and coordination across service partners.”
  • ACF’s rules (section 1351.15), allow Runaway and Homeless Youth grant awards to be used for “data collection and analysis.”
Score
6
Use of Evidence in 5 Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY19? (Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

 9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest 5 non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • The Family First Prevention Services Act (FFPSA) (Division E, Title VII of the Bipartisan Budget Act of 2018), funded under the Foster Care budget, newly enables States to use Federal funds available under parts B and E of Title IV of the Social Security Act to provide enhanced support to children and families and prevent foster care placements through the provision of evidence-based mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services. FFPSA requires an independent systematic review of evidence to designate programs and services as “promising,” “supported,” and “well-supported” practices. Only interventions designated as evidence-based will be eligible for federal funds.
9.3 Did the agency use its 5 largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • TANF Grant Program: ACF has a long-standing and ongoing research portfolio in service of building evidence for the TANF Grant Program. Since FY17, Congress has designated 0.33% of the TANF Block Grant for research, evaluation, and technical assistance related to the TANF Block Grant. ACF conducts research and evaluation projects in collaboration with TANF grantees, typically in areas where TANF grantees are facing challenges, innovating, or carrying out demonstration projects. This ongoing work includes building evidence around career pathways training programs, subsidized employment approaches, job search approaches, and employment coaching interventions. These are all program approaches used by state and county TANF grantees to meet their employment goals.
  • Child Care Development Block Grant Program: The Child Care Development Block Grant Act allows for up to one-half of one percent of CCDBG funding for a fiscal year to be reserved to conduct research and evaluation of the CCDBG grant program. ACF manages this ongoing research portfolio to build evidence for the Child Care and Development Block Grant Program (CCDBG), conducting research and evaluation projects in collaboration with CCDBG grantees, typically in areas where CCDBG grantees are facing challenges, innovating, or carrying out demonstration projects. Major projects in recent years include the National Survey of Early Care and Education; assessment of evidence on ratings in Quality Rating and Improvement Systems (QRIS); and several research partnerships between CCDF lead agencies and researchers.
  • Child Welfare Grant Programs: ACF has an ongoing research portfolio on the Title IV-E foster care grant program and related grant programs. ACF conducts research and evaluation in collaboration with child welfare grantees, typically focusing on areas in which grantees are facing challenges, innovating, or conducting demonstrations. Examples include strategies for prevention of maltreatment, meeting service needs, and improving outcomes for children who come to the attention of child welfare. Major projects include the National Survey of Child and Adolescent Well-Being (NSCAW) and a Supporting Evidence Building in Child Welfare project to increase the number of evidence-supported interventions grantees can use to serve the child welfare population.
  • Child Support Enforcement Research and Evaluation Grant Program: Section 1115 of the Social Security Act provides unique authority for research and evaluation grants to child support enforcement grantees to “improve the financial well-being of children or otherwise improve the operation of the child support program.” ACF manages the child support enforcement research portfolio and administers a variety of research/evaluation components to understand more about cost and program effectiveness. Research and evaluation within the portfolio have consisted of 1) supporting large multi-state demonstrations which include random assignment evaluations (described in criteria question 7.4), 2) funding a supplement to the Census Bureau’s Current Population survey, and 3) supporting research activities of other government programs and agencies by conducting matches of their research samples to the NDNH.
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
  • States applying for funding from ACF’s Community Based Child Abuse Prevention (CBCAP) grant program must “demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” The Children’s Bureau defines evidence-based and evidence-informed programs and practices along a continuum with four categories: Emerging and Evidence-Informed; Promising; Supported; and Well Supported. Programs determined to fall within specific program parameters will be considered to be “evidence informed” or “evidence-based” practices (EBP), as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds directed towards evidence-based and evidence-informed practices.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • ACF’s Office of Child Care drew on research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of dollars to improve the quality of programs, and more to inform regulations related to Child Care and Development Block Grant reauthorization.
  • ACF’s Welfare Research work has produced findings from numerous randomized controlled trials providing evidence on strategies that TANF agencies can use such as subsidized employment and job search strategies.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • The 2019 Program Instruction for the Community-Based Child Abuse Prevention (CBCAP) grant program states that CBCAP funds made available to states must be used for the financing, planning, community mobilization, collaboration, assessment, information and referral, startup, training and technical assistance, information management and reporting, and reporting and evaluation costs for establishing, operating, or expanding community-based and prevention-focused programs and activities designed to strengthen and support families and prevent child abuse and neglect, among other things.
  • Child Care and Development Block Grant Act of 2014 says states are required to spend not less than 7, 8, and 9 percent of their CCDF awards (“quality funds”) (for years 1-2, 3-4, and 5+ after 2014 CCDBG enactment, respectively – see 128 STAT. 1987) on activities to improve the quality of child care services provided in the state, including:
    •  1B: Supporting the training and professional development of the child care workforce through…incorporating the effective use of data to guide program improvement (see 128 STAT 1988)
    • 3: Developing, implementing, or enhancing a quality rating system for child care providers and services, which may support and assess the quality of child care providers in the State (A) and be designed to improve the quality of different types of child care providers (C) (see 128 STAT 1988)
    • 7: Evaluating and assessing the quality and effectiveness of child care programs and services offered in the State, including evaluating how such programs positively impact children (see 128 STAT 1990)
  • ACF requires all CCDF lead agencies to annually report on how their CCDF quality funds were expended, including the activities funded and the measures used by states and territories to evaluate progress in improving the quality of child care programs and services. ACF released a Program Instruction for state and territorial lead agencies to provide guidance on reporting the authorized activities for the use of quality funds.
Score
4
Repurpose for Results

In FY19, did the agency shift funds away from or within any practice, policy, or program that consistently failed to achieve desired outcomes? (Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; incentivizing or urging grant applicants to stop using ineffective practices in funding announcements; proposing the elimination of ineffective programs through annual budget requests; incentivizing well-designed trials to fill specific knowledge gaps; supporting low-performing grantees through mentoring, improvement plans, and other forms of assistance; using rigorous evaluation results to shift funds away from a program)

10.1 Did the agency shift funds/resources away from ineffective practices or interventions used within programs or by grantees?
  • Findings from the evaluation of the first round Health Profession Opportunity Grants (HPOG) program influenced the funding opportunity announcement for the second round of HPOG funding. Namely, the scoring criteria used to select HPOG 2.0 grantees incorporated knowledge gained about challenges experienced in the HPOG 1.0 grant program. For example, based on those challenges, applicants were asked to clearly demonstrate—and verify with local employers—an unmet need in their service area for the education and training activities proposed. Applicants were also required to provide projections for the number of individuals expected to begin and complete basic skills education. Grantees must submit semi-annual and annual progress reports to ACF to show their progress in meeting these projections. If they have trouble doing so, grantees are provided with technical assistance to support improvement or are put on a corrective action plan so that ACF can more closely monitor their steps toward improvement.
10.2 Did the agency shift funds/resources away from ineffective policies used within programs or by grantees? 
  • No examples available. 
10.3 Did the agency shift funds/resources away from ineffective grantees?
  • In FY12, ACF significantly expanded its accountability provisions with the establishment of the Head Start Designation Renewal System (DRS). The DRS was designed to determine whether Head Start and Early Head Start programs are providing high quality comprehensive services to the children and families in their communities. Where they are not, grantees are denied automatic renewal of their grant and must apply for funding renewal through an open competition process. Those determinations are based on seven conditions, one of which looks at how Head Start classrooms within programs perform on the Classroom Assessment Scoring System (CLASS), an observation-based measure of the quality of teacher-child interactions. Data from ACF’s Head Start Family and Child Experiences Survey (FACES) and Quality Features, Dosage, Thresholds and Child Outcomes (Q-DOT) study were used to craft the regulations that created the DRS and informed key decisions in its implementation. This included where to set minimum thresholds for average CLASS scores, the number of classrooms within programs to be sampled to ensure stable program-level estimates on CLASS, and the number of cycles of CLASS observations to conduct. At the time the DRS notification letters were sent out to grantees in 2011, there were 1,421 non-tribal active grants, and of these, 453 (32%) were required to re-compete (p. 19).
10.4 Did the agency shift funds/resources away from ineffective programs? (e.g., eliminations or legislative language in budget requests)
  • No examples available.
10.5 Did the agency shift funds/resources away from consistently ineffective products and services?
  • No examples available.
Back to the Standard

Visit Results4America.org