2019 Federal Standard of Excellence


Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY19? (Example: Performance stat systems, frequent outcomes-focused data-informed meetings)

Score
6
Administration for Children and Families (HHS)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • ACF was an active participant in the development of the FY 2018-2022 HHS Strategic Plan, which includes several ACF-specific objectives. ACF regularly reports on progress associated with those objectives as part of the FY 2019 HHS Annual Performance Plan/Report, including the eight performance measures that support Goal Three objectives to “Strengthen the Economic and Social Well-Being of Americans Across the Lifespan.” ACF supports Objective 3.1 (Encourage self-sufficiency and personal responsibility, and eliminate barriers to economic opportunity), Objective 3.2 (Safeguard the public against preventable injuries and violence or their results), and Objective 3.3 (Support strong families and healthy marriage, and prepare children and youth for healthy, productive lives) by reporting annual performance measures. ACF reports on a total of nine performance measures throughout the FY 2018-2022 HHS Strategic Plan. ACF is also an active participant in the HHS Strategic Review process, which is an annual assessment of progress on the subset of nine performance measures that ACF reports on as part of the HHS Strategic Plan.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • Individual ACF programs regularly analyze and use performance data, administrative data, and evaluation data to improve performance. Two performance management systems worth noting are the Participant Accomplishment and Grant Evaluation System (PAGES) management information system for Health Profession Opportunity Grant (HPOG) grantees and the Information, Family Outcomes, Reporting, and Management (nForm) management information system for Healthy Marriage and Responsible Fatherhood grantees. Both are web-based management information systems that are used to track grantee progress for program management and to record grantee and participant data for research and evaluation purposes.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
Score
5
Administration for Community Living (HHS)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • ACL’s strategy focuses on five pillars: supporting families and caregivers, protecting rights and preventing abuse, connecting people to resources, expanding employment opportunities, and strengthening the aging and disability networks. These pillars provide structure and focus for ACL’s work. ACL’s outcomes measures are available, by program, in its annual Congressional Budget Justification, and include measures of program efficiency. As part of the U.S. Department of Health and Human Services Annual Performance Plan and Report, ACL reports on the following two Agency Priority Goals: (1) Increase the success rate of the Protection and Advocacy Program’s individual or systemic advocacy, thereby advancing individuals with developmental disabilities’ right to receive appropriate community based services, resulting in community integration and independence, and have other rights enforced, retained, restored and/or expanded; and (2) Improve dementia capability of long-term support systems to create dementia-friendly, livable communities (Lead Agency ACL).
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • ACL employs a program performance management strategy with multiple components. This includes coordination and collaboration with other agencies and organizations, enhanced partnerships between aging and disability networks, and senior leadership involvement in performance management. ACL has several recent and ongoing evaluation studies that examine the cost benefits of its programs in terms of health care savings (see Report to Congress: The Centers for Medicare & Medicaid Services’ Evaluation of Community-Based Wellness and Prevention Programs Under Section 4202 (b) of the Affordable Care Act which focused on several of ACL’s programs and Evaluation of the Effect of the Older Americans Act Title III-C Nutrition Services Program on Participants’ Health Care Utilization).
  • ​The regional staff conduct annual reviews with the state units on aging to review the states’ work under their state plans on aging (which they develop under the Older Americans Act). While the forms used are for internal use only, states are asked to document their progress towards their approved goals, what performance indicators they use to measure their progress, and to report out on changes in program performance, targeting of priority populations, and program innovations for which they have received honors or recognition. There are also checks of how and whether states verify the quality of their performance data. ACL uses this information to inform TA directed to improve program operations, results, and return on investment. When making decisions about continued grant funding, NIDILRR uses a risk scale to determine whether the additional funding will be a good use of funds. NIDILRR’s long range plan also describes research as part of their new employment research agenda to continue development of return-on-investment models that can be used by Vocational Rehabilitation agencies to optimize the services they provide.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • As part of ACL’s performance strategy, OPE staff provide annual presentations on ACL Performance to ACL leadership. They also provide information to internal and external stakeholders about agency performance trends. OPE staff also hold annual meetings with ACL staff to report performance measure data and results, including discussing methods to incorporate performance and evaluation findings into funding and operational decision-making.
Score
9
U.S. Agency for International Development
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • Many of USAID’s innovation or co-created programs and those done in partnerships reflect a data-driven “pay for results” model, where milestones are agreed by all parties, and payments are made when milestones are achieved. This means that, for some programs, if a milestone is unmet, funds may be re-applied to an innovation or intervention that is achieving results. This rapid and iterative performance model means that USAID more quickly understands what isn’t working and can move resources away from it and toward what is working.
  • Approaches such as prizes, Grand Challenges, and ventures can also be constructed to be “pay for results only” where interventions such as “Development Impact Bonds” are used to create approaches where USAID only pays for outcomes and not inputs or attempts only. The Agency believes this model will pave the way for much of USAID’s work to be aligned with a “pay for results” approach. USAID is also piloting the use of the impact per dollar of cash transfers as a minimum standard of cost-effectiveness for applicable program designs.
  • Additionally, USAID Missions develop Country Development Cooperation Strategies (CDCSs) with clear goals and objectives and a Performance Management Plan (PMP) that identifies expected results, performance indicators to measure those results, plans for data collection and analysis, and regular review of performance measures to use data and evidence to adapt programs for improved outcomes. USAID also promotes operations performance management to ensure that the Agency achieves its development objectives and aligns resources with priorities. USAID uses its Management Operations Council OC to conduct an annual Strategic Review of progress toward achieving the strategic objectives in the JSP.
  • To improve linkages and break down silos, USAID continues to develop the Development Information Solution (DIS)—an enterprise-wide management information system that will enable USAID to collect, manage, and visualize performance data across units, along with budget and procurement information, to more efficiently manage and execute programming.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • USAID’s Program Cycle policy (ADS 201.3.2.18) requires that Missions conduct at least one portfolio review per year that focuses on progress toward strategy-level results. Missions must also conduct a CDCS mid-course stocktaking at least once during the course of implementing their Country Development Cooperation Strategy, which typically spans five years.
  • USAID developed an approach to explicitly ensure adaptation through learning called Collaborating, Learning, and Adapting (CLA). It is incorporated into USAID’s Program Cycle guidance (ADS 201.3.5.19) where it states: “Strategic collaboration, continuous learning, and adaptive management link together all components of the Program Cycle.” Through CLA, USAID ensures its programming is coordinated with others, grounded in a strong evidence base, and iteratively adapted to remain relative throughout implementation.
  • In addition to this focus through its programming, USAID has two senior bodies which oversee Enterprise Risk Management, and meet regularly to improve the accountability and effectiveness of USAID programs and operations through holistic risk management. USAID tracks progress toward strategic goals and annual performance goals during data-driven reviews at Management Operations Council meetings.
Score
3
Corporation for National and Community Service
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • CNCS continued to implement its Transformation and Sustainability Plan in FY19. This plan aims to ensure CNCS is maximizing its resources to achieve results. The agency is in the process of developing more specific goals for the Transformation and Sustainability Plan, though they are currently internal goals. In addition, the agency will be conducting a process evaluation/rapid cycle assessment for each phase of its transition to a new portfolio manager grant management model. The goal is to use lessons learned from implementation of Phase 1 to inform Phase 2 and beyond. Data collection will begin in quarter 1 of FY20.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • CNCS started the fiscal year with a series of internal budget formulation meetings that asked each office to identify in their budget proposals how evidence-based activities and evidence-building activities would be prioritized. All program offices are using data/evidence to improve returns on investment. For example:
    • AmeriCorps VISTA used poverty mapping data to inform resource allocation decisions in FY19. VISTA also launched a dashboard that puts project and member data at the fingertips of staff to help them identify best practices and trouble-shoot problems in a timely fashion.
    • AmeriCorps NCCC is creating a qualitative database of all NCCC projects completed since 2012. The database will thematically organize projects, classify project frameworks, and categorize the outcomes of these service initiatives. Moving forward this data will be used to invest more strategically in projects with the best results by refining project development with community sponsors based on those that seem to have the best uptake and outcomes.
    • In FY19, Senior Corps contracted with a research firm to comprehensively assess the quality of program administrative data and its potential uses for performance management.
    • CNCS conducted an analysis of usage of education awards to identify information trends related to how AmeriCorps alumni use these education awards to further their education and employment opportunities (a key goal of the AmeriCorps program). The report highlights a number of opportunities to encourage the usage of education awards to improve education and employment for AmeriCorps alumni.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • The Office of the Chief Financial Officer meets quarterly to assess progress toward the goals of its performance plan. The plan includes strategic objectives, strategies for achieving the objectives, milestones, measures, and targets. Quarterly meetings are used to discuss actuals versus targets and identify promising practices used to achieve targets as well as areas for better optimizing the delivery of budget, procurement, grants, and financial management.
Score
8
U.S. Department of Education
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • ED’s FY18-22 Strategic Plan includes two parallel goals, one for P-12 and one for higher education (Strategic Objectives 1.4 and 2.2, respectively), that focus on supporting agencies and educational institutions in the identification and use of evidence-based strategies and practices.
  • The Department’s FY 2018 Annual Performance Report and FY 2020 Annual Performance Plan contains FY18 performance results as well as metrics for evidence from strategic objective 5.3 in the previous Strategic Plan, where established targets were mostly met.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • The newly formed Grants Policy Office in the Office of Planning, Evaluation and Policy Development (OPEPD) works with offices across the Department to ensure alignment with the Secretary’s priorities, including evidence-based practices. The Grants Policy Office looks at where the Department and the field can continuously improve by building stronger evidence, making decisions based on a clear understanding of the available evidence, and disseminating evidence to decision makers. Specific activities include: strengthening the connection between the Secretary’s policies and grant implementation from design through evaluation; supporting a culture of evidence-based practices; providing guidance to grant-making offices on how to integrate evidence into program design; and identifying opportunities where the Department and field can improve by building, understanding, and using evidence.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • The newly formed Grants Policy Office in the Office of Planning, Evaluation and Policy Development (OPEPD) works with offices across the Department to ensure alignment with the Secretary’s priorities, including evidence-based practices. The Grants Policy Office looks at where the Department and the field can continuously improve by building stronger evidence, making decisions based on a clear understanding of the available evidence, and disseminating evidence to decision makers. Specific activities include: strengthening the connection between the Secretary’s policies and grant implementation from design through evaluation; supporting a culture of evidence-based practices; providing guidance to grant-making offices on how to integrate evidence into program design; and identifying opportunities where the Department and field can improve by building, understanding, and using evidence.
Score
9
U.S. Dept. of Housing & Urban Development
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • HUD uses data and evidence extensively to improve outcomes and return on investment. The primary means are through PD&R’s investments in data collection, program demonstrations, program evaluations, and research guided by a multi-year learning agenda; HUD’s extensive use of outcome-oriented performance metrics in the Annual Performance Plan; senior staff oversight and monitoring of key outcomes and initiatives through the Prescription for HUD, the Advancing Economic Opportunity Task Force, and the Agency-Wide Integrity Task Force, which bring together senior staff for quarterly performance management meetings. In addition, the Standards for Success pilot is a new standardized data collection and reporting framework for discretionary grant programs.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
Score
9
U.S. Department of Labor
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • Using a performance and budget system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance to analyze progress and identify opportunities for performance improvements. Learning agendas updated annually by DOL agencies in collaboration with DOL’s CEO include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The annual Strategic Reviews with leadership include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement. Using a performance stat reporting and dashboard system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance by the Deputy Secretary to analyze progress and identify opportunities for performance improvements.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • DOL leverages a variety of continuous learning tools, including the learning agenda approach to conceptualize and make progress on substantive learning goals for the agency, as well as DOL’s Performance Management Center’s (PMC) Continuous Process Improvement (CPI) Program, which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
Score
5
Millennium Challenge Corporation
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • MCC is committed to using high-quality data and evidence to drive its strategic planning and program decisions. The Monitoring and Evaluation plans for all programs and tables of key performance indicators for all projects of key performance indicators for all projects are available online by compact and threshold program and by sector, for use by both partner countries and the general public. Prior to investment, MCC performs a Cost-Benefit Analysis to assess the potential impact of each project, and estimates an Economic Rate of Return (ERR). MCC uses a 10% ERR hurdle to more effectively prioritize and fund projects with the greatest opportunity for maximizing impact. MCC then recalculates ERRs at investment closeout, drawing on information from MCC’s monitoring data (among other data and evidence), to test original assumptions and assess the cost effectiveness of MCC programs.
  • In addition, MCC produces periodic reports that capture the results of MCC’s learning efforts in specific sectors and translate that learning into actionable evidence for future programming. At the start of FY18, MCC published a Principles into Practice (PiP) report on its investments into roads; this report demonstrated MCC learning around the implementation and evaluation of its roads projects, and critically assessed how MCC was changing its practice as a result of this learning. In FY19, MCC will publish a PiP report on its technical and vocational education training activities in the education sector. MCC is also midway through research related to learning in the water, sanitation, and hygiene sector, with a PiP capturing the results of that learning expected in FY20.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • In FY18, MCC implemented a new reporting system that enhanced MCC’s credibility around results, transparency, learning, and accountability. The Star Report and its associated quarterly business process captures key information to provide a framework for results and improve the ability to promote and disseminate learning and evidence throughout the compact and threshold program lifecycle. For each compact and threshold program, evidence is collected on performance indicators, evaluation results, partnerships, sustainability efforts, and learning, among other elements. Critically, this information will be available in one report after each program ends. Each country will have a Star Report published roughly seven months after completion. In FY19, MCC released its first Star Reports about its recently completed investment in Cabo Verde and Indonesia. There are currently four Star Reports in progress that cover MCC’s programs in Honduras, Zambia, Georgia, and Malawi. These are expected to be published in FY20.
  • MCC also supports the creation of multidisciplinary country teams to manage the development and implementation of each compact and threshold program. Teams meet frequently to gather evidence, discuss progress, make project design decisions, and solve problems. Prior to moving forward with a program investment, teams are encouraged to use the lessons from completed evaluations to inform their work going forward.
  • Continual learning and improvement is a key aspect of MCC’s operating model. MCC continuously monitors progress towards compact and threshold program results on a quarterly basis using performance indicators that are specified in the Monitoring and Evaluation (M&E) Plan for each country’s investments. The M&E Plans specify indicators at all levels (process, output, and outcome) so that progress towards final results can be tracked. Every quarter each partner country submits an Indicator Tracking Table that shows actual performance of each indicator relative to the baseline that was established before the activity began and the performance targets that were established in the M&E Plan. Key performance indicators and their accompanying data by country are updated every quarter and published online. MCC management and the relevant country team review this data in a formal Quarterly Performance Review meeting to assess whether results are being achieved and integrates this information into project management and implementation decisions.
Score
6
Substance Abuse and Mental Health Services Administration (HHS)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • The SAMHSA Strategic Plan FY2019-FY2023 outlines five priority areas with goals and measurable objectives to carry out the vision and mission of SAMHSA. For each priority area, an overarching goal and series of measurable objectives are described followed by examples of key performance and outcome measures SAMHSA will use to track progress.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • The Centers have historically managed internal performance review boards to periodically review grantee performance and provide corrective actions as needed.
  • According to the FY2019-FY2023 Strategic Plan (pp. 21-22), SAMHSA will modernize the Performance Accountability and Reporting System by 1) capturing real-time data for discretionary grant programs in order to monitor their progress, impact, and effectiveness, and 2) developing benchmarks and disseminating annual Performance Evaluation Reports for all SAMHSA discretionary grant programs.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • In 2016, SAMHSA’s Office of Financial Resources (OFR) established a Program Integrity Review Team (PIRT) staffed by representatives from each of its four Centers and managed by OFR. On a quarterly basis, three SAMHSA discretionary grant portfolios (one from each of the three program Centers) conduct a self-analysis to examine grantee performance based on objective performance data, financial performance and other factors. Program staff present their program self-assessments to the PIRT and receive feedback on, for example, targets of concern.
Back to the Standard

Visit Results4America.org