2019 Federal Standard of Excellence


Substance Abuse and Mental Health Services Administration

Score
6
Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY19?

1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer (or equivalent)? (Example: Evidence Act 313)
  • The director of the Substance Abuse and Mental Health Services Administration’s (SAMHSA) Center for Behavioral Health Statistics and Quality (CBHSQ) Office of Evaluation serves as the agency’s evaluation lead with key evaluation staff housed in this division. SAMHSA evaluations are funded from program funds that are used for service grants, technical assistance, and for evaluation activities. Evaluations have also been funded from recycled funds from grants or other contract activities.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s Chief Data Officer (or equivalent)? (Example: Evidence Act 202(e))
  • CBHSQ, led by its Director, designs and carries out special data collection and analytic projects to examine issues for SAMHSA and other federal agencies and is the government’s lead agency for behavioral health statistics, as designated by the Office of Management and Budget.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, and other related officials in order to inform policy decisions and evaluate the agency’s major programs?
  • Evaluation authority, staff, and resources are decentralized and found throughout the agency. SAMHSA is composed of four Centers, the Center for Mental Health Services (CMHS), the Center for Substance Abuse Treatment (CSAT), the Center for Substance Abuse Prevention (CSAP) and the Center for Behavioral Health Statistics and Quality (CBHSQ). CMHS, CSAT, and CSAP oversee grantee portfolios and evaluations of those portfolios. Evaluation decisions within SAMHSA are made within each Center specific to their program priorities and resources. Each of the three program Centers uses their program funds for conducting evaluations of varying types. CBHSQ, SAMHSA’s research arm, provides varying levels of oversight and guidance to the Centers for evaluation activities. CBHSQ also provides technical assistance related to data collection and analysis to assist in the development of evaluation tools and clearance packages.
Score
6
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence-building plan), and did it publicly release the findings of all
completed program evaluations in FY19?

2.1 Did the agency have an agency-wide evaluation policy? (Example: Evidence Act 313(d))
  • SAMHSA’s Evaluation Policy and Procedure (P&P) provides guidance across the agency regarding all program evaluations. Specifically, the Evaluation P&P describes the demand for rigor, compliance with ethical standards, and compliance with privacy requirements for all program evaluations conducted and funded by the agency.
2.2 Did the agency have an agency-wide evaluation plan? (Example: Evidence Act 312(b))
  • The Evaluation P&P serves as the agency’s formal evaluation plan. The Evaluation P&P sets the framework for planning, monitoring, and disseminating findings from significant evaluations. The Evaluation P&P requires Centers to identify research questions and appropriately match the type of evaluation to the maturity of the program.
2.3 Did the agency have a learning agenda (evidence-building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including, but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda? (Example: Evidence Act 312)
  • According to the Evaluation P&P (p. 1): “SAMHSA is actively working to develop a learning agenda to align its evaluation goals and activities with those of the Department of Health and Human Services (HHS).” As of September 2019, no public learning agenda is available on SAMHSA’s website. However, SAMHSA has posted a National Research Agenda on Homelessness.
2.4 Did the agency publicly release all completed program evaluations?
  • Results from significant evaluations are made available on SAMHSA’s evaluation website, a new step SAMHSA took with its newly-approved Evaluation P&P in the fall of 2017. As of September 2019, the evaluation website had one evaluation summary: a process evaluation of the Safe Schools/Healthy Students (SS/HS) State Program. No other evaluation reports or summaries are posted, including of any ongoing evaluation studies. However, a word search of SAMHSA’s publications for the term “evaluation” yielded 38 results, of which 10 are evaluation reports.
  • The following criteria is used to determine whether an evaluation is significant: (1) whether the evaluation was mandated by Congress; (2) whether there are high priority needs in states and communities; (3) whether the evaluation is for a new or congressionally-mandated program; (4) the extent to which the program is linked to key agency initiatives; (5) the level of funding; (6) the level of interest from internal and external stakeholders; and (7) the potential to inform practice, policy, and/or budgetary decision-making.
2.5 What is the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts? (Example: Evidence Act 315, subchapter II (c)(3)(9))
  • In 2017, SAMHSA formed a new workgroup, the Cross-Center Evaluation Review Board (CCERB). According to the Evaluation P&P (p. 2), the CCERB reviews and provides oversight of significant evaluation activities for SAMHSA, from contract planning to evaluation completion and at critical milestones, and is comprised of representatives from each of the centers, and Office of Tribal Affairs and Policy (OTAP) for cultural competency consultation, as necessary. CCERB staff provide support for program-specific and administration-wide evaluations.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
  • SAMHSA does not list any completed evaluation reports on its evaluation website. Of the 10 evaluation reports found on the publications page, none appear to use experimental methods. According to the Evaluation P&P (p. 5): “evaluations should be rigorously designed to the fullest extent possible and include ‘…inferences about cause and effect [that are] well founded (internal validity), […] clarity about the populations, settings, or circumstances to which results can be generalized (external validity); and requires the use of measures that accurately capture the intended information (measurement reliability and validity).’
Score
1
Resources

Did the agency invest at least 1% of program funds in evaluations in FY19? (Examples: Impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance, rigorous evaluations, including random assignments)

3.1 ____ (Name of agency) invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY19 budget.
  • RFA was unable to determine the amount of resources SAMHSA invested in evaluations in FY19.
3.2 Did the agency have a budget for evaluation and how much was it? (Were there any changes in this budget from the previous fiscal year?)
  • RFA was unable to determine the budget for evaluation at SAMHSA and, thus, any changes from the previous fiscal year. SAMHSA evaluations are funded from program funds that are used for service grants, technical assistance, and for evaluation activities. Each of the three program Centers uses their program funds for conducting evaluations of varying types. Evaluations have also been funded from recycled funds from grants or other contract activities.
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
  • SAMHSA’s Evidence-Based Practices Resource Center aims to provide communities, clinicians, policy-makers and others in the field with the information and tools they need to incorporate evidence-based practices into their communities or clinical settings. The Center lists nine technical assistance projects, two of which appear to provide financial or other resources to help city, county, and state governments or other grantees build evaluation capacity (as of September 2019):
Score
6
Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY19?
(Example: Performance stat systems, frequent outcomes-focused data-informed meetings)

4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • The SAMHSA Strategic Plan FY2019-FY2023 outlines five priority areas with goals and measurable objectives to carry out the vision and mission of SAMHSA. For each priority area, an overarching goal and series of measurable objectives are described followed by examples of key performance and outcome measures SAMHSA will use to track progress.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • The Centers have historically managed internal performance review boards to periodically review grantee performance and provide corrective actions as needed.
  • According to the FY2019-FY2023 Strategic Plan (pp. 21-22), SAMHSA will modernize the Performance Accountability and Reporting System by 1) capturing real-time data for discretionary grant programs in order to monitor their progress, impact, and effectiveness, and 2) developing benchmarks and disseminating annual Performance Evaluation Reports for all SAMHSA discretionary grant programs.
4.3 Did the agency have a continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement? (Examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)
  • In 2016, SAMHSA’s Office of Financial Resources (OFR) established a Program Integrity Review Team (PIRT) staffed by representatives from each of its four Centers and managed by OFR. On a quarterly basis, three SAMHSA discretionary grant portfolios (one from each of the three program Centers) conduct a self-analysis to examine grantee performance based on objective performance data, financial performance and other factors. Program staff present their program self-assessments to the PIRT and receive feedback on, for example, targets of concern.
Score
5
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data – consistent with strong privacy protections – to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY19?
(Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies; data-use policies)

5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • The SAMHSA Strategic Plan FY2019-FY2023 (pp. 20-23) outlines five priority areas to carry out the vision and mission of SAMHSA, including Priority 4: Improving Data Collection, Analysis, Dissemination, and Program and Policy Evaluation. This Priority includes 3 objectives: 1) Develop consistent data collection strategies to identify and track mental health and substance use needs across the nation; 2) Ensure that all SAMHSA programs are evaluated in a robust, timely, and high-quality manner; and 3) Promote access to and use of the nation’s substance use and mental health data and conduct program and policy evaluations and use the results to advance the adoption of evidence-based policies, programs, and practices.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c)) 
  • The Center for Behavioral Health Statistics and Quality (CBHSQ) oversees data collection initiatives and provides publicly available datasets so that some data can be shared with researchers and other stakeholders while preserving client confidentiality and privacy.
  • SAMHSA’s Substance Abuse and Mental Health Data Archive (SAMHDA) contains substance use disorder and mental illness research data available for restricted and public use. SAMHDA promotes the access and use of SAMHSA’s substance abuse and mental health data by providing public-use data files and documentation for download and online analysis tools to support a better understanding of this critical area of public health.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • SAMHSA’s Performance and Accountability and Reporting System (SPARS) hosts the data entry, technical assistance request, and training system for grantees to report performance data to SAMHSA. SPARS serves as the data repository for the Administration’s three centers, Center for Substance Abuse and Prevention (CSAP), Center for Mental Health Services (CMHS), and Center for Substance Abuse Treatment (CSAT). In order to safeguard confidentiality and privacy, the current data transfer agreement limits the use of grantee data to internal reports so that data collected by SAMHSA grantees will not be available to share with researchers or stakeholders beyond SAMHSA, and publications based on grantee data will not be permitted.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • The Center of Excellence for Protected Health Information (CoE for PHI) is a SAMHSA funded technical assistance project designed to develop and increase access to simple, clear, and actionable educational resources, training, and technical assistance for consumers and their families, state agencies, and communities to promote patient care while protecting confidentiality.
  • Through SAMHSA’s Substance Abuse and Mental Health Data Archive (SAMHDA) SAMHSA has partnered with the National Center for Health Statistics (NCHS) to host restricted-use National Survey on Drug Use and Health (NSDUH) data at their Federal Statistical Research Data Centers (RDCs). RDCs are secure facilities that provide access to a range of restricted-use microdata for statistical purposes.
Score
4
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY19?
(Example: What Works Clearinghouses)

6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • There is great diversity across SAMHSA programming, ranging from community-level prevention activities to residential programs for pregnant and post-partum women with substance misuse issues. While this diversity allows SAMHSA to be responsive to a wide set of vulnerable populations, it limits the utility of a common evidence framework for the entire agency. Within Centers (the Center for Substance Abuse Prevention, the Center for Substance Abuse Treatment, and the Center for Mental Health Services), consistent evidence frameworks are in use and help to shape the process of grant-making (e.g., Center staff are familiar with the pertinent evidence base for their particular portfolios).
  • In 2011, based on the model of the National Quality Strategy, SAMHSA developed the National Behavioral Health Quality Framework (NBHQF). With the NBHQF, SAMHSA proposes a set of core measures to be used in a variety of settings and programs, as well as in evaluation and quality assurance efforts. The proposed measures are not intended to be a complete or total set of measures a payer, system, practitioner, or program may want to use to monitor quality of its overall system or the care or activities it provides. SAMHSA encourages such entities to utilize these basic measures as appropriate as a consistent set of indicators of quality in behavioral health prevention, promotion, treatment, and recovery support efforts across the nation.
6.2 Did the agency have a common evidence framework for funding decisions?
  • SAMHSA has universal language about using evidence-based practices (EBPs) that is included in its Funding Opportunity Announcements (FOAs) (entitled Using Evidence-Based Practices (EBPs)). This language includes acknowledgement that, “EBPs have not been developed for all populations and/or service settings” thus encouraging applicants to “provide other forms of evidence” that a proposed practice is appropriate for the intended population. Specifically, the language states that applicants should: (1) document that the EBPs chosen are appropriate for intended outcomes; (2) explain how the practice meets SAMHSA’s goals for the grant program; (3) describe any modifications or adaptations needed for the practice to meet the goals of the project; (4) explain why the EBP was selected; (5) justify the use of multiple EBPs, if applicable; and (6) discuss training needs or plans to ensure successful implementation. Lastly, the language includes resources the applicant can use to understand EBPs. Federal grants officers work in collaboration with the SAMHSA Office of Financial Resources to ensure that grantee funding announcements clearly describe the evidence standard necessary to meet funding requirements.
  • SAMHSA developed a manual, Developing a Competitive SAMHSA Grant Application, which explains information applicants will likely need for each section of the grant application. The manual has two sections devoted to evidence-based practices (p. 8, p. 26), including: 1) A description of the EBPs applicants plan to implement; 2) Specific information about any modifications applicants plan to make to the EBPs and a justification for making them; and 3) How applicants plan to monitor the implementation of the EBPs. In addition, if applicants plan to implement services or practices that are not evidence-based, they must show that these services/practices are effective.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • Until 2018, SAMHSA regarded the National Registry of Evidence-based Programs and Practices (NREPP) as the primary online user friendly tool for identifying evidence-based programs for grantee implementation. In January 2018, SAMHSA announced that it was “moving to EBP [evidence-based practice] implementation efforts through targeted technical assistance and training that makes use of local and national experts and will that assist programs with actually implementing services….” NREPP was taken offline in August 2018. In August 2019, the Pew-MacArthur Results First Initiative announced it had restored users’ access to this information, which can be found in the Results First Clearinghouse.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • To date SAMHSA has produced 11 Evidence-Based Practice Knowledge Informing Transformation (KIT) guides to help move the latest information available on effective behavioral health practices into community-based service delivery. The KITs contain information sheets, introductory videos, practice demonstration videos, and training manuals. Each KIT outlines the essential components of the evidence-based practice and provides suggestions collected from those who have successfully implemented them.
  • In April 2018, SAMHSA launched the Evidence-Based Practices Resource Center (Resource Center) that aims to provide communities, clinicians, policy-makers and others in the field with the information and tools they need to incorporate evidence-based practices into their communities or clinical settings. The Resource Center contains a collection of science-based resources, including Treatment Improvement Protocols, toolkits, resource guides, and clinical practice guidelines, for a broad range of audiences. The Resource Center also directs users to nine issue-based or regionally-organized Technical Assistance projects, all of which promote the use of evidence-based practices in some way. For example, the purpose of the Mental Health Technology Transfer Center (MHTTC) Network is disseminating and implementing evidence-based practices for mental disorders into the field.
Score
3
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY19?
(Examples: Prizes and challenges; behavioral science trials; innovation labs/accelerators; performance partnership pilots; demonstration projects or waivers with rigorous evaluation requirements)

7.1 Did the agency engage leadership and staff in its innovation efforts?
  • SAMHSA participates in collaborations with other HHS agencies to promote innovative uses of data, technology and innovation across HHS to create a more effective government and improve the health of the nation, via the HHS IDEA Lab. SAMHSA has co-developed and submitted several innovative data utilization project proposals to the Ignite Accelerator of the HHS IDEA Lab, such as Rapid Opioid Alert and Response (ROAR), a project to monitor and prevent opioid overdoses by linking heroin users to resources and information.
7.2 Did the agency have policies that promote innovation?
  • Pursuant to the 21st Century Cures Act, SAMHSA established the National Mental Health and Substance Use Policy Laboratory (NMHSUPL) as an office, led by a Director, within the agency to promote evidence-based practices and service delivery models. NMHSUPL staff and programs engage in the following activities: 1) Facilitate the implementation of policy changes likely to improve mental health, mental illness, recovery supports, and the prevention and treatment of substance use disorder services; 2) Work with CBHSQ to evaluate and disseminate evidence-based practices; and 3) Carry out other activities to encourage innovation and disseminate evidence-based programs and practices.
7.3 Did the agency have processes, structures, or programs to stimulate innovation?
  • The SAMHSA Program Portal, a collection of technical assistance and training resources provided by the agency, provides behavioral health professionals with education and collaboration opportunities, and ample tools and technical assistance resources that promote innovation in practice and program improvement. Located within the Knowledge Network are groups such as the Center for Financing Reform and Innovation, which works with states and territories, local policy makers, providers, consumers, and other stakeholders to promote innovative financing and delivery system reforms.
7.4 Did the agency evaluate its innovation efforts, including using rigorous methods?
  • SAMHSA does not list any completed evaluation reports on its evaluation website. Of the 10 evaluation reports found on the publications page, none appear to use experimental methods.
Score
6
Use of Evidence in 5 Largest Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest competitive grant programs in FY19? (Examples: Tiered-evidence frameworks; evidence-based funding set-asides; priority preference points or other preference scoring for evidence; Pay for Success provisions)

8.1 What were the agency’s 5 largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
  • SAMHSA’s five largest competitive grant programs in FY19 are: (1) State Opioid Response Grants ($1.5 billion in FY19) (eligible grantees: states); (2) Children’s Mental Health Services ($125 million in FY19); (3) Strategic Prevention Framework ($119.5 million in FY19) (eligible grantees: public and private nonprofit entities); (4) Targeted Capacity Expansion – General ($100.2 million in FY19) (eligible grantees: domestic public and private nonprofit entities); and (5) Project AWARE ($92 million in FY19) (eligible grantees: State education agencies).
8.2 Did the agency use evidence of effectiveness to allocate funds in 5 largest competitive grant programs? (e.g., Were evidence-based interventions/practices required or suggested? Was evidence a significant requirement?)
  • The 2018 State Opioid Response Grants application (the latest available) required states to use evidence-based practices to address opioid use disorder, as 1 of 5 evaluation criteria; however the application did not allot points for those criteria (p. 19).
  • The FY19 Project AWARE State Education Agency Grants application gave applicants 25 out of 100 points for the following: “Identify the Evidence-Based Practice(s) (EBPs) that will be used in each of the three LEAS. Discuss how each EBP chosen is appropriate for your population(s) of focus and the outcomes you want to achieve. Describe any modifications that will be made to the EBP(s) and the reason the modifications are necessary” (p. 23).
8.3 Did the agency use its 5 largest competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • The FY19 Strategic Prevention Framework Grants application states that SAMHSA may negotiate additional terms and conditions with applicants prior to grant award, including “requirements relating to participation in a cross-site evaluation” (p. 51).
  • The FY19 Targeted Capacity Expansion Grants application states that SAMHSA may negotiate additional terms and conditions with applicants prior to grant award, including “requirements relating to participation in a cross-site evaluation” (p. 57).
  • The FY19 Project AWARE State Education Agency Grants application states that SAMHSA may negotiate additional terms and conditions with applicants prior to grant award, including “requirements relating to participation in a cross-site evaluation” (p. 61).
8.4 Did the agency use evidence of effectiveness to allocate funds in any competitive grant program?
  • No examples available.
8.5 What are the agency’s 1-2 strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • No examples available.
8.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • No examples available.
Score
7
Use of Evidence in 5 Largest Non-Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its 5 largest non-competitive grant programs in FY19?
(Examples: Evidence-based funding set-asides; requirements to invest funds in evidence-based activities; Pay for Success provisions)

9.1 What were the agency’s 5 largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments are eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in largest 5 non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • In FY19, Congress maintained the 10 percent set-aside for evidence-based programs in SAMHSA’s Mental Health Grant Block (MHBG) grant to address the needs of individuals with early serious mental illness, including psychotic disorders, regardless of the age of the individual at onset (see p. 48 of the FY18-FY19 Block Grant application). In the FY20 budget request (p. 3), SAMHSA expressed its desire to continue the set-aside.
  • The FY18-FY19 Block Grant application requires states seeking Mental Health Block Grant (MHBG) and Substance Abuse and Treatment Prevention Block Grant (SAGB) funds to identify specific priorities. For each priority, states must identify the relevant goals, measurable objectives, and at least one-performance indicator for each objective, which must include strategies to deliver evidence-based individualized treatment plans (p. 21); evidence-based interventions for substance use or dependence (p. 21); building provider capacity to deliver evidence-based, trauma-specific interventions (p. 22); evidence-based programs, policies, and practices in prevention efforts (p. 22); evidence-based models to prevent substance misuse (p. 23).
9.3 Did the agency use its 5 largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • The FY18-FY19 Block Grant application requires states applying for Substance Abuse Prevention and Treatment funds to create an evaluation plan, which must include at least five specified evaluation elements. Additionally, the application specifies that SAMHSA will work with the National Institute of Mental Health (NIMH) to plan for program evaluation and data collection related to demonstrating program effectiveness of the Mental Health Block Grant.
9.4 Did the agency use evidence of effectiveness to allocate funds in any non-competitive grant program?
  • No examples available.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • No examples available.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • The FY18-FY19 Block Grant Application clarified that “Section 1921 of the PHS [Public Health Services] Act (42 U.S.C.§ 300x-21) authorizes the States to obligate and expend SABG [Substance Abuse and Treatment Prevention Block Grant] funds to plan, carry out and evaluate activities and services designed to prevent and treat substance use disorders” (p. 16). The Application further clarifies that states “may utilize SABG funds to train personnel to conduct fidelity assessments of evidence-based practices” (p. 35).
Score
4
Repurpose for Results

In FY19, did the agency shift funds away from or within any practice, policy, or program that consistently failed to achieve desired outcomes?
(Examples: Requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; incentivizing or urging grant applicants to stop using ineffective practices in funding announcements; proposing the elimination of ineffective programs through annual budget requests; incentivizing well-designed trials to fill specific knowledge gaps; supporting low-performing grantees through mentoring, improvement plans, and other forms of assistance; using rigorous evaluation results to shift funds away from a program)  

10.1 Did the agency shift funds/resources away from ineffective practices or interventions used within programs or by grantees?
  • The State Opioid Response Grants program required states and subgrantees to only use evidence-based treatments, practices, and interventions. As such, SAMHSA disallowed the use of medical withdrawal (detoxification) in isolation since it “is not the standard of care for OUD, is associated with a very high relapse rate, and significantly increases an individual’s risk for opioid overdose and death if opioid use is resumed” (p. 6). And SAMHSA clarified: “SAMHSA will monitor use of these funds to assure that they are being used to support evidence-based treatment and recovery supports, and will not permit use of these funds for non-evidence-based approaches” (p. 7). Further, under Standard Funding Restrictions, SAMHSA included: “non-evidence-based treatment approaches” (p. 54).
10. 2 Did the agency shift funds/resources away from ineffective policies used within programs or by grantees?
  • No examples available.
10.3 Did the agency shift funds/resources away from ineffective grantees?
  • No examples available.
10.4 Did the agency shift funds/resources away from ineffective programs? (e.g., eliminations or legislative language in budget requests)
  • No examples available.
10.5 Did the agency shift funds/resources away from consistently ineffective products and services?
  • In January 2018, SAMHSA announced it would shift resources away from the National Registry of Evidence-based Programs and Practices (NREPP) toward targeted technical assistance and training for implementing evidence-based practices. The reasoning was that NREPP had flawed and skewed presentation of evidence-based interventions, which “did not address the spectrum of needs of those living with serious mental illness and substance use disorders.”
Back to the Standard

Visit Results4America.org