2019 Federal Standard of Excellence


Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data - consistent with strong privacy protections - to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY19? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; open data policies; data-use policies)

Score
5
Administration for Children and Families (HHS)
5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • ACF’s Interoperability Action Plan was established in 2017 to formalize ACF’s vision for effective and efficient data sharing. Under this plan ACF and its program offices will develop and implement a Data Sharing First (DSF) strategy that starts with the assumption that data sharing is in the public interest. The plan states that ACF will encourage and promote data sharing broadly, constrained only when required by law or when there are strong countervailing considerations.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • In 2018, ACF produced a Compendium of ACF Administrative and Survey Data Resources. All major ACF person-level administrative data sets and surveys are included, including 11 administrative data sources and eight surveys. Each entry includes the following information: data ownership and staff experts, basic content, major publications and websites, available data sets (public, restricted use, in-house), restrictions on data sharing, capacity to link with other data sets along with history of such linking, data quality, and resources to collect, prepare, and analyze the data. The compendium is currently available for internal use at HHS; a public version is forthcoming.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • ACF has multiple efforts underway to promote and support the use of documented data for research and improvement, including making numerous administrative and survey datasets publicly available for secondary use and actively promoting the archiving of research and evaluation data for secondary use. These data are machine readable, downloadable, and de-identified as appropriate for each data set. For example, individual-level data for research is held in secure restricted use formats, while public-use data sets are made available online. To make it easier to find these resources, ACF plans to release a Compendium of ACF Administrative and Survey Data and to consolidate information on archived research and evaluation data on the OPRE website.
  • OPRE actively promotes archiving of research and evaluation data for secondary use. OPRE research contracts include a standard clause requiring contractors to make data and analyses supported through federal funds available to other researchers and to establish procedures and parameters for all aspects of data and information collection necessary to support archiving information and data collected under the contract. Many datasets from past OPRE projects are stored in archives including the ACF-funded Child Care & Early Education Research Connections site and the ICPSR data archive. OPRE has funded grants for secondary analysis of ACF/OPRE data; examples in recent years include secondary analysis of strengthening families datasets and early care and education datasets.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • ACF developed a Confidentiality Toolkit that supports state and local efforts by explaining rules governing confidentiality in ACF and certain related programs, by providing examples of how confidentiality requirements can be addressed, and by including sample memoranda of understandings and data sharing agreements. ACF is currently in the process of updating the Toolkit for recent changes in statute, and to provide real-world examples of how data has been shared across domains—which frequently do not have harmonized privacy requirements—while complying with all relevant privacy and confidentiality requirements (e.g. FERPA, HIPPA). These case studies will also include downloadable, real-world tools that have been successfully used in the highlighted jurisdictions.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • ACF engages in several broad-based and cross-cutting efforts to support state, local, and tribal efforts to use human services data while protecting privacy and confidentiality. Through the Interoperability Initiative, ACF supports data sharing through developing standards and tools that are reusable across the country, addressing common privacy and security requirements to mitigate risks, and providing request-based technical assistance to states, local jurisdictions, and ACF program offices. Several ACF divisions have also been instrumental in supporting cross-governmental efforts, such as the National Information Exchange Model (NIEM) that will enable human services agencies to collaborate with health, education, justice, and many other constituencies that play a role in the well-being of children and families. ACF also undertakes many program-specific efforts to support state, local, and tribal efforts to use human services data while protecting privacy and confidentiality. For example, ACF’s TANF Data Innovation Project supports innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. This work includes encouraging and strengthening state integrated data systems, promoting proper payments and program integrity, and enabling data analytics for TANF program improvement.
Score
6
Administration for Community Living
5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • As an operating division of a CFO Act Agency, the U.S. Department of Health and Human Services, ACL is not required to have its own strategic data plan and utilizes HHS’s data strategy. ACL provides public access to its programmatic data through a web based portal. In 2019, ACL created a council to improve ACL’s data governance, including the development of improved processes and standards for defining, collecting, reviewing, certifying, analyzing, and presenting data that ACL collects through its evaluation, grant reporting, and administrative performance measures. In addition, the council will help the Office of Performance and Evaluation meet its mission to provide and promote high quality, transparent information to support sound decision-making.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • As part of its data restructuring efforts, ACL has created an internal inventory of its administrative data sets. In addition, these data sets/files are documented through the Privacy Impact Assessment and the Systems of Records Notice processes which are required under Titles II and III of the E-Government Act of 2002. The Act specifically requires that agencies evaluate systems that collect personally identifiable information (PII) and determine whether the privacy of that PII is adequately protected. Agencies perform this evaluation through a privacy impact assessment (PIA). One result is a complete listing of all data collections.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • In 2016, ACL implemented a Public Access Plan as a mechanism for compliance with the White House Office of Science and Technology Policy’s public access policy. The plan focused on making published results of ACL/NIDILRR-funded research more readily accessible to the public; making scientific data collected through ACL/NIDILRR-funded research more readily accessible to the public; and increasing the use of research results and scientific data to further advance scientific endeavors and other tangible applications. In March 2019, the ACL completed the ACL Data Restructuring (DR) Project to assess the data hosted on the Aging Integrated Database (AGID), and to develop and test a potential restructuring of the data in order to make it useful and usable for stakeholders. In 2019, ACL awarded a follow on contract to further integrate its datasets along the lines of conceptual linkages, and to better align the measures within ACL’s data collections across the agency. This work will consist of careful data documentation, building a data repository for aging datasets (with a capability for expansion for ACL’s disability datasets), aligning measures for conceptual linkages across ACL datasets, and reviewing datasets for potential topical navigation of the data. The ultimate goal is to expand ACL’s current public data portal (AGID) to allow users to examine ACL data across data sets, geographies, and years.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • As an operating division of the U.S. Department of Health and Human Services, ACL follows all departmental guidance regarding data privacy and security. This includes project-specific reviews by ACL’s Office of Information Resource Management (OIRM), which monitors all of ACL’s data collection activities to ensure the safety and security of ACL’s data assets.  In FY19, ACL awarded a contract to stand up a “Data Council” to enhance the quality, security, and statistical usability of the data ACL collects through its evaluation, grant reporting, and administrative data collections, and to develop effective data governance standards. In addition, each funding opportunity announcement states that “a data and safety monitoring board (DSMB) is required for all multi-site clinical trials involving interventions” (see for example The FOA for Disability and Rehabilitation Research Projects (DRRP): Assistive Technology to Promote Independence and Community Living (Development) HHS-2019-ACL-NIDILRR-DPGE-0355).
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
Score
8
U.S. Agency for International Development
5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • USAID’s data related investments and efforts are guided by its Information Technology Strategic Plan. This includes support for the Agency’s Development Data Policy, USAID’s open data policy, that provides a framework for systematically collecting Agency-funded data, structuring the data to ensure usability, and making the data public while ensuring rigorous protections for privacy and security. In addition, this policy sets requirements for how USAID data is tagged, submitted, and updated. The Development Data Library (DDL) is the Agency’s repository of USAID-funded, machine readable data, created or collected by the Agency and its implementing partners. The DDL, as a repository of structured and quantitative data, complements the DEC which publishes qualitative reports and information.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • Launched in November 2018 as part of the Development Information Solution (DIS), USAID’s public-facing Development Data Library (DDL) provides a comprehensive inventory of data assets available to the Agency. The DDL’s data catalog is also harvested via javascript on an ongoing basis for further distribution on the federal Data.gov website.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • The Data Services team— located in USAID’s Management Bureau’s Office of the Chief Information Officer (M/CIO)— manages a comprehensive portfolio of data services in support of the Agency’s mission. This includes enhancing the internal and external availability and ease-of use of USAID data and information via technology platforms such as the USAID Economic Analysis And Data Services platform, broadening global awareness of USAID’s data and information services, and bolstering the Agency’s capacity to use data and information via training and the provision of demand-driven analytical services.
  • The Data Services Team also manages and develops the Agency’s digital repositories, including the Development Data Library (DDL), the Agency’s central data repository. USAID and external users can search for and access datasets from completed evaluations and program monitoring by country and sector.
  • USAID staff also have access to an internal database of over 100 standard foreign assistance program performance indicators and associated baseline, target, and actual data reported globally each year. This database and reporting process, known as the Performance Plan and Report (PPR) promotes evidence building and informs internal learning and decisions related to policy, strategy, budgets, and programs.
  • The United States is a signatory to the International Aid Transparency Initiative (IATI)—a voluntary, multi-stakeholder initiative that created a data standard for publishing foreign assistance spending data in machine-readable format. The standard links an activity’s financial data to its evaluations. USAID continues to improve and add to its published IATI data, and is looking into ways to utilize these data as best practice—including using it to populate partner country systems, fulfill transparency reporting as part of the U.S. commitment to the Grand Bargain, and make decisions internally, including based on what other development actors are doing by using the newly launched Development Cooperation Landscape tool.
  • The Landscape tool enables USAID staff to better understand cooperation partners’ priorities and identify potential areas of alignment. This data source is contributing to more robust cooperation strategy development, decision making, and helping USAID to more effectively and efficiently use cooperation resources. USAID created the Global Innovation Exchange that shares information around development innovations with hundreds of other industry partners and governments.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • USAID’s Privacy Program directs policies and practices for protecting personally identifiable information and data, while several policy references (ADS303maz and ADS302mbj) provide guidance for protecting information to ensure the health and safety of implementing partners. USAID’s Development Data Policy (ADS Chapter 579) details a data publication process that provides governance for data access and data release in ways that ensure protections for personal and confidential information. As a reference to the Development Data Policy, ADS579maa explains USAID’s foreign assistance data publications and the protection of any sensitive information prior to release. USAID applies statistical disclosure control on all public data before publication or inclusion in the DDL.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • While specific data on this is limited, USAID does invest in contracts or grants that provide support to build local organizational or governmental capacity in data collection, analysis, and use. In addition, to date, more than 245 USAID data assets are available to the public via USAID’s DDL. These assets include microdata related to USAID’s initiatives that provide partner countries and development partners with insight into emerging trends and opportunities for expanding peace and democracy, reducing food insecurity, and strengthening the capacity to deliver quality educational opportunities for children and youth around the globe. Grantees are encouraged to use the data on the DDL, which provides an extensive User Guide to aid in accessing, using, securing and protecting data. The Data Services team conducts communication and outreach to expand the awareness of websites with development data, how to access it, and how to contact the team for support. In addition, the Data Services team has developed a series of videos to show users how to access the data available. The dataservices@usaid.gov mail account responds to requests for assistance and guidance a range of data services from both within the Agency and from implementing partners and the public.
Score
5
Corporation for National and Community Service
5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • CNCS has an Information Technology Data Governance Policy, which addresses open data, and an internal Data Sharing Policy that was implemented in FY18. CNCS has not historically posted these policies publicly but is moving in that direction as the CIO develops and clears policies as well as hires staff to oversee these efforts.
  • The CIO/Acting CDO and the Director of Research and Evaluation/Evaluation Officer will be working together in FY20 to reconstitute and reconvene the agency’s Data Council and determine what kind of charter/agency policy may be needed for establishing the role of the Council with regard to managing the agency’s data assets. In essence, the role of the Council, under the direction of the Acting CDO, will be to prioritize data asset management issues such as creating an annual Fact Sheet (so all externally facing numbers have a single authoritative source), creating a more user-friendly interface for the agency’s data warehouse/data inventory, and keeping the agency’ open data platform current.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • The agency has an Information Technology Data Governance Policy, which addresses the need to have a current and comprehensive data inventory. The agency has an open data platform.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • CNCS has a data request form and an MOU template so that anyone interested in accessing agency data may use the protocol to request data. In addition, public data sets are accessible through the agency’s open data platform. The agency’s member exit survey data was made publicly available for the first time in FY19. In addition, nationally representative civic engagement and volunteering statistics are available, through a data sharing agreement with the Census Bureau, on an interactive platform. The goal of these platforms is to make these data more accessible to all interested end-users.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • The agency has an Information Technology Data Governance Policy which addresses data security and protecting personal/confidential information. CNCS has a cybersecurity policy, and it will likely be subsumed under the new Data Governance Policy.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • CNCS provides assistance to grantees, including governments, to help them access agency data. For example, CNCS provided assistance on using the AmeriCorps Member Exit Survey data to State Service Commissions (many of which are part of state government) and other grantees at the National Service Training Conference in May 2019.
Score
6
U.S. Department of Education
5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • ED’s FY18-22 Performance Plan outlines strategic goals and objectives for the Department, including Goal #3: “Strengthen the quality, accessibility and use of education data through better management, increased privacy protections and transparency.” This currently serves as a strategic plan for the Department’s governance, protection, and use of data while it develops the Open Data Plan required by the Evidence Act. The plan currently tracks measurable performance on a number of metrics including the public availability of machine-readable datasets and open licensing requirements for deliverables created with Department grant funds. ED will continue to expand its open data infrastructure to improve how stakeholders find, access and manage the Department’s public data. This will include establishing an enterprise open data platform that will make the Department’s public data discoverable from a single location and easily searchable by topic. As is required by the Evidence Act, the Department will be publishing its open data plan in 2020 within the agency’s Information Resource Management Strategic Plan.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • Information about Department data collected by the National Center for Education Statistics (NCES) have historically been made publicly available online. Prioritized data is further documented or featured on the Department’s data page.
  • In FY20, the Department will be launching an open data platform designed for improved public engagement and tailored to meet the requirements of the comprehensive data asset inventory described in the Evidence Act.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • In ED’s FY18-22 Performance Plan, Strategic Objective 3.3 is to “Increase access to, and use of education data to make informed decisions both at the Department and in the education community” and outlines actions taken in FY18. In 2019, IES generated analysis and reporting of a new type of data produced by the new NAEP digitally based assessments. This included providing a detailed statistical guide (via an R package), which facilitates external researchers’ computation of analytic weights. Additionally, IES added 12 longitudinal data sets to the DataLab system, which improved access to National Center for Education Statistics’ (NCES) sample survey data. In 2018, ED publicly released 126 data sets in machine readable formats. As of September 2019, the Department is on track to exceed its 2019 target for publicly released machine readable datasets.
  • Through leadership of NCES, ED continues to invest in clearer, more defined standards for education data. The Common Education Data Standards (CEDS) have been developed over the past 10 years using an open process to engage a broad range or data stakeholders, including local and state education agencies, postsecondary institutions, and interested organizations. CEDS establishes a common vocabulary, data model and technical tools to help education stakeholders understand and use education data.
  • ED has also made concerted efforts to improve the availability and use of its data with the release of the revised College Scorecard that links data from NCES, the Office of Federal Student Aid, and the Internal Revenue Service. In FY19, the Department released provisional data describing debt at the level of fields of study. ED plans to integrate additional field of study data into its College Scorecard consumer site and the Office of Federal Student Aid’s NextGen student tools.
  • In September 2019, the Department established an agency-level Data Governance Body (DGB), chaired by the Chief Data Officer (CDO), with participation from relevant senior-level staff in agency business units. The DGB will assist the CDO in assessing and adjudicating competing proposals aimed at achieving and measuring desirable Departmental data outcomes and priorities.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • The Disclosure Review Board, the EDFacts Governing Board, the Student Privacy Policy Office (SPPO), and SPPO’s Privacy Technical Assistance Center all help to ensure the quality and privacy of education data. In FY19, the ED Data Strategy Team also published a user resource guide for staff on disclosure avoidance considerations throughout the data lifecycle.
  • In ED’s FY18-22 Performance Plan, Strategic Objective 3.2 is to “Improve privacy protections for, and transparency of, education data both at the Department and in the education community.” The plan also outlines actions taken in FY18. ED’s Student Privacy website assists stakeholders in protecting student privacy by providing official guidance on FERPA, technical best practices, and the answers to Frequently Asked Questions. ED’s Privacy Technical Assistance Center (PTAC) responded to more than 3,200 technical assistance inquiries on student privacy issues and provided online FERPA training to more than 57,000 state and school district officials. FSA conducted a postsecondary institution breach response assessment to determine the extent of a potential breach and provide the institutions with remediation actions around their protection of FSA data and best practices associated with cybersecurity.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • InformED, the ED’s primary open data initiative, works to improve the Department’s capacity to make public education data accessible and usable in innovative and effective ways for families, policy makers, researchers, developers, advocates and other stakeholders.
  • In FY 2018, ED developed and released a series of three data stories focused on the characteristics, educational experiences and academic outcomes of English learners. ED also updated its data story on chronic absenteeism and released another new data story on career and technical education. These data stories have interactive graphics and accompanying narrative text to promote better access and use of Department data by a wider variety of stakeholders. Also in FY18, the Office of Special Education and Rehabilitative Services supported technical assistance centers to conduct several conferences to assist states in using their Individuals with Disabilities Education Act (IDEA) data to make informed decisions.
Score
6
U.S. Dept. of Housing & Urban Development
5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • As HUD implements the DATA Act and Evidence Act, new open data policies, accountability principles, and automated reporting are being developed and implemented during FY19 (see FY20 Annual Performance Plan). HUD maintains a vigorous open data program including administrative datasets on data.hud.gov, spatially enabled data on the eGIS portal, PD&R datasets for researchers and practitioners, a robust partnership with the Census Bureau, U.S. Postal Service vacancy data, and health data linkages with the National Center for Health Statistics.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • HUD has an Inventory Schedule to identify tasks and timelines for engaging program offices to identify data assets, assess whether data can be made available to the public, and create metadata and guidance for shareable data. At present, the Enterprise Data Inventory remains incomplete while the Enterprise Data Management Policy is updated during FY19.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • HUD has an updated list of open data assets; numerous PD&R-produced datasets for researchers and practitioners, including tenant public use microdata samples; and an eGIS portal providing geo-identified open data to support public analysis of housing and community development issues using GIS tools.
  • PD&R has data linkage agreements with the National Center for Health Statistics and the Census Bureau to enhance major national survey datasets by identifying HUD-assisted households; making available major program demonstration datasets in secure environments; and to produce special open-access tabulations of census data for HUD’s partners.
  • PD&R engages in cooperative agreements with research organizations, including both funded Research Partnerships and unfunded Data License Agreements, to support innovative research that leverages HUD’s data assets and informs HUD’s policies and programs. Data licensing protocols ensure that confidential information is protected.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • HUD’s Evaluation Policy specifies that HUD protects client privacy by adhering to the Rule of Eleven to prevent disclosure from tabulations with small cell sizes.
  • Data licensing protocols ensure that researchers protect confidential information when using HUD’s administrative data or program demonstration datasets.
  • HUD has an interagency agreement with the Census Bureau to link administrative data from HUD’s tenant databases and randomized control trials with the Bureau’s survey data collection and other administrative data collected under the privacy protections of its Title 13 authority. These RCT datasets are the first intervention data added to Federal Statistical RDCs by any federal agency, and strict protocols and review of all output ensure that confidential information is protected.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • HUD has an updated list of open data assets, an open data program, numerous PD&R datasets for researchers and practitioners, and an eGIS portal providing geo-identified data to support public analysis of housing and community development issues related to multiple programs and policy domains using GIS tools. These accessible data assets have privacy protections. Researchers needing detailed microdata can obtain access through data licensing agreements.
  • HUDExchange offers numerous resources and training opportunities to help program partners use data assets more effectively. Additional technical assistance is offered through the Community Compass program, a $25 million technical assistance program to equip HUD’s customers with the knowledge, skills, tools, capacity, and systems to implement HUD programs and policies successfully and provide effective oversight of federal funding.
Score
5
U.S. Department of Labor
5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • DOL’s open government plan was last updated in 2016, and subsequent updates have been delayed in anticipation of the formal release of the Federal Data Strategy and the Evidence Act.
  • DOL also has open data assets aimed at developers and researchers who desire data-as-a-service through application programming interfaces hosted by both the Office of Public Affairs and the Bureau of Labor Statistics(BLS). Each of these has clear documentation, is consistent with the open data policy, and offers transparent, repeatable, machine-readable access to data on an as-needed basis. The Department is currently developing a new API v3 which will expand the open data offerings, extend the capabilities, and offer a suite of user-friendly tools.
  • The Department has consistently sought to make as much data available to the public regarding its activities as possible. Examples of this include DOL’s Public Enforcement Database, which makes available records of activity from the worker protection agencies and the Office of Labor Management Standards’ online public disclosure room.
  • The Department also has multiple restricted-use access systems which go beyond what would be possible with simple open-data efforts. BLS has a confidential researcher access program, offering access under appropriate conditions to sensitive data. Similarly, the Chief Evaluation Office (CEO) has stood up a centralized research hub for evaluation study partners to leverage sensitive data in a consistent manner to help make evidence generation more efficient.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • The Department has conducted extensive inventories over the last ten years, in part to support common activities such as IT modernization, White House Office of Management and Budget (OMB) data calls, and the general goal of transparency through data sharing. These form the current basis of DOL’s planning and administration. Some sections of the Evidence Act have led to a different federal posture with respect to data, such as the requirement for data to be open by default, and considered shareable absent a legal requirement not to do so, or unless there is a risk that the release of such data might help constitute disclosure risk. The Department is currently re-evaluating its inventories and its public data offerings in light of this very specific requirement and re-visiting this issue among all its programs. Because this is a critical prerequisite to developing open data plans, as well as data governance and data strategy frameworks, the agency hopes to have a revised inventory completed by the end of FY19.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • DOL’s CEO, Employment & Training Administration (ETA), and Veterans Employment & Training Service (VETS) have worked with the U.S. Department of Health and Human Services (HHS) to develop a secure mechanism for obtaining and analyzing earnings data from the Directory of New Hires. In this past year DOL has entered into interagency data sharing agreements with HHS and obtained data to support 10 job training and employment program evaluations.
  • During FY19, the Department continued to expand efforts to improve the quality of and access to data for evaluation and performance analysis through the Data Analytics Unit in CEO, and through new pilots beginning in BLS to access and exchange state labor market and earnings data for statistical and evaluation purposes.
  • The Data Analytics unit also continued to leverage its Data Exchange and Analysis Platform (DEAP) with high processing capacity and privacy provisions to share, link, and analyze program data for recently separated veterans, public workforce outcomes, and sensitive worker protection data such as complaint filings. This work helps to identify trends and patterns in the data which become the foundation for future program improvements. The analysis also results in a feedback loop that can improve data quality, and allows for inquiries to determine if the data from the program are appropriate to support more rigorous performance and evaluation approaches.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • DOL’s ETA has agreements with 50 states, the District of Columbia, and Puerto Rico for data sharing and exchange of interstate wage data for performance accountability purposes. Currently, ETA is finalizing an updated data sharing agreement with states that will facilitate better access to quarterly wage data by states for purposes of performance accountability and research and evaluation requirements under the Workforce Innovation and Opportunity Act (WIOA). This work aims to expand access to interstate wage data for the U.S. Department of Education’s Adult and Family Literacy Act programs (AEFLA) and Vocational Rehabilitation programs, among others.
  • ETA continues to fund and provide technical assistance to states under the Workforce Data Quality Initiative to link earnings and workforce data with education data longitudinally to support state program administration and evaluation. As of June 2019, seven rounds of grants have been awarded to states. ETA and VETS also have modified state workforce program reporting system requirements to include data items for a larger set of grant programs, which will improve access to administrative data for evaluation and performance management purposes. An example of the expanded data reporting requirements is the Homeless Veterans Reintegration Program FY16 grants.
Score
6
Millennium Challenge Corporation
5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • As detailed on MCC’s Digital Strategy and Open Government pages, MCC promotes transparency to provide people with access to information that facilitates their understanding of MCC’s model, MCC’s decision-making processes, and the results of MCC’s investments. Transparency, and therefore open data, is a core principle for MCC because it is the basis for accountability, provides strong checks against corruption, builds public confidence, and supports informed participation of citizens.
  • As a testament to MCC’s commitment to and implementation of transparency and open data, the agency was the highest-ranked U.S. government agency in the 2018 Publish What You Fund Aid Transparency Index for the fifth consecutive year. In addition, the U.S. government is part of the Open Government Partnership, a signatory to the International Aid Transparency Initiative, and must adhere to the Foreign Aid Transparency and Accountability Act. All of these initiatives require foreign assistance agencies to make it easier to access, use, and understand data. All of these actions have created further impetus for MCC’s work in this area, as they establish specific goals and timelines for adoption of transparent business processes.
  • Additionally, MCC convened an internal Data Governance Board, an independent group consisting of representatives from departments throughout the agency, to streamline MCC’s approach to data management and advance data-driven decision-making across its investment portfolio.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • MCC makes extensive program data, including financials and results data, publicly available through its Open Data Catalog, which includes an “enterprise data inventory” of all data resources across the agency for release of data in open, machine readable formats. The Department of Policy and Evaluation leads the MCC Disclosure Review Board process for publicly releasing the de-identified microdata that underlies the independent evaluationson the Evaluation Catalog, following MCC’s Microdata Management Guidelines to ensure appropriate balance in transparency efforts with protection of human subjects’ confidentiality.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • MCC’s Data Analytics Program (DAP) enables enterprise data-driven decision-making through the capture, storage, analysis, publishing, and governance of MCC’s core programmatic data. The DAP streamlines the agency’s data lifecycle, facilitating increased efficiency. Additionally, the program promotes agency-wide coordination, learning, and transparency. For example, MCC has developed custom software applications to capture program data, established the infrastructure for consolidated storage and analysis, and connected robust data sources to end user tools that power up-to-date, dynamic reporting and also streamlines content maintenance on MCC’s public website. As a part of this effort, the Monitoring and Evaluation team has developed an Evaluation Pipeline application that provides up-to-date information on the status, risk, cost, and milestones of the full evaluation portfolio for better performance management.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • MCC’s Disclosure Review Board ensures that data collected from surveys and other research activities is made public according to relevant laws and ethical standards that protect research participants, while recognizing the potential value of the data to the public. The board is responsible for: reviewing and approving procedures for the release of data products to the public; reviewing and approving data files for disclosure; ensuring de-identification procedures adhere to legal and ethical standards for the protection of research participants; and initiating and coordinating any necessary research related to disclosure risk potential in individual, household, and enterprise-level survey microdata on MCC’s beneficiaries.
  • The Microdata Evaluation Guidelines inform MCC staff and contractors, as well as other partners, on how to store, manage, and disseminate evaluation-related microdata. This microdata is distinct from other data MCC disseminates because it typically includes personally identifiable information and sensitive data as required for the independent evaluations. With this in mind, MCC’s Guidelines govern how to manage three competing objectives: share data for verification and replication of the independent evaluations, share data to maximize usability and learning, and protect the privacy and confidentiality of evaluation participants. These Guidelines were established in 2013 and updated in January 2017. Following these Guidelines, MCC has publicly released 76 de-identified, public use, microdata files for its evaluations. MCC’s experience with developing and implementing this rigorous process for data management and dissemination while protecting human subjects throughout the evaluation life cycle is detailed in Opening Up Evaluation Microdata: Balancing Risks and Benefits of Research Transparency.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • Both MCC and its partner in-country teams produce and provide data that is continuously updated and accessed. MCC’s website is routinely updated with the most recent information, and in-country teams are required to do the same on their respective websites. As such, all MCC program data is publicly available on MCC’s website and individual MCA websites for use by MCC country partners, in addition to other stakeholder groups. As a part of each country program, MCC provides resources to ensure data and evidence are continually collected, captured, and accessed. In addition, each project’s evaluation has an Evaluation Brief that distills key learning from MCC-commissioned independent evaluations. Select Evaluation Briefs have been posted in local languages, including Mongolian, Georgian, French, and Romanian, to better facilitate use by country partners.
  • MCC also has a partnership with the President’s Emergency Plan for AIDS Relief (PEPFAR), referred to as the Data Collaboratives for Local Impact (DCLI). This partnership is improving the use of data analysis for decision-making within PEPFAR and MCC partner countries by working toward evidence-based programs to address challenges in HIV/AIDS and health, empowerment of women and youth, and sustainable economic growth. Data-driven priority setting and insights gathered by citizen-generated data and community mapping initiatives contribute to improved allocation of resources in target communities to address local priorities, such as job creation, access to services, and reduced gender-based violence. DCLI continues to inform and improve the capabilities of PEPFAR activities through projects such as the Tanzania Data Lab, which has trained nearly 700 individuals, nearly 50% of whom are women, and has hosted a one-of-a-kind “Data Festival.” Recently, the Lab has announced a partnership with the University of Virginia Data Science Institute and catalyzed launching of the first Masters in Data Science in East Africa, in partnership with the University of Dar es Salaam.
Score
5
Substance Abuse and Mental Health Services Administration
5.1 Did the agency have a strategic data plan, including an open data policy? (Example: Evidence Act 202(c), Strategic Information Resources Plan)
  • The SAMHSA Strategic Plan FY2019-FY2023 (pp. 20-23) outlines five priority areas to carry out the vision and mission of SAMHSA, including Priority 4: Improving Data Collection, Analysis, Dissemination, and Program and Policy Evaluation. This Priority includes 3 objectives: 1) Develop consistent data collection strategies to identify and track mental health and substance use needs across the nation; 2) Ensure that all SAMHSA programs are evaluated in a robust, timely, and high-quality manner; and 3) Promote access to and use of the nation’s substance use and mental health data and conduct program and policy evaluations and use the results to advance the adoption of evidence-based policies, programs, and practices.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement? (Examples: Model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c))
  • The Center for Behavioral Health Statistics and Quality (CBHSQ) oversees data collection initiatives and provides publicly available datasets so that some data can be shared with researchers and other stakeholders while preserving client confidentiality and privacy.
  • SAMHSA’s Substance Abuse and Mental Health Data Archive (SAMHDA) contains substance use disorder and mental illness research data available for restricted and public use. SAMHDA promotes the access and use of SAMHSA’s substance abuse and mental health data by providing public-use data files and documentation for download and online analysis tools to support a better understanding of this critical area of public health.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information? (Example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)
  • SAMHSA’s Performance and Accountability and Reporting System (SPARS) hosts the data entry, technical assistance request, and training system for grantees to report performance data to SAMHSA. SPARS serves as the data repository for the Administration’s three centers, Center for Substance Abuse and Prevention (CSAP), Center for Mental Health Services (CMHS), and Center for Substance Abuse Treatment (CSAT). In order to safeguard confidentiality and privacy, the current data transfer agreement limits the use of grantee data to internal reports so that data collected by SAMHSA grantees will not be available to share with researchers or stakeholders beyond SAMHSA, and publications based on grantee data will not be permitted.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • The Center of Excellence for Protected Health Information (CoE for PHI) is a SAMHSA funded technical assistance project designed to develop and increase access to simple, clear, and actionable educational resources, training, and technical assistance for consumers and their families, state agencies, and communities to promote patient care while protecting confidentiality.
  • Through SAMHSA’s Substance Abuse and Mental Health Data Archive (SAMHDA) SAMHSA has partnered with the National Center for Health Statistics (NCHS) to host restricted-use National Survey on Drug Use and Health (NSDUH) data at their Federal Statistical Research Data Centers (RDCs). RDCs are secure facilities that provide access to a range of restricted-use microdata for statistical purposes.
Back to the Standard

Visit Results4America.org