Skip to main content
Appendices

Appendix A: Capacity Assessment for Evaluation, Statistics, Research, and Analysis - FY2022

The Capacity Assessment considers the Department’s capacity for building and using evidence to support decisions about programs and policies – and, in particular, our capacity for evaluation, statistics, research, and analysis.  It is a requirement of the Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act) and is a joint effort of the Department’s Evaluation Officer, Statistical Official, and Chief Data Officer.

Introduction

The Capacity Assessment is intended to:

  • Describe current capacity: We take stock of the Department’s current strengths and resources, as a basis for documenting future capacity building.
  • Identify areas for growth: We identify opportunities for expanding or enriching capacity for evidence building to support decisionmaking about programs and policies.

Some of the Department's current capacity for statistics, research, and analysis is dedicated to operational ends, such as specific investigations or cases.  This Capacity Assessment focuses instead on evidence building to support higher-level decisionmaking – for example, policymaking, program design, and strategic decisionmaking about objectives and priorities.  In some cases, the same capacity supports both types of work; in other cases, dedicated capacity has been established or might be needed for the latter type of evidence building.

In the course of this Capacity Assessment, we gathered detailed information from across the Department about current strengths and potential areas for growth.  In this report, we offer general descriptions and highlight important themes.

Approach and Methods

Because the Department comprises a large number of components that vary greatly in mission, size, organizational complexity, and evidence needs, our approach needed to balance consistency and specificity.  To strike that balance, we conducted a series of semi-structured interviews with select individuals from 15 components.  These components represented a cross-section of the Department in terms of mission type (for example, litigation, law enforcement, and grantmaking) and representation in the learning agenda. They also included the largest components, collectively accounting for 89% of the Department’s appropriations in FY 2021 and over 95% of staff as of January 2022.

Semi-structured interviews enabled us to learn in depth about pockets of evidence-building capacity and the detailed needs of different components.  They were conducted by a team representing the Department’s Evaluation Officer, Statistical Official, and Chief Data Officer, which allowed us to cover a range of evidence-building methods with appropriate technical sophistication.  The interviews were structured around three broad topics: current capacity; needs and areas for growth; and opportunities for expanding or enhancing capacity in the longer term.

In the future, the Department intends to both broaden and deepen its capacity assessment to cover more components, to better assess the use of evidence in routine decision making and to track growth using quantitative metrics.

Participating components: 

  • Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF)
  • Civil Division (CIV)
  • Civil Rights Division (CRT)
  • Community Oriented Policing Services (COPS)
  • Community Relations Service (CRS)
  • Criminal Division (CRM)
  • Drug Enforcement Administration (DEA)
  • Executive Office for Immigration Review (EOIR)
  • Federal Bureau of Investigation (FBI)
  • Federal Bureau of Prisons (BOP)
  • Office of Justice Programs (OJP)
  • Office of Legal Policy (OLP)
  • Office on Violence Against Women (OVW)
  • U. S. Attorney’s Offices (USAO)
  • U.S. Marshals Service (USMS) 

Findings: Current Capacity for Evidence Building

The Department’s components showed a wide range of current capacity for evaluation, statistics, research, and analysis. Here we offer a high-level characterization of the Department's diverse and distributed capabilities, and we highlight themes that emerged in the course of our interviews.

Data Analysis
Many components have teams with expertise and capacity for data analysis.  These teams vary in size, purpose, and capacity for specific statistical or other types of analysis. Some teams are designed to summarize data on a variety of topics for their components’ leadership, while others are designed for more intensive analysis on specific topics (for example, healthcare fraud or criminology).

Several components have also established cross-cutting initiatives, such as communities of practice, to bolster the use of data.  Such initiatives facilitate knowledge sharing and promote standardizing processes. The Office of the Chief Information Officer (OCIO) has organized similar efforts at the Department level and has helped leverage resources across components to support innovative uses of data.  The Department has also made substantial progress on open data efforts and establishing data management building blocks (including a data strategy, a data governance board charter, and a data catalog).

Components generally can access the data they require from other components and other federal agencies, though securing this access can sometimes require significant time and effort.  On the other hand, components often lack the staff bandwidth to explore innovative uses of data from new sources while also pursuing their primary missions.

Evaluation
Components also vary widely in their capacity for evaluation.  The National Institute of Justice (NIJ), in the Office of Justice Programs (OJP), serves as the principal research and evaluation arm of DOJ.  NIJ funds and conducts evaluations principally through grants, contracts, and cooperative agreements.  Most evaluations are of grant-funded programs administered by state, local, Tribal, and other partners, though some are of programs initiated or administered by the Department.  Other components also report capacity for evaluation, including the Bureau of Prisons, the Community Relations Service, the Drug Enforcement Administration (DEA), and the Office on Violence Against Women.

Statistics
DOJ’s principal statistical capacity resides in the Bureau of Justice Statistics (BJS), also in OJP, which collects, analyzes, publishes, and disseminates information on crime, criminal offenders, victims of crime, and the operation of justice systems at all levels of government.  BJS is one of 13 principal statistical agencies within the executive branch whose activities primarily involve the collection, compilation, processing, or analysis of information for statistical purposes.  BJS provides statistical support to a wide range of stakeholders both within and outside the Department, including by hosting analytic tools on its website; publishing statistical reports; providing data; and advising on statistical techniques, data quality, and data access.

Many other components have statistical functions or programs and have pockets of statistical expertise that vary in size and purpose.  These range from statistical teams that are positioned to provide analytical support to their component’s leadership to individual statisticians who provide support on specific issues or topics.

Other Research and Analysis
Expertise in a wide range of types of research and analysis exists in many places across the Department.  This ranges from primary evidence-building activities to higher-level synthesis, interpretation, and translation of research findings into policy or program design.  For example, NIJ maintains a central resource, called CrimeSolutions, to help practitioners and policymakers understand what works in justice-related programs and practices.  Its purpose is to assist in practical decisionmaking and program implementation by gathering information on justice-related programs and practices, reviewing evaluation and meta-analysis research against standard criteria, and displaying information in accessible formats.  Many components also reported relying on contractor support for various types of research and analysis.

Findings: Challenges, Solutions, and Areas for Growth

Interviews yielded a number of valuable insights regarding challenges, solutions, and other opportunities to expand the Department’s capacity for evidence building. These fell into six broad categories:

Consolidating and sharing data across components
Though many components reported successfully accessing and using data from other components, some also saw opportunities for greater sharing and consolidation of data across the Department.  One example involved consolidating case data into a single case management system for the Department’s litigating components. Another involved consolidating and standardizing data collected through the 94 U.S. Attorney’s Offices; such data is currently compiled through ad-hoc requests, and data quality is sometimes limited by inconsistencies across offices in how the data is generated.

Leveraging data from sources outside the Department
Though many components reported success in acquiring data from sources outside the Department, some also reported limitations on such acquisition.  Data sources that were mentioned include other federal agencies, as well as state, local, and Tribal sources.

Facilitating the use of data through technology and training
Several components saw opportunities to promote and facilitate the use of data by the Department’s attorneys and other non-technical staff.  Three solutions were mentioned:

  • Interactive reporting tools can make data more accessible for non-technical users.  For example, NIJ is currently developing an interface that will allow non-technical users to generate and view reports based on the FBI’s Uniform Crime Reporting data.  Multiple components expressed an interest in developing tools of this sort for other datasets.
  • Dashboards, or reporting tools that are designed around frequently requested statistics or summaries of data, can also make data more accessible and interpretable for non-technical users.
  • Regular trainings can promote data literacy and statistical literacy for non-technical staff.

Promoting evaluation
Several components saw opportunities to promote and facilitate rigorous program evaluation. These opportunities fell in three categories:

  • Several of DOJ’s grantmaking offices reported that inflexibilities in funding sometimes prevented them from conducting evaluations or focusing on highest-priority evaluations.  Removing constraints on funding could enable more targeted evaluations to identify effective interventions in specific contexts (as opposed to broader evaluations of entire programs).  Dedicated funds or carveouts for evaluation would enable evidence building alongside program operations.
  • Several components mentioned that administrative data – and, in particular, grant performance data – could be valuable for evaluation purposes but was limited both in content and in quality.  A more flexible system for collecting such data would enable the Department to collect information more strategically for evaluation purposes.
  • In addition to tracking quantitative measures of program performance, the Department might place greater emphasis on gathering qualitative information about program implementation.  Rigorous process/implementation evaluations can lead to significant improvements in program operations.

Hiring and training for greater technical capacity
Several components expressed interest in expanding their staffs’ technical skills, especially in the area of data analysis.  This can be done through hiring new talent, though components indicated that hiring front-line mission staff often takes precedence over recruiting for technical skills.  Some components reported success in training existing staff into newly designated technical positions (for example, data scientist and data analyst positions).

Modernizing technological solutions for collection, storage, and analysis of data
Legacy information technology systems are a significant barrier to evidence building.  Several components reported that potentially useful data is trapped in unusable forms such as word processing documents.  Other components reported that outdated data systems made responding to new information requests difficult and resource intensive.  Upgraded technological solutions for data collection, data storage, and data analysis could have substantial value.

Conclusion

The Department has a range of capacities for evidence building and evidence use, which is to be expected given its size, organizational complexity, and diverse missions represented among its many components.  In addition to the Department’s principal evidence building units – NIJ and BJS – there are teams and individuals with capacity for data analysis, evaluation, statistics, and other forms of research and analysis distributed across the Department.  In-depth interviews revealed strengths and successes – including recent cross-cutting initiatives to expand capacity for data analysis, in particular – as well as a great number of valuable insights regarding challenges, potential solutions, and other opportunities to expand the Department’s capacity for the building and use of evidence.  Broadly speaking, areas for growth involved:

  • Consolidating and sharing data across components
  • Leveraging data from sources outside the Department
  • Promoting the use of data through technology and training
  • Promoting evaluation through changes in funding, improvements in data quality, and a greater emphasis on evidence building alongside program operations
  • Hiring and training for greater technical capacity, and providing additional staff or contractual support for implementing new evidence-building activities and Evidence Act requirements
  • Modernizing technological solutions for collection, storage, and analysis of data