Assessing Capacity for Using Data to Build Actionable Evidence: A Compendium of the 2022 Research Symposium

Federal agency officials, especially those in newly created leadership positions established by the Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act), are engaging in activities to facilitate building and using evidence to inform decision-making. The Data Foundation’s 2022 Virtual Symposium, in partnership with The George Washington University’s Trachtenberg School of Public Policy and Public Administration, convened experts who described real-world experiences and research projects that focused on efforts to build federal agency evidence-building capacity. In the call for submissions, the Data Foundation asked experts to submit research related to four areas: 

  • Analysis of agency capacity assessments 

  • Reviews of agency evidence-building plans and learning agendas 

  • Workforce for evidence-building activities 

  • Novel uses of data for making decisions and informing policy. 

This compendium offers an overview of the 16 presentations and key takeaways.


Table of Contents


Day 1: Understanding Capacity for Evidence-Building Across Government 

The first day focused on efforts to assess evidence-building capacity and establish evidence-building processes. The presenters from both government and non-governmental organizations covered topics including capacity assessments, data silos and data hubs, workforce management, and how agencies can work with contractors and collaborate across different levels of government to build capacity for effective evidence-based policymaking.

Informing Infrastructure Improvement Decision-Making with the Best Available Science

Presented by Ed Kearns, Chief Data Officer, First Street Foundation and former Acting Chief Data Officer at the U.S. Department of Commerce

Risk Factor is a free tool created by the nonprofit First Street Foundation to make it easy to understand risks from a changing environment. Risk Factor (available at: riskfactor.com) allows users to search their address to see the location’s flood, wildfire, or heat risk now and 30 years into the future. The tool is able to determine various climate risk factors down to the level of a street, property, house, or 100-meter segment of roadway. Developed using an open source hazard model driven by open source government data from various agencies – including the Federal Emergency Management Agency, National Oceanic and Atmospheric Administration, and U.S. Geological Survey at the Department of Interior – the First Street Foundation further expanded the tool’s capabilities using outputs from the Intergovernmental Panel on Climate Change (IPCC) climate models as well as spatial and economic analysis models. The project demonstrates how federal government data can enable development of innovative tools to inform public understanding and decision-making. By utilizing open science and bringing together various data sources, the public and private sectors can take a more climate-informed science approach to prepare for the effects of climate change.

Key Takeaways & Recommendations: 

  • Agencies can learn from demonstration projects like First Street’s risk model and leverage the wealth of government data to develop evidence-informed approaches to complex issues such as climate change.

  • Cultivating public-private partnership is essential. There is a critical need to improve federal government engagement with the private sector to develop information products. 

  • To facilitate greater data capacity within the federal government, officials need to identify existing obstacles to using data, including potential restrictions on agencies that may want to use outside products, and consider incentives for public-private sector collaboration.


An Iterative Approach to Developing, Conducting, and Using the Department of Homeland Security Capacity Assessment

Presented by Rebecca Kruse, Assistant Director for Evaluation, Department of Homeland Security; Brodi Kotila, Senior Political Scientist, RAND Homeland Security Operational Analysis Center (HSOAC); and Coreen Farris, Senior Behavioral Scientist, RAND HSOAC

The Evidence Act requires agencies to assess their capacity to produce and use evidence for better decision-making and policymaking. The Department of Homeland Security (DHS) has conducted two iterations of its capacity assessment since 2020 – an initial assessment in fiscal year 2020, and the official, independent assessment in fiscal year 2021. The presentation covered methodology, findings and recommendations, lessons learned, and an update on DHS evaluation activities since the assessment was published.

Key Takeaways & Recommendations: 

  • Cross-department collaboration is critical for successful capacity assessments.

  • Agencies can foster continued collaboration and learning to strengthen capacity by providing educational resources, hosting workshops and lunch and learns, and holding office hours.


Understanding Human Capital Needs for Expanding Data and Evidence Culture Using a Federal Data and Digital Maturity Survey

Presented by Maddie Powder, Research Associate, Partnership for Public Service

The Partnership for Public Service and the Boston Consulting Group conducted a data maturity assessment using the Federal Data and Digital Maturity Index survey, or FDDMI, at six federal agencies. To understand the agencies’ data and digital maturity, respondents ranked their agency’s current maturity on a sliding scale of 0 to 100 points and their target maturity in five years. Based on the survey responses, agency results were separated into 4 categories – starter, literate, performer, and leader. Findings from the survey revealed human capital – referring to an agency’s workforce and a component of the overall maturity score – ranked lower than overall maturity scores. Target maturity scores were higher than current maturity scores for all sub-categories, revealing a consistent ambition to improve.

Key Takeaways & Recommendations: 

  • Federal agencies need to retain and recruit a skilled workforce to carry out evidence-building activities.

  • There are several strategies to build a skilled data workforce in the federal government, including: 

  • Utilizing creative hiring authorities; ​​

  • Making government jobs accessible to young people and giving them the space to be creative to support retention;

  • Promoting the government’s mission; 

  • Creating experiential onboarding programs; and 

  • Investing in the current workforce through upskilling and reskilling programs.


An FAA Experience: Applying Intervention Research as a Change Management Approach to Implement Evidence-Based Management

Presented by Robert Young, Senior Advisor, Strategy, Risk & Engagement, Federal Aviation Administration Security & Hazardous Materials Safety Organization

The Federal Aviation Administration (FAA) Security and Hazardous Materials Safety Organization developed a three-year strategic plan to implement evidence-based management with the objectives of delivering goods and services with more rigor and scientific integrity, better managing data, and building evidence-building capabilities. To execute this strategy, the agency used intervention research as a change management approach. The research team engaged participants with a “reflexive experiential lens” to gather a range of perspectives from the senior executive level to employees to bridge the gap between the academic theory and the realities of implementing an evidence-based management system in a federal agency. 

Key Takeaways & Recommendations: 

  • Integrating evidence-based management into organizations requires ongoing, long-term effort. It is important to identify specific, practical examples to demonstrate how evidence-based management can improve an organization’s work. 

  • Trust building is a key component of intervention research. Applying concepts to day-to-day activities is more effective than formal instruction. 


Advancing Equity through Evidence-Building, Data Integration, and Research Partnerships: A Local Government’s View from “The Other Washington”

Presented by Claire Evans, Research Specialist, King County Metro Transit; Truong Hoang, Deputy Regional Administrator, Region 2, Community Services Division, Economic Services Administration, Washington State Department of Social and Health Services; Maria Jimenez-Zepeda, ORCA Reduced Fare Project Program Manager, King County Metro Transit; Christina McHugh, Housing and Adult Services Evaluation Manager, King County Department of Community and Human Services, Performance Measurement and Evaluation Unit; and David Phillips, Associate Research Professor, Wilson Sheehan Lab for Economic Opportunities, University of Notre Dame

Representatives from King County and Washington State highlighted challenges, successes, and lessons learned as their organizations have been working to adopt and integrate evidence-based policymaking into their programs and day-to-day operations. Panelists described how partnerships and data sharing helped center equity and improve program decisions about transportation assistance and housing for low-income populations. The panelists touted leadership buy-in, partnerships between local, state, and nonprofits to conduct evaluation, and funding as critical for developing local evaluation capacity. 

Key Takeaways & Recommendations:  

  • Sustained local funding and non-government financial investment are central to improving business processes.  

  • Interdisciplinary partnerships with various levels of government and other external partners enable collaboration across local and state governments and make findings applicable in multiple contexts.

  • Panelists suggested three ways the federal government can help strengthen state and local capacity for data and evidence-building: 

  • Focus investment on bolstering overall local capacity rather than program-specific funding. 

  • Incorporate federal carve-outs to provide sustained overhead to build and modernize data systems. 

  • Improve access to data and dissemination of research, and provide legal clarity and guidance around when and how data should be accessible for research purposes.


Assessing the Quality of Impact Evaluations at USAID

Presented by Irene Velez, Director, Monitoring Evaluation Research Learning and Adaptation (MERLA), Panagora Group

The United States Agency for International Development’s (USAID) Bureau of Policy Planning and Learning commissioned a study that assessed the quality of the agency’s impact evaluations from 2011-2019. The study was intended to identify the strengths and shortcomings of USAID’s impact evaluation reports and inform the agency’s update to guidance for conducting impact evaluations. The research group compared impact evaluation reports with the definition outlined in USAID’s evaluation policy and developed a review instrument to assess the quality of the impact evaluations. Quality was based on six domains, including: sample size, conceptual framing, treatment characteristics and outcome measurements, data collection and analysis, threats to validity, and reporting. The researchers found that about half of the impact evaluation reports met the quality criteria, that USAID’s existing evaluation guidance was not tailored toward impact evaluations, and that the existing guidance lacked alignment with the organization’s broader scientific research policy.

Key Takeaways & Recommendations:  

  • Third-party reviews can identify misalignment or missed opportunities to build evaluation capacity at an organization. 

  • USAID updated its operational policy by expanding the definition of impact evaluation, specifying the elements required for evaluation reports, and included explicit mention of a cost analysis for impact evaluation – demonstrating how the organization made a decision and took action based on evidence.


Approaches to Assessing Agency Capacity for Evidence-Building

Presented by Tania Alfonso, Senior Evaluation Specialist, USAID; Danielle Berman, Senior Evidence Analyst, Office of Management and Budget (Moderator); Susan Jenkins, Evaluation Officer, Department of Health and Human Services; and Christina Yancey, Chief Evaluation Officer, Department of Labor

Panelists from the Department of Health and Human Services, the Department of Labor, and the U.S. Administration for International Development demonstrated various approaches to conducting capacity assessments by sharing the models their agencies used to complete  department-wide assessments. 

Agencies tailored the capacity assessment to fit their unique organizational needs. Some utilized an evaluation council framework and others started from a more structured, centralized evaluation authority. Based on the first agency capacity assessments, the evaluation officers are planning activities to address capacity needs. Future efforts include developing a four-year plan with each year focusing on a different category of capacity, providing training and tools, focusing on knowledge management, and improving communication regarding the value of evidence.

Key Takeaways & Recommendations:  

  • Agency leaders should emphasize both evidence producers and evidence users when communicating the value of capacity assessments.

  • Frame capacity assessment conversations to emphasize continuous learning. Identify achievable short, medium, and long-term goals, and recognize agency strengths and promising practices.

  • Use maturity models to set goals and measure progress. 

  • Allow progress to reflect department-wide capacity strengths and weaknesses while also incorporating the nuances of different sub-agencies and offices. 


Opportunity for Partnership: A Budget and Program Perspective on the Learning Agenda and Evidence-Building Activities 

Presented by Courtney Timberlake, President, American Association for Budget and Program Analysis (AABPA); Ed Brigham, Executive Consultant, Federal Consulting Alliance & AABPA Board Member; Jon Stehle, Councilmember, City of Fairfax VA; and Darreisha Bates, Federal Portfolio Manager, Tyler Technologies & Former Director of Intergovernmental Relations, U.S. Government Accountability Office

The panel covered the evolution of the budget process since analysts adopted more advanced technology and discussed how to improve upon that recent progress going forward. The panel divided data use in the budgeting process into two areas: how computerization and data have made the federal budget more accurate, and where and how data can be used to support budget and policy decisions. 

Key Takeaways & Recommendations: 

  • Since the enactment of milestone data laws like the Digital Accountability and Transparency Act of 2014 (DATA Act) and the Evidence Act, the budget process has become more focused on data quality and reporting. 

  • Executive Branch leaders need to continue communicating the value of data, collaboration, and training to improve data use in the budget process and to inform programmatic decisions.

  • Federal agencies need to recruit and train data talent to improve data generation, communication, and use.


Day 2: Innovations in Evidence-Building Activities

The second day of the symposium focused on innovation. Presentations offered a glimpse into areas where government agencies and other organizations have expanded their capacity to build and use evidence – and the strategies, tools, and methods that have enabled them to do so. Panelists’ innovative approaches to evidence across multiple issue areas emphasized the importance of fostering partnerships, leveraging existing data sources, and the role of organizational leadership.


Education Research-Practice Partnerships: Innovative Structures to Build and Use Evidence

Presented by Rachel Anderson, Director, Policy and Research Strategy, Data Quality Campaign

State and local agency leaders have finite time, resources, capacity, and expertise to interpret data and conduct research. Research-practice partnerships can address these challenges by engaging education researchers and practitioners in long-term collaboration efforts to develop research questions relevant to policies or practices, collect and analyze data, and interpret research findings. To provide an example, the presenter illustrated how research-practice partnerships can alert practitioners when students are at risk of failing to be successful in school. 

Key Takeaways & Recommendations: 

  • Research-practice partnerships can expand internal agency capacity by supporting research that may not otherwise be conducted; improving agency capacity to utilize outside research; and building relationships between agencies, researchers, and practitioners.


Statewide Longitudinal Data Systems and Predictive Analytics: Understanding, Measuring, and Predicting K-12 Outcomes

Presented by Nancy Sharkey, Senior Program Officer, Statewide Longitudinal Data Systems Grant Program, National Center for Education Statistics; Robin Clausen, Stakeholder Liaison and Research Analyst, Statewide Longitudinal Data System, Montana Office of Public Instruction; Chris Stoddard, Professor, Montana State University; Mathew Uretsky, Professor, Portland State University; and Angie Henneberger, Research Assistant Professor, University of Maryland School of Social Work

The panel gave an overview of the Statewide Longitudinal Data Systems (SLDS) program with examples from several states. The SLDS program, authorized in 2002 by the Education Sciences Reform Act and the Educational Technical Assistance Act, provides cooperative agreements between the federal government and states to support evidence-building and research through grants administered by the Institute of Education Sciences at the Department of Education. To date, participating states and territories have received around $800 million over seven rounds of grants. Representatives from Montana and Maryland universities offered examples of how the SLDS program has aided in their data collection systems and research. 

Key Takeaways & Recommendations: 

  • The SLDS program allows participating states to increase capacity to collect education data, conduct important research on student outcomes over time, and improve their data infrastructure. 

  • Through SLDS, Montana provides about 50 different reports and dashboards that report data – spanning school finance, student enrollment, dropouts, graduation, and student achievement. 

  • SLDS funding has enabled Maryland to collect and link data spanning from K-12, higher education, workforce, occupational licensing, child welfare, and juvenile delinquency records. Maryland’s system provides a rich source of data for practitioners and researchers to trace students and young people through their life trajectories and understand the multimodal influences on their lives. 


Using Linked Administrative Data to Connect Families to Pandemic Stimulus Payments

Presented by Aparna Ramesh, Senior Research Manager, California Policy Lab, UC Berkeley

The California Policy Lab partnered with the California Department of Social Services, the California tax department, and a research center out of Harvard University to identify Californians who were eligible but had not yet received the Child Tax Credit or federal stimulus payments. 

The research team hypothesized that combining tax filing data and the Supplemental Nutrition Assistance Program enrollment data could reveal what percent of low-income Californians receiving social services had not filed returns but qualified for tax credits or stimulus payments. The team identified multiple data sources that included eligibility information, including tax return data and enrollment data for low-income social programs. To address potential privacy and legal concerns with linking this administrative data, both agencies signed separate data use agreements and the California Policy Lab served as a trusted third party to link state tax and social services data. 

Researchers found that using tax data helped deliver stimulus assistance quickly to most low-income families. Most families receiving social services file a return and automatically receive these credits. Among adults who didn’t receive the federal stimulus, the vast majority were ineligible because they did not have dependents. And of the children who did not receive the credit, most lived with a single adult and a quarter lived in a household with no adult. 

Key Takeaways & Recommendations:

  • The California Policy Lab used an innovative encryption process called hashing – an algorithm that transforms numbers and characters into values that cannot be reversed. 

  • This research shows how engaging various agencies and linking administrative data create a cycle of evidence use. The shared data allowed researchers to identify missing stimulus payments, test solutions, and then monitor progress and payments for all eligible Californians. 


Using a Framework for Evidence Capacity to Strengthen Federal Program Offices

Presented by Heather Gordon, Managing Consultant, Mathematica

The Evidence Capacity Support Project is an initiative between the Department of Health and Human Services’ Administration for Children and Families (ACF), Office of Planning, Research, and Evaluation (OPRE), and Mathematica to deepen evidence capacity. In this partnership, Mathematica offers support to staff within ACF program offices to build and use evidence in their work. Mathematica reviewed existing research literature and ACF documentation and conducted interviews with program staff to develop a framework for evidence capacity to inform the initiative.

Mathematica’s framework has identified and defined five dimensions of evidence capacity, each of which includes related inputs, outputs, or activities that are involved in evidence capacity-building. The framework can be used by other federal agencies and organizations interested in assessing their capacity to build evidence and apply it to their programs. 

Key Takeaways & Recommendations:

  • Evidence capacity encompasses the knowledge, skills, behaviors, and resources that support an organization’s ability to build and use evidence to make decisions.

  • Defining evidence capacity and documenting the array of activities involved in building and using evidence can help ensure that the Evidence Act is implemented consistently and with a common language. 

  • Organization-level support is critical for building evidence capacity. 


Critical Factors for Building Successful Data Science Teams

Presented by Robin Wagner, Senior Advisor, Epidemiology Branch, Division of Cardiovascular Sciences, National Heart, Lung, and Blood Institute, National Institutes of Health

The Health and Human Services Data Council established the Data-Oriented Workforce Subcommittee (DOWS) to implement the workforce priority of the 2018 HHS Data Strategy and to develop a plan to enhance the department’s data science capacity. The subcommittee conducted two research projects that identified training opportunities for existing staff, recruitment strategies and tools for new staff, and retention and succession planning strategies. The DOWS research produced several recommendations regarding skills and competencies of data science teams.

Key Takeaways & Recommendations: 

  • When structuring a data science team, it’s important to:

  • Distribute skills across team members

  • Regularly measure and upgrade the skills within the data science teams 

  • Ensure team members have diverse backgrounds and understand how data ethics fits into the work

  • Data science team managers need to be data literate themselves, with strong business skills and an awareness of the broader organizational goals. The manager must also advocate for their team to upper management.

  • To sustain a data science team, agencies must provide challenging and mission-oriented work to attract and retain talented data scientists. 

  • Data-driven culture at an agency requires buy-in from executive leadership who agree to welcome new data, scientists, ideas, and technologies; increase data literacy; and facilitate communication with stakeholders. 


Advocating for and Applying COVID-19 Equity Data: The Black Equity Coalition’s (Pittsburgh, PA) Efforts to Improve Public Sector Health Agencies’ Practices

Presented by Jason Beery, Director of Applied Research, UrbanKind Institute & Member, Black Equity Coalition Data Working Group; Ashley Hill, DrPH, Assistant Professor, Department of Epidemiology, School of Public Health, University of Pittsburgh & Member, Black Equity Coalition Community Health Working Group; Ruth Howze, Community Coordinator, Black Equity Coalition; and Stacey Pharrams, Community Researcher, Healthy Start Initiative & Member, Black Equity Coalition Data Working Group

Representatives from the Black Equity Coalition (BEC) discussed their efforts in Allegheny County, Pennsylvania to use data to address the disparate impact of the COVID-19 pandemic on the Black community. BEC had three main goals: to improve 1) access to testing, 2) data quality (for both testing and case data), and 3) vaccine uptake in Black communities.

BEC accomplished its goals by engaging stakeholders, including community members, public officials, and health departments, to discuss COVID-related data quality improvements. BEC also focused on collecting demographic data to understand how COVID impacts differed by race and ethnicity. Linking COVID data to demographic data, BEC built maps and a dashboard to present local officials with evidence of disparate outcomes by race, age, and other demographic characteristics.

Key Takeaways & Recommendations: 

  • Engaging stakeholders can help improve data quality. 

  • Collecting, sharing, and linking COVID data with demographic data enabled BEC to develop useful evidence for policy change.


A Dynamic, Inclusive Approach to Learning Agenda Development for the Centers for Disease Control and Prevention’s Center for State, Tribal, Local, and Territorial Support: Reflections on the Participant Engagement Process

Presented by Elizabeth Douglas, Senior Manager, ICF and Jessie Rouder, Lead Research Scientist, Behavioral Health, ICF

Representatives from the Center for State, Tribal, Local, and Territorial Support (CSTLTS) shared their experience as one of the first Centers for Disease Control (CDC) centers to develop a learning agenda, describing the process and the lessons learned. CSTLTS presented the learning agenda as an opportunity to align efforts with other HHS, CDC, or center-wide strategic planning processes.

The center took a phased approach to learning agenda development, in part to ensure robust engagement. The team engaged appropriate participants, identified key learning agenda questions, and then designed evidence-building activities to answer priority questions. The presentation highlighted strategies for stakeholder engagement, including orientation meetings, virtual workshops, and consultations with leadership. 

Key Takeaways & Recommendations: 

  • CSTLTS identified four key lessons learned:

  • Consider stakeholders’ capacity to participate in the necessary engagement activities.

  • Emphasize the nature of a learning agenda question and demonstrate how such questions may be different from other program planning questions.

  • Secure both public health system leaders and subject matter experts as stakeholders. 

  • Only focus on one strategic priority at a time. 


Best Practices for Monitoring and Evaluating the ARP, IIJA, and Other Programs: Report of the Department of Commerce Data Governance Working Group

Presented by Carla Medalia, Assistant Division Chief for Business Development, Economic Reimbursable Surveys Division, U.S. Census Bureau; Ron Jarmin, Deputy Director, U.S. Census Bureau; Ryan Smith, Policy Advisor for the Office of Regional Affairs, Economic Development Administration; Oliver Wise, Chief Data Officer, Department of Commerce

The Department of Commerce received an influx of funding from the American Rescue Plan (ARP) and the Infrastructure and Investment in Jobs Act (IIJA), significantly increasing its annual appropriations and opening opportunities to demonstrate value to taxpayers. To maximize the department’s ability to leverage program data and improve program outcomes, the department formed a data governance working group composed of bureaus that are managing or receiving ARP or IIJA funding. The working group identified three phases of collecting and using program performance data: 

  • Developing a shared structure and quality standards to facilitate linkage across bureaus;

  • Developing common metadata standards to ensure departmental interoperability; and

  • Identifying and overcoming barriers to implementing the identified systems and standards to ensure data use. 

Data linkage with Census Bureau-collected survey and administrative data expanded the department’s capacity to understand how well recipients are using funding to carry out programs and achieve intended outcomes. 

Key Takeaways & Recommendations:

  • The Census Bureau provides existing data linkage infrastructure and secure data governance practices that can inform and support efforts to leverage data across the Commerce Department and the federal government.

  • Agencies should leverage existing survey and administrative data when developing and executing program evaluation plans to enable large-scale observations and consistent evaluations across agencies.


Capacity-Building Going Forward

Forty speakers presented 16 different programs and projects over the course of the Data Foundation’s two-day event. 

The presentations highlighted innovative approaches to improve government data collection, governance, and use. 

Offering insight into strategies for planning and carrying out agency-wide capacity assessments and implementing sustainable changes to address agency data capacity, the symposium identified four core lessons:

  • Communicating the value of data and evidence is key to carry out successful and useful capacity assessments and ensure engagement across the organization. 

  • Funding must enable self-sufficiency of agencies at all levels – whether from federal grants or incorporated into state or local budgets. Investing in data infrastructure and staff training supports sustainable data capabilities that can integrate data into future program and policy decisions. 

  • Engaging a wide range of stakeholders fosters innovation, allowing different data sources, technologies, and ideas to come together and develop new frameworks and approaches to building data capacity and evidence use. 

  • Organizational collaboration and data linkage has great potential to develop data infrastructure to streamline analysis and understanding of where a program may be falling short of its goals. 

The symposium coincided with the release of many federal agencies’ self-assessments of evidence-building capabilities and activities, including capacity assessments and learning agendas – all requirements of the Evidence Act. 

The Data Foundation hopes this research will highlight the value of data and evidence for policymaking and encourage continued collaboration and learning between agencies and non-governmental organizations to support successful Evidence Act implementation.