Recap: Evaluation and Auditing – Lessons from Oversight of DATA Act Implementation

In 2014, Congress passed the bipartisan Digital Accountability and Transparency Act, or DATA Act, to improve federal reporting of spending information and to provide greater transparency to the American public about how taxpayer dollars are used. The law included provisions for the U.S. Government Accountability Office (GAO) and Inspectors General (IG) to evaluate how well agencies complied with the law’s requirements, including the extent to which agencies adhered to data standards and submitted quality data over time.

Agencies across the federal government have been implementing the DATA Act for nearly a decade, improving the reliability and quality of spending data. The information is also published on USAspending.gov to allow greater insights into how the government is spending taxpayer dollars. With the implementation efforts came oversight and evaluation from GAO and the IG community. 

Auditors and evaluators from GAO joined the Data Foundation in November 2022 as part of the Issues in Evaluation webinar series to share lessons learned while auditing and evaluating the implementation of the DATA Act. Speakers discussed collaboration with the oversight community, barriers to accurate spending data, and the importance of real-time audits of government-wide legislation.

Opening the webinar, Kathleen Drennan, Ph.D., Assistant Director of Strategic Issues at GAO, discussed how her office’s audits changed over the years since the DATA Act was passed, shifting their focus initially from data the IGs sampled to later using a more complete dataset available on USAspending.gov. Peter Del Toro, Assistant Director of Strategic Issues, specified three changes in GAO’s auditing methods for the DATA Act:

  1. GAO opted for a flexible real-time auditing structure, conducting capacity assessments and readiness reviews before implementation even began.

  2. GAO actively collaborated across the oversight community on a regular basis, with frequent meetings and consistent communication.

  3. Evaluators and auditors considered the end-users of the spending data and what they would need to make the best use of the data.

Maria Belaval, Senior Auditor for Financial Management and Assurance at GAO, then highlighted the findings of GAO’s most recent report summarizing the IGs’ data reports on spending data quality as required by the DATA Act. The report identified discrepancies between individual award data and summary data from IGs, the need for better data linkage to track the entire spending life cycle from appropriations to award disbursement, and difficulties agencies had reporting accurate COVID-19 spending outlays.

The session concluded with a moderated session between Data Foundation President Nick Hart, Ph.D. and GAO representatives, including GAO Center for Evaluation Methods and Issues Assistant Director Terell Lasane, Ph.D. Each speaker expanded on their experiences as auditors and evaluators throughout the implementation process of the DATA Act. The sections below describe key lessons and practices GAO developed during that process.

Collaboration

Regular meetings with the inspectors general were crucial for standardizing GAO’s reporting procedures and drawing valid results from DATA Act implementation oversight. “There really was a coming-together to achieve the broader goal of providing effective oversight of a complex process,” Mr. Del Toro said of the process of standardizing methods through collaboration to make government-wide oversight possible. The IG community, in collaboration with GAO, provided GAO some access to provide review and input on draft products, creating an atmosphere of collaboration.

Agile Auditing

Agile auditing, according to Mr. Del Toro, is the process of evaluating government practices in real time so that agencies can adapt to new circumstances and change policies based on evidence. GAO used this technique while auditing DATA Act implementation.

Dr. Lasane explained the role of the GAO Yellow Book as a set of government auditing standards that ensure GAO ties audits and evaluations to specific goals and questions, and also stressed that GAO is always looking for newer and more effective auditing strategies. 

DATA Act in Action

To illustrate the real-world applications of data produced under the DATA Act, Ms. Belaval laid out two examples from recent years:

  1. GAO uses USAspending.gov and corroborates the data with agencies to assess agency spending patterns.

  2. The Pandemic Response Accountability Committee (PRAC)—created to oversee spending from the Coronavirus Aid, Relief, and Economic Security Act (CARES Act)—uses USAspending.gov data to monitor COVID-19 spending and ensure agencies are using funds as intended.

Limitations

While spending data is increasingly available because of the DATA Act, the data has some limitations. Ms. Belaval explained that not all spending data is audited for accuracy. Dr. Drennan cautioned data users that sub-award data is updated live on USAspending.gov often without real-time validation. Likewise, Mr. Del Toro suggested that the limitations of the data are not clearly disclosed or documented on USAspending.gov. 

The speakers remarked how the lessons learned from the DATA Act have implications for implementation of other government-wide data laws, such as the Grant Reporting Efficiency and Agreements Transparency Act (GREAT Act) and the Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act). In these laws, Congress included discussion of collaboration, and the practices deployed from the DATA Act offer opportunities for considering how effective collaborations for auditing and evaluation can be applied in the future.  

Insights into the implementation of these laws—including the DATA Act—from GAO, IGs, and the evaluation community as a whole provide immense value to agencies and the public by sharing examples of how data is being used. Data Foundation President Nick Hart closed the webinar with remarks on the importance of the evaluation and auditing communities. “The data are never perfect,” he said. “We will always find gaps and things that need to be improved, but until we actually start using the data, we won’t see those gaps and we can’t make improvements.” 

JOE ILARDI is a project associate at the Data Foundation.