Executive Summary

This paper proposes frameworks for understanding the data standardization effort that is the focus of the Financial Data Transparency Act (FDTA). It recognizes that the function and computer implementation of data standards have evolved (i.e., the implementation in computer code and data encoding software of a data standard), and that to address this evolution the effort to define a modern data standard requires that both the data and the authoritative definition of the data standards be expressed as machine-readable semantic data.

In covering these topics, this paper aims to: 

  • Expand (or develop) the law’s text (Title LVIII of P.L. 117-263), specifically the meaning of data and the data standards as machine-readable. 

  • Introduce concepts and frameworks for understanding the policy and computer implementation challenges, as well as proposing options for how to address these challenges. 

  • Highlight existing technical approaches, especially widely-used, non-proprietary global standards for identifying, describing, and expressing semantic data. Semantic data refers to information that is structured and encoded with meaning, enhancing human and machine understanding and providing context for automated processing and analysis. 

  • Describe how these existing technical approaches support the conceptual frameworks described in this paper and drive disclosure modernization as they are detailed in the law. Disclosure modernization is the movement of compliance reporting from documents to machine-readable data.


Authors

Dean Ritz

Dean Ritz is a Senior Research Fellow at the Data Foundation. He previously served as the Senior Advisor to the CEO, Open Data Policies & Practices, at Workiva Inc. During his tenure at Workiva, Dean Ritz formulated the company's model-driven semantic data strategy and advised on integrating these innovations into their SaaS platform. As part of proving and propagating these innovations, he developed semantic models and proofof-concept implementations. He also represented Workiva through advocacy for open data and the modernization of regulatory reporting. His interests extend to the topics of rhetoric and ethics, with scholarly work in these areas published by Routledge, Oxford University Press, and others.

Timothy Randle

Timothy Randle is a Senior Advisory Consultant specializing in the area of semantic data modeling with expertise in taxonomy and ontology design, financial reporting and information technology. He was the technical architect designing the DATA Act Information Model Schema, the taxonomy for USASpending.Gov mandated by the Digital Accountability and Transparency Act of 2014. Timothy’s focus is providing guidance on data standards creation and design expertise to organizations developing taxonomies and information reporting systems across government and commercial domains. During his tenure at Workiva, he was responsible for creating business opportunities via collaborative projects in broad reporting domains, including early initiatives building prototypes for the Annual Comprehensive Financial Report and Sustainability Accounting Standards Board reporting taxonomies.


SPONSORS


Disclaimer 

This paper is a product of the Data Foundation, sponsored by Donnelley Financial Solutions (DFIN) and Workiva. The findings and conclusions do not necessarily reflect the views or opinions of the Data Foundation, its funders and sponsors, or its board of directors.