Government has seen “momentum” around evidence-based policymaking at agencies, the majority having placed senior officials in charge of advancing data-driven decision making, according to the Evidence Team lead at the Office of Management and Budget.
Federal News Network: Data Driving the Future
On FEDtalk this week, join us for a discussion on the ways in which data is driving the future of work within and with the federal workforce. Host Jason Briefel will be on with Shane Canfield, Chief Executive Officer of Worldwide Assurance for Employees of Public Agencies – known inside the beltway as WAEPA, as well as WAEPA’s Senior Vice President of Operations Tony Zerante and Data Foundation President Nick Hart.
The group discusses how the COVID-19 pandemic has accelerated the use of data and elevated the importance of data literacy and leadership. WAEPA discusses how their organization has adjusted to the changing needs of their customers and embraced a data-centric future. Hart focuses on what agencies must do to prioritize the most efficient use of their data and how leaders can cultivate a culture of connecting data to action.
The show airs live on Friday, March 26th, 2021 at 11:00 am ET on Federal News Network. You can stream the show online anytime via the Federal News Network app and listen to the FEDtalk podcast on PodcastOne and Apple Podcasts.
FEDtalk is a live talk show produced by Shaw Bransford & Roth P.C., a federal employment law firm. Bringing you the insider’s perspective from leaders in the federal community since 1993. FEDtalk is sponsored by the Federal Long Term Care Insurance Program (FLTCIP). The FLTCIP is sponsored by the U.S. Office of Personnel Management, insured by John Hancock Life & Health Insurance Company, under a group long term care insurance policy, and administered by Long Term Care Partners, LLC (doing business as FedPoint).
Executive Order Creates Chief Science Officers at Federal Agencies
Biden Names Top Government Management Official
Survey: Feds Unprepared for Data Deluge
The report, released by data management company Splunk on Tuesday, reinforces the fact that the public sector lags private industry not only in adopting emerging technologies such as artificial intelligence, machine learning and edge computing but also in readiness for what the report calls the “Data Age.”
The U.S. Needs a National Data Service
Just as frail and threatened buildings can be buttressed by external supports, we need to build a set of data buttresses to support a frail and threatened infrastructure. When our data collection and analysis systems were established in the last century, no one foresaw the unimaginable amounts of information that could be used to validate and verify official enumeration.
Fed CDOs Making Progress Amid Lingering Challenges, Survey Finds
The Effective Data Governance: A Survey of Federal Chief Data Officers report – jointly conducted by the Data Foundation, Grant Thornton Public Sector LLC, and Qlik – surveyed current Federal CDOs to determine the state of their roles following enactment of the Foundations for Evidence-Based Policymaking Act last year that mandates Federal government data management modernization.
Long-term success eludes federal CDOs, survey says
Survey: Federal Chief Data Officers Tout Importance of Data Governance
Could a national data service help leaders make better decisions?
National Data Service Should be Created Within the National Science Foundation, Data Foundation Says
A new hub for sharing data should be built within the National Science Foundation, according to a Data Foundation report released Tuesday. The report calls for the establishment of a National Secure Data Service that would coordinate and centralize the federal government’s data infrastructure.
Think Tank Calls for ‘National Secure Data Service’ to Improve Fed Data Management
All data is not created equal: The case for government-wide disclosure modernization
The D.C.-based Data Foundation recently published a policy paper titled Understanding Machine-Readability in Modern Data Policy, which I authored. This paper provides an overview of why machine-readable data matters, with theoretical and practical examples from information theory and contemporary practice in business and government.
We desperately need sound data to understand COVID impacts
As the coronavirus pandemic continues to alter our way of life, more than ever we need valid and reliable data to support decision-making at every level of society. When used responsibly, data analysis helps our country’s leaders determine what policies to implement and can even guide our individual actions.
[The Hill]
Census Bureau Receives ’Emergency’ Approval to Conduct Pandemic Survey
Tracking COVID-19 Symptoms and Impact in Real Time: A Survey-Based System
Strong Data And Evidence For Equitable Response To COVID-19
In a recent editorial, Nick Hart of the Data Foundation and Dr. Nancy Potok, who served on the bipartisan Commission on Evidence-Based Policy Making, highlighted the need for the United States to invest in evidence building to measure the effects of the pandemic and to evaluate the success of our policy interventions.
[via Giving Compass]
Federal Newscast: NIST gives suggestions for keeping sensitive virtual meetings secure
Two former federal data officials are on the move. Former Chief Statistician of the U.S. Nancy Potok and GSA’s former chief data officer Kris Rowley have joined the Data Foundation’s Board of Directors. Potok stepped down from her previous job at the end of last year while Rowley left GSA earlier this month.
[via Federal News Network]
Advice Offered to Agencies for Coping with New Transparency Requirements
Former Office of Management and Budget controller Dave Mader provided Federal Drive with Tom Temin the highlights of detailed advice for agencies published by the Data Foundation.
[via Federal News Network]