Government has long-since been known as a great producer of data. And new initiatives at every level of government are demanding that data be used in more meaningful ways to improve outcomes – in law enforcement, the environment, and health care, just to name a few. But at the center of these initiatives is an assumption about data quality.
“There’s a lot of pressure on government agencies to start sharing data in a better way,” said Shawn McCarthy, director of research at IDC Government Insights, in an interview with CivSource. “Often times this data comes in from multiple sources. And agencies are collecting and sharing data across platforms that don’t easily talk with one another.”
But according to Mr. McCarthy, data can be compromised just through the process of sharing, unless a system is developed to assure the data’s authenticity from the original agency that collected it. The classic example is Social Security Numbers. SSNs are almost a universal part of many government forms, and every time you use your SSN, different types of data are identified with it. But who among the long string of government agencies, departments and offices knows if your SSN is correct?
“It would be nice to have a way to look upstream – to the authority, the place that it originated, and ask, ‘Is it correct, was it written down correctly, has it changed?’”
IDC Government Insight’s report, “Methods and Practices: Introducing the Tagged Data Authority Engine – Assurance and Data Integrity for Government Agencies and Fusion Centers” examines this very issue. IDC Government Insights has dubbed this issue “Data Authority” because, “information quality can only be assured when there is a clear understanding of where a piece of data comes from, who has authority over it (for updates, changes, and full life-cycle management), and where the ultimate authoritative copy of that data resides,” the report states.
According to the report and to Mr. McCarthy, the problem is growing as governments become more networked, and continue expanding beyond their traditional in-house database to things like cloud computing. Allowing multiple parties to change data is not the only issue – making sure those changes are consistent with the “data authority” is more of the problem.
The proliferation of data-sharing initiatives – like fusion centers and shared services centers – increase the need for a single standard, McCarthy argues. “Traditional databases can take care of this situation because you can’t have conflicting information – you have merge and purge type tools – so that problem has already been solved for the kinds of databases [agencies] manage internally,” McCarthy said. “But there really is no process in place right now to take those sorts of database tools and extend them out over the Internet to every place data may reside.”
The solution is as much managerial as it is technical, Mr. McCarthy says. The embedded controls exist to track changes in data, across users, but that type of arrangement isn’t usually in place unless it’s been very specifically hard-carded between sets of users, he said. And being able to see what happened upstream to a piece of data is different from managing files created downstream.
IDC Government Insight’s proposed solution (TDAE) expands file metadata – descriptive information that travels with the file – to include authority details for every data element carried within the file. If such data can be tagged, and also provided with built-in metadata that establishes where on the Internet the official version of that that small piece of data resides, then the data itself can be more accurately tracked and relied upon, McCarthy said.
The concept also includes a quick version check for a file’s data elements, allowing a call back to multiple data sources to check for accuracy of each tagged data element, to see if updates are available. According to Mr. McCarthy, almost a dozen different possible solutions exist in the marketplace today, but the bigger issue is getting governments to prioritize and begin building the standards into their processes.
Mr. McCarthy urges that governments take a project management approach and apply the strategy to cross-agency organizations. “Because without that capability of knowing where that data came from, knowing every time it’s updated, you’re essentially dealing with outdated information in a way that could become dangerous.”
For an overview of the report, click here.
There are no comments
Add yours