No citizen should have to rely on the word of Joe Biden (or any other politician) to judge the efficacy of government programs. Being able to see for oneself with a few clicks of the mouse—to know, for example, whether there’s a Superfund site near the home one is thinking of buying—is the great promise of online, transparent government. But if half the Superfund sites aren’t listed in the data, or are in the wrong place because of transposed digits in the zip codes (a common federal data problem), one might end up owning a dream home next to a toxic sludge hole.
That means that auditing government data—determining what’s collected, how it’s collected, what it’s used for, and how accurate it is—should be a priority. Certainly, government should take up the lion’s share of this work, but the public, the press, and academics also have a crucial role to play in finding bad data.
So, while it’s unlikely to top a list of voter concerns in any poll, the quality of federal data—what they get wrong and what they leave out—is rapidly becoming a critical issue for the country. We can’t create data-driven decision-making processes when the data itself is unreliable. Whether it’s bad data on crime rates, spending programs, or the disposition of nuclear waste, it’s awfully hard to make decisions when you’re basing them on faulty information.