Chapter 6

Standards and Convening

It all started on a Sunday morning in 1904. According to legend, someone dropped a lit cigar or cigarette through a cracked glass block in a sidewalk. The block functioned as a skylight for a basement. That basement belonged to the John Hurst & Company dry goods building. The blaze triggered an explosion that left much of Baltimore burning.1

The city could not find the means to stop the flames. More than 1,200 firefighters answered the urgent call for assistance, arriving by train from as far as New York, eager and seemingly well equipped. Yet they were largely ineffective, because their hose couplings could not connect to the fire hydrants. In fact, the couplings weren’t even compatible from one building to the next. So, as the firefighters scrambled to use other methods, the fire raged for more than 30 hours, reducing 1,500 buildings on 70 blocks to rubble, killing five people, and leaving thousands unemployed.

It may not have gone on so long, or done so much damage, if compatibility had been a priority in the fire equipment industry. Instead, the market incentives led manufacturers to design entirely proprietary systems, including different couplings for each vendor. After all, a city that purchased a particular system would be entirely dependent on that system’s manufacturer for any improvements or upgrades. A more interoperable system would give city officers more market choices all the way down to the spare parts level, meaning that a manufacturer would lose some leverage, and likely some margin.

The question became whether some good could emerge from the ashes. The winds of the progressive era were beginning to blow strong—the Roosevelt administration had brought a major antitrust suit two years before, the Department of Labor had been established the previous year, and the landmark Pure Food and Drug Act would pass two years later. There was an understanding that government had a role to play in protecting Americans from the dangers of their rapidly industrialized and modernized country. So, naturally, after the Baltimore fire, there was a call for stricter building codes and the use of more fireproof materials. But there was also awareness that those improvements might not be sufficient to prevent a similar calamity from occurring elsewhere. The industry needed to come together and drive toward greater standardization, so firefighters wouldn’t again be stymied by intentionally ill-fitting parts.

If you owned a Betamax video recorder in the 1970s, you can probably relate to the challenges of competing formats. Its manufacturer, Sony, tried to dictate a standard for the rest of the industry. Instead, JVC formed a broader coalition to commercialize its own technology, VHS, which was not compatible with Betamax players and vice versa. VHS came to dominate the market and quickly rendered Betamax irrelevant. Nearly two decades later, history could have repeated itself in the rollout of the DVD. Initially, there were two different formats, each backed by a number of prominent companies, with Sony and Phillips on one side, and Toshiba and JVC on the other. But a new market force, the computer industry, took a leadership role in applying the lessons from the VHS/Betamax fiasco. After its Technical Working Group (TWG) threatened to boycott all formats other than a single, standardized one, the DVD manufacturers ultimately came together and produced a common standard.2

Consumers benefited, getting higher-quality images on a more durable disk that could hold additional material. So did Hollywood. After resisting DVD production for fears of copyright infringement, the motion picture industry felt quite differently in 2004 when its studios booked a record $15 billion on DVD sales and rentals, compared to $9.4 billion in revenues at the box office.3 And while the creators of the original DVD standard couldn’t have predicted this at the time, the lightweight nature of the product would later fuel one of the early twenty-first century’s breakout companies: Netflix, which could send feature films around the country for the price of a stamp.

That’s an example of the private sector succeeding in standards development and application without the influence of the government, and you can find plenty of those throughout the past two-plus centuries. But public sector engagement in this area is also as old as the American republic, with government not always content to wait for the private sector to solve a standards problem.

In his first annual message to Congress as President, back in 1790, George Washington spoke of the importance of “uniformity in the current weights and measures of the United States,” and even directed his Secretary of State, Thomas Jefferson, to prepare a plan. From 1830 through 1901, an Office of Standard Weights and Measures, operating under the U.S. Treasury Department, oversaw much of the work—in collaboration with manufacturers, scientists, educators, government officials, and the public at large on standardizing everything from length to mass to temperature to light to time. But its mandate, funding and testing capacity, was modest, if not minuscule. Congress largely adhered to the 10th Amendment, leaving decisions about scientific research investments in standardization to the states, which really meant much of that work wouldn’t get done.

By the turn of the twentieth century, the need for standardization was even more acute, partly due to American society’s increasingly mobile and sprawling, yet interconnected, nature. Previously, most commerce had largely been relegated to the local community, because that’s where people stayed: a gallon of milk was a gallon of milk because that’s how the local dairy measured it, and that’s what the consumer, not knowing better, came to accept. But now consumers were expecting conformity wherever they traveled. And increasingly large companies, in an increasingly large country, needed to think outside of their most proximate market and be assured that their products could compete on a fair unit of measure around the country.

The introduction of electrification served as another impetus for the government to seek greater conformity in technology. For the most possible industries to benefit, the producers and distributors of electricity needed to settle on some standardized way of measuring volts and kilowatt hours, among other things. And for that technical work, most of which would be deemed precompetitive—more commercially relevant than typical university research but not designed to advantage any single firm—some scientists and engineers argued for a role for government. According to an official historical review provided by the National Institute of Standards and Technology (NIST), “The builders of America’s industrial complex had little interest in standards as such, but the scientists, engineers, and experimenters working for industry or independently found themselves increasingly hampered without them.”4

Further, according to NIST’s historical documents, “The burgeoning electrical industry showed that simple standards for mass, length and time were no longer sufficient.” The nation needed uniform standards for electrical units, as well as units and standards for new discoveries such as x-rays and radioactivity. This required research, mostly in physics and chemistry. And that meant “simple offices of weights and measures had to be replaced with more sophisticated institutions.”

Finally, after nearly two decades of debate on whether the government would be overstepping into the economy by engaging in proactive standards work, the National Bureau of Standards (NBS) was finally formed in 1901, and would retain that name until it became NIST in 1988.5 Originally directed by physicist Samuel W. Stratton and staffed by only 12 members, including a chemist, an engineer, and five technical assistants, the new agency restricted its work to that which—to paraphrase the NIST historical documents—would cooperate with university research laboratories, support private enterprise, and promote general welfare. Following the Great Baltimore Fire, the shipping industry also raised concerns about fire hoses and couplings. In response, the Commerce Department enlisted the Bureau of Standards to collect over 600 sizes and variations of hose couplings in use across the country. One year later, the National Fire Protection Association, with the support of the NBS, established a national standard diameter and threads per inch for hose couplings and fire hydrants, while endorsing an interchangeable device for nonstandard couplings. It proved a greater struggle to achieve widespread adoption; for reasons ranging from expense to inertia, many cities took years or decades to comply. Still, the overall fire hose standardization effort left a significant legacy, as one of the first major examples of the federal government responding to a crisis by galvanizing a private sector industry behind a laudable goal—in this case, public safety—and then convening government officials and scientific experts to find solutions.

But why wait for a crisis? Not long after the establishment of the fire standards, the federal government would use similar means—­initiating action without imposing mandates—to achieve economic ends. It would apply its convening authority in the aviation industry, in order to spur R&D and growth.

The government had little to do with Wilbur and Orville Wright getting off the ground at Kitty Hawk, back on December 17, 1903. Rather, their flying machine—carrying a pilot, rising by its own power, descending without damage—was a credit to their imagination, experimentation, and determination. In the decade that followed; however, America failed to fully capitalize on their creativity, undermined by ongoing, acute issues of safety and reliability. In 1908, Orville Wright himself was flying above 2,000 spectators when a propeller and rudder broke, sending his plane nose first into the ground and killing his twenty-six-year-old passenger, Lieutenant Thomas Selfridge. By 1913, America ranked 14th in government spending on aircraft development and, by 1916, had produced only 411 aircraft.

It was around that time, however, that the government identified a way to contribute. In 1915, the Woodrow Wilson administration tucked the creation of a new committee into a Naval Appropriation Bill. The National Advisory Committee for Aeronautics (NACA), while low profile and modestly funded, represented a rather significant shift in the scope of government. Its 12 unpaid members were commissioned to conduct research and development on engines and wings. NACA sought to develop a catalog of wing curvatures (or airfoils), so that the appropriate shape could be safely matched with the corresponding aircraft.

This variety could not come, however, until the committee settled on a standardized way of testing each airfoil design. That breakthrough, coupled with American entry into World War I, supercharged aviation production. In just nine months spanning 1917 and 1918, the government procured more than 12,000 planes for use in that conflict.6

But, in the year that followed, in the absence of government demand, production again nosedived. The new Commerce Secretary, a millionaire mining engineer and investor named Herbert Hoover, was intent not to allow America to cede leadership to Europe in this promising new industry. Hoover was obsessed with efficiency—he endeavored to eliminate, from his position in government, much of the waste in the postwar manufacturing economy. That required him to reconcile his guiding mission with his conservative governing philosophy: one based on individualism, industry autonomy, and an aversion to what he deemed the traditional, intrusive models of government intervention.

Hoover would thread that needle through convening rather than coercion, and his vision of an “associative state.”7 He eschewed the top-down planning approach widely espoused throughout Europe, instead using the state’s power to encourage the formation of voluntary and flexible trade associations that represented dozens of industries and saw value in cooperation, even among fierce competitors. Those associations would remain independent of the government, but would benefit from the government’s “friendly interest,” allowing access to its scientific research experts. They would work together to unlock opportunities and achieve growth, by identifying the technical barriers in a particular industry, designing standardized and simplified parts and procedures to address those issues, and attempting to validate their assertions and methods with the help of a government lab. In this sense, the government wouldn’t be getting in an industry’s way, so much as clearing a path, enabling that industry to grow and thrive, through the recognition and implementation of the best possible practices.

“We are passing from one period of extremely individual action into a period of associated activities,” Hoover told the U.S. Chamber of Commerce in Cleveland. “We are upon the threshold, if these agencies can be directed solely to constructive performance in the public’s interest.”8

This is the approach that Hoover would apply in aviation. Hoover was aware of NACA’s standards for wing designs in advance of the war, and their positive impact on the safety of military airplanes. And he was concerned about the collapse in airplane production following the armistice, from 21,000 per year to a total of 263 in 1922. After casualties associated with poor aircraft design, militarily and commercially, America needed to reduce the risk to commercial operators and potential investors, and relieve the worries of would-be passengers, which meant elevating and expanding upon the safety work that NACA had initially done. Hoover called for the organization of a trade association called the Aeronautical Chamber of Commerce, and pushed for the passage of the Air Commerce Act of 1926, to better coordinate government’s capacity to collaborate with that association, opening research laboratories for the purpose of technical breakthroughs and safety upgrades. The two leading manufacturers of that era, Douglas and Boeing, wholly adopted various NACA standards, in their production of the popular DC-3 and 247 aircraft. Those standards, and their subsequent iterations, would play a role in aircraft acquisitions for World War II. Standardization also set the stage for the commercial aviation boom that continues through this day. According to Bureau of Transportation statistics, commercial airliners currently employ more than 500,000 Americans on a full-time or part-time basis, more than 600 million passengers board domestic flights alone in the United States each year, and Boeing projected in 2013 that the U.S. would need 35,280 new planes, valued at $4.8 trillion, by 2032.

During two terms as Commerce Secretary that spanned the Woodrow Wilson and Calvin Coolidge administrations, Hoover doubled the size of the Bureau of Standards and engaged nearly 100 industries in the collective process of creation and deployment. According to his Presidential library, he was intent on insuring “that industries voluntarily cooperated in improving our national progress and improving Americans’ standards of living. To Hoover, no product or industry was too mundane for review and reform: flashlight cases, lumber, chinaware, mattresses, and bricks all merited primers on elimination of waste.” That’s right, even bricks. Under Hoover, standardization reduced the varieties of paving brick from 66 to five. At the conclusion of Hoover’s tenure—and prior to his election as President—the Commerce Department calculated that its standardization and simplification efforts had generated roughly $600 million in economic impact across America’s $18 billion manufacturing sector.9

Over the next several decades, Commerce Department officials in Republican and Democratic administrations largely adhered to Herbert Hoover’s model in their governmental approaches to affecting economic activity: avoiding the ideological extremes of industrial policy (picking winners and losers) and laissez-faire (letting everyone be). Many also embraced his belief in the power of collective action, especially that of interested parties in the same industry, with the guidance of and access to—though not interference from—the government.

In that sense, the Council on Competitiveness was a philosophical descendant of those trade associations that Hoover had called into action.10 Nonpartisan and nongovernmental, the Council was created in 1986 in response to concerns that America was losing economic leadership around the world, notably to Asia. It consisted of an all-star team of CEOs, university presidents, and labor leaders, who came together to assess America’s place in the global marketplace, identify obstacles and opportunities, and generate public policy solutions.

As technology advanced, the corporations represented in the Council encountered more challenges that required collective consideration and action. As of the early twenty-first century, several of America’s manufacturing behemoths had invested millions in high performance computing (HPC), including modeling and simulation activities that were intended to dramatically reduce production times and costs, by allowing for the optimization of design prior to the physical testing stages. In 2005, the Council undertook the High Performance Computing initiative. Three years later, a Council report confirmed earlier findings that “virtually all U.S. businesses that have adopted HPC consider this technology indispensable for their competitiveness and corporate survival.”11 This report cited some examples. Boeing used HPC to reduce its expensive “live” experimental tests from 77 to 11 for the Dreamliner 787 compared to the 777. Walmart used HPC to better manage its worldwide stores from its Arkansas headquarters, in everything from determining shelf space to turning out the lights.

Still, the report noted that only energy firms among U.S. industries had truly integrated HPC into critical business functions, while suppliers to all of those top firms had lagged behind, with many not using HPC at all. It called that situation “troublesome” in light of HPC’s potential to reduce costs and speed time to market; the gap is especially concerning as it comes at the same time that international firms are “driving HPC through their supply chains more aggressively.” There was an explanation for the holdup: suppliers, mostly small- and medium-size manufacturers, typically could not afford to employ the expensive new technologies. Nor had the large manu­facturers standardized a method for sharing computer-­generated models across their respective supply chains. These issues had undermined progress, with the large firms often held back by their smaller, but essential, brethren.

How could the government help democratize access to this high performance modeling and simulation technology so America’s manufacturing sector could run more efficiently and build new products more rapidly? In 2011, the Commerce Department joined the Council on Competitiveness—including General Electric, John Deere, Lockheed Martin, and Procter & Gamble—in the launch of a new public-private project called the National Digital Engineering and Manufacturing Consortium (NDEMC).12 Seeded with $5 million, two-thirds of that from the private sector, NDEMC ran a pilot program in the Midwest, leveraging research universities and aimed at making modeling and simulation software and training available to small- and medium-size manufacturers. The large manufacturers, such as John Deere, invited their supply chains to participate in the program—in its case, for the purpose of more cost-effective tractor parts. Others offered to help any small- to medium-size manufacturer in the hope of validating the hypothesis that access to such technologies could strengthen American manufacturing.

Nearly 30 suppliers took advantage of the initiative within the first couple of years. One was Jeco Plastics, an Indiana-based company of 25 employees, which sought to supply plastic shipping pallets to a major German manufacturer, a task that had previously fallen to a Chinese supplier. The order was contingent on making a couple of key cosmetic changes and, while doing so, not diminishing the pallets’ ability to handle the required weight. Facing seemingly insurmountable cost and time constraints to make these irreversible alterations, Jeco CEO Craig Carson turned to NDEMC, and the access it afforded to supercomputers and staff at Purdue University. Testing its models rapidly and at no cost, Jeco adequately upgraded the pallets, increased the purchase order fivefold to $2.5 million, and received enough recurring income as part of a long-term contract that it was able to expand its workforce by 60 percent. Its successful experience with modeling, simulation, and analysis (MS&A) even led to additional contracts, including one with NASA.13

NDEMC is trying to scale the program by encouraging the development of new, low-cost software products that serve the nation’s small- to medium-size manufacturers. It is also addressing the issue of standards, a necessity considering the diversity of the U.S. supply chain. For context, consider that the Department of Defense alone works with more than 30,000 manufacturing suppliers in the United States, suppliers that represent approximately 10 percent of the total number of the nation’s small- and ­medium-size manufacturers. Many of these suppliers also provide parts and services to other manufacturers, making it impossible for them to implement a specific method for each one. As part of its broader vision of a “digital industrial commons,” NDEMC is working toward standardizing programming language so each supplier can more easily share advanced models and simulations regardless of the manufacturing counterparty.14

Further, in May 2013, President Obama announced the launch of a program spearheaded by the Defense Department to build a Digital Manufacturing and Design Innovation (DMDI) Institute—one of three new manufacturing hubs that received $70 million in federal funding, plus an expected financial match from private sources.15 The DMDI seeks to inject the full potential of information technology into a new, “smarter” manufacturing economy, one that allows for the safe, secure sharing of product designs, quality improvement through faster feedback loops from sensors and data analysis, and faster delivery of products. And, in conjunction with the private sector, it will address a growing array of standards activities, related to data interoperability, definitions, mapping, security, and other areas.

As President Obama said on the day of the announcement: “The economy is dynamic. Technology is constantly changing. That means we’ve got to adapt as well.”16

In the Obama administration, we envisioned this approach—leading through coordination and collaboration rather than fiat—working in other sectors. Ideally, public officials would convene parties to encourage the development and deployment of standards that can spark innovation in a given sector of the economy; entrepreneurs would put those standards to work in the development of new products and services; and forward-­leaning communities would serve as early test beds for those products and services.

On all of these points, we had willing partners on the legislative side, many of whom saw the value in expanding the reach of an agency that had already experienced considerable growth. Throughout the decades, the Bureau of Standards—and in its latest incarnation as NIST—had been granted greater responsibilities, capabilities, and financial resources. For instance, from 1969 to 1993, there were 79 separate pieces of law that directed the agency to conduct laboratory research and support technologies related to everything from energy conservation and recycling to the metric system to computer security.17 And, in 2007, with the passage of the America COMPETES Act, NIST would be on a 10-year trajectory to double its budget; by 2013, it had already crossed $1 billion. That legislation also created a new, more prominent, position for the NIST ­director— Under Secretary of Commerce for Standards and Technology—while directing NIST to collaborate with the private sector on initiatives as varied as cloud computing standards and high performance building standards.18

Still, while standards activities grew along with NIST, my colleagues at the White House, including Cass Sunstein, the Director of the Office of Information and Regulatory Affairs (OIRA) and Ambassador Miriam Sapiro, the Deputy United States Trade Representative, sought to revisit the policy President Clinton established in 1998. That had directed agencies to use voluntary consensus industry standards rather than create their own. That essentially told us, as well as other government officials, what we could not do. We couldn’t impose our will on others. But we needed to clarify what government could do, and what role it could play in assisting the private sector, to reach its own consensus on standards. And in doing so, we needed to provide some specific guidelines to federal agencies, so they clearly understood the appropriate areas and limits of intervention.

Over the course of two years, we engaged hundreds of stakeholders from the public and private sectors, and those interactions would inform the memo we created to institutionalize our approach.19 That memo started with a clear edict: all standards activities, in the U.S. policy context, must involve the private sector. Yet it added that involvement of the federal government, either in the form of active engagement or a convening role, was appropriate “where a national priority has been identified in statute, regulation, or Administration policy” and that involvement “could accelerate standards development and implementation to help spur technological advances and broaden technology adoption.

“In these instances, the Federal Government can help catalyze advances, promote market-based innovation, and encourage more competitive market outcomes,” the memo continued. “The Federal Government should clearly define its role, and then work with private sector standardization organizations in the exercise of that role.”

We cited, as an example, the role that the administration had begun to play in the energy sector, since Congress authorized its involvement with the 2007 passage of the Energy Independence and Security Act (EISA). That legislation had directed NIST and the Department of Energy to convene the private sector for the development of standards that would underpin the modernization of the electrical grid.

That work was long overdue. Following Thomas Edison’s invention of the lightbulb, America had embarked on what the National Academy of Engineering regarded as one of our great achievements, the construction of “an advanced and complex electrical system that would become a model for the world,” thanks to public and private investments.20 And yet, at an event in June 2011, Energy Secretary Steven Chu referenced Edison to illustrate the industry’s recent stagnation. What if Edison transported in a time machine from the 1800s to the present day? He wouldn’t recognize the modern manifestations of his inventions in lighting and sound recording, such as LEDs and iPods. “On the other hand, he would feel really at home with most of today’s power-generating system,” Chu said. “That’s in the last half of the nineteenth century, and here we are at the beginning of the twenty-first century.”

As Chu argued, we need a modernized electrical grid, a twenty-first-century system for the twenty-first-century economy. We need widespread implementation and ongoing expansion of a “smart grid.” This is a grid that, as defined by the U.S. Department of Energy, uses information and communications technology to improve “the efficiency, reliability, economics, and sustainability of the production and distribution of electricity.” Such a grid uses digital versions of millions of pieces and parts, such as controls, meters, and transmission lines, upgrades that reflect the power of modern computing and wireless broadband. While many of these remain in relatively primitive stages, the expectation is that, after testing and tinkering, this technology will fully enable real-time communication between the utility and the customer, to accelerate the recording of, and responses to, electrical demand. Real-time information about the state of the grid has value in times of normalcy and distress, for both consumers and utilities. For consumers, knowing the cost of supplying energy at a specific time, such as when demand is greatest, allows them to alter their habits, related to when to do the laundry, run the dishwasher, or merely remove a plug from an outlet. For utilities, it helps to know as soon as possible that a few solar-paneled homes in a neighborhood are requesting more energy than usual. That could speak to cloud cover, and might allow those utilities to better prepare for a surge in energy requests from other homes in the area.

Improving interactions between utilities and customers, in a way that specifically targets the efficiency end of the energy equation, is consistent with President Obama’s oft-stated goal of cutting energy waste in half over 20 years, as a complement to ongoing efforts to increase and diversify energy production.21 Yet there have been holdups, and some can be traced, at least in part, to the way in which the sector is organized. Simply, America has never had one nationalized electric utility system; instead, it has over 3,000 local and regional systems governed by local and state regulators. Each state has adopted a regulatory system with different financial incentives for the utility—from rewarding production at the lowest costs to allowing utilities to recover more money for producing costlier renewable energy. Those incentives impact both the pace of smart grid technology adoption and the effectiveness of those technologies in lowering energy usage and upping reliability, to name just two improvements.

So far, innovation has been slow. At the aforementioned June 2011 White House event, only about half the states had adopted specific policies related to smart grid technology, and most of those policies were modest. Most utilities make more money if they sell more power, not the inverse. That’s their incentive. So, why should a utility invest in something that reduces power consumption to the detriment of their shareholders? And for the regulators who oversee those utilities, and whose primary concern is to keep costs down for the rate payers to whom they are accountable, why add any expense without clear benefit? The math doesn’t make sense.

A few leading states have tackled the incentive problem by implementing policies to decouple the production of energy from the sale of energy services that fuel homes. Others have needed a push. That was the idea driving the White House Strategy for a 21st Century Grid, which, released on that June day, explicitly called for aligning the incentives to encourage the deployment of smart grid technologies, in the name of “a clean, smart, national electricity system that will create jobs, reduce energy use, and expand renewable energy production.”22

To demonstrate how anyone could make a difference, Secretary Chu invited Shreya Indukuri and Daniela Lapidous to share the stage on the day he released the report. The high school seniors had raised money to install a smart energy system at the Harker School in San Jose, California, using off-the-shelf smart submetering devices, dedicated to individual buildings, as well as an intuitive online dashboard that allowed the school superintendent to learn exactly where energy usage was greatest on the school campus. With a week of installation, several anomalies became apparent, especially the excess usage in the gym. Further investigation revealed that the air-conditioning had been running over the weekend, without anyone knowing or needing it. A flip of the switch saved several thousand dollars and, over one year, Harker saw a 13 percent savings on its energy bill and a 250 percent return on investment. Lapidous proudly touted the low barrier to entry for such a campus-based “smart meter project”: a cost of between $10,000 and $20,000 per school with an 18-month payback period. Then the teenager drew laughs by drawing a clear conclusion: “Even if you’re not an environmentalist, it’s pretty hard to argue with a 250 percent ROI.”

It certainly is. And if two teenagers could accomplish so much, energy savings stories similar to theirs certainly should become commonplace, rather than seem extraordinary. But three things need to happen. First, the incentives for the utility companies need to be aligned with those of their customers, no easy task.23 Second, those utilities need to grant access to the sort of data that even the Harker students didn’t have at their disposal. Third, all 3,000 utilities need to come together to standardize the way that information is shared, so that it could be understood and implemented by third-party developers and ultimately by consumers. If all of that occurred, it might result in a vibrant marketplace, as simple and appealing as the iPhone app store, competing to aid the Harker students, and those like them, in identifying energy waste.

So NIST and the Department of Energy, represented by George ­Arnold and Pat Hoffman, respectively, worked together to convene the Smart Grid Interoperability Panel. That public-private partnership was led by the existing private sector standards bodies to design and deploy the necessary standards, and aided by $10 million that President Obama had allocated in the Recovery Act in 2009. The stakeholders—utility sector entities, manufacturers, and technology firms, to name a few—­recognized they needed each other. Good faith, plus good leadership, can go a long way. According to Arnold, in light of the structure of the utility sector, the government “is really the only entity that can provide that coordination leadership role.” NIST prioritized the panel’s work by emphasizing more than a dozen areas critical to jump-starting the smart grid industry. Among them: standardizing how utilities communicate with customers on energy usage information.

By February 2011, the participants endorsed what Arnold characterized as “a very robust toolkit” of standards. Now it was time for the next giant leap: deployment of those standards in a sector that, according to Hoffman, was “ripe for an information revolution” of the sort that manufacturing had already begun to experience. Months later, I addressed a leading forum for utility executives interested in grid modernization, and raised a question: “How can we safely and securely provide customers electronic access to their energy information, thereby supporting the continuing development of innovative new products and services in the energy sector?”

The answer would come through enlisting a coalition of utility companies, those willing to implement the agreed-upon standards. It was a strategy based on what a number of insurance companies and medical providers were already working toward in the health care space: standardizing and simplifying the method in which Medicare recipients and veterans could download personal health data, through the placement of a Blue Button icon on patient-facing websites. Why not create a Green Button that would have a similar role and payoff in the energy sector—allowing utility customers to download their own usage data and do with it what they wished, including sharing it with a growing array of third party applications that competed to provide money-saving tips?

The appeal was well received, especially by those who had been quietly at work on the underlying standards and saw an opening for faster implementation. Still, the movement called for a champion, someone who understood the importance of engaging the customer to spark innovation. That champion would come from California, where policymakers had long been working to enable the utilities and their customers to benefit from more efficient usage. They had begun to do so in the 1980s through a process called “decoupling,” to separate energy sales from profits, and give utilities state-­approved incentives to encourage conservation and the use of renewable energy. Then, in 2011, Karen Austin came aboard as the new CIO of Pacific Gas & Electric.24 Austin had devoted her career to recognizing, understanding, and improving customer relationships in the private sector, while establishing herself as an e-commerce retailing pioneer at Kmart and Sears, with customer-friendly programs such as Buy Online, Pick Up in Store.

I called Austin, assuming she would be receptive to the Green Button proposal. “I thought the idea was fantastic,” Austin recalled. “Of course we should give our customers this information. Let’s do it!”

Seeing it as a win-win-win, for the customer, PG&E, and the environment, she called other California utilities, including Southern California Edison and San Diego Gas & Electric, and convened a meeting within a couple of weeks: highly unusual alacrity for the typically sluggish utility sector. At the meeting, she—and I, as a government representative—would be sitting at the same table with a small group of public, private, and nonprofit leaders, not on a dais, looking down.

“A partnership,” she said.

We had a brisk breeze at our back. That July, the California Public Utilities Commission, the state’s utility regulator, had ordered that the utilities at least agree on a process for the design of a standard record format, making this stage merely about solidification and deployment. And, since the three utilities represented different parts of the state, they weren’t competing with each other for customers but, rather, free to compete together, against the clock, toward the achievement of similar goals.

“The key was just to keep it simple,” Austin said. “A lot of the standards had already been thought through but never deployed.”

To Austin, the idea of not getting something done, even in the compressed time frame, “didn’t cross my mind. The thought of creating separate formats, I don’t know, it wouldn’t have made sense to proceed that way. It was going to happen. I was comfortable that we were going to get there.”

Before they left the room, they agreed upon a sketch of the key fields that a customer would need, as well as a standard record format for the data that would fill those fields. Then they assigned someone from every utility to deliver a common user interface. This was a relatively easy technical exercise, since the group adopted an existing standard that simplified the “what” and “how” data would be published and transmitted between parties.25

Within 90 days, I visited California once again, to celebrate a number of Green Button commitments.26 Austin’s PG&E and her counterpart at San Diego Gas & Electric announced they were live for all of their customers; the technical body responsible for the data format, North American Energy Standards Board, announced a free starter toolkit that included detailed technical documents necessary for developers to build products and services on top of Green Button data. A few startups demonstrated early prototypes of apps that personalize energy savings tips through the use of a customer’s Green Button file.27 All of this was inspiring, but incomplete. Following the presentation, an audience member asked if the utilities would be publishing their rate schedules electronically as well, so customers could make energy decisions based on specific costs at a particular moment; in California, peak time pricing is double or triple the normal rate. Austin, who recognized the value of that data, related the reality: improving access to that information might take more than a year.

In the interim, however, an entrepreneur named Jason Riley was working to prove that there were better ways of linking energy pricing and usage data. Prior to the Green Button launch, back in 2010, Riley had founded a startup called Genability on the premise that, to deliver meaningful savings, an energy consumer would need both pricing and usage data. The pricing side hadn’t been a problem; with the help of contractors, he had begun manually entering utility tariff information for hundreds of rate plans into an online database, offering access to developers for a fee. On usage data, however, Riley had been stuck, since that information had been inaccessible to the public. At least it was, prior to Green Button.

I met Riley a week after the Green Button launch in California, while I was serving as a guest judge at a Cleanweb Hackathon in New York, where industry and government officials outlined specific challenges related to energy, such as the exorbitant cost of solar panel installation, and—in a bit of foreshadowing—New York City’s efforts to help residents better prepare for storm surges through the visualization of climate change. The organizer’s primary purpose, however, was to invite volunteer developers to spend 72 hours building prototype applications on top of a growing number of energy information services, while promising modest prizes.

With access to Genability’s Electricity Pricing database, Green Button data for energy usage, and other open energy data sources, 15 developer teams completed prototypes by Sunday evening. One stood out, because, in using both Genability and Green Button data, it educated users about the most economical rate plans for them. The Watt Quiz was a simple, engaging customer survey that, after the customer uploads his or her Green Button file from the utility, helps answer a simple question, “Which tariff rate plan saves you the most money?” It showed how one family could save 44 percent simply by changing the rate plan.

The CleanWeb Hackathon scaled into an international movement, not just because developers like to tinker and gather, but because of the promise of a new business model—making money while helping energy consumers save it. Take, for instance, Simple Energy, based out of Boulder, Colorado, which partnered with San Diego Gas & Electric in its launch of the Green Button service. After using Simple Energy’s Customer Engagement platform to see her family’s energy usage online, Heidi Bates deputized her six-year-old son Thaddeus as the “Light Police,” to run around the house unplugging unnecessary luminescence. “He really digs it,” she said. The enthusiasm spread throughout age ranges; a grandmother, Josephine Gonzales, saved over 20 percent on her electric bills using the Facebook-­connected platform.

Meanwhile, in Northern California, in the wake of the Green Button rollout, Austin continued coming across customers who benefited from the access to information, and were eager to share experiences. That included one couple—she refers to them as Kelly and Jim—from San Luis Obispo that had downsized to a residence half the size, expecting the energy bill to decline in kind. “When it didn’t budge at all, Kelly got onto the Green Button, and took that data and then really went around her home and looked at her usage,” Austin said. “She was able to cut her bill from $160 per month to about $50 per month, which was an annual savings of $1,300.”

Some customers even revealed that they had taken it upon themselves to spread the Green Button word around their neighborhoods, to improve the area’s overall efficiency. Understanding the power not only of the data but also the importance of presenting it in easily comprehended charts and graphs, Austin asked the board at Pacific Gas & Electric for resources to sponsor an Apps for Energy contest in partnership with the White House.28 Nearly 50 developers applied in the spring of 2012, some small companies, some single individuals. They produced a diverse range of concepts, with first-place honors going to Leafully—a visually appealing software tool accessed through social media that breaks a person’s energy footprint and environmental impact down into a corresponding number of trees saved. The second prize went to Melon.com, which combined Green Button and the EPA’s Energy Star portfolio manager to give more than one million commercial buildings a “simple and affordable” benchmarking analysis, allowing managers to compare their buildings with others around the United States, comply with the law, and save money. The third-prize winner, VELObill, was a colorful, intuitive application for consumers to view utility usage, compare it to peers, find ways to save, and find local contractors who can get them closer to their energy goals.

Still, Austin was anxious to iterate further in collaboration with the White House, simplifying access for third parties and, thus, speeding the rate of innovation. My successor as Chief Technology Officer, Todd Park, would announce the release of Green Button Connect My Data, a program that would eliminate the need for customers to take possession of their own data via download in order to use third party applications. Through the Connect My Data service, that upload would happen automatically once consumers enrolled.

Austin helped winners of the Apps for Energy contest go into production in her market. One was the previously mentioned Leafully. Another was PEV4me, which, by accounting for a user’s driving habits, calculates how much the user would save on gas by switching to an electric car. Another was UnPlugStuff, which assists in determining the cost of phantom usage, such as leaving your toaster plugged into the wall. In the first three months, after setting up all three with sufficient security protections, roughly 10,000 people signed up to use their services, and that was even with limited advertising during the election season.

“So far, so good,” Austin said.

Austin viewed this early progress to be a product of “keeping it simple versus trying to boil the whole ocean, and getting everyone on the same page.”

In this scenario, “keeping it simple” meant implementing the Green Button standard rather than attempting to account for every possible request for how to access someone’s individual energy data. It also meant moving incrementally, first providing the data in downloadable form, then automating that connection.

And “everyone” meant representatives of the government and the private sector, not at odds with each other but in collaboration, sharing ideas and shaking hands, before handing off to entrepreneurs to innovate.

Utility executives haven’t always embraced government involvement in their affairs—after all, as Austin noted, government regulation can sometimes slow the pace of innovation.

“In this case,” Austin said, “government was a positive enabler for sure.”

In 2013, President Obama would use the State of the Union to reiterate his commitment to the smart grid, by calling for a Race to the Top proposal which would provide financial incentives to states who adopt energy efficiency policies. As of press time, that had yet to pass, but players in this space had moved ahead anyway, with 35 utilities and energy providers voluntarily committing within the first year to provide 36 million homes and businesses with their own energy usage information in the consensus, industry-standard Green Button format.

For all the inroads we were making in the energy sector, we knew there was a need to pour our energy, in terms of standards creation and deployment, into other industries. In the Obama administration’s calculus, no national priority ranked higher than health care. Thirteen months prior to the signing of the landmark Affordable Care Act into law, its technological underpinnings had been codified in the Health Information Technology for Economic and Clinical Health Act (HITECH Act), a provision of the Recovery Act that we mentioned earlier. Only weeks into his first term, the President pushed for a $26 billion incentive program that would encourage doctors and hospitals to adopt and use electronic health records. And on this rare issue, he didn’t encounter much pushback from either party, since many members on both sides of the aisle—from Hillary Clinton to Newt Gingrich—agreed that, no matter how we financed the care delivery system, it needed to be modernized through the application of information technology, to improve quality and reduce costs. Congress authorized the Centers for Medicare and Medicaid Services (CMS) to pay doctors up to $44,000 and hospitals millions based on their size—but only if they “meaningfully used” certified technology in their daily work treating patients.

Why the broad consensus for intervention? Because, as the rest of the economy was surging forward, experiencing sizable productivity gains powered by the workplace application of information technology, health care was stuck on a treadmill. It was underinvested in technology, relative to comparable service industries. Even when the industry did invest or innovate, its priorities tended to be misdirected, focused on the wrong set of problems. That was partly a product of the predominance of the fee-for-­service structure in the $2 trillion health care system, a structure that financially rewarded quantity over quality. The more patients a physician sees, the more tests that physician orders, the more the physician can bill. There is little incentive in the system for investing and engaging preventive and chronic disease management, care coordination, medication management, and telemedicine services (e-mail, online chats, and videoconferencing). In fact, there is actually a perverse disincentive, as healthier patients need fewer services, thus reducing provider compensation. It should come as no surprise that health care professionals have sought IT-enabled products that improved efficiency in billing and scheduling for services that will be reimbursed, but largely ignored those that might improve individual care, let alone the health of the greater population.

Consider the problem of premature births, which we described in a Virginia context in Chapter 3. The health IT industry has the capability to better predict if a mother is likely to deliver early, which often leads to admissions into neonatal intensive care units. But, when those admissions generate revenues of more than $40,000 per week, what is the incentive for a health care system to invest in that technology? Altruism alone? And, worse, even if you identified those vulnerable mothers, you would bear the costs of any preventative services that you offered, since most come with no reimbursement.

Even while pushing the health IT movement forward, proponents understood that the payoff would be limited until the payment systems aligned incentives. This was even noted early in a 2005 report by the RAND Corporation, the nonprofit global policy think tank, which estimated the potential of more than $81 billion in annual savings through the widespread adoption of electronic medical rec­ords systems, with that figure potentially doubling through additional health IT-enabled prevention and management of chronic disease.29 The report declared that the full benefits of health IT were “unlikely to be realized without related changes to the health care system.”

Lawmakers of all stripes referenced the sanguine predictions of the RAND study in advocating for incentives to encourage provider adoption of such systems, and that advocacy played a role in the passage of the aforementioned HITECH Act. They did not, however, provide the accompanying payment reform, at least in the short term. And when RAND reassessed its study in 2013, its findings prematurely rang alarm bells for many.30 It found that the $81 billion savings had not materialized, and health costs had actually risen, ironically, in small part due to the improved billing and documentation procedures that the health IT systems had made possible. So, rather than a refuge for consensus, health IT became more of a partisan talking point for those opposed to the President’s overall health care agenda and unwilling to consider the contributing factor of fee for service.

There remained a need for a more virtuous cycle in health care, one outlined by the Center for American Progress.31 That think tank saw potential for health IT adoption, care delivery innovation, and provider payment reform to interact and flourish, with each helping to make the others work, instead of the absence of one or more of those elements stalling overall improvement. In its reassessment, RAND didn’t give up hope: “We believe that the original promise of health IT can be met if the systems are redesigned to address these flaws by creating more-standardized systems that are easier to use, are truly interoperable, and afford patients more access to and control over their health data. Providers must do their part by re-engineering care processes to take full advantage of efficiencies offered by health IT, in the context of redesigned payment models that favor value over volume.”

Progress in those areas was already under way.32

On payment models, the Affordable Care Act had included many payment reform provisions, chief among them the creation of a new Center for Medicare & Medicaid Innovation, seeded with $10 billion to run experiments on new payment models that have the potential to improve quality and lower costs. To complement the center, Congress granted a new regulatory authority. If the CMS’ nonpartisan actuary certified that a payment model achieved quality improvement and cost reduction, then the HHS Secretary could make it an option for every provider in the country.

On interoperability, the HITECH Act had tied its $26 billion in incentives for health care providers to their adoption of technologies that incorporated standards, which would be defined in three stages over the next five years. And for the standards work, we had called upon an experienced hand—my old friend Dr. John Halamka, who had engaged in the standards process across two Presidential administrations.

Under President Bush, who had pledged in 2004 that every American would have access to a personal health record within a decade, HHS would seed pilot investments for technical work to that end. As Chairman of the Healthcare Information Technology Standards Panel (HITSP), Halamka was exposed to well-intentioned actors, but also to a process primarily driven by a few senior government people and the vendor community, rather than by the doctors, insurance companies, and patients. It was also a process that lacked any economic incentives for the parties to push themselves toward the best possible performance. In the current market environment, vendors and their hospital customers simply didn’t have much of a stake in making it easier to share information. The reimbursement system encourages health organizations to grow through increasing market share, an objective that would be undermined if it was easier, not harder, for patients to seek care elsewhere.

The result of such a vendor-driven approach, according to Halamka, was “basically codifying the status quo.” The standards that they did create “were so cumbersome and so heavy that vendors could only successfully implement them by charging vast sums of money.”

In his new role, as vice chair of the Health IT Standards Committee (HITSC) in the Obama administration, Halamka had a more formal legal foundation than during the Bush administration. Authorized by the HITECH Act, the HITSC would benefit from greater engagement and urgency among all parties, a clearer business case to scale what works, and a more constructive governance model. It was a collaborative model that owed plenty to the insights that Mitch Kapor, the Lotus Development Corporation founder, outlined and endorsed in his aforementioned speech calling for a Health Internet. Kapor called for a less complex, more open, “light federal approach” that would encourage an early critical mass of users to participate—something like a dozen or more products and services built on the standards—thus reducing costs and time to market while fueling innovation. That’s how the standards committee set about its work.

“Let’s do it in an open, transparent, multistakeholder fashion that will be bottom-up rather than top-down and will be fueled by innovation, agility and low costs,” Halamka said. “We’ll make it a ‘do-ocracy.’ That is, you will be rewarded for actually achieving results. What ended up happening is that suddenly the implementation guides, instead of thousands of pages of complicated technology specs, became ten pages of simple technology specs. You saw existing Internet standards being leveraged for health care. You saw open source. You saw intellectual property freedom. And you suddenly actually got the vendors a little bit on the run, because now they were having to open their systems and enable platforms and they could no longer charge obscene amounts of money for simple tasks.”

Among those tasks: downloading your own health data from a provider. That was the topic at a gathering of Internet thought leaders hosted by the Markle Foundation in New York in January 2010.33 Participants included Todd Park, then the HHS CTO; Dr. Peter Levin, the CTO of Veterans Affairs, and a former entrepreneur in the biotech and semiconductor sectors; former Google Health leader Adam Bosworth; and Professor Clay Shirky, an author and Internet scholar.

During brainstorming, the meeting’s participants decided to use a simple text format called ASCII that, if sensibly organized, machines and people could read with equal ease. Park and Levin agreed that Veterans Affairs would serve as the test case, with Medicare to follow shortly thereafter. The VA had a head start as it had the largest consolidated network of hospitals, the largest electronics records system, and nearly a million users on its personal health records platform. “There was certainly no technical challenge to doing this,” Levin said.

President Obama took on some of the marketing challenge himself. With everything proceeding smoothly in production, he shared the news with the Disabled Veterans of America on August 2, 2010: “Today, I can announce that for the first time ever, veterans will be able to go to the VA website, click on a simple ‘blue button,’ and download or print your personal health records so that you have them when you need them and you can share them with your doctors outside of the VA. That’s happening this fall.”

But how many of the roughly 1.5 million veterans would use it? When asked by Eric Shinseki, the VA Secretary, Levin pulled a figure out of the air: 25,000 in the first year. “But I had no idea whether they would really want this,” Levin said. “I figured the worst case scenario is, I’m out on Pennsylvania Avenue with a clipboard, getting people to sign up.”

No clipboard required. By October 2012, Blue Button would serve more than 1 million downloads across all of its federal agency partners—the VA, the Department of Defense (serving active duty military), and the Medicare program—under the basic premise that any American should be able to download information the government keeps on them in machine-readable form. Its popularity in the public sector inspired adoption in the private sector, and Dr. Levin would fill a role there, too, spearheading the development of the service to convening a volunteer network of private sector developers. A coalition of health insurance companies, led by UnitedHealthcare and Aetna, would offer Blue Button services for their own members as well. In September 2013, the Office of the National Coordinator (ONC) for Health IT touted another milestone: 500 partner organizations had pledged consumer access through Blue Button in order to reach 100 million Americans, nearly one-third of the population.34 Further, new privacy regulations, updating those originally mandated by the Health Insurance Portability and Accountability Act of 1996 (HIPAA), required any medical providers with electronic records on patients to offer them back to those patients in electronic form at a nominal fee.35 All of this means that the country is getting closer to honoring President Bush’s long-ago promise of providing personalized health records to everyone.

Those measures, while important to transform the health care system, are not sufficient. So in a parallel effort, beginning in 2010, ONC would apply the “light federal approach” to how patients and caregivers could safely and securely share health information over the Internet.

That effort began in response to Dr. Floyd “Tripp” Bradd’s testimony to the HITSC about a patient who was moving to Arizona and wanted his records forwarded electronically. After receiving that consent, Dr. Bradd was able to e-mail the records in a format that the other provider—using the same software package—could import and read. But conventional e-mail isn’t that secure. The data isn’t encrypted, and there is no way to prove the identity of the sender or receiver. Dr. Bradd challenged us to develop a safer, more secure version of e-mail.

Two public servants teamed up to provide it. Doug Fridsma was already working inside the administration, leading standards work at the ONC. Arien Malec was taking a leave of absence from his executive position at RelayHealth to join the government on a temporary basis.

“If there are existing standards, we say, ‘our job is done, we don’t have to do anything else,” Fridsma said. “If people say, ‘We’ve got a big hole; this is a problem because everybody is doing it in a whole bunch of different ways,’ then we can use our convening authority to say, ‘Success is filling this gap in the standards in such a way that it works with these other things that people care about, and meets all of our policy objectives as well, but you guys have to tell us whether it solves the problem or not.’ At the end of the day, the government’s role is to convene.”

Together, Fridsma and Malec led a new initiative called the Direct Project, aimed at developing a safe, secure, and cost-­effective method to “push” health information between providers, patients, and other stakeholders. By standardizing interfaces, the duo hoped to achieve a 90 percent reduction in the cost of exchanging laboratory results, to name just one example.

They recognized that the road to interoperability required an incremental, thoughtful approach, with Fridsma noting that, just as people use more than one means of personal communication, cell phones, landlines, and social media, to name the most popular, “We shouldn’t expect that within health care, which is far more complicated, we should have a single way of communicating either.” Further, they recognized the critical need for all sorts of outside input. Rather than call upon contractors to construct a monolithic system that the community would accept, reject, or ignore, as had been done in the United Kingdom a few years earlier, it was more constructive to work with the community toward the standardization of the fundamental building blocks—meaning, structure, transport, security, and services—for electronic health record communications. “We turned the whole project inside out,” Fridsma said. “We created government as a platform, if you will, a way to engage the public and let them tell us what was important and then support them in accelerating their consensus to a common solution.”

This engagement approach—consistent with what Tim O’Reilly had described as Government 2.0—also squared perfectly with lessons Malec had learned in the private sector, particularly in startup companies, “that participation drives innovation and, that in the open source software, the more you share, the more you get from the community.” Malec also believed that, since most health care is provided by the private sector and it would be writing the integrated software that pulls everything together, it needed to be part of the process. That way, the private sector would be more likely to own the responsibility for creating the solution.

“Instead of having an approach of telling people what to do, it just seemed like the right thing would be to set a challenge, set a vision, and ask people to get involved,” Malec said. “We set a vision, a view of the world as it would look past the innovation, but that outcome would describe what happens and not how it happens. Then we got people together, gave them this big inspiring vision, and said we want you to be the ones to figure out how and to own how.”

Malec had anticipated that perhaps a handful of hypermotivated organizations would participate in the Direct Project. That was, until he attended the annual Healthcare Information Management Systems Society meeting and got a chance to introduce the idea to tens of thousands of health IT stakeholders. “The response was overwhelming,” he said. Within a month, 90 organizations wanted to participate—and thanks to modern collaboration technologies, they could do so with just an Internet connection and the occasional conference call.

The Direct team set up a series of wiki sites without restrictions on participation or observation. Then it elevated some participants, those with special interest in the implementation of a solution, to the status of Committed Members. Before and throughout the process, the government promoted some basic discussion guidelines (discouraging members from conducting side e-mail conversations) and even some policy parameters (“guard rails,” in Fridsma’s parlance) that were meant to keep the community moving forward toward solutions that, at the least, sufficiently conformed with technological and security realities to have a shot. The government was vocal, but not tyrannical. “Ultimately, we have just a single vote at the table,” Fridsma said.

Still, the government’s steering role was significant. Even when conversations became contentious, Malec kept all parties at the table. As Sean Nolan of Microsoft noted, “Arien’s personality was key. He maintained the respect of a bunch of people that generally don’t respect each other.”

In Malec’s view, plenty was accomplished.

“In one year, we had progressed from nothing to ‘running code,’ real-world testing, and commitment,” Malec said, noting that 70 organizations implemented the work of the Direct Project in their products. “We had a model for every provider in the country sending secure information to every other provider in the country and every patient in the country; we had a path, a real path, to make that happen. We had enabling software that any software developer could incorporate into their products to smooth the path. And we had the process for process. We had a template for how to do this in the federal government—this new style of interaction between the federal government and private sector.”

They had accomplished all of this at a very low marginal cost, and their work led to products that were relatively inexpensive, with a growing competitive marketplace offering unlimited Direct messaging services for less than $20 a month.

Malec returned to RelayHealth in the summer of 2011, nine months after he had planned, and not due to any disillusionment. Rather, his time in government had reinforced his belief in its unique position to see problems from coast to coast and its capability, through a handshake with—and handoff to—the private sector, to effect positive change. A little push, a lot of support. “That’s a trick that I think government can replicate,” Malec said. “The risk of doing the wrong thing, of trying too much and getting unintended consequences, is so small—and the risk of doing the right thing and creating the right outcome is so huge—that it’s a really useful thing that government can do. And the second big idea is that when the relationship is participatory, when the relationship is open, it really does foster a sense that the government isn’t a thing, it’s what we do together.”

The work went on after Malec’s departure. In August 2012, the ONC announced that the Direct standards would be required for the next stage of the health IT incentive program taking effect in 2014. In response, members of the Direct community took the ball, launching an independent, self-regulatory nonprofit called DirectTrust to ensure the agreed-upon standards are deployed in health IT. Essentially, the private sector was picking up where the government left off.

As of this writing, the open government approach utilized for Direct and other health IT standards has attracted more than 500 people representing over 300 organizations, working on one or more of 10 active programs.36 This will give a head start to new startups seeking to compete in the health IT industry.

Will the interoperability standards adopted for 2014 deliver on their promise? Will future health IT regulations result in the ultimate objective: a true “plug and play” health IT ecosystem where innovative new products and services could be distributed as easily as apps on the iPhone app store, perhaps through an API standard?

Too soon to say. But, in May 2013, milestones were met—more than 50 percent of America’s doctors and more than 80 percent of hospitals were participating in the meaningful use incentive program.37 That represented a doubling and quadrupling, respectively, from previous participation. It appears the industry has reached a tipping point that will position the care delivery system for its most difficult chapter: achieving more with less.

“We have done more the past two years than in the previous twenty,” Halamka said.

Which merely means that more can be done.

“Standards aren’t standards because we say so,” Fridsma said. “Standards are standards because people use them. The only standard you don’t have to maintain over time is the standard you never use. Standards, good ones anyway, will continue to grow and change to accommodate new use cases and new innovations that are out there.”

Sometimes, successful innovations come at the intersection of ongoing and evolving initiatives. In this case, some combination of Blue Button and the Direct Project could eliminate extra steps for patients, and make them more likely to participate. Rather than be required to download their information onto the computer after every medical development, and then determine which parts to share with which caretakers, what if patients could simply set preferences at various points of care (whether the emergency room or the pharmacy) that would automatically route medical information to other places it should go? For instance, it is useful for a primary care physician to know not only that a patient was admitted to a hospital the previous night, but also every test that was run, every diagnosis that was made, and every medication that was prescribed while the patient was there. It is also useful for the physician to know whether the patient ever picked up a prescription. This sort of knowledge is especially valuable to address one of the health care system’s most troubling inefficiencies: nearly one in five Medicare patients discharged from a hospital is readmitted within one month, at a cost of nearly $17 billion to the government, and often due to inadequate patient follow-through, especially when it comes to taking the proper medication.

As was the case with energy data and Green Button, all this progress in the field of health information records could not occur without that initial enabling piece—the development of a standardized method for health IT applications to send and receive Blue Button files on behalf of patients to everyone in the health care ecosystem, with security and privacy built in. That Auto Blue Button Initiative effort, again largely driven by private sector entities, would eventually be launched as Blue Button + in February of 2013.38 It would enable consumers to do everything from printing a physical copy of their records to sharing it with a third party application.

“All of this data more or less exists in the market,” Levin said. “But getting all of it in one place, making sure your prescriptions are in the same place as your immunizations, in the same place as your radiologist images, in the same place as the last time you took antibiotics because you had a sore throat, and with all of those things now able to flow without friction between places of trust, that’s what the Auto Blue Button Initiative is all about.”

Technical standards aren’t a sufficiently sexy topic to sneak their way into the daily senior staff briefings held in the West Wing, nor common in cocktail party conversation. Yet, someday, they might be. Look at how they were applied after President Obama announced the official end of combat operations in Iraq on August 5, 2011.

Service members, who had spent months or years in an unforgiving conflict, would be coming home. They would need jobs. The President, citing the shockingly high unemployment rate among young veterans, called upon the private sector to provide those opportunities. Corporate executives enthusiastically raised their hands, eager to assist. But this was about more than good will or gracious gestures. The veterans would need direction in order to navigate their way through all that might be available via the competitive and expansive online job listings industry. They needed to have some idea of which listings had been set aside for them.

There were essentially two options to assist them. We could create a single website, through government funding, which would provide places for every willing employer to post listings. That would have been the default. That, however, also would be redundant and burdensome for employers, most of which already had preferred methods for online job postings. Alternatively, we could engage the job listings industry in designing and deploying a standardized method to “tag” job listings associated with a veteran hiring commitment, regardless of the website they chose for those postings.

We chose the latter and started our 90-day sprint for a Veterans Day launch by calling upon the experts at Schema.org, a voluntary collaboration between Google, Microsoft, and Yahoo, that tagged web pages in a way that helped users find the most relevant and specific information.39 Take recipes. Schema.org includes a recipe standard that reflects a consensus among many in the online recipes industry on the sort of information that is most useful to a cook, including cuisine type, calorie count, ingredients, and instructions.40 Any website is free to add the schema to its page, enhancing its search results on the participating search engines and inspiring customers to click through and learn more.

This sounded like something we could simply and seamlessly apply to our project. I engaged the private sector online job listings industry to help design the schema. Figuring that veterans would need to know more about job descriptions, locations, and compensation, among other topics, we engaged a variety of stakeholders for further input before publishing the agreed-upon standard at schema.org by October 2011. That would give us a month to encourage the job listings industry to begin tagging relevant job listings with the standard before a Presidential announcement on Veterans Day.

On November 7, 2011, President Obama announced the launch of the Veterans Job Bank, a Google-customized search engine—built at no cost to the taxpayer in collaboration with the Department of Defense and Veterans Affairs—that any veteran could use to find the tagged jobs that were right for them, including the ability to narrow results based on military occupation code or ZIP code.41 On that day, several job listings vendors, including Monster.com, Simply Hired, and LinkedIn, publicly pledged to adopt the schema.org standard if they had not already done so. Simply Hired, for example, had already kept track of nearly 500,000 open jobs associated with veteran hiring commitments and all of them were tagged on day one. Within a year, the job listings sites would feature more than one million veteran-tagged jobs, and shortly after that, Google would devote a day to highlighting the Veterans Job Bank on its most precious asset, the search home page.

The next step in this process was bringing the Job Bank to veterans, rather than forcing them to find it on a government website. Several technology firms volunteered to build free apps to help connect veterans with the Bank, with the reward of being honored at the Joining Forces Veterans App Showcase, featuring Dr. Jill Biden, the Vice President’s wife. Originally, Twilio was not among those participating firms. Yet, upon learning about the event just one day earlier, the San Francisco cloud communications company thought it could contribute. Its CEO, Jeff Lawson, issued a “lightning challenge” to his more than 75,000 partner developers to build a more veteran-friendly interface to search for these tagged jobs, accessible on smart phones and using their technology.42 He set the deadline for 8 a.m., Pacific time, the following morning, five hours before the event.

Tony Webster, a 25-year-old web developer, just happened to be on Twitter when the challenge crossed his feed. He had been interested in open government for a while and, although he had never served in the military, had close friends who had returned from Iraq and Afghanistan. He had heard horror stories about their job searches due, in part, to the reluctance of employers—for one reason or ­another—to hire veterans. Further, he had a good understanding of Twilio. So he headed to a café, intent to expeditiously create something unique before its doors closed at midnight. “I wasn’t about to be up all night, because I actually had a real job, too,” Webster said. “I whipped up something in four hours. Then I went home, ramped it up, and got some sleep.”

His conception of HeroJobs.org wowed the Twilio judges by flipping the model. After entering their Military Occupation Specialty codes and ZIP codes, veterans received text messages, every morning, with the appropriate, veteran-committed openings in their area. Webster had no idea whether anyone would actually use HeroJobs.org, but many veterans did, and veterans’ relatives told him they appreciated the ease of the service.

His application was designed just for the competition, not as a full-running site, and eventually it faded out. So, too, did many of the demonstrated apps in the showcase. But firms like Google and LinkedIn have further invested in making their applications work with the former’s launch of VetNet, a portal that accesses and builds valuable services, such as networking courses and Google Hangout video discussions, on top of the jobs bank. LinkedIn would develop its own veterans page, and a discount offer for those users—a free LinkedIn premium account (a $99 value), plus a more direct path for employers to connect with the 1 million veterans already on the LinkedIn platform.

Common to all these examples is the notion of a more innovative state. That starts with smart government, government that identifies problems, convenes the interested parties and innovative thinkers, and then empowers them to address an issue. It is one of the initiatives that, for those that follow, will set a rather high standard.