[ CHAPTER 3 ]

Adoption Chain Risk:

Seeing All the Customers Before
Your End Consumer

Arange of intermediaries stand between you and your end customer: the distributor who needs to agree to bring your product to market, the retailer who needs to agree to showcase it, and the salesperson who needs to agree to sell it. Your success depends on each of these partners adopting your innovation and seeing the value it will create for them. If any one of these partners is not on board, you will never be able to reach your end customer. When does the best product lose? When the consumer doesn’t have a chance to choose it.

In this chapter we will examine Adoption Chain Risk, the extent to which partners will need to adopt your innovation before end consumers have a chance to assess the full value proposition. Here our question shifts from whether partners can deliver the required co-innovations to whether they will see the value proposition as beneficial not only to the end consumer but to themselves as well.

We all know that a natural tension exists between those selling a product and their target buyers. At its root is a misunderstanding over the notion of value—the balance between costs and benefits. Although both innovators and consumers use the terms “cost” and “benefit” to describe the way they think about value, they think about these terms in very different ways.

Consider Microsoft’s Office 2007 suite. When launched, the vast majority of enterprise users agreed that Office 2007 was superior to its predecessor, Office 2003. It had better code quality, was more reliable, and offered more features. According to Jeff Raikes, president of the Microsoft Business Division, “The 2007 Microsoft Office system [release] completes the most significant improvements to the products in more than a decade.” And the price difference was zero: enterprise customers pay Microsoft a fixed fee, per user, per month, to lease their Office software, independent of which version they use. Firms would not need to pay Microsoft any premium to use Office 2007 instead of Office 2003. A better product for the same price: this looks like an obvious win. But decision makers at most companies large and small chose to stick with the clearly inferior Office 2003. Why?

(Total) Cost vs. (Relative) Benefit

Innovators and customers view “benefits” and “costs” from very different perspectives. Missing this difference is a recipe for disaster. Innovators think about benefits in terms of what their product actually provides—the absolute benefit delivered to the customer. But customers think about benefits in terms of added value—the relative benefit delivered by the product compared to the available alternatives. Each group also has a different understanding of costs: while innovators tend to think of the price they will charge for their innovation as the determinant of customer cost, customers conceive of cost in terms of that price plus all the other changes they need to undertake in order to use the innovation (beyond the initial outlay, the cost of retraining, equipment upgrades, etc.). While innovators tend to focus on delivering an offer whose absolute benefits exceed the purchase price, adoption happens only if the customer sees a clear surplus: that is, the relative benefits must exceed the total cost. These differences can lead to an assessment gap. And this gap can lead to disappointment.

Office 2007 adoption lagged because, even if the price was free, the total cost was high—too high for most customers to justify. If you are a CIO considering an Office upgrade, your notion of cost goes well beyond price to include retraining, hardware upgrades, changeover time, and all the unknowns that inevitably come with making a company-wide change. While the sales rep brags that Office 2007 is 99.9 percent compatible with 2003, what keeps the customer up at night is that last 0.1 percent. Not knowing whether the incompatibility affects the ninth iteration of Helvetica font or the critical invoicing macro changes the attractiveness of the offer dramatically (perceived risk is an important contributor to nonprice cost). And even though none of these costs show up in the innovator’s revenues, they are all charged against the innovation’s benefits. Frustratingly for Microsoft, Office 2007’s total costs exceeded its relative benefits for most customers.

Figure 3.1: Innovators and consumers have different views on what constitutes “benefits” and “costs.”

In the end, the key driver of adoption for Office 2007 wasn’t a change in its relative benefit, but rather a reduction in its total cost that accrued over time. As noncompatibility issues and bugs were gradually resolved, and as companies followed the natural replacement cycles for computer hardware, it became easier to justify the 2007 upgrade . . . until the next version of Office was launched in 2010, and the entire cycle started anew.

Superabrasive Grinding Wheels

Take a look around your home. It’s likely that you have abrasives to thank for the smooth shape of your sink’s faucet, the sharp edge of your cutlery, and the tiny components that run your computer. Abrasives are the category of hard materials used to shape and polish softer objects in a variety of manufacturing processes. They constitute a mature industrial market with revenues of close to $4 billion in 2010.

Figure 3.2: A grinding wheel in use. (Photo compliments of Saint-Gobain Abrasives.)

Superabrasive grinding wheels were an innovation born when GE introduced synthetic diamonds in 1955. At the time, industry leaders agreed that superabrasives were going to transform the industry. They were much harder than their conventional brethren (aluminum oxide, silicon carbide) and therefore offered many benefits: they were more durable, required less frequent dressing (the process by which the outer layer of the wheel maintains its coarseness and its trueness), could be used at higher speeds, allowed higher production rates, and could enable better products with finer finishes and tighter tolerances. Superabrasives were more expensive, with prices up to thirty times more than those of conventional wheels; but they lasted up to a hundred times longer. With their longer lives, their higher production rates, and their decreased need for human intervention, superabrasives would offer greater overall efficiency and increased performance. Expectations were high.

Despite their grand promise, however, today superabrasives make up only approximately 15 percent of the abrasives market. This innovation has had nearly fifty years to make good on initial excitement and capture market share, yet it has reached a plateau. What went wrong?

The relative benefit of superabrasive grinding wheels was real and high. But the total costs were high as well. They were not just more expensive to buy; they imposed new costs in their use. To take advantage of higher grinding speeds, manufacturers would need to invest in higher-speed grinders (the powered machine tools used for grinding). The greater hardness of the wheel meant that the machines used to dress them needed to change too. And because higher speeds required greater precision in the way in which the wheels were dressed and balanced, it meant staff needed to be retrained and upskilled.

The balance of total cost/relative benefit depended on the application. Where tight tolerance requirements combine with long production runs, such as selected manufacturing steps within the aerospace and automotive sectors, total surplus was positive, and superabrasive grinding wheels made deep market inroads. But outside these niches, taking into consideration the total cost of the value proposition, most of the market opted to stay with traditional grinding wheels.

At 15 percent market share, were superabrasive grinding wheels a successful innovation? It depends on your viewpoint. Success is always assessed relative to expectations. For early enthusiasts, superabrasives’ niche market is a disappointment. After all, what promised to be a revolution ended up being a small but healthy advancement in the industry. However, a number of firms developed a balanced view of the relative benefits and total cost of superabrasive wheels. They designed their organizations, and allocated their resources, with the niche market as an explicit target. For them, 15 percent market penetration is a high-margin cause for celebration.

From Adoption to Adoption Chains

Many innovations rely on a chain of intermediaries that stand between them and their end customer. Consider your breakfast choices. Before you get to decide whether you like the new Strawberry Delight Cheeriflakes that the marketing folks at Z Cereals are trying to sell you, many other actors need to get on board. The plant manager needs to decide when and how to adjust production to accommodate the new recipe; the distributor needs to agree to accommodate a more complex product range; the salespeople need to promote it; the grocer needs to agree to allocate shelf space to the new offer (which means taking away space from others); and everyone needs to converge on prices and terms. Together they comprise the Cheeriflakes adoption chain. And if any one of them decides against supporting the new offer, consumers will never have the chance to make the purchase. And company Z will never have the chance to make the sale.

Once products are established, these handoffs are seamless (indeed, achieving this seamlessness is a primary element of what it means to be established). But new offers can disturb the existing routines. When adoption chains are fragile, outcomes are uncertain.

In today’s interdependent world, the successful innovator must treat each partner as a customer even if they are not in a direct business relationship.

Whereas the logic of co-innovation is one of multiplication (not averages), adoption chains follow a logic of minimums (not net surplus). Consider the two innovation proposals, A and B, illustrated in figure 3.3. Both require that your innovation pass through two intermediaries before reaching the end customer: the distributor who sells the product to a retailer, who then sells the product to the end customer. Innovation A creates high value for the innovator (it is highly profitable, with surplus of +4), high value for the distributor (high margins and low handling costs, with surplus of +3), slightly negative value for the retailer (higher up-front costs, retraining and after-sales service headaches, despite slightly higher margins, with a deficit of –1), and very high value for the end customer (surplus of +5). Innovation B creates positive, but low, surplus for everyone in the chain (surplus of +1 for each of the four actors). The net system surplus created by innovation A is 11 (4 + 3 – 1 + 5). The net system surplus created by innovation B is 4 (1 + 1 + 1 + 1). Which one should you bet on?

We have all been trained to “focus on the customer,” to “listen to the voice of the customer,” to try our utmost to “delight the customer.” But as the adoption chain makes explicit, we rarely have just one customer. Which customer in the chain is most important? All of them! Each and every intermediary that is part of the ecosystem needs to see surplus from adopting the innovation. A single instance of rejection is enough to break the entire chain.

The logic of adoption chains dictates that innovation A, despite the far higher value it creates for the end customer (+5 vs. +1), and the higher net surplus that it creates for the chain as a whole (+11 vs. +4), will fail. It will fail, not because the end customer won’t prefer it, but because the end customer will never have the chance to choose it. As long as the retailer is worse off with innovation A than with its current alternative, it will be a broken link in the adoption chain. Ironically, despite its lower value creation, innovation B will sail through the adoption chain. A moderate, but attainable, win is infinitely more valuable than a huge, but hypothetical, victory.

Figure 3.3: Surplus along the adoption chain for two innovations. Expect failure if any link is negative.

As was the case with co-innovation risk, however, the very act of recognizing the sources of adoption chain risk can suggest avenues for addressing it. Rather than abandon the effort, the wise innovator will revisit the plan for innovation A with an eye to eliminating the retailer’s deficit—by finding an alternative partner, an alternative path, or by reallocating surplus along the chain to put the retailer in positive surplus (for example, shifting the end consumer’s surplus from +5 to +3 to move the retailer from –1 to +1). These mitigation steps are themselves innovations to the structure of ecosystems. Risk can be managed proactively, but only if it is recognized in advance.

As clear as this may seem in theory, in practice it’s easy for innovators to overlook the incentives and motivations of the intermediaries to which they have no direct links. For this reason, it can be tempting to turn your gaze inward to issues you can better control. But ignoring your dependency on others will not prevent them from upending your innovation efforts.

The key is to ask the difficult questions from day one, unearthing what everyone in the adoption chain needs in order to push your innovation down the line. What are these difficult questions? They are the questions that probe for constraints and problems rather than for support and uncommitted agreement. Recall Michelin’s run-flat disaster from chapter 1. What if the company had asked the garages not, “Will you be willing to incorporate new machinery and retrain your staff to fix PAX tires?” but rather, “Will you be willing to incorporate new machinery and retrain your staff to fix PAX tires even before there are a substantial number of PAX tires on the road?” The answer would have been unpalatable. But it would have been critically useful. Michelin’s leadership would have had the opportunity to modify their strategy to figure out how to manage the garage adoption problem in advance of the PAX launch, instead of scrambling and improvising when it was already too late.

If the innovators at Michelin had truly understood the nature of their adoption chain challenges, then they might have approached the market in a very different way. For example, one option (among others) could have been a multiyear exclusivity to a single carmaker. This would increase the carmaker’s differentiation advantage compared with its rivals in exchange for a strong guarantee that their dealerships would all install repair equipment in their service garages within the first year of launch. With the equipment installed, these dealerships could service PAX tires on any make of car, such that when exclusivity ended, a service infrastructure would be in place. In chapter 8 we will explore such strategies under the heading of establishing a minimum viable ecosystem. For innovators who believe they have a great product, as Michelin did with PAX, proactively limiting the size of early success can be a challenging strategy to embrace. But the disaster of launching without having addressed adoption chain problems is far worse. The only way such an option can be debated seriously is if you begin your discussion with a wide-lens view of your ecosystem.

Adoption chain risk is a product of the tug-of-war between innovation and the status quo. The challenge is to convince critical partners that there is positive value for themselves in joining your efforts when they feel they are doing just fine as is. Sometimes this is a matter of communication. But as we will see in the following example, often it requires innovating the innovation ecosystem itself.

How Digital Cinema Turned a Minus into a Plus

In the late 1990s, Hollywood’s dream factory had a dream. Since its earliest days, stars—from Charlie Chaplin to Cary Grant to Marilyn Monroe to Julia Roberts to Russell Crowe—were all projected into our imaginations through the same technology: celluloid film. Digital technology was enabling new magic in almost every stage of moviemaking (special effects, editing, sound), but movie projection remained stuck in the industry’s analog standard for over half a century. While the click-clack sound of spinning reels may evoke nostalgic smiles among audiences, for the studios the need to print, ship, and retrieve bulky canisters of physical film was both costly and inefficient. And in the midst of a fertile technology boom, a new way to distribute, project, and enjoy movies finally emerged in the form of digital cinema.

As analog formats throughout the media world were falling away, the transition to digital cinema seemed a certainty. Building on the foundations already laid by the development of the laser disc and, more successfully, the DVD, the technology path to digital was clear. The value proposition was unambiguous: higher-resolution picture quality, better protection from piracy, flexibility of programming, the potential for 3-D screening, and—for the studios—the elimination of costly film prints. Bringing this exciting new technology to the masses required technological advances. Most crucial was the development of the digital projectors themselves. Happily for the studios, the first commercial DLP (Digital Light Processing) projectors became commercially available as early as 1996.

However, as we saw in the case of 3G in chapter 2, the mere availability of an innovation is not enough for success: often, a multitude of co-innovation challenges need to be overcome in order for the value proposition to become a reality. The co-innovations required for digital cinema’s progress included analog-to-digital film conversion and data transfer and storage capabilities. These were ready too. The 1989 development of the digital telecine scanner, which rapidly converts film into a digital format, meant movies could be shot on film yet projected digitally. This was a great boon to the studios because it allowed them to avoid the challenge of convincing directors and production staff to adopt new equipment and techniques. Storage and delivery capabilities also fell in line: hard drives at the time were capable of storing digital films if they were split over several drives, and content could be efficiently distributed over the Internet through the newly developed high-speed T3 lines. By 1999, these key elements of the digital cinema projection system were ready, and moviegoers were treated to digital cinema for the first time when Star Wars: Episode I—Phantom Menace was screened in Los Angeles and New York.

In February 1999, the New York Times heralded “Digital Projectors Could Bring Drastic Changes to Movie Industry.” But unfortunately for the studios, despite the viability and availability of this miraculous system, and its potential to save them millions in costs and add millions in revenues, digital cinema would spend the next seven years in limbo. Yet, while digital projectors were used in less than 5 percent of U.S. cinema screens by the end of 2006, they were used in nearly 40 percent by the end of 2010. Why did it take nearly a decade for this appealing new technology to reach more than a handful of cinephiles? And what happened to finally speed its ascent? The answer lies in the adoption chain.

Who Stands to Gain?

In the digital cinema ecosystem, there were several key players who had to adopt the technology in order for it to reach the general moviegoer.

The big movie studios certainly saw value in supporting and advancing digital cinema. After all, leaving celluloid behind also meant avoiding the $1 billion spent annually on the printing and shipping costs associated with traditional film. The cost of producing one film print falls between $2,000 and $3,000 per print. So, if a typical U.S. nationwide release is shown on about 3,000 screens, the studio’s printing cost alone can come to $7.5 million.

Digital cinema also opened up the possibility of releasing the film in what insiders call a “day-and-date” approach. Traditionally, movies were rolled out in a staggered fashion, which allowed the studios to control and minimize the costs of moving expensive, heavy prints around the globe. With digital cinema, films could be released everywhere simultaneously, offering an alternative to pirated films for (over-) eager global audiences. In addition, digital film technology allowed for encryption, which would further help reduce, though not eliminate, film piracy, to which studios attributed a cost of more than $3 billion per year.

The distribution arm of the studios, as well as independent distributors, would also benefit from greater flexibility of tailoring the movie to a specific audience. Going digital meant that subtitles and multiple versions of trailers could be easily adjusted to better target a specific locale. Theater owners could operate more efficiently, with fewer staff required to operate and maintain the projection equipment. Of course, the suppliers of the projectors and equipment upgrades and service technicians for whom digital cinema provided a large new revenue stream saw the obvious value in the digital transition. And moviegoers would be winners too, enjoying an enhanced theater experience. Better resolution meant the images were brighter, crisper, and more engaging. “What do you think people notice the first time they see film and digital cinema projected side by side?” asked Jim Korris, the executive director of the Entertainment Technology Center at the University of Southern California. “It’s that people realize they’ve been looking at films that flicker their entire lives. Digital cinema is like looking out of a window. It’s rock steady.”

Figure 3.4: The digital cinema ecosystem in 2003.

So if everyone saw the benefit of adopting digital cinema, where was the problem? The answer lies with one player for whom the benefits were high, but not high enough to offset their total cost: the movie theaters. Imagine that it is 2003 and you are the owner of either an independent movie house or chain of theaters and have been following the development of this new technology. Digital cinema is an exciting possibility: you could easily adjust which movies would be shown on which screen and thus take advantage of demand spikes for surprise hits; you’ve heard it would be possible to stream live sporting events and concerts, offering a new way to entice audiences; and there is even talk of mainstreaming 3-D movies, with higher ticket prices, but which could be screened only with digital projectors. The relative benefit is high.

But so is the total cost. Cinemas are relatively low-margin businesses. Historically, they make most of their profits not from ticket sales but from concessions. For the vast majority, the $70,000–$100,000 per screen conversion cost necessary to buy the digital projector and supporting hardware and software for an upgraded projection room was prohibitive. Moreover, digital projectors have a life span of only ten years, compared to thirty to forty years for traditional projectors. Add a lack of clarity about competing format standards that would take years to get settled, and it is easy to understand why a theater owner considering the ongoing maintenance costs of this new, uncertain technology, and weighing it against the familiar celluloid alternative, would hold off incorporating digital cinema. “Maybe we’ll do it next year . . . or the year after that.”

Director James Cameron, long a proponent of the promise of going digital, noted, “Digital cinema and 3-D open the door for filmmakers to mine completely new creative territory. It’s up to exhibitors, now, to adopt these new technologies on the display side, so that audiences have a reason to seek out the cinema and leave their computer and flat-screen TVs.” Easy for a director to say—for him, the relative benefits far exceeded the costs. Exhibitors saw it differently. “We can’t afford to put in expensive equipment,” said Ayron Pickerill, a small theater owner in Montana. “And if we don’t have it, and everything is digital, we’ll be out of business; that’s all there is to it.” Is it any wonder their preferred tactic was to delay?

In the early years of the new millennium, this attitude was the consensus among theater owners across the country, causing a crucial rupture in the adoption chain. Unless the costs and risks of adoption for cinema owners were lowered, mass audiences were unlikely to ever enjoy the experience—and studios the cost savings—of digital cinema.

Technology Standards Are Not Enough

In 2002, seven major motion picture studios—Disney, Fox, Metro-Goldwyn-Mayer (MGM), Paramount Pictures, Sony Pictures Entertainment, Universal Studios, and Warner Bros. Studios—came together to form Digital Cinema Initiatives, DCI. “We thought that if we left it up to the exhibitor it would never happen,” noted one Warner Bros. executive, Dan Fellman. The group’s purpose was to establish a set of specifications for digital cinema with the hope that industry-wide standards would reduce uncertainty and encourage adoption. The struggle took years of tests and trials. Julian Levin, DCI chairman and executive vice president of digital cinema at 20th Century Fox, put it mildly when he observed that the work was “taking a little longer than people had expected.” In July 2005, DCI published its specifications, finally resolving the issue of standards.

But even with these technology standards in place, theater owners were unwilling to take the digital cinema plunge. Another innovation would have to be introduced into the ecosystem in order to equalize adoption risk. Interestingly, in this technology-driven transition, the missing piece of the puzzle wasn’t a technical advancement but a financial one.

How an Innovative Financial Framework Saved Digital Cinema

Digital cinema promised big net benefits not just to the movie studios but to the aggregate ecosystem as well. The problem, however, was that the benefits weren’t distributed evenly. And because theaters were a critical link in the chain—meaning that there was no way to do without them or go around them—it would be impossible for digital cinema to succeed as a win-win-lose-win solution. The challenge was to find a win-win-win-win proposition—some way to share enough surplus with theaters to make them want to come on board.

With a clear recognition of digital cinema’s advantage to their bottom line, studio heads fought to get a solution in place before confidence in digital cinema’s potential began to fade. With the press calling the digital cinema arena “the Wild West,” and exhibitors far from a sure bet for digital adoption, something needed to happen. As National Association of Theater Owners president John Fithian put it: “They’ll either have their act together this year, or it’ll fall apart.”

The Virtual Print Fee Solution

In an industry in which the links between parties were nonexclusive (each studio would send the same movie to many theaters; each theater would screen movies from many studios), what was needed was an elegant way to enable studios to contribute to the theaters’ digital conversion cost in a way that all the parties regarded as fair.

Across the sector, the idea of a financing model in which the studios would contribute to the cost of converting to digital was being raised. The studios agreed: “The proportion of financing should be relatively proportional to the benefits enjoyed by each party,” said Chuck Goldwater, CEO of DCI. The National Association of Theater Owners voiced its concern that whatever form the financing plan took, it must be industry-wide, “supported by all major motion picture studios through one financial entity or a consortium of financial entities, and that all other motion picture studios willing to participate in the plan must be given the opportunity to do so.” This would avoid the nightmare of fragmentation, where different studios had divergent financial requirements and terms for screening their films. The challenge was not just of agreeing to subsidize the theater owner’s participation, but of finding the right way to bring them on board (and doing so without running afoul of antitrust concerns).

Enter VPF. The VPF (virtual print fee) program is a financing innovation that allows studios to subsidize the high cost of digital cinema adoption in theaters. It does so by introducing a new actor into the ecosystem—the digital theater integrator. Under the VPF model, instead of theaters being forced to cover the costs of equipment upgrades and maintenance, this new third-party integrator pays the initial outlay for the equipment and assists with technology integration and maintenance. The integrator’s profits come from a five- to ten-year lease-to-own arrangement with the theaters, which is subsidized by the VPF. For every movie that is projected in digital rather than analog form (thereby saving the studio thousands of dollars in avoided printing, shipping, and retrieving costs), the studio shares the benefit by paying a virtual print fee of approximately $1,000 to the digital integrator on behalf of the exhibitor. Over the life of the contract, the VPF arrangement can cover approximately 80 percent of the exhibitor’s conversion costs. At the end of the contract, the exhibitor retains the equipment, ensuring that digital cinema becomes industry standard.

The VPF breakthrough emerged in 2005, ultimately allowing the digital cinema dream to become a reality. The year 2006 is widely considered to be the transition year for digital cinema, as theaters across the country began to gradually embrace digital cinema’s promise. And in 2007, theater chains AMC Entertainment, Cinemark USA, and Regal Entertainment Group—who collectively controlled almost 30 percent of the total screens in the United States—formed their own integrator organization, Digital Cinema Implementation Partners (DCIP), further spurring the mass adoption of digital cinema.

Figure 3.5: Digital cinema ecosystem after the introduction of the virtual print fee (VPF).

In 2009, James Cameron’s 3-D record-breaking sci-fi megahit Avatar led moviegoers to realize that modern three-dimensional offerings were a far cry from the gimmicks of the 1950s. In the first two months after Avatar’s release, 81 percent of its $601 million U.S. box-office gross came from 3-D ticket sales. Riding high on the public’s enthusiasm for 3-D, Disney’s Toy Story 3 became 2010’s top-grossing picture. By the end of the year, 38 percent of U.S. screens and nearly 25 percent of global screens had converted to digital. The revolution had finally arrived.

Digital cinema still faces hurdles. As new generations of equipment emerge, there will likely be issues of quality control and compatibility as everyone along the adoption chain gets up to speed. Additionally, the question of long-term digital storage is problematic. The costs of digital preservation are significantly higher than for traditional film. And there are questions about the sustainability and size of ticket premiums for 3-D movies. Despite these concerns, the shift to digital cinema will be nearly impossible to reverse. As the installed base of analog projectors shrinks, as analog manufacturers abandon the market, and as studios stop printing and shipping analog film, we can expect celluloid film to go the way of the LP record album in the age of digital music and become a niche curiosity.

Figure 3.6: Digital cinema installations.

Evaluating the Success of Digital Cinema

At first glance, the progression of digital cinema can be viewed as frustratingly slow. After all, in the near ten-year lag before widespread adoption, everybody could have been enjoying its benefits. But, if we look at the path to full adoption, we see that the incremental steps taken—and the order in which they were taken—were necessary for digital cinema’s ultimate victory. The studios were smart to set industry standards first, preventing the possibility of high-tech chaos as different players supported divergent technologies, hindering the consensus necessary for a mass rollout. Three-dimensional content was ramping up, creating visible consumer demand for digital cinema that theater owners could see. But, offered a financing solution and VPF payments, exhibitors didn’t need to wait for 3-D hits before agreeing to take their initial steps. Digital-enabled theaters began to enjoy added revenue from new sources of content such as live broad- casts of sporting events, concerts, and opera. And having taken the initial steps, upgrading to support the slew of digital and 3-D offerings that soon arrived was more a hop than a leap. And as the benefits to theaters of adopting digital cinema have grown, studios have been able to reduce the VPF subsidy for later adopters.

We will explore the confluence of leadership and successful innovation in depth in chapter 5. But it is worth noting here that in order for mass adoption of digital cinema to take place, leadership was required from the only party with enough economic incentive and resources to matter: the movie studios. Neither the independent distributors nor the equipment suppliers were in a position to lead due to fragmentation, limited bargaining power over exhibitors, and a lack of financial sway. The integrators made a great leap for digital cinema by launching the VPF model, but without the participation of the studios, the financing framework would have fallen flat. It was the studios that, by initially moving away from celluloid film, setting digital standards, and later collaborating with digital theater integrators, cleared the path for the widespread adoption of digital cinema. With VPF, the studios stepped into a new role, creating a new set of links into the ecosystem. They found a brilliant solution to induce the digital changeover. And by limiting their participation to a fixed time window, they also engineered a process to get themselves back out of the way, and off the hook, once the system is entrenched.

Targeting Adoption Chain Solutions

Adoption chain problems require adoption chain solutions. Not just intervention and investment, but intervention and investment targeted at the right place in the right way. The movie studios could have spent their resources trying to create excitement with audiences, producing more movies, or pushing more advanced technologies, but none of these would have affected the core problem that held back digital cinema. Instead, by identifying the theater operators as the key bottleneck, they knew where to focus their efforts, creating a financing solution targeted specifically at turning exhibitors’ negative assessment to a positive one.

For superabrasive grinding wheels, understanding the total-cost/relative-benefit balance has meant adjusting market expectations. Recognizing that the advantages offered by this superb innovation were accompanied by a host of nonprice costs, the industry understood that the net proposition would be positive for only a subset of users, and it found ways to run profitable operations targeting the niche rather than the mainstream.

Seeing the realities of your adoption chain risk puts you in a better position to make smarter choices early on. Wise innovators will have a plan for addressing every adoption chain link with negative surplus in advance of their launch. They will modify their strategy to create surplus for the problematic adopter or find a way to reach their end customer without the help of the problematic link. And if they cannot find a way to address these challenges, they will understand that they may well be better off foregoing the opportunity and shifting resources to the next option on their list.

A wide-lens perspective will help you uncover the true nature of adoption chain challenges. With a clearer view, the outlines of the right solution will be easier to see. Developing them may require effort and innovation, as was the case with VPF, but the returns from these efforts are high. You must not avert your gaze when confronted with a partner with negative surplus. The key is to not rush to action until the fundamental dynamics are clear. It is easy to invest in satisfying end users, but it is not always effective. Remember which customer in the adoption chain is the most important. All of them.