Open Standards Complements

Clayton Christensen further observed in his research[12] that as companies begin to overdeliver functionality in their product lines faster than customers are able to use the new functionality—and therefore faster than customers are willing to pay for it—the market begins to call for standardization. Indeed, prior to the point where they begin to overdeliver, the market leader is often offering the technology in a tightly integrated fashion and best delivers to consumer needs in this space where the solutions typically are not yet good enough. This is the time when tight integration, not standards-based components, is the path to success. Standards develop once the marketplace reaches a point where the market leader begins to overdeliver. These are the circumstances in which a market-dominant de facto technology is at a critical point and the call for de jure standardization is possible.

The signal to standardize a technology is somewhat unclear, but there is likely a collection of factors:

If you are the one true implementer, and the market (i.e., partners, customers, and competitors) is calling for standardization in your core technology space, you have a problem. They're calling for the benefits of standards (expanding market and price competition) because they want the ability to replace you. Some segment of your customers wants the choice of multiple implementations. Your competitors are happy to support the call, as this is the thin edge of the wedge to break open your value proposition to your customer, all in the name of open systems. Your partners may be happy to support the call for standardization because they want price pressure as their margins diminish and perhaps your percentage of their Cost-of-Goods-Sold is increasing.

It is important to note that one needs to get the view of the market "right" for this sort of discussion, and hindsight is always 20/20. It is not necessarily the dominant vendor's product that is to be standardized, but the product market space. For example, one can argue that the POSIX standards (and the C-language standards, for that matter) were not about standardizing Unix systems, but rather, were an effort to standardize an OS interface for minicomputers. Digital Equipment Corp. was the dominant player in minicomputers (which became departmental servers and workstations). DEC was driving customers up the hardware upgrade cycle to support its market growth faster than customers were willing or able to absorb the change. Unix systems of the early and mid-1980s represented the best opportunity around which the market could form a minicomputer application programming standard to support customers' applications portability. While the Unix systems of the day were often less scalable, less robust, and less secure than VAX/VMS systems, the Unix operating system had been ported to most vendors' hardware (including DEC VAXen), so competing vendors could see the market opportunity.

At the same time, the PC arrived on the scene. Many have argued that the PC won against Unix systems by taking over the desktop, largely due to the inability of the Unix vendors to set a desktop "standard" fast enough. The PC certainly took the desktop by storm, but it was actually competing against nonconsumption. In a Christensen view of the world, it was put together from inexpensive parts, and when compared to minicomputers it was certainly underperforming, but it became the de facto business appliance in a document-centric world, enabling a whole new class of electronic document-centric applications. (Word processing systems companies vanished almost as fast as the minicomputer companies.) The PC was competing with nonconsumption, giving business users computing resources on their desktop instead of being stuck waiting for their business data processing applications to be developed by corporate IT, with its ever-growing systems development backlog. The Unix systems (driven by standards and an "open systems" message) were data processing-centric rather than document-centric, and caused DEC grief in a completely different space.

Christensen observed that as an area of technology is standardized, the value moves to adjacent spaces in the network.[14] The trick then becomes to ensure that one is building one's business efforts in the product network around the space being standardized. This would lead us to believe that the richer a product offering network a vendor has, among different software, hardware, and service components and products, the more opportunity that vendor has to move with the value or to define new components that the old components complement.

This core-complement product network view allows one to very rapidly see how the vendor politics in a standards working group play out. A vendor with a de facto product technology that is being dragged by the marketplace into a de jure standards working group is likely a little less than enthusiastic about participating in its own commoditization. The vendor alliances within the working groups are participants in the complement space. The game is one of technology diplomacy, where the goal as a vendor representative is to expand your area of economic influence while defending sovereign territory. This holds true regardless of whether one is participating in a vendor-centric organization such as Ecma International, as an "expert" to a national delegation to the ISO (on behalf of her employer), or as an individual contributor to an organization like the IEEE (again, funded by her employer to participate). Vendor consortia offer a similar view. Which vendors formed the consortia and which vendors quickly and noisily joined shortly afterward says a lot about who the incumbent in a product space is and who the competitors are.



[14] This was originally referred to as "the Law of Conservation of Attractive Profits," but is now referred to as "the Law of Conservation of Modularity."