Perhaps you sometimes wonder the same things we do: “How can Uber instantly connect me to a car when I'm on a random street corner some 500 miles from home and then automatically bill my credit card, send me an invoice, and capture my passenger rating in seconds?” “How can I be watching a YouTube video on a mobile device while moving at 130 mph on a train?”
These two experiences, moments of “magic” that have become almost mundane, would have been impossible even a few years ago. What's surprising is that Uber and YouTube, in spite of offering dissimilar services, both run their operations on a “machine” with essentially the same components. This new machine, what we call “a system of intelligence,” is rapidly becoming the cornerstone for companies that compete on knowledge. It sits at the center of Facebook, Instagram, Google, e*Trade, Betterment, and all the other examples of today's digital leaders.
Yet, for all its importance, the new machine is still largely misunderstood. Most of us actively consume the output of these systems of intelligence without slowing down to think about how such real-time, personalized, and curated experiences are actually created and delivered.
To that end, in this chapter we will explain what these new machines are—what the technology components are, how they fit together, what a good one looks like, what they mean to enterprise work, and how they will profoundly impact the future of your work.
We know; this overview may seem akin to when you were learning to drive as a teen and your uncle popped the hood on the car, and started to explain exactly how everything worked. Some of the lesson can be dry (e.g., “this is the carburetor; these are the spark plugs”), but as we now consume these systems of intelligence on a continual basis and need to build and implement them in our own companies to seize competitive advantage, a working knowledge is important.
Let's start with a simple definition; then we'll unpack it a bit.
This may sound like “a bunch of hardware, software, and data is combined, and then a miracle occurs.” So let's touch on the three key attributes that make a system of intelligence so special:
In short, these three special features—software that learns, massive processing power, and enormous amounts of data—are combining to bring systems of intelligence to life. (As an aside, in some circles, these are now being referred to as software “platforms”; for clarity and consistency, we will refer to them as systems of intelligence). Further along in the chapter, we will outline how these pieces all fit together. Before we get there, it's useful to provide some definitions on the most controversial and misunderstood part of the machine—artificial intelligence.
The term “artificial intelligence” has become so overused that it's actually causing more confusion than clarity. There are many definitions in the market, and almost all of them emphasize a comparison to humans. Such definitions—for example, Merriam-Webster's (“the capability of a machine to imitate intelligent human behavior”)—immediately send many of us down the wrong path, for we start thinking, “What human intelligence can and will be imitated?” We think this is wrong.
Our definition is simpler:
It's that clear-cut. The anthropomorphically biased definitions of AI are wrong for two basic reasons:
Thus, AI is not about building robots that ape human form and behavior. Instead, practically applied AI represents the next generation of computer systems that, like the systems of old, are housed in air-conditioned computer rooms and accessed through networks and systems (like those apps on your smartphone) that you may not see but consume on a regular basis.
But such a definition is only the beginning. We've found it very helpful in slicing through the definitional clutter to divide AI into three subsets: 2
Narrow AI, which is also referred to as “applied AI” or “weak AI,” is our default definition for this book. It is important to note that all AI today—and for at least the next decade—is narrow (also termed “artificial narrow intelligence” or ANI). Such AI is purpose-built and business-focused on a specific task (e.g., driving a car, reviewing an X-ray, or tracking financial trades for fraud) within the “narrow” context of a product, service, or business process. It's what the FANG vendors utilize today in delivering their digital experiences. Thus, while it appears that the new machines can do everything, they actually focus on doing just one particular thing very well. As such, these ANI systems would be hopeless in any pursuits beyond those for which they were specifically designed (just ask your Waze GPS if that onion bagel with cream cheese fits your current diet). ANI is simply a tool, albeit a very powerful one, that provides the basis for all we will explore in the coming pages.
General AI, also referred to as “strong AI” or AGI, is what is fueling the fears of the Singularity crowd, and has been highlighted in the previously referenced films Her and Ex Machina. 3 Strong AI is the pursuit of a machine that has the same general intelligence as a human. For example, just as you in a span of a few moments can discuss politics, tell a joke, and then hit a golf ball 150 yards, the AGI computer will have the general intelligence to perform these activities as well.
Ben Goertzel, Chairman of the Artificial General Intelligence Society, points to the Coffee Test as a good definition for AGI. That is, “Go into an average American house and figure out how to make coffee, including identifying the coffee machine, figuring out what the buttons do, finding the coffee in the cabinet, etc.” 4 This set of tasks that is seemingly easy for almost any adult to perform is currently insanely difficult for a computer. Creating AGI is a dramatically harder task than creating ANI; by most estimates we are still more than two decades away from developing such AI capabilities, if ever.
That said, it's easy to scare ourselves with general AI for two reasons, one practical and the other theoretical. On the practical front, we see examples today of narrow AI that appear to be general AI. This might be your Amazon Alexa home appliance, which passes the Turing test (of acting in a way that's indistinguishable from a human). It may feel as though it's moving toward AGI, but it's actually just a brilliantly elegant vocal interface to the Internet search capacity we've known for over 15 years.
On the theoretical front, computer scientists look at the human as a machine in and of itself—one with very distinct limitations. Human IQs generally range from 80 to 150, an incredibly small scope in computing terms. If AGI becomes a possibility from a software standpoint, why would we limit a machine's “intelligence” to, say, an IQ of 150? Why not 300 or 3,000 or 30,000? None of us would even begin to understand what such an IQ would be or could do, but when it's a simple matter of stringing together more servers in the cloud to add more processing power, where would this take us?
This leads to the third definition. Super AI , in essence, is the technical genie being let out of the bottle. In such a scenario, would humans even know how to stop such a machine? It would run circles around our collective intellect (and, as we know, whenever 10 reasonably smart people are put in a room, the collective IQ is not 1,200 but actually somewhere around 95 once one accounts for the different opinions and objectives that people always bring). How could we then turn the machine off when it's always 10 (or 1,000) steps ahead of us?
This is interesting stuff, particularly for cocktail party conversations. Yet, to reiterate, based on our research, a post-Singularity future with super AI-based Terminators running amok is a mirage. Serious people—the people building these systems today—are quite modest about whether such scenarios are even possible in the next 100 years, much less in the next 5 to 10. Andrew Ng, the chief scientist at Baidu Research, put it succinctly when he said that “worrying about [general or super] AI is like worrying about overpopulation on Mars before we've even set foot on it.” 5
Thus, our focus in this book is specifically on ANI because here, in the real world, we are more concerned about effective tool use for good business outcomes in the modern enterprise. Although some will continue fretting away, worrying about such things as super AI, their competitors will put them out of business with practical applications of narrow AI. With that as a definition of AI, let's go deeper into the new machine.
Each system of intelligence can do vastly different things, but they all share a similar basic anatomy. In fact, if you are familiar with enterprise technology and the prior generation of systems of record (such as ERP and CRM systems), many of the constituent parts will seem familiar. After all, the technology “stacks” of systems of record and systems of intelligence share many common elements, such as user interfaces, application logic, process flows, databases, and infrastructure.
However, critical differences exist up and down each layer of the stack, the most important focusing on the three distinguishing characteristics of the new machine highlighted earlier: systems that learn, massive processing power, and huge amounts of data. In Figure 4.1 we highlight several of the key differences at each layer of the software stack. Next, we'll work top-down through the various components
common to every system of intelligence as shown in Figure 4.2 .
Regardless of how digitized our world becomes, this shift is primarily oriented around people; that is, us—carbon-based life forms without an on/off switch. Thus, with successful digital solutions, the more technical they are, the more human they feel. The best digital solutions don't slow us down; instead, they get out of the way, seamlessly helping us to achieve our end goals. We don't want to learn the systems, we just want the results. With the Waze app, it's how to get somewhere by the fastest possible route; with GE's Predix platform, it's the status of our windmill farm; with the Lex Machina legal analytics service, it's the track record of a particular judge. The machines may be able to do amazing things, but the trait we've seen successful systems share is that they put the human experience at the center of the design. Oftentimes, even in the most automated, seemingly AI-rich environments, there is still a considerable amount of human intervention involved.
In our personal lives, we think of Netflix, Strava, LinkedIn, and so on as “apps.” Most of us don't even call them “software” anymore. What you touch, the content you share, the information or insight you get—all of this flows through the app (or the application layer), which is the front door to the rest of the new machine. For years, you've downloaded apps onto your PC, your smartphone, or your tablet, and now they are being embedded into industrial machines (such as cars). What's essential here is that it's the app that frames our experience. The rest of the system of intelligence, if it's any good, is invisible to us.
Additionally, as we outlined in our book Code Halos, the app has to provide a beautiful experience for users. The app has to hit the FANG benchmark for elegance and ease of use. This explains the explosion of “design thinking” in corporate IT circles, given that these interfaces must fit the way in which your customers, partners, and employees wish to use them in the midst of their activities. (No owner's manuals or user training allowed!) The key to success here is that the app interface must be simple and intuitive, seamlessly fitting into the context of the user's needs.
Beyond all the hype, ANI is modern, complex, adaptive software that is the essential core of a system of intelligence. What we think of as AI really comprises three main elements:
This is a business book for a technical age. We're not going to go into the ins and outs of the specific technologies that comprise today's artificial intelligence. We could write an entire second book on machine learning, deep learning, and neural networks (these are the hot topics in leading universities around the world), but candidly that would be redundant because there are some terrific learning resources in the market already.
The inputs to the new machine are many and varied. Some will be mature, ERP-based systems; some will be real-time data pouring in from instrumentation—the Code Halos of data surrounding products, people, and places—that is continually informing this nerve center as to what is going on all around it. Over time, these inputs will evolve, sometimes both quickly and radically. All those inputs are responsible for creating that contextualized, highly valuable data. Without those new data sources, it will be difficult to build or fuel your new machines. Sensors in mobile devices, apparel, sporting equipment, cars, roads, and virtually every other physical entity will be responsible for generating the information—the code. Connecting those sensors to a system of intelligence is the Internet of Things coming to life.
Data may seem like a weightless abstraction. (If you're interested, all the electrons in the Internet together apparently weigh about as much as two strawberries.) 7 So while it doesn't weigh much, data has a huge amount of value when it's deployed in the right place at the right time. Massive amounts of data must be captured, stored, maintained, analyzed, and made accessible. That's why we need large database systems that that are stable, scalable, and tested (no matter how cool the shiny new tools may seem). A new generation of databases (e.g., Hadoop) are finding favor, but Oracle and SAP, with a combined customer base of more than 740,000 companies, are not going anywhere. Neither are IBM, Microsoft, or any of the other major enterprise software “arms dealers.” In the digital economy, we'll still need high-quality systems of record (both traditional and emerging ones) as much as we need our AC power grid.
As with any machine on a factory shop floor, the new machine needs power and “plumbing.” Infrastructure includes all the networking, servers, power sources, and so forth that make the machine hum. For existing systems, many of the elements are already managed either directly by the IT department, by an external service provider, or—more common now—by a cloud provider. Mobile networking, generally running on a major carrier, is imperative. For computing power, regardless of whether your system is running on Amazon's servers, somewhere in the Googleplex, or in your own data center, systems of intelligence all need highly efficient, always-on plumbing.
The “anatomy” of the new machine can seem somewhat abstract unless the parts are linked together into actual new machines visible in the real world that we conceptually understand. Much has already been said about Netflix, but our take is different. Because we are all familiar with the media-streaming platform, it provides a great illustration of how a company that is leveraging the new machine, the new raw materials, and a business model oriented around them is upending business as usual.
As of 2016, Netflix represented about 35% of all Internet traffic in North America and has the incumbent TV networks in quite a spin. 8 By dissecting Netflix a bit, we can see the anatomy of the new machine in action. (See Figure 4.3 .)
Figure 4.3 The Anatomy of the Netflix System of Intelligence
Anatomy Element | How It Works (continued ) |
Users/Customers | This is all of us as consumers. Netflix now has about 75 million subscribers worldwide (and growing), and we all want what we want (and we want it now, on every device, always). 14 |
The App | Most of us experience Netflix as an app. Regardless of where we are or the hardware we use—tablet, mobile device, laptop, set-top box, or VR headset—the only part of the Netflix system of engagement we touch is the app that connects us to Orange Is the New Black (and many more). |
Process Logic | Do you remember using the print TV Guide to find out what was on television and then having to time your life around when a show was airing? Um, neither do we. In those bygone days, there was really only one process for consuming media via television or movie screen, and it wasn't great. Now, of course, whether we are watching on a mobile device on the road or next to our loved ones at home, Netflix provides content in an almost infinite set of variations. It seems so simple now. Of course we can watch House of Cards at any time on a train going 130 mph! But it feels seamless and frictionless because Netflix has thoughtfully adapted its entire system to the way we want to consume content. |
Machine Learning | Netflix's recommendation engine—basically, a set of algorithms that connect us to content we want—is the most well-known element of the Netflix AI system, but the whole engine is actually much more than that. 15 The core of the Netflix experience connects us to content, distinguishes between family members, and processes billions of events a day related to movies, viewers, payments, and the like. More important, the platform improves over time. As we use it more, it learns about our tastes and serves up the best content available in a highly personalized way. It can distinguish between what we say we like and what we actually like. (Note, for example, that Adam Sandler's The Ridiculous Six is rated a middling three stars by Netflix users, but in January 2016 was the most watched Netflix program in history.) 16 The core of the Netflix system is a remarkable piece of software design and engineering, which, because it is so good, is nearly invisible. |
Software Ecosystem | Netflix relies on connections to dozens of other systems to bring us Orphan Black and The Walking Dead (two of the most binge-watched shows). 17 The Netflix team emphasizes open-source software tools such as Java, MySQL, Hadoop, and others. Content distribution is supported by tools like Akamai, Limelight, and Level 3 Communications. It also relies heavily on Amazon's cloud systems for storage. 18 The point here is not the specific tools Netflix uses but that the company realized early on that it would need to build certain elements of its overall system and that it also could leverage other best-in-class systems to grow faster. |
Sensors/Internet of Things | Netflix engages with us—both providing content and collecting data—via our devices, but it also learns about us via the many sensors and data associated with these devices. Other sensors are starting to matter to Netflix. For example, it is exploring ways to connect to Fitbits and even socks to monitor if we've fallen asleep. 19 |
Data | The Netflix data warehouse stores about 10 petabytes of information. 20 (One petabyte is equivalent to 13.3 years of streaming HD-TV video. 21 ) With all that data flowing through the AI engine, Netflix knows you. We're sorry if that creeps you out, but it tracks the movies we watch, our searches, ratings, when we watch, where we watch, what devices we use, and more. 22 In addition to machine data, Netflix algorithms churn through massive amounts of movie data that is derived from large groups of trained movie taggers. (Netflix isn't saying, but the best guess is that it is applying more than 76,000 genres to categorize movies and TV shows. 23 ) |
Systems of Record | There's no way to manage the vast reams of Netflix data without an absolutely top-notch architecture. Netflix started with Oracle, but has moved over to an open-source database called Cassandra. It uses Hadoop for data processing and Amazon S3 for storage. 24 It also links to back-end payment systems so nothing can interfere with our next purchase. |
Infrastructure | Amazon inside! As of February 2016, all of the infrastructure services Netflix requires are provided by Amazon Web Services. 25 (Anyone still waiting for the cloud to “mature” is probably tripping over the stretchy cord on his or her phone at this point.) |
There's a big difference between merely having all the necessary ingredients of the new machines and actually getting them to perform at a high level. A system of intelligence that can help you be the Usain Bolt of whatever race you're in will have all or most of these characteristics:
In looking in more detail at systems of intelligence, many profound, strategic, even existential questions are raised. What is a department store in an age of Amazon? What is a hotel in an age of Airbnb? What is car insurance in an age of self-driving cars?
Given just how large these questions are, it's tempting to think what we are calling a major trend is actually just a fad or the latest consulting theory. We hear it all the time: Maybe this is tomorrow's problem. Maybe we should wait until regulation or a disruptive event forces change on our industry.
We disagree. Every day that passes gives us more evidence and strengthens our conviction that the new machines that we've examined in this chapter are the engines of a Fourth Industrial Revolution. We are well past the theoretical phase. Individuals and companies that are realizing an early advantage are not all geniuses or legendary entrepreneurs. They are people like you, people who are using new technologies to solve major problems.
Remember, building your own new machine is becoming increasingly easy, even though, as we have pointed out, “some assembly is still required.” Increasingly you can take your credit card out and lease access to machine-learning code, infrastructure, and databases. The Google Cloud Platform gives you immediate access to a neural-net machine-learning platform. 12 Amazon Machine Learning gives you access to the same predictive analytics platform that provides recommendations to deep-pocket consultants and enterprises. 13 Only a few years ago, this would have cost companies millions and have taken many months to put into place.
The actual builders of the new machines all say the same thing: Making narrow AI the core of a system of intelligence is not a theoretical exercise; it's possible, and it's happening today. And what's really supercharging this explosion of activity is that the fuel for the new machines is all around us if you can see it, grab it, and use it, which is the focus of the next chapter.