Now that we’ve covered AIPB, and how to create an AI vision in detail, let’s create an AIPB AI vision using a hypothetical example.
Our AI vision example was inspired by Grant Achatz, a highly renowned chef and restaurateur, famous for his Chicago-based restaurant, Alinea. Alinea was ranked the second best restaurant in the United States and seventh in the world according to Restaurant magazine’s list of the world’s 50 best restaurants. Alinea was also awarded three Michelin stars eight consecutive years at the time this book was published!
Very unfortunately, Mr. Achatz was diagnosed with stage 4b tongue cancer at the young age of 33. Despite most oncology experts telling Achatz that the only treatment option required removing 75% of his tongue and effectively destroying his sense of taste permanently, another opinion provided by a University of Chicago Medicine oncologist named Everett Vokes suggested an innovative treatment option. His plan was geared toward saving Achatz’ tongue and taste buds with targeted chemotherapy and radiation, which proved to be successful, although it did result in a temporary loss of his sense of taste. Today, Achatz is now cancer free.
Achatz’s sense of taste eventually returned, and his story presents a compelling AI opportunity. Let’s build an AIPB vision based on this example. What if technology-based mouth sensors could be combined with AI algorithms to map sensory inputs to what we call tastes? Let’s try to use AI to create a digital mapping that is similar to the chemical-based sensory mechanism that makes our taste sense work. We want to do this to help people like Achatz, who lose their sense of taste.
Humans perceive tastes such as bitter, sour, salty, sweet, savory, and metallic. This happens when chemical substances make contact with certain nerve cells in our mouths, which then activate other nerve cells. The brain ultimately receives this information and perceives “tastes,” or what we also call flavor.1
Jeff Hawkins, in his book On Intelligence, states: “You hear sound, see light, and feel pressure, but inside your brain, there isn’t any fundamental difference between these types of information…. All your brain knows is patterns. Your perceptions and knowledge about the world are built from these patterns…. All the information that enters your mind comes in as spatial and temporal patterns on the axons.”
We humans often take our ability to sense and perceive for granted and might think that our brains somehow see, hear, taste, smell, and touch directly. Our brain actually exists in silence and darkness, and senses indirectly only by receiving patterns of signals through sensory organs that are passed through huge numbers of neurons and synapses during their journey deep into our brain (for more information, refer to Appendix A). These spatial and temporal patterns of signals, by the time the brain receives them, are nothing more than neuron activations. In other words, the brain interprets these activations as an image that we see, for example, even though the activations represent nothing to the brain like what we actually perceive that we see. It’s like the streaming symbols from the movie The Matrix and how they actually represent something else; in this case, what’s going on in the matrix.
All of these examples are very analogous to how artificial neural networks and deep learning work. Inputs go in (in this case, sensed chemical substances that trigger nerve activation signals), and the network produces a result such as a prediction or classification. In the case of human sense inputs, outputs could be a specific sound, smell, feeling, visual image, or (you know where this is going) flavor.
We are interested in creating the world’s first technology-based, AI-powered taste-sensing mechanism to help those who are no longer able to taste. Assume that nothing close to this has ever been done before. Moving forward, we refer to this solution as the Tasterizer, and the hypothetical company is Tasty Co. Notice that we have an initial high-level vision without having performed any assessments or developed a plan. We don’t need to do that yet.
Recall our AIPB Framework, as shown in Figure 10-1.
We begin with the North Star: better human experiences and business success. So how will this vision address both? Luckily the answer is relatively easy. Getting back the sense of taste clearly results in better human experiences, so done deal for that.
A business that invents this hypothetical solution would be able to achieve business success in many ways, not just sales related. A business based on a product like this, and its employees, can feel really good about how its products are able to help those in need. It’s a win-win situation. They also can feel good knowing that they’re able to truly innovate and create new products, business models, and capture new markets as a result. Of course, there’s also the obvious business benefits such as sales and revenue (assume for this example that there’s a big enough market for this business, and you’re a first mover).
We’ve now identified the benefits to both people and business at a high level and are able to move on to developing the vision statement. Remember that all relevant, AIPB-recommended experts should collaborate to develop the intended output of each AIPB Methodology Component phase. For the vision phase, this includes managers (business folks and domain experts) and scientists (AI practitioners). Highest paid person’s opinion (HiPPo) methods and design-by-committee are not allowed, and concepts like the flipped classroom are highly recommended. Use all sessions as highly productive, effective, actual working collaboration sessions, as opposed to teaching and learning sessions. Let everyone know in advance how to get up to speed, preferably in a way that requires minimal time and effort to accommodate busy schedules.
Also recall that for the AIPB methodology vision phase, and from the AIPB process categories introduced in Chapter 2, I recommend the ideation and vision development category. This includes using recommended methods such as design thinking, brainstorming, and the five whys. Note that this is meant to be a high-level and overly simplified example. Developing an AI vision using AIPB in the real world should involve everything vision related that we’ve discussed so far in this book.
With everything established, and to wrap up our example, let’s create our vision statement.
For those who have lost the ability to taste, we at Tasty Co. are helping to restore people’s sense of taste so that they can taste food and beverages once again (why for people). This has enabled Tasty Co. to generate new revenues by capturing an entirely new market while expanding our portfolio of human wellness-benefiting products (why for business). We’re able to do that with our patented Taste-Fusion technology, which combines state-of-the-art microsensing hardware with the latest AI techniques in order to map the chemical substances of food and drinks to tastes perceived by the human brain (how). The result is what we call the Tasterizer (what)!
We can modify this statement to be only people focused or only business focused as needed for a specific audience, but the exercise of creating a vision statement that identifies the why for both people and business is a must and is something that many companies don’t do. As Simon Sinek famously said, “People don’t buy what you do, they buy why you do it.” Make sure you clearly understand the benefits to the people that use your offering and then build a great product around it.