image

7

BY DESIGN

People ignore design that ignores people.

–FRANK CHIMERO

Sometimes being fearless and crazy isn’t about the decisions you make or the paths you take but about the things you create. It’s about genius and ingenuity, insight and inspiration. Some amazing people among us (and I hope you’re one) have that God-given something, that special gift that lets them see, feel, and create works of art—products that make our lives better and our spirits more fulfilled. They are the people who lift us—at least for a few precious moments—from our grinding earthbound existence to the heights of the angels.

His Own Devices

Consider the Apple iPad 2, introduced in March 2011. It’s not that much different from its amazing predecessor, the iPad, the most buzzed-about e-device since Apple’s iPhone (which, of course, was the most buzzed-about device since the iPod).

The first iPad sold more than 15 million copies in nine months, at a price of $500 or more, depending on how much power and how many bells and whistles it had. Its amazing success has led to a flood of other tablets, all of which are threatening to make the laptop computer as obsolete as the typewriter.

When you compare the iPad 2 to its iPad parent, you notice the obvious: it’s one-third thinner, 15 percent lighter, and twice as fast. It also has front and back cameras, one of which is an HD video cam. All of that’s important, but there’s more: an optional screen cover that protects the screen and doubles as a stand to prop the iPad 2 upright—available in five different colors of polyurethane or five different colors of leather. When you open the screen cover, the iPad 2 powers up instantaneously, not within a few seconds like even the fastest PCs and the original iPad. There’s a gyroscope too, which is needed for some advanced games. And of course, there are all those apps—65,000 and counting, plus another 290,000 iPhone apps that run at lower resolution on an iPad screen.

But those are just the facts. The iPad 2—and everything else that Apple has created or will create—is flat-out fun to see, fun to hold, and fun to use. There’s something subjective, something visceral, about Apple’s products that even Apple’s detractors cannot deny. Apple’s products are different; they’re unique. They’re well designed.

Consider the man behind those designs, the man who is the face and spirit of Apple: Steve Jobs. At the March 2011 unveiling of the iPad 2 in San Francisco, he suddenly walked out onto the stage, and the crowd gasped and then quickly cheered. The man himself, dressed in his trademark black turtleneck and jeans, was back. He got a standing ovation.

In January of 2011, he had announced his third medical leave of absence from Apple’s helm, the result of his 2004 pancreatic cancer diagnosis. His health problems had raised questions about Apple’s future without him. But here he was again—thin as ever—doing what he does so well: promoting his newest product, zinging the competition, exuding excitement and enthusiasm, and working the crowd. By the time he left the stage, Apple shares had jumped $3. Vintage Jobs, vintage Apple. Well timed and well executed. The entire event, just like Apple’s products, was well designed.

In a fearless, crazy, and consistent way, Jobs and his company were like no others in the tech industry. Love ’em or hate ’em, you had to shake your head and admire them. Apple Inc. is the darling of Main Street as well as Wall Street. Its market capitalization has surpassed that of Microsoft, Google, and IBM. Many predict its share price will continue to climb from over $300 to $500 and beyond.

How did all this happen? Well, it’s a familiar story and a crazy one.

Born in San Francisco in 1955, Steve Jobs set off on his tech path sometime between high school and college. Jobs frequented after-school lectures at the Hewlett-Packard Company in Palo Alto, California, working there over a summer with his future business associate Steve Wozniak.

After high school, Jobs enrolled in Reed College in Portland, Oregon. He dropped out after one semester but later pointed out that if he hadn’t taken a class in calligraphy while there, “the Mac would have never had multiple typefaces or proportionally spaced fonts.”76

In the fall of 1974, Jobs returned to California and took a job at video game producer Atari. He saved enough money to travel to India, where, like so many before him and since, he sought spiritual enlightenment. He took along a friend from Reed College, Daniel Kottke, who later would become the first Apple employee.

While in India, Jobs apparently did find the enlightenment he sought, even shaving his head and wearing traditional Indian clothing. He hadn’t abandoned his old life for something new, however. Far from it. The Indian trip was merely preparation for the road ahead—in business in America.

Back home, Jobs returned to his old job at Atari. One of his first assignments was to create a circuit board for a game called Breakout. According to the Atari company founder, Nolan Bushnell, Atari had offered its game developers a $100 bonus for each chip in the Breakout board they could eliminate (i.e., make redundant). The idea was to streamline the game without losing power or functionality. Jobs took up the challenge, even though he didn’t know how to eliminate the chips. What he did know is someone who did—Steve Wozniak. He enlisted his old summer job buddy to help him; the two would split the bonus money. To the amazement of everybody at Atari, Wozniak reduced the number of chips by 50, creating a design so tight that it was impossible to reproduce on an assembly line.77

A Jobs-Wozniak partnership was born. By 1976, the two of them and friend Ronald Wayne set about building Apple, a company that would assemble and sell computers. The three-fledgling entrepreneurs were lucky to get some funding from the then-semiretired Intel product marketing manager and engineer A. C. “Mike” Markkula. They were on their way.

In 1978, to help manage expansion, Apple recruited Mike Scott, an experienced manager from National Semiconductor. Scott served as CEO for several turbulent years and in 1983 was succeeded by John Sculley from Pepsi-Cola. Apparently the key to Jobs’s recruiting pitch was a question he asked Sculley: “Do you want to sell sugar water for the rest of your life, or do you want to come with me and change the world?” Sculley opted for the latter.78

In 1984, Apple cemented its reputation for edginess with its now-famous “1984” Super Bowl television commercial. In the commercial, Apple introduced to the world the Macintosh computer, the first personal computer with a graphical user interface. Jobs himself had worked extensively on its development. The machine’s simplicity and design were remarkable. The nickname “Mac” eventually became a household name. The term “user-friendly” was often used in the same sentence as “Mac.”

As the face of Apple, Jobs could be inspirational and charismatic, but to many of his employees and coworkers, he was also erratic, arrogant, and aggressive. When sales slumped in late 1984 and early 1985, he was ousted by the Apple board led by John Sculley, the very man Jobs convinced to leave the sugar water industry.

Jobs was out but not down for long. He started another computer company, NeXT, whose hardware and software would be highly advanced technologically and marketed primarily to the scientific and academic fields. In fact, Jobs went as far as to describe one of the NeXT products, the NeXT-cube, as an “interpersonal” computer. In his opinion, the level of innovation it brought to the market was beyond what was already accessible via the “personal” computer models. The NeXTcube, Jobs believed, was the next step after “personal” computing, setting the basis for people to communicate and collaborate using computers and thereby overcoming the problems of “personal” computing. Indeed, at a time when most people were totally unaware of email (and those who were emailed in plain text), Jobs was busy demonstrating a new emailing system, NeXTMail, which was incorporated into the NeXT computer system and used, by Jobs at least, to demonstrate the “interpersonal”—bridging the communication gap between machines and people. NeXTMail was one of the first systems to allow people to send audio and graphic content within their emails.

But there was the bottom line—there’s always the bottom line. The company sold only about 50,000 computers. Jobs ran the company with such an obsessive focus on perfection that nothing went smoothly; everyone and everything was strained. The company was broken into hardware and software divisions, and the hardware division was sold in 1993. The emphasis on NeXT software development continued for another three years until the rest of the company was put up for sale.

Enter Apple. Jobs’s old company bought his new one, NeXT, for $429 million and, in the process, brought Jobs back to Apple. In a relatively short time, he was established as Apple’s interim CEO after the directors lost confidence in then-CEO Gil Amelio. The boardroom coup was unexpected, but Jobs was in a position to take advantage of the company’s failing profitability.

By March 1998, Jobs had developed a plan to return Apple to the black ink. He started out by terminating a number of projects that weren’t working out. Although many Apple employees were subsequently terrified of losing their jobs, only a few people actually were let go as a result of his overhaul of the company. Apple and Jobs had established a reputation for being unafraid of ditching any initiative that wasn’t panning out—in other words, cutting their losses and moving on to something better. One of those better things was the iMac, noted not only for its ease of use but for its sleek, elegant design. It looked as good as it worked. To Jobs, of course, performance and design went hand in hand.

And not just for computers. Back in 1986, Jobs had purchased the Graphics Group from Lucasfilm for $10 million. The company was intended to be a high-end graphics hardware developer. It called its flagship equipment the Pixar Image Computer, but sales were modest at best. Then Jobs struck a deal with Disney to produce computer-animated feature films. Pixar, as the company was now called, would make the movies; Disney would cofinance and distribute them.

Toy Story, the first film produced by the studio, was released in 1995 to critical acclaim and stellar box office receipts. Overnight, Pixar was a sensation, and the animated hits just kept coming: A Bug’s Life in 1998, Toy Story 2 in 1999, Monsters, Inc. in 2001, Finding Nemo in 2003, The Incredibles in 2004, Cars in 2006, Ratatouille in 2007, WALL-E in 2008, Up in 2009, Toy Story 3 in 2010, and Cars 2 in 2011.

In January 2006, Disney struck a deal to buy its partner Pixar in an all-stock transaction worth $7.4 billion. Jobs became the largest single shareholder in the Walt Disney Company as a result of this deal, with ownership of approximately 7 percent of the company’s stock. Jobs’s Disney holdings would exceed even those of former CEO Michael Eisner, who owned about 1.7 percent, and Disney family member Roy E. Disney, who held about 1 percent at the time the Pixar deal was struck. Jobs also gained a seat on the Disney board of directors.

Back at Apple, Jobs steadily brought the company back to the fore, with aggressive marketing and—most important, of course—jaw-dropping, innovative digital products.

Consider the iPod. It revolutionized the way people listen to music, not only serving as the dominant portable music player (sayonara, Sony Walkman) but also making digital music accessible just about anywhere through the iTunes store. Apple’s development of the iPod was part of a broader plan for the company to branch into music distribution as well as consumer electronics.

When Apple released the iPhone in 2007, there was a similar plan in place. Apple burst into another key industry of the modern world, mobile communications. The multi-touch-display cell phone that facilitated this development builds on the features of the iPod and has ultimately revolutionized the mobile browsing scene. It’s not for nothing that they call them “smart” phones.

And then came the iPad and iPad 2. We’ll have to see what Apple comes up with next. All of Apple’s growth and success have not been without controversy, however. Jobs insisted that product users and software developers play his way or hit the highway. Jobs didn’t believe in open systems, allowing free, unfettered access to products and services—the way Google, Microsoft, and many others manage their e-worlds. Instead, Apple and Jobs wanted to make a buck whenever and wherever they could. Magazine and newspaper publishers, for example, have been disappointed at best and angry at worst over the iPad’s rules for content and subscription purchases. Apple has censored some apps and has insisted on a 30 percent commission for any subscriptions sold. Said Forrester Research analyst James McQuivey in a BBC interview, “Apple envisions a world in which people don’t consume any kind of digital media without its help.” 79

Not everyone shares that vision. Apple is involved in a host of lawsuits, as both a plaintiff and a defendant, ranging from disputes over patents to antitrust allegations. Apple and Jobs have also taken heat for their outsourcing of many iPhone and iPod assembly jobs overseas, particularly to a Foxconn plant in Shenzhen, China, where harsh working conditions and low pay have led to a rash of worker suicides.

But the man and his company always seemed to weather every storm that came their way. Steve Jobs was aggressive, persuasive, effective, and fearless. An egomaniac too? You bet. Just take a look at his black-turtleneck performances. But he was a visionary. And it’s safe to say our lives would be quite different without him and his company.

What is it about Apple products that makes them resonate so well with users throughout the world? Why can’t we put them down? Why do we so enjoy working and playing with them? In other words, what is so special about Apple products’ design?

First, simplicity. No clutter, no confusion. The designs never try to do too much at once or show too much in too little space. Then there’s the attention to detail. Whether it’s icons on a screen or logos on a carrying case, Apple does sweat the small stuff. Furthermore, the designs are consistent. Apple screens in every one of its products have the same look and feel and the same basic grid, in which navigation is intuitive. Finally, Apple products possess a quality that is hard to measure but apparent nonetheless: class. The designs are not sterile, not industrial, and not techie, but classy. You feel good about holding Apple products in your hands, putting them up to your ear, or opening them up on the tray table in front of you.

But most important is what the man himself, Steve Jobs, said about it: “Design is not just what it looks like and feels like. Design is how it works.”80

For all his amazing intelligence, his business acumen, and his relentless aggressiveness, Jobs always was fearless and crazy enough to give design its due—to make sure his company’s products worked well and looked good. That’s the only way he’d have it. It was by design.

And now that Apple aggressiveness, that dedication to quality and design, will have to go on without the man himself. In yet another surprise announcement, Jobs, 56, revealed in late August 2011 that he was stepping down as the company’s chief executive. Clearly his health dictated the decision. “I have always said that if there ever came a day when I could no longer meet my duties and expectations as Apple’s CEO, I would be the first to let you know,” Jobs wrote in a letter released by the company. “Unfortunately, that day has come.”81

Just weeks later, on Oct. 5th, 2011, Job’s ultimate day came. He died peacefully, with his family at his side. He’s gone, but his company and its legacy live on. It’s Apple, after all. Solid, creative, and well-designed.

Brick and Mortar

Design is what he does, what he lives for. He’s one of the greatest architects the world has ever known. The projects he has worked on form an amazing list that would fill many pages of this book. Among them are L’Enfant Plaza Hotel in Washington, D.C., the Green Building at MIT, the National Center for Atmospheric Research in Colorado, the John F. Kennedy Library in Boston, Dallas City Hall, the East Building of the National Gallery of Art in Washington, D.C., the Fragrant Hill Hotel in China, the Bank of China building in Hong Kong, the glass and steel pyramid for the Louvre museum in Paris, the Morton H. Meyerson Symphony Center in Dallas, the Rock and Roll Hall of Fame in Cleveland, the Miho Museum in Japan, and the Museum of Islamic Art in Qatar. And that’s just some of the buildings.

Also, he has received virtually every prize, award, and honor the architecture profession can bestow, including the AIA Gold Medal, Praemium Imperiale for Architecture, the Lifetime Achievement Award from the Cooper-Hewitt National Design Museum, and the Pritzker Prize, often called architecture’s Nobel Prize.

I. M. Pei is in his nineties now and not slowing down. The spark, talent, and creative eye are still there, just like they were all those decades ago when the 18-year-old made a fearless and crazy decision to leave his native China for America. Why the life change? Because he liked Bing Crosby movies.

Pei knew he wanted to attend a first-rate architecture school in the United States—that was his serious, career-minded side. But he was fascinated by what he had seen of college life in Bing Crosby’s early 1930s movies. Hollywood made higher education seem like an amazing adventure.

“College life in the U.S. seemed to me to be mostly fun and games,” he remembered decades later. “Since I was too young to be serious, I wanted to be part of it. You could get a feeling for it in Bing Crosby’s movies. College life in America seemed very exciting to me. It’s not real, we know that. Nevertheless, at that time it was very attractive to me. I decided that was the country for me.”82

So Pei made his way by ship and train to Philadelphia to attend the University of Pennsylvania. But he was soon disappointed—in the academic offerings, not the social life. The school’s architecture professors seemed stuck in the classic traditions of the Greeks and Romans. Pei had his sights on modern architecture—something new, something different. He transferred to the Massachusetts Institute of Technology.

But at MIT, the architecture school was more of the same, teaching and promoting the Beaux-Arts style with its emphasis on the classical. Pei stuck it out and did well academically, even though he was intrigued by the designs of the new International style and the Prairie School designs of Frank Lloyd Wright. He received his bachelor’s of architecture degree in 1940 and planned to return to China to begin his design career. But the Japanese had invaded his homeland, and he stayed in America.

He soon met Eileen Loo, the woman whom he would marry and with whom he would raise a family and share a life. Loo enrolled in the landscape architecture program at Harvard University; Pei joined Harvard’s Graduate School of Design. It was here that Pei worked with and got to know famed architects Walter Gropius and Marcel Breuer, both of whom were proponents of modern architecture. Pei’s career took off like a rocket.

He received his master’s from Harvard in 1946 and taught there for the next two years. He then went to work for New York real estate developer Webb and Knapp, designing apartment buildings, housing towers, a corporate building for Gulf Oil in Atlanta, the Roosevelt Field Shopping Mall, the Mile High Center in Denver, and Denver’s Court House Square.

In 1955, he started his own firm, I. M. Pei and Associates, but continued to work with Webb and Knapp. Among the projects Pei and his firm designed over the next 35 years are Kips Bay Towers in Manhattan, Society Hill Towers in Pennsylvania, and the National Center for Atmospheric Research (NCAR) complex, mentioned previously. Pei considered the NCAR project his breakout design; it won him notoriety and praise.

The NCAR was followed by the S. I. Newhouse School of Public Communications at Syracuse University and the John F. Kennedy Presidential Library and Museum, his signature creation. Jacqueline Kennedy herself chose Pei for the job for both professional and personal reasons. She liked that Pei didn’t fit any particular mold; he used techniques and ideas from just about anyone or anywhere. “He didn’t seem to have just one way to solve a problem,” said Mrs. Kennedy. “He seemed to approach each commission thinking only of it and then develop a way of making something beautiful.”83

She also felt a special connection with Pei; he reminded her in many ways of her late husband. “Pei was so full of promise, like Jack,” she said. “They were born in the same year. I decided to take a great leap with him.”84

The Kennedy project was incredibly challenging for Pei, with stops and starts and rethinking and reimagining along the way. It was dedicated in October 1979. Pei calls it the most important commission of his life.

The projects continued and haven’t stopped yet. The entire world is his drawing board for more grand designs, traditional structures, and breathtaking advances. Jacqueline Kennedy had it right when she said he didn’t have just one way to solve a problem. I. M. Pei’s style cannot be described by any one adjective. He doesn’t like the idea of architectural trends and “schools.” There is no Pei School. To him, everything is fair game. “An individual building, the style in which it is going to be designed and built, is not that important,” he said. “The important thing, really, is the community. How does it affect life?” And later he said, “I believe that architecture is a pragmatic art. To become art it must be built on a foundation of necessity.”85

The citation for Pei’s 1983 Pritzker Prize sums up the man and his work best: “Ieoh Ming Pei has given this century some of its most beautiful interior spaces and exterior forms…. His versatility and skill in the use of materials approach the level of poetry.”86

And perhaps we have an affable crooner to thank for it all. If the fearless and crazy I. M. Pei hadn’t been so intrigued by Bing Crosby and the college life he promised, the young student may have never gotten on that boat and sailed off to America and into architectural history.

Multidimensional Man

His mother was a painter, and his father was an engineer. So it’s not that surprising that this Ontario-born-and-bred risk taker is a remarkable scientist/artist combination. In fact, to create the greatest work of his amazing career, the artist in him had to wait several years for the science—his and others’—to catch up and make his vision a reality.

This dual-purpose genius is none other than James Cameron, the most successful filmmaker in Hollywood history. And the story that had to wait for the technology before it could become digital reality is, of course, Cameron’s Avatar, the biggest box-office smash ever.

As a young man, Cameron assumed his natural bent, and the tech side of the filmmaking business was where it landed. He drove a truck on weekdays but hunkered down in a University of Southern California library on weekends, photocopying scholarly works and dissertations about optical printing and film stocks. “This is not bull,” he said later in an interview with Popular Mechanics. “I gave myself a great course on film FX for the cost of the copying.”87

His first gig in the biz was with legendary low-budget producer/director Roger Corman as part of the effects crew on the forgettable 1980 flick Battle Beyond the Stars. Then it was on to meatier fare such as 1981’s Piranha II: The Spawning, which Cameron wrote and directed. Then came the hits in rapid succession: The Terminator (1984), Aliens (1986), The Abyss (1989), Terminator 2: Judgment Day (1991), True Lies (1994), Titanic (1997), and Dark Angel (2000–2002). Titanic, of course, was like nothing before. It earned 11 Academy Awards, including Best Picture and Best Director, for Cameron. It was the highest-grossing movie ever, at $1.8 billion. But as we all know, there’s a new box-office champ: Cameron’s 2009 megahit, Avatar, at $2.7 billion.

The movies reflect their creator. They are the art-science combination that is at the core of the man himself. More often than not, Cameron writes the screenplays. Among his screen-writing credits are The Terminator, Aliens, The Abyss, Titanic, and of course, Avatar.

On the tech side, The Abyss featured ahead-of-the-curve underwater lighting effects, developed by Hollywood inventor Vince Pace. And it was that kind of cutting-edge design—primarily on the video-optical side—that captured Cameron’s imagination and got his creative juices flowing. He helped found the special effects company Digital Domain in the early 1990s. Those underwater effects for The Abyss were the basis for his amazing “liquid metal” effect in 1991’s Terminator 2: Judgment Day. Cameron and Pace designed a unique high-definition 3-D camera for Cameron’s 2003 documentary, Ghosts of the Abyss, which explored the wreck of the Titanic.

Every Cameron project tests the limits of what is possible in the science of moviemaking. If it hasn’t been invented yet, he designs it himself and tries it out. Bruce Davis, former executive director of the Academy of Motion Picture Arts and Sciences, admires Cameron’s “willingness to incorporate new technologies in his films without waiting for them to be perfected. They call this ‘building the parachute on the way down.’”88

They also call it fearless and crazy.

But Cameron did have to wait for science to catch up before he could shoot Avatar. In 1995, he wrote an 80-page treatment of the story, with working title Project 880. It wasn’t until 10 years later that the 3-D technology was ready for him to move forward. But even then, it was slow going, with the film’s May 2009 release pushed back to December (which, incidentally, gave 3,000 additional theaters more time to install the expensive 3-D projectors).

Avatar is composed almost entirely of computer-generated animation. Cameron used a modified version of the Fusion camera for the live-action sequences. The new 3-D camera creates an augmented-reality view for Cameron as he shoots, sensing its position on a motion-capture stage and then integrating the live actors into computer-generated environments on the viewfinder. Confused? Me too. The point is that Cameron’s specially designed equipment and painstaking technique make the 3-D picture immersive; it swallows you up to bring you deep into the exotic world of Pandora.

Cameron’s Avatar tale and the amazing way he brought it to the screen were clearly worth the wait. The man who can design a story and put it on the page and then design a camera to bring that story to life is the most daring risk taker in Hollywood.

Coming attractions? Here’s the lineup:

image 2013: Fantastic Voyage (executive producer)

image 2014: Avatar 2 (director, writer, producer, editor)

image 2015: Avatar 3 (director, writer, producer, editor)

You’d better get your 3-D glasses ready. On second thought, hold off on those glasses. They’re likely to be obsolete because there’s no telling what James Cameron, the amazing artist-scientist of Hollywood, will have come up with by then.

image

image   MY TAKE

One of my favorite definitions of design comes from Sir George Cox, writing in his Cox Review: “Design is what links creativity and innovation. It shapes ideas to become practical and attractive propositions for users or customers. Design may be described as creativity deployed to a specific end.”89

Another fitting definition says that “design is all around you; everything man-made has been designed, whether consciously or not.”90

There has never not been design; design is involved in everything from the earliest drawings on a cave wall to the latest smart phone app. It’s the aesthetic—the look and feel of something. But more important, it’s the practicality and the functionality. Design looks at how things work, how they do their jobs, and how they could do their jobs better.

There’s a simple way to tell if your neighborhood park has been well designed. As you walk around and through it, notice whether there are paths worn into the grass where your neighbors have left the designated, constructed sidewalk or trail and made their way by a different route, often shorter and sometimes more scenic. In your park, people “vote” with their feet. They decide which route is best. If people have to forge their own trails, the designers got it wrong.

It’s that way in business too. You can get a headache thinking of all the product flops over the years—even Apple has had a few (remember the Apple Newton MessagePad?). Quite often those flops were because the design was wrong, impractical, or too confusing—design run amok, as I like to say.

Design takes ideas—the intangibles, the abstract—and makes them concrete, usable, applicable, and appropriate. Design solves problems. If it doesn’t, the product is soon abandoned and forgotten.

Sometimes a designer gets a little too artsy and forgets that practical part. If you want to witness an example of that, just show up on a construction project site and ask the contractors what they think of the architect’s design details. You’re likely to get an earful.

Because design is so important, it’s seldom “discovered” in a vacuum, seldom come upon by some lonely nerd in a eureka moment. Instead, design is a collaborative process. It requires brainstorming and teamwork. It means hit and miss, trial and error. That’s why I love it. In my new business, making movies, collaboration is essential. Without it there is nothing, and certainly not a movie.

Steve Jobs had hundreds if not thousands of product designers at Apple. I. M. Pei has his design team and all the contractors that work with him. James Cameron hires the best and brightest technology talent in Hollywood. And for those three and you, design happens from the beginning of your life-wealth plan. It starts with your vision and goals; affects every component of your plans to make those visions and goals realities; and is always there when you execute your plans, when you network and market, and when you react to the unknowns. Design is like the spine of your life-wealth plan. In fact, creating (and living) a life-wealth plan is in itself an exercise in design.

So how do you go about designing? Is there a method? A process? Later in this book, I give you some simple exercises to help you get started designing your life-wealth plan. In the meantime, here are several ways to approach more general design challenges, as practiced by those who get paid to do so:

image Use the KISS principle (keep it simple, stupid). Eliminate the unnecessary.

image There is more than one way to do it (TIMTOWTDI). Allow for many ways of doing the same thing.

image Incorporate use-centered design, in which the primary emphasis is on how the item will be used.

image Think about user-centered design. Most important are the needs, wants, and limitations of the end user of the item. Consider critical design—design that makes a statement or offers a critique of the values and practices in a culture.91

Look at those approaches again, thinking of Jobs, Pei, and Cameron. Their works have these methods written all over them.

But let’s not get too academic. You know good design when you see it, and more important, you know good design when you use it. As Procter & Gamble CEO A. G. Lafley says, “Your products run for election every day, and good design is critical to winning the campaign.”92

People—consumers, clients, investors—vote for good design, with their feet and hands, with their brains and hearts. As noted design writer Brenda Laurel says, “A design isn’t finished until someone is using it.”93

____________________

76 “You’ve Got to Do What You Love, Jobs Says,” Stanford University News, June 14, 2005, http://news.stanford.edu/news/2005/june15/jobs-061505.html.

77 Steven Kent, The Ultimate History of Video Games (New York: Three Rivers Press, 2001), 71-73.

78 Andrew Leonard, “Do Penguins Eat Apples?” Salon.com, September 28, 1999, http://www.salon.com/technology/feature/1999/09/28/mac_linux/index.html.

79 Patrick May, “Apple Makes Bid to Become Gatekeeper for Newspapers and Magazines,” mediaIdeas, April 3, 2011, http://blog.mediaideas.net/2011/04/04/apple-makes-bid-to-become-gatekeeper-for-newspapers-and-magazines/.

80 Rob Walker, “The Guts of a New Machine,” New York Times, November 30, 2003, http://www.nytimes.com/2003/11/30/magazine/30IPOD.html?ex=1386133200&en=750c9021e58923d5&ei=5007&partner=USERLAND.

81 David Streitfeld, “Job Steps Down at Apple, Saying He Can’t Meet Duties,” New York Times, August 24, 2011, http://www.nytimes.com/2011/08/25/technology/jobs-stepping-down-as-chief-of-apple.html?_r=1&hp.

82 Gero von Boehm, Conversations with I. M. Pei: Light is the Key (New York: Prestel, 2000), 34.

83 Carter Wiseman, I. M. Pei: A Profile in American Architecture (New York: H.N. Abrams, 2001), 98.

84 Ibid., 99.

85 Barbaralee Diamonstein, American Architecture Now (New York: Rizzoli, 1980), 145.

86 “I. M. Pei, 1983 Laureate: Jury Citation,” Pritzker Architecture Prize, http://www.pritzkerprize.com/laureates/1983/jury.html.

87 Anne Thompson, “How James Cameron’s Innovative New 3D Tech Created Avatar,” Popular Mechanics, January 1, 2010, http://www.popularmechanics.com/technology/digital/visual-effects/4339455.

88 Ibid.

89 Sir George Cox, Cox Review of Creativity in Business: Building on the UK’s Strengths (London: Her Majesty’s Treasury, 2005).

90 See Mat Hunter, “What Design Is and Why It Matters,” Design Council, http://www.designcouncil.org.uk/about-design/What-design-is-and-why-it-matters/.

91 “Approaches to Design,” Wikipedia, March 21, 2011, http://en.wikipedia.org/wiki/Design.

92 Jennifer Reingold, “What P&G Knows About the Power of Design,” Fast Company, June 1, 2005, http://www.fastcompany.com/magazine/95/design-qa.html.

93 Sander Baumann, “Beautiful and Inspiring Designers Quotes,” designworkplan.com, December 8, 2008, http://www.designworkplan.com/design/inspiring-quotes.htm.