CHAPTER 1

The New Need to Learn—and Our Mixed Response

My client was clearly not happy.

“I can’t get my folks to think beyond what they’re doing now!” he said, exasperated. At my inquiring look, he continued, “I asked everyone on my senior team to come up with one big new idea—something they thought could really move the business forward. And most of them just repeated back to me some variation of their existing goals. I don’t know what to do to get them thinking differently.”

A year into his new job as CEO of a media company just coming out of bankruptcy, he had already made sweeping changes: he completely revamped the senior team, spun off an underperforming division, and invested in a small part of the business where he saw future potential. His big frustration? Figuring out how to get his folks to follow him into the unknown, to be willing to experiment, to try new things.

“I imagine they’re worried about being bad—and having you see them as bad,” I replied.

He seemed puzzled.

“These people are good at their jobs,” I went on, “and they believe you’ve hired them or kept them on because they’re good at their jobs. They feel highly competent, even expert, and that makes them confident and comfortable—especially in this organization that’s undergoing so much change.” I leaned forward. “And now you’re asking them to learn brand-new ways of operating, to suggest things they’ve never tried, to, perhaps, fail publicly. You’re asking them—these middle-aged experts—to risk going back to being novices… to look dumb, to make mistakes, to not know how to do things. And it’s enormously uncomfortable for them. They’re resisting it by focusing on what they know they’re good at and what they feel comfortable doing.”

His face cleared. This particular executive, unlike most people, is okay with the discomfort of trying new things in public. Making mistakes and having to ask I-don’t-know-what-that-means kinds of questions usually doesn’t bother him much.

“So I need to encourage them to get comfortable with being uncomfortable,” he said, starting to smile.

“Exactly,” I said. “You need to let them know that not having everything perfectly mapped out is okay, and that you expect that some of their new ideas won’t work. Most importantly, they have to know that you truly see trial and error as an inevitable part of breaking new ground. It’s your responsibility to support them in becoming as comfortable as possible with the terribly awkward reality of exploring and understanding new ideas and new skills. You have to know—and let them know that you know—that to get good at anything that’s new to you, you have to be willing to be bad at it first.”

And thus the title of this book. You may get tired of me saying this over and over again, but it’s the core of what we’ll be talking and thinking about throughout this book, and it’s a surprisingly unrecognized aspect of modern life. Because each one of us today is faced, moment to moment, with an overwhelming flood of information and possibilities that are brand new to us, we have to learn to be okay with being continuously uncomfortable in a way that no one in previous generations has had to do. As my client so wisely said it, we have to learn to be “comfortable with being uncomfortable.”

Before You Decide This Is Another Book About Failure

I feel compelled to take a pause and do a bit of mind reading here. I believe you may, right at this moment, be thinking, Oh, yeah—failing forward, failing fast. That’s what she’s talking about; I know about that. I’ve read books about failure…

I believe you may be thinking something like that because it’s how a pretty high percentage of people respond when we start talking to them about being bad first. The concept of “failing forward”—the title of John Maxwell’s enormously popular book about how to respond well to failure—is an important one, and has helped a great many people accept their failures and mistakes rather than being overwhelmed or defeated by them.

So just to clarify: this is not a book about forging ahead through failure, or about how it takes a hundred bad ideas (failures) to come up with a good one, or about “grit” or resilience in the face of failure. What I’ll be doing with you here is supporting you in building a few key habits of mind and action—mental skills that will allow you to acquire new capabilities quickly and continuously. This is an essential ability in today’s world.

One of those mental skills—often the most difficult to develop—is the ability to accept the discomfort and disequilibrium that is an inevitable part of learning something new. Sometimes that involves accepting failure, but more often it simply means learning to be okay with slowness, awkwardness, not being clear about things, having to ask embarrassing questions—that is, learning to be okay with being bad first on the way to getting good. Interestingly, learning to be bad first (along with the other three mental skills I’ll share with you throughout this book), can actually make it less likely that you’ll fail massively during the learning process. But more about that later (in chapter 8). The important thing for our purposes here is that the ability to be bad first, along with the other mental skills you’ll learn, is necessary to being a world-class learner, which is key to our success here in the twenty-first century.

Why Being Bad First Is So Essential Now

Let’s talk a bit more about why this is so. Unless you’re living somewhere deep in the equatorial rain forest or on top of a mountain, you know that we’re living in an era of unprecedented change, driven largely by the enormous daily proliferation of new knowledge. One fascinating way to conceptualize the viral growth of human knowledge was posited in the early 1980s by futurist Buckminster Fuller. In his book The Critical Path, he established a concept he called the “knowledge-doubling curve.” He started by assuming that all of humankind’s collective knowledge in the year AD 1 was equal to one “knowledge unit.” Fuller estimated that it then took until about AD 1500 for that cumulative human knowledge to double—for us to have discovered and understood, as a species, twice as much: at that point we had two “knowledge units.”

He went on to propose that the next doubling—from two to four units—happened by 1750. He suggested that this second doubling was significantly accelerated by the invention of the printing press and the building of ocean-going ships that could travel fairly reliably from region to region, both of these innovations serving as powerful knowledge-spreading mechanisms.

He further estimated that our human knowledge had doubled again by around 1900. So, according to Fuller’s theory, at this point—nineteen hundred years from the beginning of his knowledge-doubling curve—we humans had acquired or created eight “knowledge units.” Pretty good work on the part of humanity… but we were just getting started. At this point, the curve really started to accelerate.

Fuller proposed that the next doubling occurred around 1950, the next in the mid-70s, and the next in the late 1980s. And the acceleration continues: today, researchers estimate that human knowledge is doubling every twelve months—and some project that within the decade, it could be doubling every twelve hours.

Let’s take a minute and put that into individual human terms. My mom’s dad was born in 1887, right around the third doubling, when Fuller’s model assumes human knowledge had increased to 800 percent of what it had been in the year AD 1. But by the time my children—his great-grandchildren—were born in the 1980s… human knowledge had already increased again by 800 percent from when he was born. You’re reading that correctly: our knowledge had expanded as much during that one-hundred-year period as it had during the preceding nineteen hundred years.

I imagine you now have an image in your head of a geometric curve: a line that starts out almost horizontal, gradually moves upward, starts to curve, and then heads suddenly north, almost exploding off the top of the chart. In terms of our knowledge as a race, we’re now in that rocketing-upward part of the curve. But what does that mean for us, day to day? More than anything else, it means we have more options: this ramping up of knowledge has brought a concurrent ramping up of choices. Because we know so much more and understand so much more about ourselves and about the world around us, we all have many, many more choices to make at every moment than did our grandparents and great-grandparents—choices about what to focus on, what to learn, and how to use what we learn.

Here’s an example. My mom, who was born in 1922, used to tell us how she and her dad constructed a crystal radio set so they could listen to the account of Charles Lindbergh’s solo flight from New York to Paris in May of 1927—the first time anyone had crossed the Atlantic alone in an airplane. She remembered her dad telling her that Lindbergh was the first person in history to be in New York one day and in Paris the next. Nearly everyone in the world who had access to information about Lindbergh’s flight was focused on that event, from the hundreds of people who were there when he landed in an airfield outside Paris, to the thousands who were able to listen in, thanks to the modern miracle of radio, or who were informed through the telegrams and phone calls that sped around the world, to the millions who read about it over the following days in their local newspapers. For weeks after it happened, it was likely the most important event on most people’s minds. It probably took a number of months for the news of his success to get to everyone in the world who was interested in hearing it.

Contrast that with today: any event of any importance (and lots that we’d argue aren’t that important at all) zips from one side of the world to the other in mere moments. From presidential elections to the events of the Arab Spring, from government scandals to celebrity babies, from groundbreaking medical discoveries to scary pandemics: we have access to any and all information in the blink of an eye, and we have to decide, moment to moment, how to think about it, how to respond to it, and whether it affects our lives.

And this explosion of information (with its resulting explosion of choices about where to put our attention) has also created secondary explosions—in our scientific and technological knowledge, for instance, and in our beliefs and expectations about society and individual rights. Anyone from the early twentieth century magically transported into the twenty-first would be overwhelmed by the differences in daily life since his time, on both a technological and a cultural level. Cars are a part of every family’s daily life; pocket devices make phone calls, take pictures, and hold all our information; airplanes, television, antibiotics, and computers are facts of everyday life; the status of women and people of color has changed dramatically; and a much wider variety of lifestyles, religions, and philosophies are accepted as legitimate and commonplace.

But what does this have to do with my original point, that here in the twenty-first century it’s essential for us to learn how to “be bad first”? This explosion of knowledge, and the technological, scientific, and cultural advances that have resulted, have also dramatically changed how we learn and how we work—and what it takes to succeed at work and in our lives.

For someone growing up in the early part of the twentieth century, the expectations around learning were fairly clear: you would go to school to learn the basics, then land a job and learn what you needed in order to do that job reasonably well. You would go on to work in some version of that job until you retired. This was true whether you were a doctor or a pipe fitter: the vast majority of people learned a trade or profession, and practiced it throughout their working lives. Of course, some people aspired to be excellent at their chosen profession, and those people might take pride in learning the new techniques and approaches that occasionally arose, or in figuring out a better way to do some essential part of the job. But for the most part, someone who started working in 1925 would most likely be engaged in very much the same kind of work when he or she (usually he) retired with a gold watch in 1965.

In that environment, the ability to learn new things quickly and continuously wasn’t that important. People assumed, for the most part, that the bulk of learning would happen early in their lives, and that by the time they got to be adults, they could relax into being competent (or even excellent), and just make the effort necessary to maintain their existing skills and knowledge. Changing jobs—let alone changing careers—was seen for the most part as somewhat suspect, a sign of low commitment, poor work habits, or an inability to get along with people. Those old movies from the ’30s and ’40s, where someone says, “he’s a reporter” or “she’s a teacher”—that was an accurate reflection of the way people thought of themselves and one another: you had a job, and you were identified with that job until you retired from it, and, in fact, probably until you died.

Fast-forward to today, when most people coming into the workforce expect that they will have a variety of jobs and work at a number of companies, perhaps with a stint or two of working freelance mixed in—or even spend part of their career creating and working in their own company. The geometric increase of human knowledge has made it inevitable that nearly every job has changed and will change dramatically over the course of any person’s work life. And our new relationship with choice—that daily, hourly choice has become the norm, and that we will be making different choices about the same topic as new possibilities emerge—has made it acceptable for us to choose different work at different times in our lives.

Three Generations: As Change Accelerates

This sea change in our careers and what we expect of them, driven by the geometric increase in knowledge over the past three generations, has affected most of us directly: we see it written in the family histories of almost everyone we know. Your grandparents approached work differently than did your parents; their approach differs from yours, and yours from that of your children.

My own family is no exception. My dad was born in Valley, Nebraska, in 1921, and grew up wanting to be a lawyer. After Pearl Harbor, he left college to join the Coast Guard, and fought in the Pacific theater of World War II for three years. After the war, he came home to marry my mom, his college sweetheart, and attend law school at the University of Nebraska on the GI Bill. After he graduated and passed the bar, they settled in Omaha, where he joined my grandfather’s law practice. They raised four kids and were active in the community. My dad practiced labor law in Omaha his whole life, until his death in 1988. He never expected to change careers, or to leave Omaha, and would have been puzzled at the idea. Why leave a career he loved and was good at, in a town where he had settled, where his kids had gown up, and where he had been successful both as an involved citizen and in his chosen profession? He didn’t have to think about doing anything differently: the reasonable pace of change in those decades allowed him to live his life in much the same way throughout that forty-year period.

Though he was an intelligent and thoughtful guy, I can’t imagine him taking time to seriously question the idea that his path was pretty much what a career was supposed to be: you found something you liked and were good at, and you worked at it to support your family for the rest of your working life. End of story.

I remember my mom and dad talking in worried voices about my uncle, a salesman who had had to change companies and move to new cities a few times in his career. It wasn’t that he was in danger of not being able to support his family, it was simply a cause for concern that he hadn’t been able to stay with one company throughout his career.

My siblings and I, born in the next generation, took far more divergent paths—which we also thought perfectly acceptable, given the accelerating pace of change in our young adult years. Of the four of us, only my older sister has had a single career—she’s a college professor—and even she has lived in three different cities since college, has taught at two different universities, and did her graduate work at yet a third. My two brothers and I all chose a kind of career path that must have seemed exceedingly strange to our Depression-era parents, but was very common among baby boomers: we spent our twenties and early thirties creating careers for ourselves, that is, inventing new work rather than slipping into existing jobs. It was our way of taking best advantage of the possibilities we saw before us—responding to the explosion of new knowledge and new technology around us in the ’70s and ’80s. My older brother created a business renovating and brokering high-end pianos and is a nationally respected piano technician and teacher of other technicians; my younger brother is a well-known journalist, social commentator, and best-selling novelist; I’m an author and the founding partner of Proteus, a consulting company that focuses on leader readiness. However, we didn’t entirely abandon the find-something-and-stick-with-it credo of our parents’ generation. We’ve each been doing some version of those careers we invented for decades now: we baby boomers, for the most part, still believed that the smart thing to do was find a good career and commit to it.

My children and my siblings’ children, however, have moved even further away from the “one career” credo, as the knowledge curve has really started to shoot up. Most of them are millennials, and each has had a number of jobs. Some of them have already explored more than one profession. For example, my older daughter graduated from college with a degree in marketing communications and worked in PR and customer service for a few years. She realized it wasn’t as satisfying and interesting to her as she’d hoped, so she went back to school and got a master’s degree in early childhood education, and is now teaching in a progressive independent school. Though she loves teaching, and is glad she made the change, she doesn’t take for granted that she’ll be a teacher for the rest of her career. She’s learning and adjusting as she goes, assuming that the main skill she’ll need is the ability to adapt to changing circumstances as she looks to craft career paths that she’ll enjoy and that will allow her to grow with and support her husband and children.

She and her generational counterparts rarely define themselves by their careers; they see career as something that will change as they change, in response to changing needs and circumstances. One study shows that these millennials expect to have fifteen to twenty different jobs by the time they retire. Not just promotions into a slightly bigger or more senior version of their current job, mind you—fifteen to twenty entirely different jobs. In fact, most people now in their twenties and thirties don’t assume that the job they have today will even exist forty years from now, let alone that they’ll have anything resembling that job by the time they retire. If they retire at all: 50 percent of millennials don’t expect to have access to Social Security or corporate retirement plans when they hit retirement age, and almost that many say they want to find work that’s so meaningful to them that they may not ever want to retire completely.

This same shift—from stability to fluidity—has happened on an organizational level. In the early twentieth century, the business landscape was dominated by big companies we all assumed would last forever. (Remember TWA? General Foods? Arthur Andersen?) Now, many of the companies that immediately come to mind when we think of successful enterprises are those that have arisen out of new technologies spawned by new knowledge (Amazon, Google, Apple, Samsung). And even the largest companies seem vulnerable these days to the sweeping changes in technology and consumer behavior driven by this ramping up of knowledge. For example, on a Wikipedia list of the world’s largest companies by revenue, almost one-third are in the oil and gas industry. What will happen to these mega-companies as our knowledge of alternative fuels and their profitable application increases exponentially?

Just like the rest of us, the folks who run these companies will have to learn and change—or fail. In the words of Arie de Geus, a wonderfully agile thinker and for many years the head of strategy for Royal Dutch Shell (interestingly, number two on that list of the world’s largest companies by revenue), “The ability to learn faster than your competitors may be the only sustainable competitive advantage.”1

Let’s focus on the one element of our world that is making that statement more and more true every day.

More Knowledge = More Communication = More Knowledge

We’ve been talking about the proliferation of knowledge and options over the past century, and the technological explosion spawned by that new knowledge. But we haven’t yet talked directly about the Internet, the twenty-first century’s most powerful knowledge distribution mechanism, even more potent than the printing press and seagoing vessels in terms of its ability to accelerate the advance of human knowledge. The advent of the Internet as a means of communicating and expanding our knowledge base makes “learning faster than your competitors” both more possible and more necessary.

If it’s true that human knowledge will be doubling every year by the end of this decade, we have the Internet to thank. For example, every scientific or medical breakthrough that happens today is immediately available online all over the world in complete detail, so that researchers can access that new knowledge instantly, and begin to build upon it. The same is true of business innovation: in the old days (say, 1980), when people started a truly new kind of business, they could assume that they would have a couple of years to get their feet under them before they’d have much competition. No more. New ventures can succeed almost overnight, given the fact that social media can be harnessed to spread the word about their services or products with minimal investment. But the same viral-ness of communication that allows for that success also makes that success completely and thoroughly visible to potential competitors, who can start replicating those approaches and products the next day.

Given all this, it seems clear that those who succeed in today’s world will be those who can acquire and apply new knowledge and new skills quickly and continuously. That’s really the premise of this book: that at this point in history, where knowledge is increasing exponentially, where work is changing daily, where advancements in every area of discipline nearly outpace our ability to communicate them —the ability to learn well and quickly is the most important skill we can have.

And that requires, among a number of things that we’ll talk about throughout this book, being willing to be a novice over and over again—being willing to be bad at things on the way to getting good at them. There’s no way around it, I’m afraid: in order to thrive in this new world, you have to let go—on a daily basis—of the idea that to be an adult means to be an expert. You have to stop thinking that your primary goal at work and in life is to tick all the boxes correctly and not make any mistakes. That belief will be reassuring in the moment, but over time it will land you in the dustbin of history.

But Not Knowing Things Feels So… Bad

We may realize intellectually that being successful these days requires being open to continuous, disruptive learning, but that doesn’t mean we like it… or that we’re very good at it. The Financial Times noted in a recent article, for instance, that corporate learning tends to be largely ineffective. “Companies’ spending on training and development accounts for hundreds of billion pounds globally each year. But every year, according to successive empirical studies, only 5 to 20 per cent of what is learnt finds its way back into the workplace.”2

Somehow, our attempts to teach and learn new and needed skills and understanding at work aren’t working very well.

This may be at least partly because our efforts at supporting people to acquire new skills and knowledge don’t take into account that we so often resist learning new things—especially when those things are different from what we already know, or when we have to take on new ways of behaving or thinking in order to learn them. A study completed in 2010 at Cornell University by researchers Jennifer Mueller, Shimul Melwani, and Jack Goncalo focused on our conflicted relationship with new ideas.3

They used a test to determine people’s implicit attitudes toward new ideas. The test offered a list of words that described new or untested things (for instance, novel, creative, inventive, original) and other words that described standard or known things (such as practical, functional, constructive, and useful), and asked the study participants to categorize each word as “good” or “bad.” The study found that although people say that they like and want creativity and newness, when it came to categorizing the words that describe familiar things and unfamiliar things, they saw the “familiar” words as more positive and the “new” words as less positive, time and time again.

No matter what we assert about wanting to be creative and being open to new ideas, we tend to have a harder time accepting and feeling positive toward new ideas than toward those already proven and understood. One finding from the study was that people may believe they are open to insights and innovations but are generally only receptive to new ideas that fit with existing practices and maintain predictability.

In other words, we like new ideas or new skills as long as they reinforce our existing beliefs and experience. As I said early on in this chapter, we really like being good at things and feeling as though we’re competent. So if new knowledge comfortably supports and expands our sense of ourselves as expert, we’re fans of it. However, if the new knowledge makes us question what we know, or feel like we’re not as expert as we thought (and have told everybody that we were), we resist it.

In the late nineties, a British scholar named James Atherton demonstrated very poignantly that, though we say we want to learn and take in new ideas, when those new ideas start to poke holes in what we know, or send us into realms where we feel like novices—i.e., clumsy and inexpert—we often close down and resist that learning. Atherton notes, “Resistance to learning is a phenomenon well-known to most tutors and trainers of adults, but has received remarkably little attention in the literature.”4

By way of exploring this resistance, Atherton set up an interview-based study of social services professionals taking in-service training programs. This wasn’t some wacky adventure in free-form learning: it was knowledge the participants needed in order to succeed in their careers, and for the most part, the participants were not overtly negative toward either being required to attend the training or the need to acquire this new knowledge—they saw why the training was necessary and were reasonably open to taking it.

However, when Atherton and his colleagues began investigating the participants’ actual responses to the training, they found something very different. When confronted with new facts and knowledge, especially knowledge that seemed to contradict what they already knew, they often simply shut down. Study participants became confused, unable to concentrate, even angry when asked to learn things that would cause them to operate in new ways or rethink current practices. A number of participants “reported… a largely inexplicable inability to listen to or to understand ideas which they themselves felt they should have been able to manage intellectually with no difficulty. They used phrases like ‘I just couldn’t get my head round it.’”5

I’ve seen this for years in the executives we coach. If we’re working to help someone improve in an area where that person already has some competence—is okay at delegating, for instance, but needs to get better—that’s usually relatively easy. Especially if the delegation model we’re teaching aligns pretty well with what she has already been doing. But if an executive is really poor at delegating (and particularly if that person sees himself as an “expert” leader), it can be very tough. We often get the same kind of confusion, irritation, and inability to grasp relatively straightforward ideas or change fairly simple behaviors that Atherton encountered in his study. It can take a good deal of time, care, and attention to help executives adopt the mindset that will allow them to learn new skills necessary to their success.

In short, we don’t like being thrown into that “be bad first” position; as adults, we simply don’t like to do what feels like going backward, to being novices all over again. We secretly hope that we’ve left behind forever that feeling of incompetence that we have when we’re confronted with new knowledge or skills. We want to be fully competent adults who have it all together. As adults, having to start from scratch to learn something brand new can be confusing, demoralizing, even downright scary.

Like the executives in the example I gave at the beginning of the chapter—who may have verbally supported their boss’s focus on innovation, and may even have intellectually understood how important it was for them to open up to new ideas and new ways of doing things—when it came to the point of actually needing to step out of their comfort zone and do it, Atherton’s study participants shut down. It felt too uncomfortable, too professionally and personally vulnerable, to venture into that vast territory of “I don’t know this.” They retreated to the known, turning their backs on the possibility of learning those new ways of understanding and operating that might carry them and their company forward into a more successful future.

So, here in this moment, when knowledge and the possibilities that arise from that knowledge are expanding exponentially—even as we speak—a key question for each of us is: How can we overcome our hesitation and our resistance to new learning, in order to become those “masters of mastery” who will best succeed in the twenty-first century?

The good news is, we’ve all got something inside of us that will help. We may hate to be bad at things—but we love getting good at things.