Unless you’ve missed the last couple decades, you know what information overload feels like.
Every year, there are six hundred thousand to a million new books published in English alone. Not to mention the millions of books published in other languages.
And that’s just books. More and more, as a society, we are consuming our information from an ever-growing flood of newer media. These range from the traditional media, like magazines, television, and newspapers, to the more modern blog posts, podcasts, audiobooks, and videos. In short, we are producing (and consuming) more information than ever before.
Not long ago, books were a precious commodity. People were lucky to own one or two books, and they read those books over and over again, savoring each page. In 1731, when Benjamin Franklin established the first subscription library, he pulled all kinds of strings to amass just forty-five books.1 Today, around 250 years later, the Library of Congress holds over thirty-nine million books. And again, let me remind you: that’s just books. (Every weekday, the library receives about fifteen thousand items, adding about twelve thousand of them to the archives).
For the most part, this is a very good thing. Throughout human history, progress has been loosely correlated to how easy it is for the average person to create—and access—knowledge. In this light, we might look at a few key events throughout history as major “turning points” in our development. The foundation of our world, then, started with the invention of writing, around five thousand years ago. Sure, we take it for granted today, but writing is what allowed us to asynchronously record and deliver information and knowledge from one person to another. No longer did we have to transmit information from person to person orally. More importantly, we no longer had to rely on our imperfect memories to store that information. This might not sound like a big deal, but it is. After all, every great empire is built on technology. For the British, that technology was ships. For the Romans, it was roads and metallurgy. But thousands of years before that, it was writing and accounting that helped the Sumerians build the first massive kingdoms.
Of course, even then, new information technology was not without its critics. Socrates, a proponent of memorization and oral education, often spoke against the use of writing, claiming it “weakens the memory and softens the mind.”2 Imagine that. I guess every generation has their own version of “that thing is turning your brain to mush!”
Controversial or not, the creation of writing was a massive technological breakthrough. It empowered us to disseminate important texts—mostly religious ones, mind you—to millions and millions of people. This enabled mass education and mass collaboration on a scale never before seen in human history. Pretty great, if you stop and think about it.
In the 1440s, Gutenberg’s commercial printing press took this a step further. While printing presses had existed in Asia for hundreds of years, none of them were as practical or as scalable. Gutenberg’s design, once perfected, enabled printers to easily reproduce and distribute many copies of books. This, in turn, made it much faster and easier to spread thoughts and ideas using the printed word.
For centuries after this revolutionary invention, the rate of information produced and consumed climbed steadily—and for good reason. Information—and therefore education—became cheaper and more readily available. This meant a more educated public, which, in turn, meant that more people were able to contribute to the growing body of knowledge. By the time he wrote Part III of his autobiography in 1788, Benjamin Franklin proudly proclaimed that his public library system had “made the common tradesmen and farmers as intelligent as most gentlemen from other countries.”3 In fact, Franklin himself was a perfect example of the effect a well-educated public can have on the body of human knowledge. Though he was denied a formal education and dropped out of Harvard over ideological differences, his life of autodidactic learning served him very well. In his lifetime, Franklin made significant contributions to the fields of politics, literature, science, governance, and more. And, in his roles as both postmaster for the US colonies and one of the most prominent printers and newspaper editors in the New World, he personally presided over this information explosion.
The next waves of innovation in information technology were, without a doubt, revolutionary. But despite that, they all had one flaw in common with traditional print publishing. Be it radio, broadcast television, or satellites, the next few waves of technology still had gatekeepers. Besides the occasional community radio or TV show, it was just about impossible for your average person to spread information on a massive scale. This meant that the information shared was, for the most part, carefully curated.
All this changed with the advent of the internet. Sure, in the early days, you needed to know a bit about computers and HTML to produce something people could actually read on the internet. But no longer. Today, technological literacy is a given, and all it takes to share information is a couple of clicks. You don’t even need to be able to write intelligibly today…you can vlog!
This, once again, has resulted in an absolute explosion in the amount of information we as a society produce. I need not scare you with the statistics of how many millions of posts, tweets, videos, and podcasts are shared every day on the internet because chances are you’ve felt it. And sure, most of it is, for lack of a better word, noise. But a great deal of it is not.
Consider this for a moment: Of all the information you’ve consumed over the last week, how much of it came from “traditional” media outlets? You know, names like CNN, The New York Times, NPR, or Random House? Just twenty years ago, that number would have been 100 percent. But today, in the era of The Huffington Post, Medium blogs, independent podcasters, and self-made YouTube stars, it’s probably less than half. In fact, of the three million podcast downloads, tens of thousands of books, and over two hundred thousand online course enrollments I’ve delivered over the last five years, not one of them came through “traditional” media outlets. And guess what? More and more of your favorite authors, podcasters, and bloggers are bypassing the gatekeepers and publishing their work directly to you.
So, like I said…explosion.
This doesn’t just have repercussions for “casual” information like self-help books or interesting business podcasts. This democratization of knowledge creation has played out in more and more fields. At first, the explosion of information started in only the most technical of fields like science and medicine. Think about it: at the turn of the nineteenth century, a doctor was a doctor (and a veterinarian too, if the situation called for it!). But then, as the amount we knew about our bodies began to increase, it was no longer possible for one person to maintain a working knowledge of it all. Over time, the medical profession fragmented: pediatrics, internal medicine, oncology, orthopedics, radiology, psychiatry, and so on. Today, it’s even more fragmented. If your child has a tummy ache that just won’t go away, today, you’ll likely be referred to a pediatric gastroenterologist. Need a nose job? You’ll likely see a craniofacial plastic surgeon who does nothing but noses. And heaven forbid you should develop a rare form of bone cancer. You’ll need to find a good musculoskeletal oncologist.
But this explosion of knowledge hasn’t stopped there. Indeed, it has expanded outward. From the sciences and computer programming to history and law and even to “soft” skills like sales and marketing. Today, the floodgates are wide open, and every profession is experiencing the benefits—and the detriments!
Furthermore, because there are exponentially more of us creating this new knowledge, the pace at which we do so is not linear, but geometric.
For those of you who struggled with math as much as I did, here’s a helpful picture illustrating what I mean.
Just as Moore’s law famously stated that computer power would double every two years (and costs would halve), knowledge grows in a similarly exponential way. Twenty years ago, computer science students at Stanford could rest easy knowing that their four years of training would prepare them for the job. Today, much of what they’ll learn is obsolete even before they graduate.
It’s clear why. After all, with the insane amounts of knowledge and technology at our fingertips, anyone, anywhere, can innovate. A few decades ago, the early programming frameworks like C were developed over the course of years by massive corporations like Bell Labs. Today, someone such as David Heinemeier Hansson can build a framework like Ruby on Rails (which powers many of your favorite websites) in his spare time—in about six months!
One afternoon, during the writing of this book, I broke for lunch on the rooftop patio of my office building in the bustling high-tech hub of Tel Aviv. Once there, I couldn’t help but overhear a loud, heated debate between the founders of a cybersecurity startup. From the sound of it, they had discovered the perfect job candidate. She was friendly. She was ambitious. Heck, she was almost overqualified for the job. There was only one problem: she wanted to travel for six months before starting.
The conversation that ensued reads eerily like a marketing campaign for this book:
“It’s a long time, but she’s really talented.”
“You’re right. She’s amazing. But six months? Our industry moves at light speed. Six months is an eternity. Even if she were Steve Jobs, in six months of traveling, her skills will be completely irrelevant!”
“You’re being a little unreasonable, aren’t you?”
“Am I? Did you see that article I sent you about the innovation happening in micropayments right now? In just three months, those guys have created a whole new freaking industry! And she wants six?”
Something tells me she didn’t get the job.
Whereas it used to be only doctors and programmers who struggled to keep up with the pace of their field, today, it’s almost everybody. Marketing managers who aren’t caught up on all the latest consumer psychology research. Sales professionals who haven’t learned the latest features of their software of choice. Professionals in every industry who want to take their career to the next level but are struggling to keep up with the work they already have—much less make time for “leisure” learning like foreign languages, musical instruments, new skills, or pleasure reading.
Perhaps you’ve already felt this overwhelm. Perhaps you’re in one of the few professions that hasn’t felt it—yet. One way or another, let me assure you: it’s coming. And until this rapid progress brings us the technology to “download” information directly into our gray matter, the overwhelm is only getting worse.
Fortunately, there’s a better way. A way to not only choose the right things to learn, but to absorb them with relative ease—and actually remember them! Fortunately, you can become a SuperLearner.