1. Glass

Roughly 26 million years ago, something happened over the sands of the Libyan Desert, the bleak, impossibly dry landscape that marks the eastern edge of the Sahara. We don’t know exactly what it was, but we do know that it was hot. Grains of silica melted and fused under an intense heat that must have been at least a thousand degrees. The compounds of silicon dioxide they formed have a number of curious chemical traits. Like H2O, they form crystals in their solid state, and melt into a liquid when heated. But silicon dioxide has a much higher melting point than water; you need temperatures above 500 degrees Fahrenheit instead of 32. But the truly peculiar thing about silicon dioxide is what happens when it cools. Liquid water will happily re-form the crystals of ice if the temperature drops back down again. But silicon dioxide for some reason is incapable of rearranging itself back into the orderly structure of crystal. Instead, it forms a new substance that exists in a strange limbo between solid and liquid, a substance human beings have been obsessed with since the dawn of civilization. When those superheated grains of sand cooled down below their melting point, a vast stretch of the Libyan Desert was coated with a layer of what we now call glass.

About ten thousand years ago, give or take a few millennia, someone traveling through the desert stumbled across a large fragment of this glass. We don’t know anything more about that fragment, only that it must have impressed just about everyone who came into contact with it, because it circulated through the markets and social networks of early civilization, until it ended up as a centerpiece of a brooch, carved into the shape of a scarab beetle. It sat there undisturbed for four thousand years, until archeologists unearthed it in 1922 while exploring the tomb of an Egyptian ruler. Against all odds, that small sliver of silicon dioxide had found its way from the Libyan Desert into the burial chamber of Tutankhamun.

Glass first made the transition from ornament to advanced technology during the height of the Roman Empire, when glassmakers figured out ways to make the material sturdier and less cloudy than naturally forming glass like that of King Tut’s scarab. Glass windows were built during this period for the first time, laying the groundwork for the shimmering glass towers that now populate city skylines around the world. The visual aesthetics of drinking wine emerged as people consumed it in semitransparent glass vessels and stored it in glass bottles. But, in a way, the early history of glass is relatively predictable: craftsmen figured out how to melt the silica into drinking vessels or windowpanes, exactly the sort of typical uses we instinctively associate with glass today. It wasn’t until the next millennium, and the fall of another great empire, that glass became what it is today: one of the most versatile and transformative materials in all of human culture.

Pectoral in gold cloissoné with semiprecious stones and glass paste, with winged scarab, symbol of resurrection, in center, from the tomb of Pharaoh Tutankhamun

THE SACKING of Constantinople in 1204 was one of those historical quakes that send tremors of influence rippling across the globe. Dynasties fall, armies surge and retreat, the map of the world is redrawn. But the fall of Constantinople also triggered a seemingly minor event, lost in the midst of that vast reorganization of religious and geopolitical dominance and ignored by most historians of the time. A small community of glassmakers from Turkey sailed westward across the Mediterranean and settled in Venice, where they began practicing their trade in the prosperous new city growing out of the marshes on the shores of the Adriatic Sea.

Circa 1900: Roman civilization, first–second century AD glass containers for ointments

It was one of a thousand migrations set in motion by Constantinople’s fall, but looking back over the centuries, it turned out to be one of the most significant. As they settled into the canals and crooked streets of Venice, at that point arguably the most important hub of commercial trade in the world, their skills at blowing glass quickly created a new luxury good for the merchants of the city to sell around the globe. But lucrative as it was, glassmaking was not without its liabilities. The melting point of silicon dioxide required furnaces burning at temperatures near 1,000 degrees, and Venice was a city built almost entirely out of wooden structures. (The classic stone Venetian palaces would not be built for another few centuries.) The glassmakers had brought a new source of wealth to Venice, but they had also brought the less appealing habit of burning down the neighborhood.

In 1291, in an effort to both retain the skills of the glassmakers and protect public safety, the city government sent the glassmakers into exile once again, only this time their journey was a short one—a mile across the Venetian Lagoon to the island of Murano. Unwittingly, the Venetian doges had created an innovation hub: by concentrating the glassmakers on a single island the size of a small city neighborhood, they triggered a surge of creativity, giving birth to an environment that possessed what economists call “information spillover.” The density of Murano meant that new ideas were quick to flow through the entire population. The glassmakers were in part competitors, but their family lineages were heavily intertwined. There were individual masters in the group that had more talent or expertise than the others, but in general the genius of Murano was a collective affair: something created by sharing as much as by competitive pressures.

A section of a fifteenth-century map of Venice, showing the island of Murano

By the first years of the next century, Murano had become known as the Isle of Glass, and its ornate vases and other exquisite glassware became status symbols throughout Western Europe. (The glassmakers continue to work their trade today, many of them direct descendants of the original families that emigrated from Turkey.) It was not exactly a model that could be directly replicated in modern times: mayors looking to bring the creative class to their cities probably shouldn’t consider forced exile and borders armed with the death penalty. But somehow it worked. After years of trial and error, experimenting with different chemical compositions, the Murano glassmaker Angelo Barovier took seaweed rich in potassium oxide and manganese, burned it to create ash, and then added these ingredients to molten glass. When the mixture cooled, it created an extraordinarily clear type of glass. Struck by its resemblance to the clearest rock crystals of quartz, Barovier called it cristallo. This was the birth of modern glass.

WHILE GLASSMAKERS such as Barovier were brilliant at making glass transparent, we didn’t understand scientifically why glass is transparent until the twentieth century. Most materials absorb the energy of light. On a subatomic level, electrons orbiting the atoms that made up the material effectively “swallow” the energy of the incoming photon of light, causing those electrons to gain energy. But electrons can gain or lose energy only in discrete steps, known as “quanta.” But the size of the steps varies from material to material. Silicon dioxide happens to have very large steps, which means that the energy from a single photon of light is not sufficient to bump up the electrons to the higher level of energy. Instead, the light passes through the material. (Most ultraviolet light, however, does have enough energy to be absorbed, which is why you can’t get a suntan through a glass window.) But light doesn’t simply pass through glass; it can also be bent and distorted or even broken up into its component wavelengths. Glass could be used to change the look of the world, by bending light in precise ways. This turned out to be even more revolutionary than simple transparency.

In the monasteries of the twelfth and thirteenth centuries, monks laboring over religious manuscripts in candlelit rooms used curved chunks of glass as a reading aid. They would run what were effectively bulky magnifiers over the page, enlarging the Latin inscriptions. No one is sure exactly when or where it happened, but somewhere around this time in Northern Italy, glassmakers came up with an innovation that would change the way we see the world, or at least clarify it: shaping glass into small disks that bulge in the center, placing each one in a frame, and joining the frames together at the top, creating the world’s first spectacles.

Those early spectacles were called roidi da ogli, meaning “disks for the eyes.” Thanks to their resemblance to lentil beans—lentes in Latin—the disks themselves came to be called “lenses.” For several generations, these ingenious new devices were almost exclusively the province of monastic scholars. The condition of “hyperopia”—farsightedness—was widely distributed through the population, but most people didn’t notice that they suffered from it, because they didn’t read. For a monk, straining to translate Lucretius by the flickering light of a candle, the need for spectacles was all too apparent. But the general population—the vast majority of them illiterate—had almost no occasion to discern tiny shapes like letterforms as part of their daily routine. People were farsighted; they just didn’t have any real reason to notice that they were farsighted. And so spectacles remained rare and expensive objects.

The earliest image of a monk with glasses, 1342

What changed all of that, of course, was Gutenberg’s invention of the printing press in the 1440s. You could fill a small library with the amount of historical scholarship that has been published documenting the impact of the printing press, the creation of what Marshall McLuhan famously called “the Gutenberg galaxy.” Literacy rates rose dramatically; subversive scientific and religious theories routed around the official channels of orthodox belief; popular amusements like the novel and printed pornography became commonplace. But Gutenberg’s great breakthrough had another, less celebrated effect: it made a massive number of people aware for the first time that they were farsighted. And that revelation created a surge in demand for spectacles.

What followed was one of the most extraordinary cases of the hummingbird effect in modern history. Gutenberg made printed books relatively cheap and portable, which triggered a rise in literacy, which exposed a flaw in the visual acuity of a sizable part of the population, which then created a new market for the manufacture of spectacles. Within a hundred years of Gutenberg’s invention, thousands of spectacle makers around Europe were thriving, and glasses became the first piece of advanced technology—since the invention of clothing in Neolithic times—that ordinary people would regularly wear on their bodies.

But the coevolutionary dance did not stop there. Just as the nectar of flowering plants encouraged a new kind of flight in the hummingbird, the economic incentive created by the surging market for spectacles engendered a new pool of expertise. Europe was not just awash in lenses, but also in ideas about lenses. Thanks to the printing press, the Continent was suddenly populated by people who were experts at manipulating light through slightly convex pieces of glass. These were the hackers of the first optical revolution. Their experiments would inaugurate a whole new chapter in the history of vision.

Fifteenth-century glasses

In 1590 in the small town of Middleburg in the Netherlands, father and son spectacle makers Hans and Zacharias Janssen experimented with lining up two lenses, not side by side like spectacles, but in line with each other, magnifying the objects they observed, thereby inventing the microscope. Within seventy years, the British scientist Robert Hooke had published his groundbreaking illustrated volume Micrographia, with gorgeous hand-drawn images re-creating what Hooke had seen through his microscope. Hooke analyzed fleas, wood, leaves, even his own frozen urine. But his most influential discovery came by carving off a thin sheaf of cork and viewing it through the microscope lens. “I could exceeding plainly perceive it to be all perforated and porous, much like a Honey-comb,” Hooke wrote, “but that the pores of it were not regular; yet it was not unlike a Honey-comb in these particulars . . . these pores, or cells, were not very deep, but consisted of a great many little Boxes.” With that sentence, Hooke gave a name to one of life’s fundamental building blocks—the cell—leading the way to a revolution in science and medicine. Before long the microscope would reveal the invisible colonies of bacteria and viruses that both sustain and threaten human life, which in turn led to modern vaccines and antibiotics.

The Flea (engraving from Robert Hooke’s Micrographia, London)

The microscope took nearly three generations to produce truly transformative science, but for some reason the telescope generated its revolutions more quickly. Twenty years after the invention of the microscope, a cluster of Dutch lensmakers, including Zacharias Janssen, more or less simultaneously invented the telescope. (Legend has it that one of them, Hans Lippershey, stumbled upon the idea while watching his children playing with his lenses.) Lippershey was the first to apply for a patent, describing a device “for seeing things far away as if they were nearby.” Within a year, Galileo got word of this miraculous new device, and modified the Lippershey design to reach a magnification of ten times normal vision. In January of 1610, just two years after Lippershey had filed for his patent, Galileo used the telescope to observe that moons were orbiting Jupiter, the first real challenge to the Aristotelian paradigm that assumed all heavenly bodies circled the Earth.

This is the strange parallel history of Gutenberg’s invention. It has long been associated with the scientific revolution, for several reasons. Pamphlets and treatises from alleged heretics like Galileo could circulate ideas outside the censorious limits of the Church, ultimately undermining its authority; at the same time, the system of citation and reference that evolved in the decades after Gutenberg’s Bible became an essential tool in applying the scientific method. But Gutenberg’s creation advanced the march of science in another, less familiar way: it expanded possibilities of lens design, of glass itself. For the first time, the peculiar physical properties of silicon dioxide were not just being harnessed to let us see things that we could already see with our own eyes; we could now see things that transcended the natural limits of human vision.

The lens would go on to play a pivotal role in nineteenth- and twentieth-century media. It was first utilized by photographers to focus beams of light on specially treated paper that captured images, then by filmmakers to both record and subsequently project moving images for the first time. Starting in the 1940s, we began coating glass with phosphor and firing electrons at it, creating the hypnotic images of television. Within a few years, sociologists and media theorists were declaring that we had become a “society of the image,” the literate Gutenberg galaxy giving way to the blue glow of the TV screen and the Hollywood glamour shot. Those transformations emerged out of a wide range of innovations and materials, but all of them, in one way or another, depended on the unique ability of glass to transmit and manipulate light.

An early microscope designed by Robert Hooke, 1665

To be sure, the story of the modern lens and its impact on media is not terribly surprising. There’s an intuitive line that you can follow from the lenses of the first spectacles, to the lens of a microscope, to the lens of a camera. Yet glass would turn out to have another bizarre physical property, one that even the master glassblowers of Murano had failed to exploit.

AS PROFESSORS GO, the physicist Charles Vernon Boys was apparently a lousy one. H. G. Wells, who was briefly one of Boys’s students at London’s Royal College of Science, later described him as “one of the worst teachers who has ever turned his back on a restive audience. . . . [He] messed about with the blackboard, galloped through an hour of talk, and bolted back to the apparatus in his private room.”

But what Boys lacked in teaching ability he made up for in his gift for experimental physics, designing and building scientific instruments. In 1887, as part of his physics experiments, Boys wanted to create a very fine shard of glass to measure the effects of delicate physical forces on objects. He had an idea that he could use a thin fiber of glass as a balance arm. But first he had to make one.

Hummingbird effects sometimes happen when an innovation in one field exposes a flaw in some other technology (or in the case of the printed book, in our own anatomy) that can be corrected only by another discipline altogether. But sometimes the effect arrives thanks to a different kind of breakthrough: a dramatic increase in our ability to measure something, and an improvement in the tools we build for measuring. New ways of measuring almost always imply new ways of making. Such was the case with Boys’s balance arm. But what made Boys such an unusual figure in the annals of innovation is the decidedly unorthodox tool he used in pursuit of this new measuring device. To create his thin string of glass, Boys built a special crossbow in his laboratory, and created lightweight arrows (or bolts) for it. To one bolt he attached the end of a glass rod with sealing wax. Then he heated glass until it softened, and he fired the bolt. As the bolt hurtled toward its target, it pulled a tail of fiber from the molten glass clinging to the crossbow. In one of his shots, Boys produced a thread of glass that stretched almost ninety feet long.

Charles Vernon Boys standing in a laboratory, 1917

If I had been promised by a good fairy for anything I desired, I would have asked for one thing with so many valuable properties as these fibres,” Boys would later write. Most astonishing, though, was how strong the fiber was: as durable, if not more so, than an equivalently sized strand of steel. For thousands of years, humans had employed glass for its beauty and transparency, and tolerated its chronic fragility. But Boys’s crossbow experiment suggested that there was one more twist in the story of this amazingly versatile material: using glass for its strength.

By the middle of the next century, glass fibers, now wound together in a miraculous new material called fiberglass, were everywhere: in home insulation, clothes, surfboards, megayachts, helmets, and the circuit boards that connected the chips of a modern computer. The fuselage of Airbus’s flagship jet, the A380—the largest commercial aircraft in the skies—is built with a composite of aluminum and fiberglass, making it much more resistant to fatigue and damage than traditional aluminum shells. Ironically, most of these applications ignored silicon dioxide’s strange capacity to transmit light waves: most objects made of fiberglass do not look to the untutored eye to be made of glass at all. During the first decades of innovation with glass fibers, this emphasis on nontransparency made sense. It was useful to allow light to pass through a windowpane or a lens, but why would you need to pass light through a fiber not much bigger than a human hair?

The transparency of glass fibers became an asset only once we began thinking of light as a way to encode digital information. In 1970, researchers at Corning Glassworks—the Murano of modern times—developed a type of glass that was so extraordinarily clear that if you created a block of it the length of a bus, it would be just as transparent as looking through a normal windowpane. (Today, after further refinements, the block could be a half-mile long with the same clarity.) Scientists at Bell Labs then took fibers of this super-clear glass and shot laser beams down the length of them, fluctuating optical signals that corresponded to the zeroes and ones of binary code. This hybrid of two seemingly unrelated inventions—the concentrated, orderly light of lasers, and the hyper-clear glass fibers—came to be known as fiber optics. Using fiber-optic cables was vastly more efficient than sending electrical signals over copper cables, particularly for long distances: light allows much more bandwidth and is far less susceptible to noise and interference than is electrical energy. Today, the backbone of the global Internet is built out of fiber-optic cables. Roughly ten distinct cables traverse the Atlantic Ocean, carrying almost all the voice and data communications between the continents. Each of those cables contains a collection of separate fibers, surrounded by layers of steel and insulation to keep them watertight and protected from fishing trawlers, anchors, and even sharks. Each individual fiber is thinner than a piece of straw. It seems impossible, but the fact is that you can hold the entire collection of all the voice and data traffic traveling between North America and Europe in the palm of one hand. A thousand innovations came together to make that miracle possible: we had to invent the idea of digital data itself, and laser beams, and computers at both ends that could transmit and receive those beams of information—not to mention the ships that lay and repair the cables. But those strange bonds of silicon dioxide, once again, turn out to be central to the story. The World Wide Web is woven together out of threads of glass.

Think of that iconic, early-twenty-first-century act: snapping a selfie on your phone as you stand in some exotic spot on vacation, and then uploading the image to Instagram or Twitter, where it circulates to other people’s phones and computers all around the world. We’re accustomed to celebrating the innovations that have made this act almost second nature to us now: the miniaturization of digital computers into handheld devices, the creation of the Internet and the Web, the interfaces of social-networking software. What we rarely do is recognize the way glass supports this entire network: we take pictures through glass lenses, store and manipulate them on circuit boards made of fiberglass, transmit them around the world via glass cables, and enjoy them on screens made of glass. It’s silicon dioxide all the way down the chain.

IT’S EASY TO MAKE FUN of our penchant for taking selfies, but in fact there is a long and storied tradition behind that form of self-expression. Some of the most revered works of art from the Renaissance and early modernism are self-portraits; from Dürer to Leonardo, to Rembrandt, all the way to van Gogh with his bandaged ear, painters have been obsessed with capturing detailed and varied images of themselves on the canvas. Rembrandt, for instance, painted around forty self-portraits over the course of his life. But the interesting thing about self-portraiture is that it effectively doesn’t exist as an artistic convention in Europe before 1400. People painted landscapes and royalty and religious scenes and a thousand other subjects. But they didn’t paint themselves.

The explosion of interest in self-portraiture was the direct result of yet another technological breakthrough in our ability to manipulate glass. Back in Murano, the glassmakers had figured out a way to combine their crystal-clear glass with a new innovation in metallurgy, coating the back of the glass with an amalgam of tin and mercury to create a shiny and highly reflective surface. For the first time, mirrors became part of the fabric of everyday life. This was a revelation on the most intimate of levels: before mirrors came along, the average person went through life without ever seeing a truly accurate representation of his or her face, just fragmentary, distorted glances in pools of water or polished metals.

Mirrors appeared so magical that they were quickly integrated into somewhat bizarre sacred rituals: During holy pilgrimages, it became common practice for well-off pilgrims to take a mirror with them. When visiting sacred relics, they would position themselves so that they could catch sight of the bones in the mirror’s reflection. Back home, they would then show off these mirrors to friends and relatives, boasting that they had brought back physical evidence of the relic by capturing the reflection of the sacred scene. Before turning to the printing press, Gutenberg had the start-up idea of manufacturing and selling small mirrors for departing pilgrims.

Las Meninas by Diego Rodríguez de Silva y Velázquez

But the mirror’s most significant impact would be secular, not sacred. Filippo Brunelleschi employed a mirror to invent linear perspective in painting, by drawing a reflection of the Florence Baptistry instead of his direct perception of it. The art of the late Renaissance is heavily populated by mirrors lurking inside paintings, most famously in Diego Velázquez’s inverted masterpiece, Las Meninas, which shows the artist (and the extended royal family) in the middle of painting King Philip IV and Queen Mariana of Spain. The entire image is captured from the point of view of two royal subjects sitting for their portrait; it is, in a very literal sense, a painting about the act of painting. The king and queen are visible only in one small fragment of the canvas, just to the right of Velázquez himself: two small, blurry images reflected back in a mirror.

As a tool, the mirror became an invaluable asset to painters who could now capture the world around them in a far more realistic fashion, including the detailed features of their own faces. Leonardo da Vinci observed the following in his notebooks (using mirrors, naturally, to write in his legendary backward script):

When you wish to see whether the general effect of your picture corresponds with that of the object represented after nature, take a mirror and set it so that it reflects the actual thing, and then compare the reflection with your picture, and consider carefully whether the subject of the two images is in conformity with both, studying especially the mirror. The mirror ought to be taken as a guide.

The historian Alan MacFarlane writes of the role of glass in shaping artistic vision, “It is as if all humans had some kind of systematic myopia, but one which made it impossible to see, and particularly to represent, the natural world with precision and clarity. Humans normally saw nature symbolically, as a set of signs. . . . What glass ironically did was to take away or compensate for the dark glass of human sight and the distortions of the mind, and hence to let in more light.”

At the exact moment that the glass lens was allowing us to extend our vision to the stars or microscopic cells, glass mirrors were allowing us to see ourselves for the first time. It set in motion a reorientation of society that was more subtle, but no less transformative, than the reorientation of our place in the universe that the telescope engendered. “The most powerful prince in the world created a vast hall of mirrors, and the mirror spread from one room to another in the bourgeois household,” Lewis Mumford writes in his Technics and Civilization. “Self-consciousness, introspection, mirror-conversation developed with the new object itself.” Social conventions as well as property rights and other legal customs began to revolve around the individual rather than the older, more collective units: the family, the tribe, the city, the kingdom. People began writing about their interior lives with far more scrutiny. Hamlet ruminated onstage; the novel emerged as a dominant form of storytelling, probing the inner mental lives of its characters with an unrivaled depth. Entering a novel, particularly a first-person narrative, was a kind of conceptual parlor trick: it let you swim through the consciousness, the thoughts and emotions, of other people more effectively than any aesthetic form yet invented. The psychological novel, in a sense, is the kind of story you start wanting to hear once you begin spending meaningful hours of your life staring at yourself in the mirror.

How much does this transformation owe to glass? Two things are undeniable: the mirror played a direct role in allowing artists to paint themselves and invent perspective as a formal device; and shortly thereafter a fundamental shift occurred in the consciousness of Europeans that oriented them around the self in a new way, a shift that would ripple across the world (and that is still rippling). No doubt many forces converged to make this shift possible: the self-centered world played well with the early forms of modern capitalism that were thriving in places like Venice and Holland (home to those masters of painterly introspection, Dürer and Rembrandt). Likely, these various forces complemented each other: glass mirrors were among the first high-tech furnishings for the home, and once we began gazing into those mirrors, we began to see ourselves differently, in ways that encouraged the market systems that would then happily sell us more mirrors. It’s not that the mirror made the Renaissance, exactly, but that it got caught up in a positive feedback loop with other social forces, and its unusual capacity to reflect light strengthened those forces. This is what the robot historian’s perspective allows us to see: the technology is not a single cause of a cultural transformation like the Renaissance, but it is, in many ways, just as important to the story as the human visionaries that we conventionally celebrate.

McFarlane has an artful way of describing this kind of causal relationship. The mirror doesn’t “force” the Renaissance to happen; it “allows” it to happen. The elaborate reproductive strategy of the pollinators didn’t force the hummingbird to evolve its spectacular aerodynamics; it created the conditions that allowed the hummingbird to take advantage of flower’s free sugars by evolving such a distinctive trait. The fact that the hummingbird is so unique in the avian kingdom suggests that, had the flowers not evolved their symbiotic dance with the insects, the hummingbird’s hovering skills would have never come into being. It’s easy to imagine a world with flowers but without hummingbirds. But it’s much harder to imagine a world without flowers but with hummingbirds.

The same holds true for technological advances like the mirror. Without a technology that enabled humans to see a clear reflection of reality, including their own faces, the particular constellation of ideas in art and philosophy and politics that we call the Renaissance would have had a much more difficult time coming into being. (Japanese culture had highly prized steel mirrors during roughly the same period, but never adopted them for the same introspective use that flourished in Europe—perhaps in part because steel reflected much less light than glass mirrors, and added unnatural coloring to the image.) Yet the mirror was not exclusively dictating the terms of the European revolution in the sense of self. A different culture, inventing the fine glass mirror at a different point in its historical development, might not have experienced the same intellectual revolution, because the rest of its social order differed from that of fifteenth-century Italian hill-towns. The Renaissance also benefited from a patronage system that enabled its artists and scientists to spend their days playing with mirrors instead of, say, foraging for nuts and berries. A Renaissance without the Medici—not the individual family, of course, but the economic class they represent—is as hard to imagine as the Renaissance without the mirror.

It should probably be said that the virtues of the society of the self are entirely debatable. Orienting laws around individuals led directly to an entire tradition of human rights and the prominence of individual liberty in legal codes. That has to count as progress. But reasonable people disagree about whether we have now tipped the scales too far in the direction of individualism, away from those collective organizations: the union, the community, the state. Resolving those disagreements requires a different set of arguments—and values—than the ones we need to explain where those disagreements came from. The mirror helped invent the modern self, in some real but unquantifiable way. That much we should agree on. Whether that was a good thing in the end is a separate question, one that may never be settled conclusively.

THE DORMANT VOLCANO of Mauna Kea on Hawaii’s Big Island rises almost fourteen thousand feet above sea level, though the mountain extends almost another twenty thousand feet down to the ocean floor below, making it significantly larger than Mount Everest in terms of base-to-peak height. It is one of the few places in the world where you can drive from sea level to fourteen thousand feet in a matter of hours. At the summit, the landscape is barren, almost Martian, in its rocky, lifeless expanse. An inversion layer generally keeps clouds several thousand feet below the volcano’s peak; the air is as dry as it is thin. Standing on the summit, you are as far from the continents of earth as you can be while standing on land, which means the atmosphere around Hawaii—undisturbed by the turbulence of the sun’s energy bouncing off or being absorbed by large, varied landmasses—is as stable as just about anywhere on the planet. All of these properties make the peak of Mauna Kea one of the most otherworldly places you can visit. Appropriately enough, they also make it a sublime location for stargazing.

Today, the summit of Mauna Kea is crowned by thirteen distinct observatories, massive white domes scattered across the red rocks like some gleaming outpost on a distant planet. Included in this group are the twin telescopes of the W. M. Keck Observatory, the most powerful optical telescopes on earth. The Keck telescopes would seem to be a direct descendant of Hans Lippershey’s creation, only they do not rely on lenses to do their magic. To capture light from distant corners of the universe, you would need lenses the size of a pickup truck; at that size, glass becomes difficult to physically support and introduces inevitable distortions into the image. And so, the scientists and engineers behind Keck employed another technique to capture extremely faint traces of light: the mirror.

Keck Observatory

Each telescope has thirty-six hexagonal mirrors that together become a twenty-foot-wide reflective canvas. That light is reflected up to a second mirror and then down to a set of instruments, where the images can then be processed and visualized on a computer screen. (There is no vantage point at Keck where one can gaze directly through the telescope the way Galileo and countless astronomers since have done.) But even in the thin, ultra-stable atmosphere above Mauna Kea, small disturbances can blur the images captured by Keck. And so the observatories employ an ingenious system called “adaptive optics” to correct the vision of the telescopes. Lasers are beamed into the night sky above Keck, effectively creating an artificial star in the heavens. That false star becomes a kind of reference point; because the scientists know exactly what the laser should look like in the heavens were there no atmospheric distortion, they are able to get a measurement of the existing distortion by comparing the “ideal” laser image and what the telescopes actually register. Guided by that map of atmospheric noise, computers instruct the mirrors of the telescope to flex slightly based on the exact distortions in the skies above Mauna Kea that night. The effect is almost exactly like putting spectacles on a nearsighted person: distant objects suddenly become significantly clearer.

Of course, with the Keck telescopes, those distant objects are galaxies and supernovas that are, in some cases, billions of light-years away. When we look through the mirrors of Keck, we are looking into the distant past. Once again, glass has extended our vision: not just down to the invisible world of cells and microbes, or the global connectivity of the cameraphone, but all the way back to the early days of the universe. Glass started out as trinkets and empty vessels. A few thousand years later, perched above the clouds at the top of Mauna Kea, it has become a time machine.

THE STORY OF GLASS reminds us how our ingenuity is both confined and empowered by the physical properties of the elements around us. When we think of the entities that made the modern world, we usually talk about the great visionaries of science and politics, or breakthrough inventions, or large collective movements. But there is a material element to our history as well: not the dialectical materialism that Marxism practiced, where “material” meant the class struggle and the ultimate primacy of economic explanations. Material history, instead, in the sense of history as shaped by the basic building blocks of matter, which are then connected to things like social movements or economic systems. Imagine you could rewrite the Big Bang (or play God, depending on your metaphor) and create a universe that was exactly like ours, with only one tiny change: those electrons on the silicon atom don’t behave quite the same way. In this alternate universe, the electrons absorb light like most materials, instead of letting the photons pass through them. Such a small adjustment might well have made no difference at all for the entire evolution of Homo sapiens until a few thousand years ago. But then, amazingly, everything changed. Humans began exploiting the quantum behavior of those silicon electrons in countless different ways. On some fundamental level, it is impossible to imagine the last millennium without transparent glass. We can now manipulate carbon (in the form of that defining twentieth-century compound, plastic) into durable transparent materials that can do the job of glass, but that expertise is less than a century old. Tweak those silicon electrons, and you rob the last thousand years of windows, spectacles, lenses, test tubes, lightbulbs. (High-quality mirrors might have been independently invented using other reflective materials, though it would likely have taken a few centuries longer.) A world without glass would not just transform the edifices of civilization, by removing all the stained-glass windows of the great cathedrals and the sleek, reflective surfaces of the modern cityscape. A world without glass would strike at the foundation of modern progress: the extended life spans that come from understanding the cell, the virus, and the bacterium; the genetic knowledge of what makes us human; the astronomer’s knowledge of our place in the universe. No material on Earth mattered more to those conceptual breakthroughs than glass.

In a letter to a friend about the book of natural history that he never got around to writing, René Descartes described how he had wanted to tell the story of glass: “How from these ashes, by the mere intensity of [heat’s] action, it formed glass: for as this transmutation of ashes into glass appeared to me as wonderful as any other nature, I took a special pleasure in describing it.” Descartes was close enough to the original glass revolution to perceive its magnitude. Today, we are too many steps away from the material’s original influence to appreciate just how important it was, and continues to be, to everyday existence.

This is one of those places where the long-zoom approach illuminates, allowing us to see things that we would have otherwise missed had we focused on the usual suspects of historical storytelling. Invoking the physical elements in discussing historical change is not unheard of, of course. Most of us accept the idea that carbon has played an essential role in human activity since the industrial revolution. But in a way, this is not really news: carbon has been essential to just about everything living organisms have done since the primordial soup. But humans didn’t have much use for silicon dioxide until the glassmakers began to tinker with its curious properties a thousand years ago. Today, if you look around the room you’re currently occupying, there might easily be a hundred objects within reach that depend on silicon dioxide for their existence, and even more that rely on the element silicon itself: the panes of glass in your windows or skylights, the lens in your cameraphone, the screen of your computer, everything with a microchip or a digital clock. If you were casting starring roles for the chemistry of daily life ten thousand years ago, the top billing would be the same as it is today: we’re heavy users of carbon, hydrogen, oxygen. But silicon wouldn’t likely have even received a credit. While silicon is abundant on Earth—more than 90 percent of the crust is made up of the element—it plays almost no role in the natural metabolisms of life-forms on the planet. Our bodies are dependent on carbon, and many of our technologies (fossil fuels and plastics) display the same dependence. But the need for silicon is a modern craving.

The question is: Why did it take so long? Why were the extraordinary properties of this substance effectively ignored by nature, and why did those properties suddenly become essential to human society starting roughly a thousand years ago? In trying to address these questions, of course, we can only speculate. But surely one answer has to do with another technology: the furnace. One reason that evolution didn’t find much use for silicon dioxide is that most of the really interesting things about the substance don’t appear until you get over 1,000 degrees Fahrenheit. Liquid water and carbon do wonderfully inventive things at the earth’s atmospheric temperature, but it’s hard to see the promise of silicon dioxide until you can melt it, and the earth’s environment—at least on the surface of the planet—simply doesn’t get that hot. This was the hummingbird effect that the furnace unleashed: by learning how to generate extreme heat in a controlled environment, we unlocked the molecular potential of silicon dioxide, which soon transformed the way we see the world, and ourselves.

In a strange way, glass was trying to extend our vision of the universe from the very beginning, way before we were smart enough to notice. Those glass fragments from the Libyan Desert that made it into King Tut’s tomb had puzzled archeologists, geologists, and astrophysicists alike for decades. The semiliquid molecules of silicon dioxide suggested that they had formed at temperatures that could only have been created by a direct meteor strike, and yet there was no evidence of an impact crater anywhere in the vicinity. So where had those extraordinary temperatures come from? Lightning can strike a small patch of silica with glassmaking heat, but it can’t strike acres of sand in a single blast. And so scientists began to explore the idea that the Libyan glass arose from a comet colliding with the earth’s atmosphere and exploding over the desert sands. In 2013, a South African geochemist named Jan Kramers analyzed a mysterious pebble from the site and determined that it had originated in the nucleus of a comet, the first such object to be discovered on Earth. Scientists and space agencies have spent billions of dollars searching for particles of comets because they offer such profound insight into the formation of solar systems. The pebble from the Libyan Desert now gives them direct access to the geochemistry of comets. And all the while, glass was pointing the way.