Bit rot sounds, at first hearing, like a paradoxical proposition: how can digital information decay, let alone rot (a fine Old English word, originally rotian, “to putrefy”)?
The idea behind bit rot is that the individual units of electrical charge constituting digital bits can disperse over time. More generally, though, it has become a term used to describe the degeneration of physical media on which data is stored: from CDs and DVDs to older formats, such as magnetic tape and disks.
Physical damage isn’t the only challenge facing those who wish to preserve digital information, or even the most severe. Far more problematic is the basic fact that any digital information is useless if you don’t have access to the software and hardware designed to read it, and to convert that raw data back into words, images, code, or whatever it is that they embodied in the first place.
Already, some of the world’s earliest computer data, generated by institutions like NASA in the 1950s and 1960s, has become almost impossible to access thanks to the destruction of the physical machinery needed to read it—and that’s before the physical decay of the storage medium itself comes into play.
As well as bit rot, a slightly more enigmatic phenomenon known as “software rot” can also plague older computers and their programs. Software rot—also sometimes known as software entropy—is not a physical process, but rather reflects the ways in which software gradually tends to work less and less effectively as it is transferred between computer systems, and as the machines on which it is operating become more distant in time from the period in which the software itself was created.
There are, for example, many programs from the early days of computing that it is simply impossible to run on any modern computer, as their instructions are no longer fully comprehensible by, or compatible with, modern computers and the programming languages running on them.
Software that has ceased to be usable on modern machines is sometimes referred to as “legacy” software, which may have to be “refactored” if it is to operate on a modern system—or be run through a specially designed “emulator” program, which effectively re-creates a virtual version of an older system within a modern computer.
Arcane though it may sound, issues of emulation and data preservation are some of the central problems facing anyone wishing to preserve the young history of computing in a usable form. Ultimately, digital data and programs are all a kind of language; and unless systems remain that can read and speak all the many languages of their past, much of modern history risks becoming literally incomprehensible.