From the start of the twentieth century until the 1960s, the word bionic—an adjective formed from the Greek word bios, “life”—existed only as a technical term within the study of fossils. To describe the “bionic value” of different fossils was to quantify the enduring power over huge stretches of time of the characteristics of particular organisms.
Come the late 1950s, however, a very different sense for the word arrived, describing not the characteristics of living organisms but the design of machines based on the study of biological systems, and the potential integration of machine and organic components. The coiner in question was medical doctor and United States Air Force colonel Jack E. Steele, known today as the “father of bionics” thanks to his work from 1953 at the 6570th Aerospace Medical Research Lab at Wright-Patterson Air Force Base in Ohio, and who reportedly first used bionics in 1958 to describe the copying of biological organs in the design of artificial prostheses and robots.49
Steele seems to have been unaware of the word’s previous existence, and probably conceived it as a contraction of biological and electronics. The word itself remained an obscure item of technical vocabulary until the fateful intervention of mass media, in the form of the 1974 television series The Six Million Dollar Man. The series starred actor Lee Majors as military hero Steve Austin, injured while testing an experimental aircraft and subsequently reconstructed as “the world’s first bionic man”: a part-man, part-machine super soldier. The series was a hit, and bred a spin-off series—1976’s The Bionic Woman—which helped cement the term in popular consciousness.
Interestingly, The Six Million Dollar Man was based on a novel that had already brought another young word into public consciousness: cyborg, meaning an organism that is part human and part machine. This term was also a child of the 1960s, having been coined as a contraction of the words cybernetic and organism in a 1960 article in the journal Astronautics, discussing the benefits of “altering man’s bodily functions to meet the requirements of extraterrestrial environments.”50 It was, however, Martin Caidin’s 1972 novel Cyborg that boosted the term to the level of popular culture—a novel directly inspired by Jack Steele’s work at Wright-Patterson Air Force Base.
Today, the increasing integration of electronic technologies into every aspect of human life have made bionics and cyborgs into iconic aspects of both science fiction and science fact—complete with ever-expanding linguistic subsets for each.
For example, a brain–computer interface (BCI), often called a mind–machine interface (MMI), is a direct communication pathway between the brain and an external device—and something that in recent years has achieved remarkable early results, including some of the first instances of rudimentary sight being restored to blind individuals via electronic retinal implants: tiny microchips that substitute for the functioning of damaged retinal cells.
As technology progresses, this newly literal integration of human experience with cutting-edge electronics is set to continue—bringing with it an increasing demand for new terms to describe the artificial, the enhanced, and the hybrid zone that lies between them.