24.

The Scunthorpe Problem

Much like automatic spelling corrections, the idea of filtering digital content in order automatically to remove sources of offense is almost as old as computing itself. Long before the world wide web came along, internet forums and bulletin board systems provided a seeding ground for everything from pornographic images to extremely frank exchanges of views. Hence the need for some protection for those of delicate sensibilities—and hence the birth of the marvelously named “Scunthorpe problem,” whereby entirely innocent words and phrases fall victim to machine filth-filters thanks to unfortunate sequences of letters within them.

In Scunthorpe’s case, it’s the second to fifth letters that create the problem—a phenomenon first spotted in 1996, when America Online’s internet service temporarily prevented anyone living in Scunthorpe (an English town found in the north of the county of Lincolnshire) from creating user accounts.

Scunthorpe itself innocently draws its name from the Old Norse word Escumetorp, meaning “the homestead belonging to Skuma,” a name coined during Danish rule of the northeast of England during the ninth to eleventh centuries. This venerable history did not, however, stop Google from echoing AOL’s anti-Scunthorpe stance, with its SafeSearch filter restricting search results for businesses with Scunthorpe in their names as recently as 2004.30

It’s not just Scunthorpe that has suffered from eponymous problems. Pity the residents of, for example, Penistone in South Yorkshire; or indeed any innocent member of the public with a name like Cockburn, whose first four letters may still be deemed too obscene for creating an email account by some providers.

More elaborate than mere blocking, however, is the automated substitution that can sometimes take place online when words or phrases are identified as “offensive.”

Consider the fate of American sprinter Tyson Gay, whose name was automatically converted to “Tyson homosexual” throughout an article appearing on the website of the American Family Association.31

As the last example suggests, perhaps the most interesting thing about automated prudishness is what it reveals about the intentions of the people designing a system of censorship in the first place—and when it reveals a filtering process that might otherwise have remained invisible.

This last phenomenon is sometimes referred to as “the clbuttic effect,” named in honor of the mangling of the word classic by overzealous obscenity filters. For the mistake to occur, the letters ass are transformed into the (presumably less offensive) word butt, leaving you with a text full of clbuttics—not to mention the potential for mbuttive (massive) quantities, someone pbutting (passing) a football, and so on.

Beyond clbuttics and Scunthorpe lies something a little more serious, however: the almost-invisible editing of what is sayable, readable, and discoverable online. To use another fine term, coined by Eli Pariser in his 2011 book of the same name, we spend more and more of our digital lives in “filter bubbles”—personalized bubbles of content selected by an algorithmic definition of relevance. The cleverer and more seamless they get, the more we may find ourselves missing the good old days of clunking textual error.