DUXBURY: RACHEL CARSON
The mosquitoes in Duxbury, Massachusetts, in 1957 seemed like a biblical plague. Science, however, came to the rescue. With military precision, the ponds and marshlands around the town were doused with DDT. The mosquitoes disappeared, but so too did the butterflies, birds, and mice.
A frightened Olga Huckins, who ran a bird sanctuary, wrote to her friend Rachel Carson, beseeching her to use her influence to get someone to investigate. Carson, a zoologist by training and a writer by vocation, had become a household name because of her lyrical sea trilogy (1941-1955), which included the bestseller The Sea around Us. Her skill lay in her ability to communicate complicated biology in beautiful prose. The problem of pesticides had already begun to worry her; they were, she felt, a gross example of mankind’s egotistical attempt to impose its will upon nature. “We still talk in terms of conquest,” she argued.
We still haven’t become mature enough to think of ourselves as only a tiny part of a vast and incredible universe. Man’s attitude toward nature is today critically important simply because we have now acquired a fateful power to alter and destroy nature.
But man is a part of nature, and his war against nature is inevitably a war against himself. . . . Now I truly believe that we in this generation must come to terms with nature, and I think we’re challenged as mankind has never been challenged before to prove our maturity and our mastery, not of nature, but of ourselves.
Carson decided to embark upon her most ambitious project. The shy, soft-spoken lover of nature went to battle against the might of the American chemical industry.1
Carson took four years to write Silent Spring. She realized that, given the determination and resources of her adversaries in the pesticide business, the science had to be impeccable. Fortunately, she was able to draw help from biologists, pathologists, chemists, and entomologists who shared her concern. She also realized that if the book were to exert maximum impact, it had to read like a novel. She therefore condensed dire scientific forebodings into simple, dramatic homily:
There was once a town in the heart of America where all life seemed to live in harmony with its surroundings. . . . Then a strange blight crept over the area and everything began to change. . . . There was a strange stillness. . . . The few birds seen anywhere were moribund; they trembled violently and could not fly. It was a spring without voices. On the mornings that had once throbbed with the dawn chorus of scores of bird voices there was now no sound; only silence lay over the fields and woods and marsh.
Serialization began in the New Yorker in June 1962, four months before the book’s official publication. An industrial toxicologist immediately dismissed Carson’s work as “crass commercialism” packaged as “idealistic flag-waving,” while the president of the Montrose Chemical Corporation, one of the nation’s largest producers of DDT, complained that Carson had written “not as a scientist but rather as a fanatic defender of the cult of the balance of nature.” He argued that if nature were allowed to determine its own balance, mankind would be ravaged by starvation and disease.2
PR executives worked overtime for the embattled industry. At first the response was temperate; spokesmen welcomed responsible inquiry. “What we have done, we have not done carelessly or without consideration,” a Dow Chemical representative pleaded. “The industry is not made up of money grubbers.” As Carson’s popularity grew, however, the avuncular approach was abandoned in favor of character assassination. Carson was accused of being a Communist and, as a woman, prone to hysteria. “The major claims of Miss Rachel Carson’s book,” industry spokesman Robert White-Stevens argued, “are gross distortions of the actual facts, completely unsupported by scientific, experimental evidence and general practical experience in the field.” Her credentials as a scientist came under attack by critics who did not know the difference between a beaker and a pipette. 3
The soft-spoken Carson absorbed the attacks with immense dignity—all the more impressive as she was dying of breast cancer. In response to claims that she advocated a “return to the Dark Ages,” she quietly explained: “I do not favor turning nature over to insects. I favor the sparing, selective, and intelligent use of chemicals. It is the indiscriminate, blanket spraying that I oppose.” The campaign to discredit her backfired badly, providing huge publicity for her book. Politicians found that they could not completely ignore the ecological bandwagon. On June 4, 1963, Carson addressed a Senate subcommittee investigating pesticides. She used the occasion to widen her warning:
Contamination of various kinds has now invaded all of the physical environment that supports us—water, soil, air, and vegetation. It has even penetrated that internal environment that lies within the bodies of animals and of men. It comes from many sources: radioactive wastes from reactors, laboratories, and hospitals; fallout from nuclear explosions; domestic wastes from cities and towns; chemical wastes from factories; detergents from homes and industries.
John Kennedy was not particularly interested in environmental issues, but he could not ignore the furor Carson had inspired. His Science Advisory Committee, headed by Jerome Wiesner, reported on May 15, 1963, that the uncontrolled use of pesticides was “potentially a much greater hazard” than that of radioactive fallout. Since Kennedy had already expressed concern about that threat, he had to act on this one. A process was set in motion which eventually led to the banning of DDT. More important, Kennedy set a precedent for federal intervention on pollution issues. There followed a steady stream of environmental legislation: the Clean Air Act (1963), the Wilderness Act (1964), the Clean Water Act (1965), and the Endangered Species Act (1966). This last was inspired in part by the fact that the bald eagle, America’s cherished symbol, perched on the edge of extinction.4
Membership in environmental groups increased tenfold from i960 to 1972, exceeding one million by the new decade. Carson helped this movement coalesce, but her cause was helped enormously by a steady stream of disasters like the Santa Barbara oil spill and the Cuyahoga River fire. Each pollution problem seemed to reveal a similar source, namely a powerful, wealthy industrial polluter more concerned with amassing profit than with public health, and able to buy support from corrupt politicians.
The environmental movement was, in the main, a middle-class pressure group in which women, as mothers, had huge impact. Mothers provided the perfect voice for future generations—the only special interests they seemed to represent were their own children. Since pollutants had the greatest impact on the young, the old, and the infirm, the concerns voiced were impossible to ignore. This was especially the case after revelations that radioactive Strontium-90 from nuclear tests had been found in breast milk. The environmental cause was, however, weakened by the fact that it seemed a luxury to those on the margins of society. The black activist Vernon Jordan complained that “some people have been too cavalier in proposing policies to preserve the physical environment for themselves while other poor people pay the costs.” Labor unions distributed a bumper sticker that read: “IF YOU’RE HUNGRY AND OUT OF WORK, EAT AN ENVIRONMENTALIST.”5
The movement harmonized perfectly with the hippie counterculture, and received a fillip from it. Flower power was nine-tenths fantasy, but the idea nevertheless focused attention on the beauty and fragility of the natural world. Central to the simple life idealized by hippies was the concept of harmony with the planet and the rejection of artificial, plastic products. While the hippie ideal quickly disintegrated, one lasting by-product was a nascent ecological consciousness. “A lot of us had been [thinking] . . . about the planet,” the actor and environmentalist Peter Coyote recalled, “about what the eternal reality was under this thin sheet of asphalt. And we were sensitive to the growing ecological crisis and the fact that a culture that was pissing in its life-support system couldn’t continue. And so a lot of people moved out of the city to try to get in touch with the planet and those processes that seemed more long-term and durable and renewable.” Macrobiotic restaurants sold “whole food” free of pesticides, synthetic flavors, and preservatives. The word “organic” came to mean something which did not damage the environment and would not endanger health. By the end of the decade, the term “earth” was applied to anything deemed ecologically pure and environmentally innocent. Boutiques sold earth clothes, clubs played earth music, Birkenstock made “earth shoes,” and people ate “earth food.”6
Another tremendous boost was provided, rather ironically, by NASA. The Apollo 8 mission on Christmas Eve 1968 brought back spectacular photographs of Earth, looking like a dappled blue-and-white pearl in a sea of black nothingness. The photos underlined the natural beauty of the planet, but also its vulnerability. Never before had Earth seemed so finite and fragile. Environmentalists loved the photos, even if they condemned the wasteful industry that had produced them. The photos became icons, subsequently used on countless posters encouraging people to recycle waste, clean up rivers, and use public transport. Ecologists spoke of “Spaceship Earth.”
Environmentalism was the perfect populist cause; it provided a chance for every would-be David to go into battle against a mean, corporate Goliath. The Earth was something everyone had in common, and its defense was, therefore, a seemingly apolitical act. Even though not everyone could muster the energy to be environmentally responsible, taking a stand on this issue did not annoy one’s friends and neighbors the way other political problems inevitably did. It was also a cause which provided something meaningful for everyone to do, be it recycling an aluminum can, cycling to work, or picking up litter. Occasional victories suggested progress was possible.
Progress was, however, illusory. Big industry quickly discovered how easy it was to make small gestures. Soft-drink manufacturers, for instance, made much of their decision to switch to pop tops that did not come loose from the can when the drink was opened. Previously pop tops had been thrown on the ground, where they stayed as indestructible reminders of environmental contempt. Integrated tops cost a bit more, but reaped dividends in positive publicity. They also distracted attention from the fact that the mining of aluminum was raping the environment, and that the massive amounts of sugar in soft drinks distorted agricultural production and contributed to health problems like obesity and diabetes. “Progress,” in other words, relied on smoke and mirrors. While the federal government responded to the public demand for environmental controls, those controls were often confidence tricks designed to quiet the outcry while providing renewed opportunity for profit.
Rachel Carson died in 1964. While her books brought her enormous fame, she never acted like a celebrity. Public adulation was never the point, as she explained to a friend: “The beauty of the living world . . . has always been uppermost in my mind—that, and anger at the senseless, brutish things that were being done. I have felt bound by a solemn obligation to do what I could—if I didn’t at least try, I could never be happy again in nature. But now I can believe that I have at least helped a little.” “We have met the enemy and he is us,” said the possum Pogo in a comic strip published on Earth Day, 1971. That was essentially the message Carson tried to convey, but it was not the message most people heard. The destruction of the environment has long been seen as an evil wrought by rapacious mega-corporations. Few stop to consider that those corporations make products demanded by us. Toward the end of the decade, the Whole Earth Catalog proclaimed: “We are as gods and might as well get good at it.” However wise that proclamation might have been, it was never likely to be heeded.7
HARLEM: MALCOLM X
The Harlem ghetto made Malcolm X. He saw its terror, its degradation, but also its vibrancy and its strength. “The ghetto people knew that I never left the ghetto in spirit,” he claimed, “and I never left it physically more than I had to. I had a ghetto instinct.” The ghetto was an instrument of oppression, a place where blacks were kept passive, demoralized, and drugged. For Malcolm, it was all that—and a platform for revolution.8
Malcolm Little was born on May 19, 1925, in Omaha, Nebraska. His parents were followers of Marcus Garvey, the Pan-Africanist who dreamed of a separate nation for American blacks. In 1929, the family home in Lansing, Michigan, was firebombed by white supremacists. Two years later, Malcolm’s father was run over by a streetcar; police judged it an accident, but the black community decided otherwise. When his mother was institutionalized after a nervous breakdown in 1936, an otherwise stable family was cast to the wind. Though an intelligent boy and a promising student, Malcolm left school and drifted into a life of crime. By the age of twenty-one, he was a professional criminal, with his own burglary ring. His future seemed depressingly predictable, but a spell in prison changed everything. He converted to Islam, turned his back on crime, and emerged in 1952 as Malcolm X.9
He joined the Nation of Islam, first established in the 1930s. It offered a spiritual escape from ghetto life; according to its teachings the Muslim faith gave individuals the strength to resist alcohol, tobacco, drugs, sex, and crime. Converts were taught to live within their means, thus avoiding the depredations of the loan shark and the numbers runner. Men were required to dress conservatively in suits and white shirts; women, to keep their hemlines below the knee and to cover their hair. While the ascetic practices of the Nation of Islam restricted its appeal, it nevertheless had nearly seventy congregations in thirty cities by i960. Converts were drawn almost exclusively from the ghetto. Anyone who doubted the power of the faith had only to look at Malcolm, a walking example of redemption.
Elijah Muhammad, the church’s leader, argued that Christianity was a white faith used to delude blacks, a gospel of subservience which taught acceptance of degradation by promising paradise in the afterlife. Whites, those “blue-eyed devils,” could never be trusted. In stark contrast to Martin Luther King, who preached racial harmony, Muhammad and Malcolm told their followers that blacks could never live in peace with whites, who were unworthy of their love. “There is nothing in our book, the Koran, that teaches us to suffer peacefully,” Malcolm argued in November 1963. “Our religion teaches us to be intelligent. Be peaceful, be courteous, obey the law, respect everyone; but if someone puts his hand on you, send him to the cemetery. That’s a good religion.”10
From his pulpit in Harlem’s Mosque Seven, Malcolm urged blacks to separate from white society and establish “a land of our own, where we can . . . lift up our moral standards.” Integration, he argued, was simply a sop to keep the black man quiescent. “It’s just like when you’ve got some coffee that’s too black, which means it’s too strong. What do you do? You integrate it with cream, you make it weak. But if you pour too much cream in it, you won’t even know you ever had coffee. It used to be hot, it becomes cool. It used to be strong, it becomes weak. It used to wake you up, now it puts you to sleep.” Instead of working for integration, “the American black man should be focusing his every effort toward building his own businesses, and decent homes for himself.” Malcolm accepted that the Black Nation he envisaged could not be achieved peacefully, but would instead have to be wrested violently from the white man’s grasp. Violence was neither avoidable nor regrettable. The spilling of blood would lead to the overthrow of white control and would purge black souls. “When a person places the proper value on freedom,” he argued, “there is nothing under the sun that he will not do to acquire that freedom. Whenever you hear a man saying he wants freedom, but in the next breath he is going to tell you what he won’t do to get it, or what he doesn’t believe in doing, . . . he doesn’t believe in freedom. A man who believes in freedom will do anything under the sun to acquire . . . his freedom.”11
“The Negro revolution is controlled by foxy white liberals, by the Government itself,” Malcolm argued, with obvious reference to King. “But the Black Revolution is controlled only by God.” King’s aim was to transform America, so that blacks could live safe and fulfilling lives. Malcolm’s aim was to show blacks how to survive America; he wanted to equip them with the moral strength and physical fortitude necessary to live in an irretrievably racist nation. He derided nonviolence and personally questioned the “manhood” of civil rights leaders who stood aside while their followers were beaten and killed. “It is criminal,” he argued, “to teach a man not to defend himself when he is the constant victim of brutal attacks.”12
“Malcolm . . . was simply electric,” the writer Alex Haley recalled. “Almost everything he did was dramatic—and it wasn’t that he was trying to be, it was just the nature of him.” The activist Marian Wright Edelman first encountered Malcolm delivering a speech at Yale Law School: “He was absolutely mesmerizing. He was brilliant. He was funny. He expressed the rage that all of us continued to feel about the slow pace of change in the country, but he did it in the cleverest and funniest way you could imagine. I just remember laughing uncontrollably at some of the ways Malcolm would answer questions and put down whites who were trying to trick him.” “The reason why initially we cut off the televisions [when he was on] is that we were scared,” claims Sonia Sanchez. “What he did was he said, ‘I will now’—in a very calm fashion—‘wipe out fear for you.’ He expelled fear for African-Americans.” She contends that “he made us feel whole. He made us feel loved. And he made us feel that we were worth something, finally, on this planet Earth.” He had a particularly profound effect upon women, who previously had felt inferior because of their race and their sex. “He made women feel like they were queens of the universe. It was a queen not that set on a throne and did nothing. It was a queen that worked. A queen that talked. A queen that led. . . . So, yeah, you said, ‘Hey, I am pretty. Look at here, look at these big lips. Aren’t they full? When you been kissed by these lips, you know you been kissed.’”13
Malcolm was far too good a preacher and far too charismatic a man to be tolerated for long by Elijah. Finding his popularity threatened, Elijah imposed a gag order. Malcolm respected the order, but grew increasingly resentful: “When a high-powered rifle slug tore through the back of NAACP Field Secretary Medgar Evers in Mississippi, I wanted to say the blunt truths that needed to be said. When a bomb was exploded in a Negro Christian church in Birmingham, Alabama, snuffing out the lives of those four beautiful black girls, I made comments—but not what should have been said about the climate of hate the American white man was generating and nourishing.” Silencing Malcolm was like trying to keep a dog from barking. After the assassination of Kennedy, he openly questioned why blacks should mourn the president’s death. Kennedy, he reminded listeners, had pursued a course of violence in Asia, Africa, and Latin America; he had supported assassination in the Congo and Vietnam. His murder was a case of “chickens coming home to roost. . . . Being an old farm boy, chickens coming home to roost never did make me sad. They’ve always made me glad.” While the statement was not far removed from the separatist policy of the Nation of Islam, Elijah, who insisted on playing the white man’s politics even though he rejected the white man’s world, found it unacceptable. Malcolm, recognizing an inevitability, broke with the Nation of Islam.14
A break was probably inevitable even if Malcolm had been able to stay silent. He was by this stage developing a more secular philosophy than that espoused by Elijah, one which accepted the possibility of racial coexistence. He regretted having been such an effective promoter of Elijah. “I shall never rest until I have undone the harm I did to so many well-meaning, innocent Negroes who through my own evangelistic zeal now believe in him even more fanatically and more blindly than I did.” After a pilgrimage to Mecca and a tour of Africa, his faith became noticeably more cosmopolitan. “I remember him coming back saying he . . . had seen not just black, not just white or yellow or red, but had seen all people and in the eyes of Allah they were one,” the singer Harry Belafonte recalled. “It was a major shift . . . and Malcolm knew . . . that a new alliance would have to be forged, that more than ever there was a need for the movements to come together.” “Where true brotherhood existed among all colors,” Malcolm observed, “where no one felt segregated, where there was no ‘superiority’ complex, no ‘inferiority’ complex—then voluntarily, naturally, people . . . felt drawn together by that which they had in common.” He wanted this same amalgamation to be achieved in America. This meant modifying his attitude to white people. The actor Ossie Davis recalled Malcolm saying to a white audience: “You know, I do not think all white folks are evil now, but some of you are, and I’m going to keep on at you until you, whoever you are, grant us the respect that we’re due as fellow human beings.” He also toned down his criticism of civil rights campaigners. He still believed their faith in nonviolence was deluded, but respected their courage.15
On March 27, 1964, Malcolm traveled to Washington to hear the Senate debate the Civil Rights Bill. Somewhat coincidentally, King was also present, and the two met for the first and only time in their lives. “I’m here to remind the white man of the alternative to Dr. King,” Malcolm announced. A short time later, he founded the Organization of Afro-American Unity, a body reflecting his new cosmopolitanism. He had in mind a worldwide pressure group campaigning for improvements in the lives of blacks everywhere. Included in his list of demands was a call for the US government to establish a “Pentagon-sized department” to address every aspect of black people’s plight. At one of his last press conferences, in February 1965, he argued: “It is incorrect to classify the revolt of the Negro as simply a racial conflict of black against white, or as a purely American problem. Rather, we are today seeing a global rebellion of the oppressed against the oppressor, the exploited against the exploiter.”16
“It is a time for martyrs now,” Malcolm remarked on February 19,1965. “If I am to be one, it will be for the cause of brotherhood. That’s the only thing that can save this country.” Two days later, he was gunned down at the Audubon Ballroom in Harlem, in front of his OAAU supporters. Elijah called it an “act of divine chastisement,” though he denied involvement. A week later, he told a crowd gathered for the Savior’s Day rally: “We didn’t want to kill Malcolm and didn’t try to kill Malcolm. They know I loved him. His foolish teaching brought him to his own end.”17
Uncertainty about where Malcolm was headed has led to wide variation in the way he has been remembered. Admirers choose the interpretation that pleases them the most. For some, Malcolm was a martyr to black nationalism; to others, he was a champion of world revolution. Some revered him as a separatist, others as a born-again integrationist. While it is tempting to speculate about what he might have become, there is no doubting that he remained just as influential dead as he had been alive.
Malcolm persuaded very few blacks to convert to Islam. But that is not an accurate measure of his impact. Through his example, blacks learned to take pride in their race. Long before the race riots of 1964–1968, he focused attention on how the ghetto destroyed lives. “Malcolm kept snatching our lies away,” Ossie Davis said at the funeral. “He kept shouting the painful truths we whites and blacks did not want to hear. . . . And he wouldn’t stop for love or money.” Davis thought that “Malcolm spoke directly to the emasculation of the black male in particular. He wanted to teach us how, in spite of that, to be men again.”18
“I felt,” Davis reflected, “a determination to say something about who the man was, because at that time the headlines were so full of so many awful things. He was being described as a mad dog whose violence had killed him.” Davis, a gentle man, recalled how white liberals later questioned why he had agreed to speak at Malcolm’s funeral. “No Negro has yet asked me that question,” he invariably replied.19
HAVANA: CHE
In November 1967, a photo was released of a Bolivian soldier standing over a half-naked corpse. The Bolivian government was proud of that photo, since the corpse in question was that of Ernesto “Che” Guevara, killed while trying to foment revolution among local peasants.
Two months later came a much more pleasing picture of Che, unveiled as a sixty-foot mural in the Plaza de la Revolutión in Havana, in celebration of the ninth anniversary of the Cuban Revolution. The image, captured by Alberto Díaz Korda, showed an achingly beautiful, almost saintly man wearing a black beret and looking upward, as if seeking the truth. Korda’s photo would eventually become one of the most widely reproduced images in the history of photography, a modern icon replicated on posters, T-shirts, postcards, matchboxes, table napkins, playing cards, condoms, and tattoos. In the 1990s, Nike replaced the star on his beret with a swoosh in order to advertise athletic shoes made in Third World sweatshops. Billions of dollars have been made peddling the image of a socialist.
Che’s appeal can be explained by the fact that, like John Kennedy, Patrice Lumumba, and Jim Morrison, he will always remain young, handsome, and dead. He was killed in the fullness of youth, before age could shatter the ideals and tarnish the image. Though a Marxist, he was a man unencumbered by convoluted theory. The simple, practical nature of his revolutionary style explains his enduring appeal to the young. “Che’s writings instilled in me the belief that a revolutionary is motivated by the strongest feelings of love for humanity,” one former radical recalled. He seemed genuinely concerned for the suffering of the downtrodden and led by example, sharing the sacrifices he expected of his followers. A man of action more than ideology, he taught peasants to rise up and seize what was theirs.20
Thanks in part to the work of the romantic socialist Régis Debray, an idealized account of Che’s revolution became the gospel for a generation of radicals in the industrialized West. Debray’s book Revolution in the Revolution, a title evoking the dynamism craved by the young, was first published in English in 1967. It became an essential accessory for those wishing to establish revolutionary credentials. Love for Cuba became as trendy as a tie-dyed shirt. Castro and Che seemed proof that the revolution could work, and that socialism could deliver justice and equality to the people. While socialists were constantly splintering into tiny factions, they generally managed to agree on the merits of Che, a powerful antidote to disillusionment with Soviet Russia.
Adoration has obscured the questionable aspects of Che’s career. His particular contribution to revolutionary theory was his belief that the preconditions necessary for revolution could be created by a small, enlightened cadre which he called a foco. In other words, the peasantry could be transformed into a revolutionary mass through inspired leadership. Since Che maintained that leaders would have to come from the educated class, this explains his appeal to middle-class students in the developed world. He implied, to the delight of radicals everywhere, that a few committed activists could change the world. Fired by this thought, the Columbia University firebrand Mark Rudd confessed to a “macho desire to be a great revolutionary hero (and martyr) like Che Guevara.”21
Che is best judged by his own standards. His experience in the Congo in 1965 demonstrates that enlightened leadership cannot by itself turn a reluctant peasantry into a revolutionary force. “I shall go to the Congo,” he announced, “because it is the hottest spot in the world now. . . . I think we can hurt the imperialists at the core of their interests.” The plan was launched against the advice of the Egyptian president, Gamal Abdul Nasser, who warned that Africa did not need “another Tarzan, a white man among black men, leading them and protecting them.” The idea that Africa was an African problem did not accord with Guevara’s internationalism. “I tried to make them understand that it was not just a struggle limited by frontiers, but a war against a common enemy . . . but no one took the point.” After nine months of trying to light a fire, Che was forced to conclude that the Congolese made poor revolutionaries. “We cannot pretend that the situation looks good,” he wrote. “Organizational work is almost nonexistent, since the middle-rank cadres do not work, do not know how to work, and inspire no confidence in anyone. . . . Indiscipline and the lack of any spirit of sacrifice are the principal characteristics of all these fighters. To win a war with such troops is out of the question.”22
Similar lessons were driven home in Bolivia, where Che went in early 1967. After nine months, he was forced to admit that “we have not succeeded in developing peasant support. . . . On the contrary, they are becoming informers for the army.” His foco dwindled from an original forty-five men to just sixteen at the time of his death. Yet this failure occurred in a country specifically chosen because it seemed to offer the best conditions for success. “I feel like a human scrap,” he confessed toward the end, “and on some occasions I have lost self-control.”23
Che’s failings were conveniently ignored by those who desperately craved a hero. In February 1968, SDS organized a field trip to Havana, an act of adoration disguised as education. Leftists had been making this pilgrimage for some time; it was an essential item on any radical’s curriculum vitae. Cutting cane was one small way they could compensate for the ill-treatment of Cuba by the American government. “I saw in Cuba what I wanted to see,” Rudd later admitted. “Factories, farms, and institutions that were owned by the state, socialized. I wanted to see a different way to organize society. But I didn’t see the obvious: that you can’t have a one-party state, that you have to have elections.”24
Radicals in the developed world were skilled practitioners of moral relativism. Horrible crimes were perfectly excusable if committed on behalf of a revolution in an oppressed Third World country. Those who reacted with outrage at the execution of the brutal rapist Caryl Chessman in California in i960 turned a blind eye when Castro executed hundreds of his fellow Cubans for the crime of political infidelity. Susan Sontag, who went to Cuba in i960, found it “astonishingly free of repression,” even though she admitted that it did not have a free press. The fact that 15,000 “counter revolutionaries” were being held in Cuban prisons by the mid-1960s apparently did not qualify as repression. There is no doubt that American propaganda exaggerated the Cuban threat, but there is also no doubt that student radicals compensated for this propaganda by manufacturing a fairy tale starring Castro and Che.25
It is easy to admire a peasant revolution from the safe refuge of an ivory tower. Support comes cheap—merely the cost of a full-sized poster of Che and some Joan Baez albums. There’s no need to think about what Che actually accomplished, or the ruthless methods he used. Symbols are a substitute for analysis. Long after Debray’s book was forgotten, the power of Che’s portrait remained potent. The hero is inseparable from the image: a cult of Che could not have materialized without that photo. He symbolizes an era when images smothered words. As the Russian poet Yevgeny Yevtushenko complained:
Comandante, your precious name
they wish to sell so cheaply.
With your name industry wants to buy
new customers.
Comandante: in Paris I saw your
portrait on little pants called
“hot.”
Your pictures, Che,
are printed on shirts
You plunged into the fire.
They want to turn you into
smoke.
But you fell,
riddled by bullets,
by poisonous smiles
not to become later
merchandise for the consumer
society.
The image provided rebellion without pain, a fantasy inspired entirely by a face. It was not necessary to examine the grim realities of Castro’s revolution, or the real character of Che himself, because that beautiful face said everything. Eventually, the image completely overwhelmed what it was once supposed to represent. Students today still put Che on their walls, but are more likely to eat Cherry Guevara ice cream. Some of them even know that Che once had something to do with Cuba.26
MIAMI: THE GREATEST
They called him the Louisville Lip because his voice was directly wired to his ego. Detractors called him the Mouth, for much the same reason. Cassius Clay’s talent for bluster was brutally evident on the eve of his first title fight, when he mocked his opponent, Sonny Liston: “The man can’t talk. The man can’t fight. The man needs talking lessons. The man needs boxing lessons. And since he’s gonna fight me, he needs falling lessons. . . . I’m gonna give him to the local zoo after I whup him. . . . He’s too ugly to be the world champ. The world champ should be pretty, like me.” Vulgar boasts were tolerable because Clay usually delivered what he promised. On February 25, 1964, he combined exquisite grace with vicious brutality, overwhelming the lumbering Liston. “I want everyone to bear witness,” Clay shouted. “I am the greatest! . . . I’m the king of the world!” A new hero had arrived, a man everyone talked about but few even remotely understood. “What a strange and uncommon man,” Gilbert Rogin remarked in Sports Illustrated. “Who can fathom him? We can only watch in wonder as he performs and ponder whether, despite his truly affecting ways, he doesn’t scorn us and the world he is champion of.”27
Clay refused to conform to the image admirers demanded. This was driven home immediately after the Liston fight. Instead of attending the customary parties, complete with champagne and beautiful women, he spent a quiet evening with friends, among them Malcolm X and the singer Sam Cooke. The next morning came an even more unpalatable revelation. Clay announced his conversion to Islam: “I believe in Allah and I believe in peace. . . . I’m not a Christian any more. I know where I’m going, and I know the truth, and I don’t have to be what you want me to be.”28
This declaration of independence was, by itself, shocking. Heroism implies ownership. Sports stars belong to spectators; they sacrifice their private self for public adulation. All this explains why the nation felt betrayed when Clay said, “I don’t have to be what you want me to be.” The betrayal was compounded when he underlined his independence by adopting a “foreign” religion, essentially saying that America felt foreign to him. This interpretation was confirmed when Clay emphasized that his conversion was not just religious:
I ain’t no Christian. I can’t be when I see all the colored people lighting for forced integration get blowed up. They get hit by stones and chewed by dogs and they blow up a Negro church and don’t find the killer. . . . I’m the heavyweight champion, but right now there are some neighborhoods I can’t move into. . . . I’m no troublemaker. . . . I’m a good boy. I never have done anything wrong. I have never been to jail. I have never been in court. I don’t join any integration marches. . . . A rooster crows only when it sees the light. Put him in the dark and he’ll never crow. I have seen the light and I’m crowing.
At the time, innocence still reigned in American sports. Fans could not stomach a sporting hero who used his prestige to criticize America. Clay’s decision to bring personal grievances into the boxing arena seemed blasphemous. No wonder, then, that he was cast from the temple. Columbia Records, producers of his album I Am the Greatest, pulled it from shops. Advertising agencies terminated his product endorsements. Appearances on television were abruptly canceled. “You don’t use the heavy-weight championship of the world to spout religious diatribe,” maintained Harry Markson, the director of Madison Square Garden. “We’ve made so much progress in eliminating color barriers that it’s a pity we’re now facing such a problem.” The former champion Joe Louis, who knew discrimination, chimed in: “Clay will earn the public’s hatred because of his connections with Black Muslims. The things they preach are the opposite of what we believe.”29
Clay’s affiliation with the Nation of Islam was no mere whim. He had essentially been a member since 1962, but had kept his beliefs quiet in order to protect his career. Since Malcolm X and Elijah Muhammad were deeply reviled figures, his association with them was bound to cause disquiet. After Clay announced his conversion, commentators customarily referred to him as a “Black Muslim,” as if that was something altogether different from a genuine Muslim. “‘Black Muslim’ is a press word,” he protested. “The real name is ‘Islam.’ That means ‘peace.’” For most white Americans, the “Black Muslim” label fused two already frightening identities. Clay became, overnight, something altogether more threatening and foreign than an ordinary black man. Ed Wassman, president of the World Boxing Association, argued that Clay’s behavior was “detrimental to boxing.” The conversion was dismissed as a fad, what one reporter called the “Allah routine.”30
White America’s sense of betrayal was exacerbated by the peculiar dynamic of 1960s race relations. The black sports star was given a magic key into the white world. Fans ignored an athlete’s color if he was talented. Thus, an inveterate Chicago racist might adore Jim Brown because there was no better running back in football. Acceptance, however, implied duty. In exchange for the keys to the kingdom, the black star was supposed to be grateful, obedient. He was not supposed to present himself as a spokesman for black people. He could not, it was argued, be a spokesman, since his huge income and the admiration he enjoyed separated him from his race. It seemed hypocritical for such a man to complain about racism when he no longer felt its pain.
Clay, however, could not keep quiet. He used the podium of his heavyweight title to lecture on race. “The government should . . . get down on their hands and knees every morning and thank God that 22 million black people have not become anti-American,” he argued in January 1965. “You’ve given us every right to. The whole world would side with us if we became anti-American.” Clay’s diatribes delighted Malcolm X, who understood just how useful the new champion could be. He wanted a hero exclusively for blacks. “Clay . . . is the finest Negro athlete I have ever known,” he remarked, “the man who will mean more to his people than Jackie Robinson, because Robinson is the white man’s hero.” Malcolm calculated that the annoyance Clay would cause whites would be inversely proportional to the pride he would inspire among blacks.31
“This fight is the truth,” Malcolm told Clay before the Liston fight. “It’s the Cross and the Crescent fighting in a prize ring—for the first time. It’s a modern crusades . . . with television to beam it off Telstar for the whole world to see what happens!” This willful divisiveness annoyed white liberal supporters of civil rights. Annoyance increased when Clay told the Louisville Courier-Journal: “I’m not going to get killed trying to force myself on people who don’t want me. I like my life. Integration is wrong. The white people don’t want integration. I don’t believe in forcing it, and the Muslims don’t believe in it.”32
Clay’s emergence came at an opportune moment, since it allowed Elijah to be much more decisive with Malcolm. The conversion coincided perfectly with the Nation of Islam’s annual Savior Day. For the first time in a decade, Malcolm was absent, having been banished by Elijah, who used the occasion to showcase Clay. “I’m so glad that Cassius Clay was brave enough to say that he was a Muslim,” Elijah told the crowd. “He was able, by confessing that Allah was his god and by following Muhammad, to whip a much tougher man. They wanted him to get his face torn up, but Allah and myself said ‘No!’ . . . Clay has confidence in Allah, and in me as his only messenger.”33
One icon quickly replaced another. Elijah was willing to use Clay as an advertisement for Islam despite the fact that he hated boxing and never attended matches. On March 5, after announcing that Malcolm’s suspension would be indefinite, Elijah revealed that Clay would be given the name Muhammad Ali. This came as a surprise, since converts usually had to wait more than ten years for such an honor. While Clay was jubilant, the banishment of his close friend Malcolm at first troubled him. Faith, however, took precedence over friendship. He accepted that he would have to break with Malcolm. “You just don’t buck Mr. Muhammad and get away with it,” he explained. “I don’t want to talk about [Malcolm] anymore.”34
To whites, the name change was a bitter statement of rejection. As Cassius Clay, he was familiar, containable. As Muhammad Ali, he was foreign, threatening. Those who objected to the change persisted in calling him Clay, as if, by so doing, they could control the man, assert ownership of him, deny him what he wanted to be. The Los Angeles Times sportswriter Jim Murray mocked the boxer by calling him “Abdullah Bull Bull” or “Abou Ben Hernia.” To confirm that he was entirely serious, Ali made a point of walking out of Madison Square Garden when the announcer introduced him as Cassius Clay. “Clay means dirt,” he objected. “My white blood came from slave masters, from raping.” “People are always telling me what a good example I could be if I just wasn’t a Muslim,” he added. “I’ve heard it over and over, how come I couldn’t be like Joe Louis and Sugar Ray. Well, they’re gone now, and the black man’s condition is just the same, ain’t it? We’re still catching hell.” 35
The day after Ali announced his conversion, both the FBI and the Defense Department opened inquiries into his draft status. Given the close sequence, it is easy to conclude that Ali was being punished. The issue, however, is much more complicated than Ali worshipers seem to understand. The draft matter predated his title fight and the announcement of his conversion. He was eligible because he was the right age, was unmarried, and was not a full-time student. Fame was not supposed to be a deferment, a point the government occasionally tried to make. While the entire selective service system was undoubtedly iniquitous, it would surely have compounded its iniquities if Ali had escaped the call-up because of his fame.
On January 24, 1964, Ali had failed the military qualification examination, scoring in the sixteenth percentile, well below the threshold for induction. “I tried my hardest,” he insisted, but the Selective Service Board suspected otherwise and ordered him to retake the test. When the issue went public in early March, Ali maintained that he would not request exemption as a conscientious objector on grounds of religious faith. “I don’t like that name,” he said. “It sounds ugly—like I wouldn’t want to be called.” He instead took the test again on March 13, and failed again. This time, the board accepted the result and classified him as ineligible for service.36
Ali’s detractors, deeply disappointed that he had escaped the draft, now hoped that another boxer would silence him in the ring. The obvious contender was Floyd Patterson, who had announced that “as a Catholic” he felt obliged to “reclaim the title for America.” One suspects he meant “as a Christian,” but everyone understood, and most cheered. “Cassius Clay is disgracing himself and the Negro race,” Patterson told Sports Illustrated. “The image of a Black Muslim as the world heavyweight champion disgraces the sport and the nation. Cassius Clay must be beaten and the Black Muslim scourge removed from boxing.” Ironically, Patterson, the great defender of American values, had only recently been forced to sell his $140,000 home at a considerable loss because of incessant racial abuse from white neighbors.37
Ali responded with typical bluster. “Patterson says he’s gonna bring the title back to America,” he spat. “If you don’t believe the title already is in America, just see who I pay my taxes to. I’m American. But he’s a deaf dumb so-called Negro who needs a spanking. I plan to punish him for the things he said; cause him pain. . . . The little pork-chop eater don’t have a chance.” He then shifted to rhyme:
I’m gonna put him flat on his back,
So that he will start acting black,
Because when he was champ he didn’t do as he should,
He tried to force himself into an all-white neighborhood.
Ali did not predict a short fight, because he did not want one. On November 22, 1965, he made Patterson suffer, pummeling him with devastating punches, then backing off to allow him to recover, only to resume the punishment. Punches provided percussion to a continuous refrain of abuse. “Come on America!” Ali taunted. “Come on white America!” Fans and critics alike were appalled at his cruelty.38
Sublime talent could not mask the stench of hypocrisy. Ali had blamed Patterson for being the victim of racism—it was his fault for moving into a white neighborhood. He subjected his opponent to verbal abuse which, if uttered by a white man, would have been condemned as racist. “No black athlete had ever spoken so disparagingly of another black athlete,” the tennis star Arthur Ashe complained. Patterson, not Ali, seemed to recognize the brutal irony of two black men beating each other up for the entertainment of a mainly white audience. “There is so much hate among people,” he complained, “so much contempt inside people who’d like to think they’re moral, that they have to hire prizefighters to do their hating for them. And we do. We get into a ring and act out other people’s hates. We are happy to do it. How else can Negroes like Clay and myself, born in the South, and with little education, make so much money?” On that conundrum, the Mouth was silent.39
After the Patterson fight, the draft issue resurfaced. The war was using up America’s eligible men at a voracious pace. Desperate to widen the recruiting pool, the Selective Service Board in early 1966 lowered the passing score from 30 percent to 15 percent. Ali suddenly found himself eligible. He was mystified: “For two years the Army told everyone I was a nut and I was ashamed. And now they decide I am a wise man. Now, without ever testing me to see if I am wiser or worser than before, they decide I can get into the Army.” “Why me?” he kept asking. “Man, I ain’t got no quarrel with them Vietcong.”40
In truth, not many Americans did. Ali had by this stage been struggling with the draft issue for two years, yet he still had not developed a coherent position on the war, or on where his Islamic faith placed him. He saw himself as different from the thousands of other young black men in his draft cohort, at the same time that he aspired to be their representative. He suggested that he was more valuable to the war effort as a professional boxer than as a GI. “I buy a lot of bullets, at least three jet bombers a year, and pay the salary of fifty thousand fighting men with the money they take from me after my fights.” That line of argument was abandoned when he realized how selfish it sounded. He then claimed that his ring career was immaterial to the question of service. “Boxing is nothing, just satisfying some bloodthirsty people. I’m no longer Cassius Clay, a Negro from Kentucky. I belong to the world, the black world. I’ll always have a home in Pakistan, in Algeria, in Ethiopia. This is more than money.”41
Ali claimed he was being victimized, punished for his outspokenness and his conversion to Islam. “I can’t understand it. Out of all the baseball players, all of the football players, all of the basketball players—why seek out me, the world’s only heavyweight champion?” That argument had some justification. Draft boards were known to call up “troublemakers,” like the civil rights leaders John Lewis and Bob Moses, apparently to silence them. Yet there is still something slightly sordid about Ali’s line of argument, based as it is on an assumption of exclusivity. His arguments have been accepted with little critical examination because, today, he is universally loved and the Vietnam War seems unjust. From that logic flows the argument that the US government “stole” the career of a gifted boxer. If this is true, however, it must be pointed out that the government also stole the prime years of some nine million other Americans. The issue of Ali’s “unfair” treatment is based on the assumption that he was too important to go to war, that those less gifted should have gone in his place. That seems grossly elitist.42
Ali’s detractors labeled him a coward. He wasn’t a coward, but he was an opportunist. Despite at first arguing that it was not right to claim conscientious-objector status, he eventually settled on that as the most promising line of escape. On February 22, he filed for CO status, explaining: “It would be no trouble for me to go into the armed services, boxing exhibitions in Vietnam and traveling the country at the expense of the government or living the easy life, and not having to get out into the mud and fight and shoot. If it wasn’t against my conscience to do it, I would easily do it. I wouldn’t raise all this court stuff, and I wouldn’t go through all this and lose the millions that I gave up and my image with the American public.” Rather surprisingly, Justice Lawrence Grauman, on reviewing the case, bought that argument, ruling that Ali was “sincere in his objection on religious grounds to participation in war in any form.” This, however, was not the ruling that the government wanted, since it seemed to suggest that the Muslim faith provided legitimate escape from service. As the prosecutor Morton Susman warned, “If [Ali] gets by, all black people who want to be Muslims will get out for the same reasons.” The Justice Department, insisting that Ali’s refusal was based on racial and political—not religious—grounds, called upon Kentucky’s selective service appeal board to ignore the Grauman ruling. They dutifully obliged. The case was referred to the national appeal board, which delivered the decision the government wanted.43
In truth, Ali’s objections were based as much on politics as on religion. In voicing his opposition to the war, he could seldom resist the opportunity to criticize the government. While touring Louisville, he argued: “Why should they ask me to put on a uniform and go ten thousand miles from home and drop bombs and bullets on brown people in Vietnam while so-called Negro people in Louisville are treated like dogs and denied simple human rights? No, I am not going ten thousand miles from home to help murder and burn another poor nation simply to continue the domination of white slavemasters of the darker people the world over. This is the day when such evils must come to an end.” Heartfelt as that objection might have been, it contains a massive flaw. Reversing the logic produces the following possibility: Ali would have gone ten thousand miles to drop bombs and bullets if the conditions for black people at home had been better. He confirmed that possibility later in the statement: “The real enemy of my people is right here. . . . If I thought the war was going to bring freedom and equality to twenty-two million of my people, they wouldn’t have to draft me—I’d join tomorrow. But I either have to obey the laws of the land or the laws of Allah. I have nothing to lose by standing up for my beliefs. We’ve been in jail for four hundred years.” Only one besotted by Ali can ignore the contradiction in his statement. He’d claimed he had a religious objection to war. Yet that should have applied to all wars, as Judge Grauman suggested it did. Pacifism, as every draft board knows, is supposed to be absolute. His claim that he would join up in a minute if he was convinced that the war was a bona fide freedom struggle merely demonstrates that his objection pertained specifically to this war and was based on politics. There is nothing dishonorable in such an objection, but there is something rather shifty in attempting to pass it off as genuine conscientious objection based on religious opposition to war.44
By April 1967, Ali’s options for appeal were exhausted. On the 28th he refused to take part in an induction ceremony in Houston. The boxing establishment immediately began their censure of him, suspending his license and stripping him of his title. Then, in June 1967, Ali went on trial before an all-white jury. He was quickly found guilty and sentenced to five years in prison and a $10,000 fine. The severity of the judgment shocked even the prosecutor, given that the usual sentence was eighteen months. Ali, however, remained defiant. “Standing up for my religion made me happy; it wasn’t a sacrifice,” he later insisted. “When people got drafted and sent to Vietnam and didn’t understand what the killing was about and came home with one leg and couldn’t get jobs, that was a sacrifice. But I believed in what I was doing, so no matter what the government did to me, it wasn’t a loss.”45
For his crime, Ali spent three years exiled from boxing. That was a significant sacrifice but, as he intimated, it is risky to compare sacrifices when there are nearly 60,000 names on a black-marble memorial in Washington. The system was harsh, but its harshness was felt by thousands of others besides Ali. To single him out as uniquely mistreated denigrates the experiences of those who suffered more. Likewise, to dismiss the system as corrupt demeans those who managed to convince the tribunals that their conscientious objection was genuine, for the simple reason that it was.
CHELSEA: MARY QUANT
In November 1955, a young art student named Mary Quant, along with her boyfriend, the suitably cool jazz trumpeter Alexander Plunket Greene, opened a shop called Bazaar on the King’s Road in London’s Chelsea district. Like Bob Dylan, Quant anticipated an age; her clothes were fresh, breezy, and bright, at a time when Britain was still gray, boring, and fusty. “Suddenly the King’s Road was full with new and unlikely people, all of the magical age in which they were adult but not old,” Alexandra Pringle recalled. “The women . . . wore big floppy hats, skinny ribbed sweaters, keyhole dresses, wide hipster belts and, I believed, paper knickers. They had white lipsticked lips and thick black eyeliner, hair cut at alarming angles, op-art earrings and ankle-length white boots. They wore citron-colored trouser suits and skirts that seemed daily shorter. They rode on miniature motorbikes. They had confidence and, it seemed, no parents.” Quant loved the adjectives “kooky” and “kinky,” words most people still thought pejorative, but which would soon define a decade. “I hated the clothes the way they were,” she later explained. “I wanted clothes that were much more for life, much more for real people, much more for being young and alive in.” Like the brilliant businesswoman she was, Quant provided a product before people quite realized they wanted it. By the end of her first week of business, it was clear that Bazaar would not only be a huge success, it would also transform fashion around the world.46
During World War II, British fashion had been carefully controlled through clothing coupons and utility designs. The latter quashed style by stipulating precisely how many pleats a skirt might have and how long it should be. Dyes were in short supply, since the military had first priority. Fabrics had to be produced at home as much as possible, and this meant that virtually everything was made of wool. The overall effect was that everyone looked the same—something the government encouraged because it strengthened the impression of a people’s war.
In 1947 came Christian Dior’s New Look, an intentional rebuke to wartime drabness. While undoubtedly modern, it was also assertively nostalgic, a throwback to an imaginary age of plenty. Long dresses and flouncy skirts were conspicuously wasteful of resources; frill attacked asceticism. Most Britons, however, had to admire the New Look from afar, since money was still tight, resources controlled, and austerity still policy. It was not until the mid-1950s that the British could again think of clothes as fashion.
Those who were parents in the 1950s were children of empire, of war, and of economic depression. To them, Britain was an old country and a country which drew strength from its oldness. To their children, however, Britain was postimperial, even though no one used that term. They had heard stories of hard times, but had not felt the pain. No one quite knew what Britain would become in the second half of the twentieth century. What was clear, however, was that it would be something different—something new—and that newness would be defined by the young.
Fashion heralded the triumph of the new. Teds morphed into Mods, a style which defined both clothing and behavior. Those who did not live the Mod life nevertheless coveted the Mod style: sharp lines, simple designs, basic colors—the very antithesis of the New Look. “At first we thought it was just the art-student type that wanted to look like us and buy our clothes,” one former Mod reflected. “What we didn’t realise at the time . . . was the fact that we were interpreting the mood of a whole generation, not just smart art students. The whole thing caught on in a much bigger way than we expected.” That was partly due to Quant, who took an improvised street fashion and turned it into a billion-pound industry. “Suddenly someone had invented a style of dressing which we realised we had been wanting for ages,” wrote Brigid Keenan, who covered fashion for the Daily Express. “Comfortable, simple, no waists, good colours and simple fabrics. It gave anyone wearing them a sense of identity with youth, and adventure, and brightness.”47
In November 1965, the model Jean Shrimpton, while visiting Australia, shocked observers at the Melbourne Race Track when she arrived wearing a shift which ended a good four inches above her knees. The event, which was still provoking comment in newspapers a week later, is seen as the official unveiling of the miniskirt, even though hers was not technically a skirt. The revolutionary fashion turned every woman into an exhibitionist. At the same time, the men at the Exchequer tore their hair in frustration. The tax on clothing stipulated that a woman’s skirt had to be at least twenty-four inches from waist to hem; anything shorter was a child’s skirt, and children’s clothing was not taxed. Thanks in part to Shrimpton, the rules of fashion stipulated that a miniskirt was authentic if it measured around twenty inches or less, while the really daring wore microskirts a mere twelve inches in length.
Soon, the controversial style had circled the globe. For Yasmin Alibhai, who grew up in Uganda, the Sixties began when a new girl arrived at her school: “Nadya walked into the school party shimmering in a noisy silver paper foil dress, and hair which reached down three inches below the hemline, much of which she was air-propelling by the deft movement of her head from side to side with extraordinary viciousness. She was also showing a terrifying amount of leg. She cut through the crowd which was paralysed with horrific admiration and put on a Joan Baez tape, belting out some protest song. That too was a first.”48 In the United States, the Wall Street Journal discovered that 52 percent of businesses found miniskirts acceptable for office wear, while long hair (on men) was approved by only 5 percent. “It’s a functional thing,” quipped New York’s mayor, John Lindsay. “It enables young ladies to run faster—and because of it they may have to.” In the California legislature, however, women were banned from wearing miniskirts. The author of the ruling, Eugene Chappie, explained that he “was getting sick of turning my head when one of them was at a drinking fountain.” In San Diego, the Padres baseball team desperately tried to boost attendance by hosting Miniskirt Days, with free admission to women so attired. Management reasoned that if baseball did not bring men to the stadium, short skirts would. Around the same time, Pacific Southwest Airlines enjoyed a brief boost in profits thanks to its reputation for having the stewardesses with the shortest skirts.49
What Mary Quant gave to women, John Stephen gave to men. He came from Glasgow in the mid-Fifties to work in London as a tailor’s clerk, but before long opened a Carnaby Street boutique called His Clothes. Stephen was not afraid to let street style dictate what was in his shops, instead of the other way around. He would watch what hip Mods were wearing and would put those same clothes on the racks a short time later. So great was his mastery of popular taste that within a few years he had more than fifty shops in Britain, Europe, and the United States.
While Quant and Stephen saw themselves as trendsetters—the instigators of a fashion revolution—they were also astute businesspeople who saw a niche in the market and cleverly filled it. Though most Britons hardly felt affluent, they sensed that better times had arrived. Unemployment was low and so were prices. Young people with a modicum of disposable income spent it on conspicuous finery, in a way that made their ascetic parents cringe. Working-class girls thought nothing of spending half their weekly wage on clothing, cosmetics, and jewelry.
The clothes themselves, like the new music, were designed to be disposable. Because fashion changed so quickly, quality and durability were unimportant. Cathy McGowan, the young presenter of the pop music program Ready Steady Go! and one of the foremost exemplars of the “dolly bird” image, was briefly employed as a “teenage consultant” by three large clothing chains. She advised that “kids want clothes to look terrific—and they don’t wear them for long, so it doesn’0t matter if they fall to bits.”50
Quant, Stephen, and their imitators understood that new fashions required new marketing techniques. Boutiques were not simply clothing stores; they were places where kids went to have fun, to meet people, and to indulge in fantasy. Customers were encouraged to linger. The interior decoration harmonized with the clothing styles. Music was loud and ubiquitous, carefully calculated to raise spirits. Out went the older, professional shop assistant, who made customers feel unwelcome. The new clerks were the same age as their clients, wore the same clothes, and usually managed to give the impression that they, too, were simply having fun. Shops pretended to be democratic, in line with the illusion of a classless society. “Haute couture doesn’t have the same meaning anymore, because money, status, and power no longer have the same meaning,” the fashion designer Rudi Gernreich proclaimed. “Now fashion starts in the streets. . . . That’s why I watch kids.”51 Quant claimed that “snobbery has gone out of fashion, and in our shops you will find duchesses jostling with typists to buy the same dresses.”52 This was rank marketing hype, since her clothes did not come cheap. Fashion perhaps started in the streets, but the retail industry devised a way to sell back to kids (at considerable profit) what they themselves had invented.
With a new look went a new body type: straight lines and simple shifts did not work very well on women shaped like Marilyn Monroe. According to Barbara Hulanicki, the immensely successful founder of the boutique Biba, the perfect look was “square shouldered and quite flat-chested [with] . . . an upturned nose, rose cheeks, . . . a skinny body with long asparagus legs and tiny feet.” Big breasts, Quant felt, suggested motherhood, and mothers could never be groovy. Whereas girls had once dressed to look like women, now women were desperate to look like girls.53
“The Look” was not confined to clothes. In 1963, Quant asked the young Vidal Sassoon to provide a hairstyle to go with her latest fashion line. He had already been experimenting with “the Shape,” a simple bob made revolutionary by the fact that it was long at the sides and short at the back. The line was so clean and the curve so perfect that it looked like it had been cut with precision tools. It had something of the space age about it—the kind of hairstyle an alien might sport. This was no coincidence, given that the outer-space theme was used extensively in fashion to suggest ultramodernity. Manmade fabrics furthered the image; designers took to using PVC and Mylar, or at the very least shiny polyester. Photo shoots were often staged with a backdrop made to look like Mission Control. Designers displayed a fondness for silver, bright white, and fluorescent colors, while geometric patterns and simple stripes suggested functionality. The models themselves, with their flat chests and straight hips, looked and acted like an army of androids.
Fashion is the easiest way to adopt an image: wear the clothes and become the person—at least on the outside. Deconstructing that image is like walking through a minefield. In striving for the youthful look, Quant intentionally mimicked children’s clothing, in particular school uniforms. She readily admitted that she wanted women to appear under the age of consent. While that seems a style specifically designed to satisfy warped male fantasies, Quant insisted that her fashions implied strength: “The way girls model clothes, the way they sit, sprawl or stand, is all doing the same thing. It’s not ‘come hither,’ but it’s very provocative. She’s standing there defiantly with her legs apart saying, ‘I’m very sexy. I enjoy sex, I feel provocative, but you’re going to have a job to get me. You’ve got to excite me and you’ve got to be jolly marvellous to attract me. I can’t be bought, but if I want you, I’ll have you.’” The novelist Sheila MacLeod disagrees. She wonders in retrospect “how [I] could have been deceived into imagining that the fashions of the Sixties spelled liberation for women.” Instead, “their purpose was to imprison women more securely as objects of male attention, male ribaldry, male lust.” Germaine Greer, in The Female Eunuch, objected vehemently to the way women had been turned into sex objects, as if that were something new. Most miniskirted women, however, saw Greer as a loudmouthed spoilsport. The era witnessed a new version of an old argument: Do women dress for themselves, or merely to please men? The answer is elusive because it differs from woman to woman, depending upon who they were pre-mini and who they became. One woman’s flag of surrender was another’s weapon of conquest.54
Before long, the arbiters of style decided that the only acceptable standard in fashion was unconventionality. At David Bailey’s wedding to Catherine Deneuve on August 18, 1965, the bride wore a blue sweater and light-green corduroy trousers and arrived smoking a cigarette. The best man, Mick Jagger, wore blue denim and no tie. In truth, the most unconventional thing a guest could have done would have been to wear a tuxedo. Sartorial nonconformity sent the Look into decline after 1966; its orderliness was out of place in an age devoted to chaos. Quant gave way to the flower children, for whom freedom was paramount. The new trend was to dress as if style was unimportant—to wear whatever one wanted, the more bizarre the better. Hulanicki anticipated this when she opened Biba in September 1964. Out went the neat lines and precise colors pioneered by Quant; Biba was full of off-tones, flowing fabrics, colors rich instead of bright. As Alexandra Pringle recalled: “A certain lasciviousness, a sort of voluptuousness crept in. . . . I tried on clothes for the sinful and louche: slithery gowns in glowing satins, hats with black veils, shoes stacked for sirens. There were shoulder pads huge as American footballers’, evening dresses to make Betty Grable sigh, makeup—chocolate and black—for vamps and vampires.” While Bazaar catered to the rich, almost anyone could afford Biba. Hulanicki concentrated on volume, packing her stores with cheap clothes guaranteed to fall apart or go out of fashion in a few weeks. By the end of 1967, her main store was reputed to have the highest turnover per square foot of any shop in the world. On any given day, she lost more to shoplifters than most boutique owners made in profit.55
“So where does all the Swinging London stuff, pop music, hemlines, where did it all fit in?” the novelist Angela Carter asks. “I’d like to be able to dismiss it all as superficial and irrelevant to what was really going on, people arguing about Hegel and so on, but I’m forced to admit that there was a yeastiness in the air that was due to a great deal of unrestrained and irreverent frivolity. . . . There’s no denying that towards the end of the decade everyday life . . . took on the air of a continuous improvisation. . . . Carpe diem. Pleasure. It didn’t have to cost much, either.” The success of boutique-style retail astounded even those at its cutting edge. By 1966, Quant was making £6 million a year and had brokered a deal with J. C. Penney to market her designs in the United States. Stephen, the working-class Glaswegian, had a mansion in the country and a fleet of fancy cars. London, with some 2,000 boutiques, had become the fashion capital of the world. The boutiques, the music, the sheer effervescence of life in the capital had combined to make London the Swinging City. “The city is alive with birds (girls) and Beatles, buzzing with minicars and telly stars, pulsing with half a dozen separate veins of excitement,” Time magazine proclaimed.56
“Britain has lost an empire and has not yet found a role,” Dean Acheson remarked in 1962. By 1965, Britannia’s role was clear. Whereas once she had prided herself on giving substance to the world, now she gave it style.