Movers, Shakers, and Boomers |
1 |
In 1970 the Bayonne High School class of 1960 gathered for their reunion. Journalist Steven Roberts told their story as a participant observer, interviewing his old classmates and comparing notes with them, in a feature article in the Sunday New York Times. One common theme emerged: the class of 1960 had “just missed out” on the great changes of the upcoming decade. As one alumnus commented, “The last five years have really been the turning point.” What had changed? Practically everything.
Between 1965 and 1970 the “police action” in Vietnam had escalated to a war, the civil rights movement had blossomed into Black Power and Nixon’s “Southern Strategy,” Reefer Madness (1936) became a cult laughing stock on the college film circuit, and Playboy discovered pubic hair. The women at the reunion discussed their marriages and children through the new lens of second-wave feminism. “We had been shaped,” Roberts concluded, “in the dying years of a world that no longer exists.” The basic assumptions instilled in them in the 1950s—“respect authority . . . sex is dirty”—had been swept away.1
Or had they? While many younger Americans were embracing the sexual revolution, the civil rights movement, and the celebration of personal freedom, many others were not. Today’s silver-haired conservatives did not spring from thin air during the Reagan administration. The story of Mitt Romney and a few friends forcibly cutting a classmate’s long hair may have shocked voters during the 2012 presidential campaign, but there were dozens of similar incidents reported across the county in the 1960s, and probably many more that were unreported.2 Contrary to popular media images, not everyone in the 1960s and 1970s was white, middle-class, and straight. Nor did we all become hippies and protesters in college. One of my most vivid memories of the Syracuse University campus was the sunny afternoon in May 1970 when I attended a vigil for the students who had just died at Kent State. One end of the Quad was a mass of students singing antiwar songs; at the other end some of our classmates were sunbathing and throwing Frisbees. Between us, students headed to their classes along the walkways that crisscrossed the lawn. Two of the students who died at Kent State had been passers-by like them, not protesters.
No generation is a monolith, no matter how society’s institutions treat them. Baby boomers, as defined by Madison Avenue, did not exist in real life but were as much a construct as any other demographic or marketing segment. Contrary to popular stereotypes, there were—and are—black, Latino, queer, straight, celibate, disabled, and working-class baby boomers, with a diversity of opinions about politics and morality.
Nor was the older generation uniformly opposed to the transformations taking place in American culture. The doctor who raised so many of us—Benjamin Spock, then in his sixties—was a familiar figure at major antiwar rallies, and many other liberal heroes and heroines were contemporaries of our parents and grandparents. It may be tempting to frame the divide that emerged as a “generation gap”—a term popularized during the early 1960s—but it is more useful to see it as the opening wedge in the culture wars that have engulfed the United States for the past fifty years.
Like huge tectonic plates colliding to reshape continents, three simultaneous forces began to interact during this time period. The first was the postwar baby boom, which in 1960 began pumping millions of teenagers a year into the consumer marketplace. The second was the sexual revolution, which had its roots in the sexology studies of Masters and Johnson, Hugh Hefner’s dream of sexual freedom, and the uncoupling of sex and procreation. Finally, the civil rights movement focused national attention on individual rights, beginning with African Americans but soon expanding to include youth and women of all races and, to a lesser extent, gays and lesbians. The civil rights movement and the sexual revolution were well under way when baby boomers were still watching Howdy Doody (1947–1960) and would have been major influences on American culture with or without them. The adolescence and young adult years of the baby boom accelerated the conflagration, and our diverse experiences during those formative years are reflected in the conflicts that have dogged my generation ever since.
Why look at the tensions and controversies of this era through clothing trends? It’s common to think of fashion as superficial, bearing little relationship to the serious issues of its time. This is wrong on two points. First, clearly there have been times when fashion changes have expressed deeply held convictions in times of change. The best example is the abandonment of knee breeches (associated with the aristocracy) in favor of trousers in revolutionary France, a shift that foreshadowed the triumph of commercial culture over hereditary power in the nineteenth century. (A more cynical explanation, but equally valid in some cases, is that the sudden taste for proletarian pants reflected an acute desire for survival by the French aristocracy.)
The other reason to look past the apparent triviality of fashion is that it is an important way that individuals connect themselves to others in modern consumer culture. We dress to express ourselves—age, gender, race, religion, as well as personality—and to place ourselves in context: place, time, occupation, kinship, and communities. Theater critic Eric Bentley, observing the clashes over clothing and hair, wrote in 1970, “If hair-dos and clothing are hardly, in themselves, worth a fight to the death, in the nineteen sixties they did become symbols of more than just a lifestyle; they became symbols of another life, and this the essential life of human beings, the life of their deep affections and their cherished thoughts.”3
This juxtaposition of “lifestyle” and “life” brings to mind the rhetoric of modern opponents to gay rights. To label the way someone lives a “lifestyle” is to reduce their existence to a spread in this month’s issue of Esquire or Vogue—a whim, subject to change with season or mood. The fashion controversies of the 1960s and 1970s—for example, whether women should wear pants to work, or if boys’ long hair or girls’ miniskirts disrupted education—were not about lifestyle. They were, in the words of the era, about “doing your own thing.” To be your own person and express yourself fully was and always will be a serious and complicated process, and the efforts of people struggling to make lives for themselves through the upheavals of that era are still influencing our culture. That doesn’t mean the baby boomers’ struggles were more important; it’s their (our!) sheer numbers that have made that generation so influential. In fact I take care in this book to consider both the experiences of people who were not teens or young adults as well as those who were baby boomers but were outside of “mainstream” boomer culture by choice or exclusion.
I grew up knowing that my brother and I were part of a “baby boom” that happened when World War II ended and couples settled down to start long-delayed families. We weren’t “baby boomers” until 1970, when the label first appeared in a Washington Post article, according to the Oxford English Dictionary. As “leading edge” boomers (born in 1947 and 1949), we had a front-row seat for the cultural changes of the 1950s—television, the growth of suburbs, the Cold War. Those seats always seemed pretty crowded; in the early 1960s we attended a school so overrun with kids that we were on half session: seventh and eighth graders attended in the afternoon, and ninth graders and up attended from 7:00 AM to noon. Frankly, being part of a baby boom seemed more of an embarrassment and an inconvenience than anything else—that is, until Madison Avenue discovered the youth market. The first national brand to target baby boomers was Pepsi, with its 1963 ads that shouted, “Come Alive! You’re in the Pepsi Generation!”4 Vogue editor Diana Vreeland coined the term “Youthquake” in 1965 to describe the sweeping influence of young people in seemingly every facet of life: music, fashion, and politics. Suddenly we were leaders!
Although baby boomers made up nearly 50 percent of the U.S. population in 1965, we weren’t alone on the cultural scene. Our older siblings and cousins, born between 1925 and the end of the war, dubbed the “Silent Generation,” were just coming into their own in the mid-1960s, with their own lives and desires. They were often forced to choose sides between the seasoned survivors of the “Greatest Generation” and the defiant baby boomers rather than blaze their own trail. The Silent Generation has not produced a president of the United States, the nation having gone in 1992 from our last World War II–era leader (George H. W. Bush) to our first boomer, Bill Clinton, and staying with that cohort long enough to block them permanently. But the Silent Generation did provide the Youthquake with its sound track: the Beatles, Jimi Hendrix, Joan Baez, Brian Wilson of the Beach Boys, Johnny Cash, Aretha Franklin, Barry Manilow, and Bob Dylan were all born between 1925 and 1946. So were fashion designers Mary Quant, Ralph Lauren, Yves Saint Laurent, and Betsey Johnson, as well as iconic hair stylist Vidal Sassoon. (The other major names of the era—Courrèges, Cardin, and Gernreich—were born just before 1925.)
Born between the early 1960s and 1981 (demographers differ on the date for the end of the postwar baby boom), Generation X was emerging, though blessedly unlabeled until 1991, when Douglas Coupland’s novel Generation X: Tales for an Accelerated Culture christened them as such.5 These were the beneficiaries—or victims, depending on your point of view—of the social and cultural transformations of the 1960s. They never knew Jim Crow laws or “Help Wanted” ads divided by sex or race, and they were legal adults when they turned eighteen, while first-wave boomers had to wait until they were twenty-one to take advantage of adult privileges. For the most part they missed the free love and high times of the boomers’ youth, thanks to PCP, crack cocaine, the War on Drugs, the resurgence of STDs, and the discovery of HIV/AIDS. Still, they play an important role in this story, because they were the guinea pigs for parents and educators attempting to prepare the next generation for the Age of Aquarius, the Apocalypse, or whatever else they thought was on the horizon. Of course there were also our elders: men and women in their prime or in their twilight years, who had lived through so much and now found themselves irrelevant to marketers and challenged, baffled, or infuriated by their children and grandchildren.
In every age group there were atheists and believers, political views that spanned the spectrum from Marxists to John Birchers, prudes and libertines. The usefulness of generational categories stems from their adoption by manufacturers, retailers, media, and advertisers as a means of targeting customers. Since we are examining consumer culture, these niches tell us something about how groups of Americans were perceived by the commercial world. It is truly rare for any of us to have never felt pointedly targeted or ignored by advertisers.
If you were born after 1981, don’t worry. The party that started in the 1960s is still going strong, and you’re invited—like it or not. As I reveal in the rest of the book, the styles of the ’60s and ’70s were just the visible signs of the questions on everyone’s mind—questions we are still struggling to answer. Many of them deal with the most essential aspects of our beings: sex and gender.
Baby boomers were sometimes accused of behaving as if we had invented sex; in fact we would have been the dimmest generation in human history if we hadn’t responded to the national fascination with sex that coincided with our own adolescence. And we would not have been normal teenagers if we hadn’t responded to that environment with hyper-hormonal enthusiasm. Like most revolutions, this one had been decades in the making. Unbeknownst to us, our grandparents had already witnessed a first sexual revolution in the 1920s among writers, artists, and other bohemians inspired by Freudian psychological theory, which introduced the concept of a human unconscious driven by sexual desires and fantasies. The music, clothes, and literature of the Roaring Twenties celebrated a hedonistic, sensual youth culture that arose from the horror and destruction of World War I, only to be submerged again in the Great Depression. The academic study of sex continued in biology and psychology departments, building up a body of work that began to attract wider public attention with the 1948 publication of Alfred Kinsey’s Sexual Behavior of the Human Male, followed in 1953 by Sexual Behavior of the Human Female. Hugh Hefner, as a graduate student in journalism at Northwestern University, wrote his master’s thesis on Kinsey’s work before launching Playboy. The pornography cases over Lady Chatterley’s Lover, Tropic of Cancer, and Fanny Hill in 1959 opened up a market for racy novels that became more and more explicit. By the mid-1960s curious teenagers could find just about any kind of information they might desire about sex, though probably not in any public library. Personally, I learned a great deal just browsing the books and magazines in the homes where I babysat.
Explicit straight extramarital sex in books and movies was just the beginning. Homosexuality, once hidden and persecuted, became, if not completely open and still far from accepted, a titillating subject of conversation and art. More common was bisexuality, which several cultural observers identified as the latest cool thing in the early 1970s. Love triangles have been a time-honored plot device, but in the early 1960s group marriage and other forms of polyamory caught the imagination of the many fans of Robert Heinlein’s Stranger in a Strange Land (1961). A steady stream of popular works on multiple relationships followed, including Robert Rimmer’s novels, particularly The Harrad Experiment (1967); the film Bob and Carol and Ted and Alice (1969); and Nena and George O’Neill’s book Open Marriage (1972), which sold 1.5 million copies. Of course much of this sexual freedom was facilitated by the availability of the Pill (approved in 1960), which made possible the separation of intercourse and reproduction and also the uncoupling of “love and marriage” (which, we had learned from Frank Sinatra in 1955, “go together like a horse and carriage”). Not surprisingly, baby boomers are more likely to admit to smoking dope than to any form of sexual experimentation beyond “shacking up” before marriage.
This upheaval in intimate relationships is usually characterized as the “sexual revolution,” but I suspect that had it happened a decade later we would be calling it the “gender revolution” instead. The concept of “gender identity”—the acquired cultural traits that proceed from biological sex—was quite new, having been just introduced to the scientific literature in 1955 by sexologist John Money (more about his troubling career later in the chapter “Nature and Nurture”). Betty Friedan does not use the word “gender” once in The Feminine Mystique (published in 1963); at that time “sex role” was the more common term, signifying the close relationship between biology and our lives as social beings. The distinction between sex and gender has never been easy to grasp or even generally accepted. No matter how scholars have tried to explain the distinction between nature and nurture, popular media and consumer culture reflect the general uncertainty as to which traits, tastes, and behaviors were cultural and which were innate. After all, we’ve known for hundreds of years that the earth circles the sun, yet we still speak about the sun setting, because that’s how it feels. In the case of sex and gender, the jury is still out on how separate they really are. While the sexologists, evolutionary psychologists, anthropologists, and neurobiologists sort it out, the rest of us will continue to mingle and confuse them.
Before John Money introduced the notion of a cultural dimension called “gender,” the variations in human sexual activity and expression could be labeled as natural or unnatural, normal or abnormal, legal or illegal. What was natural, normal, and legal was good; the unnatural, abnormal, and illegal required treatment, correction, or punishment. Adding cultural influence to the mix was brilliant and clearly true. Anthropologists and historians could provide ample evidence of the mutability of cultural patterns over time and geography. But it also raised some very thorny questions. If an individual’s gender expression did not match their biological sex, was that necessarily the result of biological or psychological abnormality, a character flaw, or incorrigible criminality? Could culture be the problem in such a “mismatch”? Were cultural norms automatically right? After all, they were subject to change and variation. Without using the word “gender,” Betty Friedan argued that suburban lives were an alien and toxic culture and that the scientific arguments used to justify consigning women to lives of nurturing and consuming were false. Treating biological sex as a defining, existential characteristic denied individuality and human agency. To achieve her highest potential a woman must be as free as a man to pursue her interests and use her talents, and it was culture—not biology—that was standing in her way.
The Pill is often credited with launching the sexual revolution, and reliable, hormonal birth control was certainly a biological solution to what appears to be a biological problem. But a closer look reveals the problem with this perception. First, as my mother, a registered nurse, was fond of telling me, not only had my generation not invented sex, but neither had they discovered birth control. Remember that one of the reasons the postwar baby boom was so dramatic was the “birth dearth” that preceded it. People did not stop having sex when the economy crashed in 1929; they stopped having children or had fewer of them. They used condoms and diaphragms (which worked pretty well), withdrawal and rhythm (with less success), and when those methods failed they sought abortions. One of my professors in college told the story of her mother, who had five children during the Great Depression—and four abortions, one between each live birth. My own mother, who had been the third oldest in a family of eight children, had a tubal ligation in her late twenties after producing my brother and me.
The convenience and certainty afforded by oral contraceptives would not have been possible without cultural change driven by a desire among young women and men for different lives from those of their parents. The alternative visions included a life with fewer children, or children later in life, but, more important, it included a sexual life without marriage, monogamy, or even commitment. When social commentators raised the alarm about the sexual revolution, it wasn’t the birthrate that concerned them; it was women’s sexual freedom, the severing of the connections between sex and love, the decline of premarital chastity. From the perspective of young, sexually active single women, oral contraceptives were a powerful weapon against the old double standard and a means of escaping the pattern of early marriage and motherhood that had become the standard during the 1950s. This was not about sex and reproduction, it was about gender: about life, not lifestyle, about the cultural expectations of women.
The gender revolution was not just about femininity; it was also about masculinity and about homosexuality. There was no male equivalent to The Feminine Mystique on the best-seller list, but men were subject to as many restrictions as women, just different restrictions—ones that resulted in, and reinforced, power and privilege for some. Those advantages came at a cost, as studies were beginning to show in the late 1950s. Men’s lives were shorter, they were at much greater risk for heart disease and stroke, and they began to regret their absence from their children’s lives in the mom-dominated suburbs. A men’s movement and scholarly interest in masculinity emerged, led by psychologists Joseph Pleck and Jack Sawyer, who organized a “Male Liberation Festival” at Harvard University in 1971. Their groundbreaking anthology, Men and Masculinity (1974), inspired even more academic interest in male sex roles, though the subject has never enjoyed the visibility or influence of women’s studies.6
The gender revolution touched homosexual men and women as well and in even more complex ways. In 1960 homosexuality was still considered a mental illness by the American Psychological Association, and to be a sexually active homosexual man or woman was to be an outlaw, thanks to sodomy and public indecency laws. The erotic possibilities of bisexuality appealed to many young people, straight or otherwise, who made it the “in lifestyle” in the early 1970s. For gays and lesbians this popularity meant that bisexuality could work as a culturally acceptable location between the closet and complete coming out.
In the scientific community the idea that sex and gender could ever be completely separate was controversial, especially as women began to demand full legal equality with men. In 1972 Time published an article about the work of John Money and his contention that gender identity could be shaped independently from biological sex. (He based his argument on his work with intersex children who were surgically assigned as females and treated with hormones and behavioral therapy to produce happy, well-adjusted girls—or not, as the case turned out.) An assembled panel of experts, including Money, discussed the possibility and desirability of a “unisex society.” They considered the supposed differences between men and women—verbal ability, creativity, temperament, and so on—and came to the conclusion that culture played a greater role in all of them than did biological difference. For most men and women, claimed psychologist Jerome Kagan, “the biological differences are totally irrelevant.” Psychiatrist Donald Lunde agreed: “There is no evidence that men are any more or less qualified by biological sex differences alone to perform the tasks generally reserved for them in today’s societies.”7
When asked if a truly egalitarian “unisex” society would ever exist, however, the experts were unanimous in saying it was not only unlikely but also undesirable. For the experts in the first camp, modification of gender norms was impossible because they were ultimately connected to physical reality. Anatomy was still destiny (or, as Therese Benedek put it, “biology precedes personality”). According to psychologist Joseph Adelson, efforts to alter cultural norms were misguided and doomed to fail, “as though the will, in pursuit of total human possibility, can amplify itself to overcome the given.”8
Others considered cultural change possible but stopped short of an endorsement. “Perhaps the known biological differences can be totally overcome, and society can approach a state in which a person’s sex is of no consequence for any significant activity except child-bearing,” suggested Jerome Kagan. “But we must ask if such a society will be satisfying to its members.” Psychoanalyst Martin Symonds agreed: “The basic reason why unisex must fail is that in the sexual act itself, the man has to be assertive, if tenderly, and the woman has to be receptive. What gives trouble is when men see assertiveness as aggression and women see receptiveness as submission.” Besides, a family where Mom and Dad were too similar would be “a frictionless environment in which nobody would be able to grow up,” because children need roles to identify with and rebel against.9 Symonds was not alone in this opinion; he was echoing critics of women’s liberation dating back nearly a century, who had warned of a dire future of manly women and effeminate men. Two years earlier an opinion piece by Barbara Wyden in the St. Petersburg [Florida] Times had suggested that unisex parents (shorthaired working mom in pants; longhaired, bead-wearing dad doing housework) was a sign that the family was in trouble.10
As Betty Friedan had pointed out, many of these scientists were failing to take into account the powerful influence of their own culture. They were like the proverbial fish unable to comprehend the water in which they swam; I would take it a step further and suggest it was not actually the water: science can help very smart fish understand how they move through water and why they breathe in it but suffocate in air. What the scientists had not taken into consideration was the fishbowl, the container that, like culture, determines the size and shape of their environment.
Reformers, advocates, and activists working to expand civil rights were essentially trying to change the dimensions of the fishbowl. The Declaration of Independence and U.S. Constitution offer definitions of human rights that initially promised more than they delivered to many people living within our borders. The civil rights movements in our history have been efforts to include people who had been excluded from the promise of “life, liberty, and the pursuit of happiness” offered in 1776 and the guarantee of “equal protection under the law” added in 1868. This may seem heady, serious stuff for a book on fashion, but it was the civil rights movements that made clothing and hair into national, contentious issues. Much of the controversy centered on issues of gender expression and gender equality, which raised different questions for women and men and for adults and children.
Many of the initial questions were seemingly trivial: Why can’t girls wear slacks to school? Why must men always wear ties, which seem to serve no practical purpose? Why do so many dresses button or zip up the back? Why can’t a boy wear his hair long just like the Beatles? Why do I have to wear white gloves and a hat just to go shopping downtown? Why is it cute to be a tomboy but not a sissy? If these sound like children’s questions, maybe it’s because at first they were. I remember puzzling over these and many other rules when I was growing up. The answers were even more confusing—and annoying! “That’s just the way it is.” “Because I said so.” The cultural authority of grown-ups, which we had accepted as small children, lost its credibility as we reached our teen years. In the 1960s the baby boom generation started to question more and push back harder, along with some allies in older generations. They were aided and abetted by a consumer culture that was more interested in their buying power than in cultural and political change.
Along with the push for progress came resistance. For some the changes were dangerous and threatening, for others perhaps they just came too fast. Evolutionary biologists have a useful concept called “punctuated equilibrium,” which can be applied to cultural change as well. Instead of Darwin’s model of smooth, steady evolution, punctuated equilibrium means there are long periods of stasis between events of sudden change when the new ecological system settles in. These eras of little change may be a period of adjustment or a time when populations are migrating to a more hospitable environment. A biologist friend explained it this way: the internal or external events (mutations, crises) that result in significant change can be stressful. Like a rubber band that is stretched too far, a species can either snap (extinction) or retreat to something like its original size and shape, just slightly altered. She made this suggestion when I was struggling to explain the apparently sudden change in the U.S. cultural climate after 1972, which includes the gradual decline in enthusiasm for the Equal Rights Amendment as well as a revival of “classic” elements and styles, from Diane Von Furstenberg’s dresses to the power suit and Preppymania. Perhaps, she said, the changes of the 1960s had been “too much, too soon” for enough people that we had relaxed into a period of stasis.11 That will be one of the ideas worth testing as we survey the evolution of fashion between 1960 and 1980.
The most obvious fashion-related flashpoints in the gender revolution are pants for women; long hair and colorful, flamboyant dress for men; and unisex for just about everybody. But there is both more and less than meets the eye in each of these trends. Women and girls had been wearing pants in some form for some time prior to the 1960s. Rompers and overalls for little girls were unexceptional, as were slacks, capris, and even shorts and jeans for women, at the right time and place. These were casual or leisure styles; in a culture where dressing up still mattered, even the nicest slacks were unacceptable for work or school. Backyard cookouts, yes; shopping or church, no. When my family moved to a small New England town in 1961, my Midwestern mother was scandalized by the housewives in slacks she saw at the grocery store. (Hollywood stars like Katharine Hepburn and Marlene Dietrich were exempt from these rules, because they were not like the rest of us.) The objection to women in trousers was based on gender but not on the rules of the Victorian era (pants are masculine; skirts are feminine). It was part of the particular construction of being “ladylike,” the image that allowed women special status and protection, layered onto rules about informal as opposed to formal behaviors. One school dress code specifically banned “play clothes,” for girls listing slacks, jeans, shorts, and pedal pushers as examples. When the formality of the early 1960s began to relax, trousers were often permitted if they were part of a pantsuit and if they had a side or back zipper and the jacket or top covered the wearer’s rear end. The decision to permit pants if they were part of a suit is related to the shifting boundary between formal and informal dress, but the other limitations are about the erotic associations of pants, especially tight-fitting trousers that draw attention to the hips and buttocks, or even—with a center front fly—the crotch.
The exuberant splendor of menswear, especially between 1967 and 1973, and the battles over long hair, mustaches, and beards reveal the very different gender rules as they applied to men. The public response to each of these trends also gives us an indication of the differences between the cultural expectations of men and women. Femininity and masculinity are not simple opposites; they are more like two sports with a few commonalities but with totally different sets of rules. Consider, for example, figure skating and ice hockey. Athletes in both must be able to skate, but they wear different costumes, use specialized equipment, and are judged by vastly different standards. Is figure skating the “opposite” of ice hockey?
The rules of femininity value different attributes and behaviors than do the rules of masculinity, and they are not always opposites. Even though both men and women were governed by the formal/informal standards of dress and grooming, the demands of gender resulted in noticeably different effects. A man in formal dress was dressed identically to every other man in the room and completely covered from neck to toe. A formal gown for a woman was revealing (more or less, depending on her age and marital status) and, as popular humor reminds us, hopefully, unique. For another woman to have the same dress was cause for mortification. Women and men could both wear shorts and bathing suits, but only women needed to shave their legs and underarms when they did so.
The rules for young women dictated careful management of an image that oscillated between ladylike and seductive. There was a proper time and place for each, and part of a girl’s education—whether at home, in home economics class, or cotillion—was learning the boundaries and nuances of feminine performance. The rules had a different meaning for women of color and working-class women, for whom a genteel appearance and quality clothing signified access to respect and privilege. For men there was very little space or place for sexual display, or even individual expression; instead, boys and men were trained to operate with a very limited visual range. Again, these restrictions played out differently for men of color. African American and Latino men’s dress includes a tradition of flamboyance despite or, as Monica Miller has argued, as a response to oppression or subservient status.12 Appropriation of minority or subcultural masculine style by straight white men was an important feature of the 1960s and 1970s, as if dressing like the super masculine movie hero John Shaft would inoculate them from accusations of effeminacy.
When young men and women began to break out of their respective limits, the disparity in the public response was remarkable. Girls exchanged “ladylike” rules and trappings for sexier, more revealing clothing to much fanfare and little protest. Miniskirts rose inch by inch, and school dress codes followed them, resignedly. Only a handful of legal cases involving dress codes dealt with too-short skirts or girls wearing pants, compared with the dozens of boys who went to court to argue for their right to wear their hair as long as they wished. (This issue is discussed in much greater detail in chapter 5, “Litigating the Revolution.”)
The most obvious manifestation of the gender revolution is unisex. The term “unisex,” referring to styles intentionally designed to blur or cross gender lines, dates to the mid-1960s. The trend peaked during the 1970s and affected men, women, boys, and girls, all in different ways. On one level, for many people it was a fad, an amusing flash in the pan. For others it was a movement generated by serious, existential questions about the very nature of sex and gender, what constituted appropriate social roles for men and women, and how to raise children. Unisex includes many different ways of challenging gender rules. Some styles are best described as “androgynous,” or combining elements of masculine and feminine styling (a longhaired girl in a miniskirt, button-down shirt, and tie). The opposite approach to androgynous design is a neutral style, devoid of masculine or feminine elements (a turtleneck sweater, jogging suits). The third approach to unisex dressing is best termed “cross-dressing,” although I mean that term in a broader sense than is popularly meant. The rules of masculinity rarely permit cross-dressing, and even in that defiant time doing so was limited to details, not entire outfits. Women, on the other hand, could not only wear “man-tailored” clothing and “boyfriend sweaters” but could also wear actual men’s clothing, as was the case with young women who bought their jeans in the boys’ department.
When teenagers and adults wore unisex clothing, the resulting confusion might be the desired effect, a poke in the establishment’s eye. Complaining “you can’t tell the boys from the girls these days” was a sure way to mark yourself as an old fogey. But to younger children, gender mix-ups could be irritating or embarrassing. Baby boomers and Generation Xers tend to have very different memories of the unisex era, which is significant because it is the younger generation that has helped drive fashion change from the late 1970s on.
If you were to ask someone in the fashion industry, unisex was a fad that came and went in one year: 1968. For that brief moment the fashion press hailed gender blending as the wave of the future, and department stores created special sections for unisex fashions. Most of these boutiques had closed by 1969. However, in the more mainstream realm of Sears, Roebuck catalogs and major sewing patterns, “his ’n’ hers” clothing—mostly casual shirts, sweaters, and outerwear—persisted through the late 1970s. The difference between avant-garde unisex and the later version is the distinction between boundary-defying designs, often modeled by androgynous-looking models, and a less threatening variation, worn by attractive heterosexual couples.
The work and thought of designer Rudi Gernreich, the visionary master of unisex fashion and its most famous proponent, shows how complex the trend actually was. Born in Austria, Gernreich and his mother had fled to the United States when he was a teen in order to escape Nazi persecution of the Jews. Besides being a leading American designer, he was also an early gay rights activist, providing financial support to the Mattachine Society, through cofounder Harry Hay, who was his lover in the early 1950s.13 Most of his work with homosexual rights was unknown to the wider public until after his death in 1985. As a very public figure, who had once been arrested and prosecuted as part of police entrapment, Gernreich understood that his political and personal life could threaten his livelihood. Nevertheless, it is clear that sexuality and gender identity played important roles in his fashion vision. He thought naked bodies—female and male alike—were beautiful, an attitude variously attributed to his reaction against prudish Nazi edicts against nudity, his training as a dancer, and a night job washing bodies in a hospital morgue. Designs such as the topless bathing suit and the soft, transparent “no-bra bra” were intended to emancipate women’s bodies from the artifice of boning, uplift, and elastic. But he wanted to give equal attention to men, asking in 1969, “Why should the male not also be a sexual object?”14 He also designed futuristic styles such as the costumes for the TV series Space: 1999: jumpsuits, turtlenecks, and tunics.
These seemingly opposite creations represented two important strands in the gender revolution: a focus on “natural” bodies (not the girdled, air-brushed versions of the 1950s) and a futuristic vision of an egalitarian world. Ironically, the latter minimized sex differences, with fabrics and silhouettes that erased curves and even facial features, while Gernreich’s other designs left nothing to the imagination, exposing breasts and even pubic hair. These fashions perfectly encapsulate the central conflict in fashions from this era, between displaying and celebrating the human body and minimizing or even erasing differences that result in gender inequality. Thus we have contradictions such as his 1974 thong bathing suit, “a unisex garment which nonetheless enhanced the difference between the sexes.”15
Liberation is a fine thing, but it does not come with instructions. Baby boomers who came of age in the 1960s wanted freedom to “do their own thing,” which meant many different things to different people. For racial and sexual minorities, the goals were probably clearer than for those who were straight white folks. We already had the right to vote, the right to public displays of affection and marriage, and the all-important right to pursue happiness, unhassled and unmolested—as long as we followed the rules. But many of the rules chafed; they didn’t make sense to us as they clearly did to most of our parents and grandparents.
It is striking to me how many of the sympathetic commentators on the youth movement seemed to think we knew what we were doing. Prominent liberal intellectual Alfred Kazin practically proposed American young people for sainthood, calling us “the visible conscience of society,” leading the nation toward an egalitarian and unmaterialistic future.16 According to socialite/designer Gloria Vanderbilt, young women had “more choice of what they want to do and be.”17 But having more choice didn’t automatically make the choices more clear; we have been fretting and arguing and judging over those choices ever since. Young men, freed from our culture’s “sick preoccupation with virility,” could instead “dress in terms of how they feel about themselves,”18 but that assumed they understood those feelings and were prepared for negative as well as positive reactions. The fashions of the 1960s and 1970s articulated many questions about sex and gender but in the end provided no final answers.
One reason for this lack of resolution was that the experts from all the various “-ologies”—sexology, psychology, theology, and the rest—were still experimenting, still theorizing, and still arguing. In his history of sex research, Vern Bullough identifies two major strands in the field during the 1970s.19 The first is the interactionist model of gender identification, which credited genetic, physiological, and social forces with the creation and maintenance of our feminine or masculine selves. Interactionism suggested that these influences have different impacts at different stages of human development. Within interactionism there was a tug-of-war among the various disciplines to determine which academic specialty was best equipped to unlock the formula. There were also still disagreements over which was more important, nature or nurture. John Money, of course, argued that behavioral therapy and hormones could override physiology; Milton Diamond engaged in a long-running public feud with Money over his claims, eventually successfully exposing the flaws in his research.
The second area of controversy was even more fundamental. In order to study a behavior, psychologists must develop tests and instruments to measure it. The existing measures were based on a bipolar model of gender, with femininity and masculinity as opposites. “Normal” men and women would have scores at the appropriate end of the spectrum. People who scored too high on the opposite scale or who ended up in the middle were believed to have problems ranging from gender identity disorder to homosexuality. Alternative models began to appear in the 1970s, most importantly by psychologists Anne Constantinople and Sandra Bem, who rejected the linear model of femininity and masculinity as polar opposites.20 Bem developed the BSRI (Bem Sex Role Inventory), which measured masculinity and femininity on independent axes. An individual’s score would place that person in one of four quadrants: low masculine/high feminine (“feminine”), high masculine/low feminine (“masculine”), low masculine/low feminine (“undifferentiated”), and high masculine/high feminine (“androgynous”). Bem argued not only that this instrument was a more reliable measure of femininity and masculinity but also that research using the BSRI supported her theory that androgynous individuals were psychologically healthier than the other categories.
Against this backdrop the unending disagreements and divisions over everything having anything to do with sex and gender seem inevitable. Even where there is agreement in the scholarly community—the declassification of homosexuality as a mental disorder by the American Psychological Association in 1973, for example—common knowledge and public opinion have lagged far behind. This is the setting for our next four chapters: the foggy, uneven landscape of gendered and unisex fashion at the dawn of the culture wars.