14

MEET THE NEW BOSS

JAKARTA: A PERPECT LITTLE COUP

On a lonely hill in the Krawang region of Java, in an area surrounded by rice farms, one finds an irregular hump of ground overgrown with wild mustard. Just beneath the surface lie the bodies of perhaps twelve victims—a tiny fraction of what the CIA called “one of the worst mass murders of the twentieth century.” Perhaps 500,000 were killed, or maybe a million—no one bothered to count. Relatives would like to give those twelve a proper grave, but they are still too frightened.1

The trouble began on September 30, 1965, when six of Indonesia’s senior generals, including the army commander General Ahmad Yani, were murdered and thrown down a well. A seventh, Major General A. H. Nasution, somehow survived the plot, but his daughter and an aide were murdered. The fact that the seven generals were all outspoken anti-Communists led immediately to assumptions that the murders were the work of the Indonesian Communist Party (Partai Komunis Indonesia, or PKI).

Allegations against the PKI have survived to this day, while the real perpetrators have escaped blame. In fact, the six dead generals, while definitely rightists, were all members of the Yani faction, a group loyal to President Achmed Sukarno. The most likely scenario is that a separate group of right-wingers who wanted to get rid of Sukarno decided that they first had to liquidate his most senior supporters. Their leader was General Suharto, who eventually ousted Sukarno and ruled Indonesia until 1998.

The coup was fostered by the intelligence services of the United States, Great Britain, and Australia, all of whom wanted to oust the troublesome Sukarno. The president had made a nuisance of himself in 1955 when he had hosted the Bandung Conference, from which the Nonaligned Movement was born. Sukarno promoted himself as a leader in the resistance against neocolonialism, behavior which did not endear him to those nations eager to extend Western capitalist interests. He had angered the British by complaining in 1963 that the Malaysian Federation, consisting of Malaysia and Singapore, was a “neocolonial plot” designed to weaken the Indonesian economy and promote British commercial interests in the area. His reputation plummeted further when he told the US ambassador, “Go to hell with your aid,” after learning of the strings attached.2

Heru Atmojo, an air force officer imprisoned for fifteen years because of his loyalty to Sukarno, later reflected on the impossible position in which his country found itself in the early 1960s:

The pressure on Indonesia to do what the Americans wanted was intense. Sukarno wanted good relations with them, but he didn’t want their economic system. With America, that is never possible. So he became an enemy. All of us who wanted an independent country, free to make our own mistakes, were made the enemy. They didn’t call it globalization then; but it was the same thing. If you accepted it, you were America’s friend. If you chose another way, you were given warnings. And if you didn’t comply, hell was visited on you.

The West might have been able to stomach Sukarno’s neutralism, but his growing friendliness toward the Communists was unacceptable. Sukarno argued that this tolerance was motivated by self-preservation, since the PKI, with three million members, was the largest Communist party outside China and the USSR. That argument did not, however, convince his Western critics, who considered the Communist problem to be of his own making. Nor did it matter that the PKI’s popularity within Indonesia arose not because of its status as a revolutionary party, but because it was widely seen as a trusted friend of the poor. Nuance did not move the hard-bitten men at the US State Department or the British Foreign Office.3

As early as 1953, a US National Security Council resolution recognized the need for “appropriate action, in collaboration with other friendly countries, to prevent permanent Communist control in Indonesia.” Millions were spent arming and training insurgent groups willing to challenge Sukarno. According to a CIA memorandum, by 1962 Harold Macmillan and John F. Kennedy had agreed on the need to “liquidate President Sukarno, depending on the situation and available opportunities.”4

Kennedy’s wish was father to Lyndon Johnson’s action. While JFK’s policies in Indonesia were ad hoc and opportunistic, those of his successor were programmed and purposeful. America’s actions in Indonesia, however, contrasted sharply with its behavior in Vietnam, probably for the simple reason that the Johnson administration could not afford two wars in East Asia. While some advisers urged an assertive approach, the administration instead settled upon a “low posture,” or what William Bundy likened to playing an “eight-high hand.” The administration understood that active US intervention might encourage nationalist reaction in Indonesia and prove counterproductive to American interests. Johnson’s advisers concluded that the situation could be resolved by a showdown between the PKI and forces on the right, so the sensible policy was to maintain a low profile and let nature take its course. As the ambassador to Indonesia, Ellsworth Bunker, advised, “US visibility should be reduced so that those opposed to the Communists and extremists may be free to handle a confrontation, which they believe will come, without the incubus of being attacked as defenders of the neocolonialists and imperialists.” The United States simply made sure that, when the confrontation came, the right held all the cards.5

To this end, unbridled American speculation contributed to a collapse in the value of Indonesian currency and a general destabilization of the economy. The price of rice quadrupled in the four months preceding the coup. At the same time, all official aid ceased, while unofficial funding was channeled to anti-Sukarno elements. Military support included 200 Aero-Commanders, light aircraft ideally suited to counterinsurgency. Senior Indonesian air force personnel were trained in the use of the planes at bases in the United States. Meanwhile, the CIA routed hefty payoffs to soldiers known to be sympathetic to Suharto. In other words, the United States had decided which side to back long before the coup was actually launched.

Suharto used the September 30 murders as an excuse to move against the PKI. The men who might have stopped him had, of course, already been dropped down a well. Since Sukarno had no loyal forces capable of countering Suharto’s army, the latter had free rein. Suharto’s campaign was not just an effort to defeat the Communists but a comprehensive program of liquidation. Lending help were his CIA friends, who kept the Indonesian people well fed with black propaganda. “Media fabrications played a key role in stirring up popular resentment against the PKI,” Ralph McGehee, a CIA senior operations officer, later admitted. “Photographs of the bodies of the dead generals—badly decomposed—were featured in all the newspapers and on television. Stories accompanying the pictures falsely claimed that the generals had been castrated and their eyes gouged out by Communist women. This cynically manufactured campaign was designed to . . . set the stage for a massacre.”6

The efficiency and brutality of Suharto’s purge defies description. In less than a year, between 500,000 and one million Indonesians were murdered. The comprehensiveness of the slaughter is testimony to how well American and British intelligence agencies had supplied and trained dissident groups opposed to Sukarno. The United States, by its own admission, had focused on “religious and cultural organizations, youth groups, veterans, trade unions, peasant organizations, political parties, and groups at regional and local levels.” Granted, the PKI and Sukarno were not entirely innocent victims. On occasion, they responded in kind. The intensity of the crisis meant that, for nearly a year, Indonesia was torn by civil war, with killings sparking reprisals and counterreprisals.7

The mere suspicion of Communist sympathy was sufficient to warrant a death sentence, usually carried out on the spot. Survivors of the slaughter describe rivers “jammed with bodies like logs.” One popular tactic among pro-Suharto vigilante groups was to enter a village, round up the young men, brutally murder them, and leave a row of severed penises behind as a reminder to the rest. One survivor recalls seeing the headmaster of his school dragged into the playground and beaten to death in front of a crowd of terrified children. “He was a wonderful man: gentle and kind. . . . I can hear his screams now, but for a long time, years in fact, all I could remember was running from the classroom, and running and running through the streets, not stopping. When they found me that evening, I was dumbstruck. For a whole year I couldn’t speak.”8

On December 1, 1965, a delegation from Suharto’s headquarters visited the US embassy and told the new ambassador, Marshall Green, that “the right horse was now winning and the US should bet heavily on it.” The Americans did precisely that. They kept close tabs on the progress of Suharto’s operation, the aim being to “shape developments to our advantage.” This meant supplying Suharto with a state-of-the-art field communications system in order to speed the liquidation of Communists. One tactic was to encourage Islamic vigilante groups to attack the PKI, so that subsequent atrocities could be blamed on sectarianism, allowing the army to distance itself from the bloodshed. At the height of the purge, Green, known within the State Department as the “coupmaster,” assured Suharto: “The US is generally sympathetic with and admiring of what the army is doing.”9

The CIA, having studied Communist activity in Indonesia, provided Suharto with a hit list of some 5,000 prominent names. These were crossed off as they were liquidated. “We were getting a good account in Jakarta of who was being picked up,” Joseph Lazarsky, deputy CIA station chief in Jakarta, recalled. “The army had a ‘shooting list’ of about 4,000 or 5,000 people. . . . The infrastructure [of the PKI] was zapped almost immediately. We knew what they were doing. . . . Suharto and his advisers said, if you keep them alive you have to feed them.” Howard Federspiel, the Indonesia expert at the State Department’s Bureau of Intelligence and Research, reflected: “No one cared, as long as they were Communists, that they were being butchered. No one was getting very worked up about it.”10

News of the massacre was carefully controlled by Western intelligence agencies. Journalists, prevented from entering the country, relied unquestioningly on official word emanating from embassies. The British Foreign Office’s Information Research Department, headed by the professional deceiver Norman Reddaway, made sure the media were given the kind of story that would suit the furtherance of Britain’s interests. Not long after the shooting started, the British embassy in Jakarta advised intelligence headquarters in Singapore on how the news should be presented: “Suitable propaganda themes might be: PKI brutality in murdering Generals and Nasution’s daughter . . . PKI subverting Indonesia as agents of foreign Communists. . . . But treatment will need to be subtle, e.g. (a) all activities should be strictly unattributable, (b) British participation or cooperation should be carefully concealed.” “It was a triumph for Western propaganda,” Roland Challis, the BBC’s reporter in the area, later reflected. While journalists were undoubtedly deceived, it has to be said that they colluded in their own deception. Too often, the official story was relayed without analysis. In a confidential note, Reddaway praised a Fleet Street journalist who had promised to give the government’s “angle on events”—“i.e., that this was a kid glove coup without butchery.” While journalists sipped champagne in the British Embassy, bodies were washing up on the riverfront outside.11

On March 11, 1966, Suharto felt strong enough to push Sukarno aside. Sukarno spent the next four years in virtual house arrest, and died in 1970. British and American papers welcomed the transfer of power. “The West’s best news for years in Asia,” remarked Time, praising a “scrupulously constitutional” new regime supposedly “based on law, not on mere power.” Echoing the praise, US News and World Report saw reasons for “hope . . . where there was once none.” The highly respected journalist James Reston called it “a gleam of light in Asia,” while skipping over the butchery. Leading the cheers was Rupert Murdoch and his paper the Australian. As recently as 1998, its editor was still insisting that Suharto’s cruelties were a figment of the left’s imagination, while eyewitness accounts of atrocities were dismissed as Communist propaganda.12

On August 8, 1967, Walt Rostow, the new national security adviser, told President Johnson that “Suharto . . . is making a hard try at making something of Indonesia which could be very good for us and the world.” Johnson concurred: “Here is a country which has rejected communism and is pulling itself up by its bootstraps. . . . We can make it a showcase for all the world.” In Britain, foreign secretary Michael Stewart praised the new dictator’s “laudable economic policies” and rejected any suggestion that his was an aggressive regime. In November 1967 Time-Life sponsored a conference on the future of Indonesia. The prime mover was the Ford Foundation, working in cahoots with the CIA and a gaggle of sympathetic economists from America’s most prestigious universities. James Linen, president of Time, Inc., opened the conference by outlining its aims: “We are trying to create a new climate in which private enterprise and developing countries work together . . . for the greater profit of the free world. This world of international enterprise is more than government. . . . It is the seamless web of enterprise, which has been shaping the global environment at revolutionary speed.” In truth, the conference was little more than a deliberate effort by Western interests to feast on the choicest dishes from the Indonesian rijstaffel. A glittering array of corporations attended, including General Motors, Siemens, ICI, British Leyland, British-American Tobacco, American Express, Goodyear, and US Steel. They negotiated with Suharto’s economic team to dole out concessions. Afterward, LBJ congratulated Linen on “a magnificent story of opportunity seen and promise awakened.” The Copley Corporation was equally effusive: “It is [in Indonesia] that the deep-rooted American concepts of free enterprise and Yankee ingenuity are finding new forms of expression. Moreover, the profit potential fairly staggers the imagination.”13

Everything had worked out brilliantly. William Bundy later remarked that the secret to success lay “in the cooking, not in the recipe.” Indonesia was now ruled by a pro-Western dictator, its economy had been made safe for capitalist exploitation, and at least a half million Communists were dead. In stark contrast to Vietnam, the Communists had been defeated without American soldiers dying. The operation had proceeded so smoothly that the CIA was able to construct a credible alibi. The official line, widely believed, was that of the head of the agency’s Far Eastern Division, William Colby, who later claimed that “Indonesia exploded, with a bid for power by the largest Communist Party in the world outside the Curtain, which killed the leadership of the army with Sukarno’s tacit approval and then was decimated in reprisal. [The] CIA . . . did not have any role in the course of events themselves.” In contrast, Ralph McGehee admitted privately that “the Agency was extremely proud of its success . . . and recommended it as a model for future operations.” Also impressed was Richard Nixon, who, according to Marshall Green, “was very interested in that whole experience as pointing to the way we should handle our relationships on a wider basis in Southeast Asia generally, and maybe in the world.” McGehee admitted that Indonesia was a “model” for later operations, like the one that ousted Salvador Allende in Chile. “You can trace back all the major, bloody events run from Washington to the way Suharto came to power. The success of that meant that it would be repeated, again and again.”14

Indonesia also became a model for globalization, though that term was hardly used in the 1960s. Suharto made himself a billionaire by selling his people to General Motors, Goodyear, Siemens, and ICI, and later to Gap, Nike, and Starbucks. “We are the people, the nation, that the world forgot,” lamented Adon Sutrisna, a political prisoner during the Suharto years. “If you know the truth about what happened in Indonesia, you can understand clearly where the world is being led today.”15

HOLLYWOOD: TAKIN’ CARE OF BUSINESS

On April 10, 1957, at the age of sixteen, Ricky Nelson performed the Fats Domino song “I’m Walkin’” on national television during an episode of Ozzie and Harriet. He sounded awful, but that was hardly his fault, since he was not a singer. Theoretically, that did not matter, since Ozzie was a situation comedy, not a musical variety show. Television, however, has a way of making people famous even when they hardly deserve it. As a result of singing one song badly on national TV, Ricky was suddenly a star. In 1958, he was the top-selling pop artist in the country.

Hollywood invented Ricky Nelson. His success reveals how show business had taken over rock music, turning it into an industry. Ricky was the perfect commodity: an artist who did not upset milktoast America. He was Elvis without grease, without swinging hips. His nice, sanitized West coast accent was perfectly middle class—devoid of ethnicity. Though time would eventually reveal that he actually had talent, this was never the reason for his popularity. He sold records because he was marketable; industry executives understood the profit potential of wholesome rock.

One of Ricky’s fans was the young Bob Dylan. “He was different than the rest of the teen idols,” Dylan later reflected. “Ricky had a smooth touch, the way he crooned in fast rhythm, the tonation of his voice. . . . He didn’t sing desperately, do a lot of damage, and you d never mistake him for a shaman. . . . He sang his songs calm and steady like he was in the middle of a storm, men hurling past him. His voice was sort of mysterious and made you fall into a certain mood.” Dylan and Nelson are seldom juxtaposed when Sixties music is discussed. One was a media creation, the other an artist. Yet both are important to understanding the decade and its music. Nelson is symptomatic of the industry’s ability to create marketable stars with planned obsolescence—rather like washing machines. After a brief run in the charts, they are quickly forgotten, and immediately replaced by equally anodyne assembly-line singers designed solely to make money. Dylan, on the other hand, provided the soundtrack for a generation—his songs evoke the events of those days and will never be forgotten. If importance is measured by meaningfulness, then Dylan is deservedly remembered. But if sales are a measure of significance, then Nelson also deserves attention. He was a harbinger of a brand-new force: a billion-dollar industry which turned music into a commodity.16

When Dylan was starting out, in the early Sixties, all the folk labels he approached rejected him on the grounds that his music would never sell. “What I was playing at the time were hard-lipped folk songs with fire and brimstone servings, and you didn’t need to take polls to know that they didn’t match up with anything on the radio, didn’t lend themselves to commercialism.” He was, however, lucky enough to find a producer with an ear for meaningful music. At Columbia Records, he stumbled upon John Hammond, “a music man through and through.” Dylan was charmed. “He talked the same language as me. . . . Money didn’t make much of an impression on him. . . . [He] didn’t give a damn about record trends or musical currents.” That might be true, but Hammond was also astute enough to hear the tunes behind Dylan’s sandpaper voice. He knew that even if Dylan didn’t sell, his songs would. To date, “Blowin’ in the Wind” has been covered around 375 times, each time with a nice check to the copyright holder.17

Folk provided Dylan shelter from the storm—the rapid and bewildering developments in music. “Practically speaking, the ’50s culture was like a judge in his last days on the bench,” he recalled. “It was about to go. . . . With folk songs embedded in my mind like a religion, it wouldn’t matter. Folk songs transcended the immediate culture.” This transcendence gave Dylan’s songs an importance far beyond their position in the charts. He did not have to produce number-one hits to be important.18

That transcendence can also confuse. Politics played in stereo is easily absorbed. Music fuels myth. Play a few bars of “Blowin in the Wind” and suddenly the most apathetic listeners imagine that they, too, manned the barricades in 1968. The “meaningful” music of the Sixties smothers rational assessment of that confusing decade. It is easy to forget that while Dylan, Joan Baez, and Phil Ochs alerted listeners to injustice, militarism, and hypocrisy, the Ohio Express made a great deal of money singing “Yummy, yummy, yummy, I got love in my tummy.” The political songs of the Sixties may be widely remembered, but they were seldom among the “greatest hits.” In the United States, the top ten artists over the course of the decade, measured in sales of singles, were: the Beatles, Elvis, the Supremes, Brenda Lee, the Beach Boys, the Four Seasons, the Temptations, Connie Francis, Ray Charles, and Marvin Gaye. The biggest-selling single of 1969, “Sugar, Sugar,” was sung by a band called the Archies, which did not actually exist. A group of studio musicians was pulled together and given some songs to sing; sound engineers then enhanced the tapes. Perhaps appropriately, the lead vocalist, Ron Dante, later made a fortune recording “You Deserve a Break Today” for McDonalds.19

The counterculture thought it could use music to defeat the conservative mentality of the Fifties, typified by saccharine love songs. A battle raged for the soul of the consumer, or so it seemed to those in the trenches. In fact, baby boomers could not be bothered; they simply wanted good tunes. As one perceptive student observed, “You don’t have to be radical to love rock music. . . . Most of the record-buying kids of today are about as aware as eggplants.” When the Beatles occasionally sang about politics, their songs still sold—for the simple reason that the music sounded nice, and because they were the Beatles. The one constant of this period was that young people had money to spend on music. They bought for enjoyment, not because they sought political enlightenment. “I don’t find [Vietnam] a thing to sing songs about,” Mick Jagger once confessed. “It’s music for us and it’s supposed to be fun. We want you to get up and dance, not sit back and be worried about what you’re supposed to do.” Even those who sang about politics often had paper-thin commitment. The Byrds seemed politically aware, but Roger McGuinn, their lead singer, confessed: “I was political in that I didn’t like inhumanity to man. But I really didn’t get involved. I didn’t vote either. I didn’t do anything. I was tied up in my own little world. . . . And staying stoned a lot. That was my life. I mean, we were hedonists.”20

Fun sold better than politics. The great irony of the 1960s was that radicals—New Left, student protesters, hippies—thought they were conducting a social revolution. In fact, the most profound revolution that occurred was the emergence of a consumer society. Ordinary people spending money were the shock troops of an economic revolution. “The rock revolution failed because it was corrupted,” Germaine Greer argued in 1969. “It was incorporated in the capitalist system, which has power to absorb and exploit all tendencies, including the tendencies towards its own overthrow.” While the young attacked the materialism of their parents, they spent billions on conspicuous consumption. “We’ve bought the values of our parents’ generation more than we thought we would,” Steve McConnell admitted. Baby boomers had their cake while gorging themselves on Hostess Twinkies.21

The greatest success of the music business during this period lay in its ability to take a packaged product and sell it as something new. The industry created an illusion of daring, but in truth few risks were taken. Rock and roll, according to Jeff Nuttall, one-time happening artist, “is the most unchanging, conservative popular music that there has ever been, and continues to be so, under the banner of perpetual revolution.” He was briefly under the impression that the counterculture could determine its own tastes, but eventually discovered that clever music industry executives were pulling strings like puppeteers. “I thought we had to invade the media, but what we misjudged was the power and complexity of the media. The media dismantled the whole thing. It bought it up.”22

In the 1960s the profit motive allowed only a tiny bit of room for songs challenging the political status quo. Granted, a lot of Sixties music was socially rebellious—it flouted conventions, especially those pertaining to sex and drugs. It provided the theme music for a generation bent on defying parental authority. But it did so because that’s what young consumers wanted. It wasn’t shaping trends; it was mirroring them. In any case, the charts frequently found room for blatantly counterrevolutionary songs like Guy Drake’s “Welfare Cadillac” and Merle Haggard’s “Okie from Muskogee.”

As for politics, before 1968 only one protest song—Barry McGuire’s “Eve of Destruction”—was sufficiently successful to be termed a hit in the United States. At that time, Vietnam and student unrest were minority issues, and music was produced for a mass market. Since fewer than 29 percent of those younger than twenty-nine opposed the war in 1966, a large market for antiwar music did not exist. In that year, Sergeant Barry Sadler made a small fortune with his rousingly patriotic “Ballad of the Green Berets,” which ended up the most popular war-related song ever produced, selling seven million copies. Country artists produced more prowar songs, with greater success, than rock artists produced antiwar songs. No issue was too controversial for patriotic songwriters, as demonstrated by the success of “Battle Hymn of William Calley,” which celebrated the massacre at My Lai and instantly became a gold record.

After the Tet Offensive, support for the war dipped below 50 percent for the first time. The shift in public opinion was reflected in the success of John Lennon’s “Give Peace a Chance,” which rose to number 11 on the American charts. From 1969 to 1974, the industry produced thirty-four identifiably antiwar songs, of which twenty-six made it onto the charts. Even Motown, usually preoccupied with love and romance, got into the act, most notably with Edwin Starr’s “War” and Marvin Gaye’s “What’s Going On?” “The music business is a whore,” one BMI executive admitted. “It will make and market anything that it thinks will sell.” Protest songs had become commodities manufactured for profit. Reacting to the sudden popularity of his “I Feel Like I’m Fixin to Die Rag,” Country Joe MacDonald admitted that he felt like he was “selling peace . . . for $3.98.”23

Protest songs did not inspire opposition to the war; they simply reflected that opposition. In any case, the protest message was often lost on listeners. The lyrics of McGuire’s “Eve of Destruction” should leave little room for confusion, since they cover racism, militarism, hatred, and greed. Yet a poll of American students in 1965 revealed that only 14 percent correctly understood the song. A greater proportion actually misinterpreted it, while by far the majority had no idea what it was about. Around 70 percent confessed to being attracted by the beat rather than the message.24

Record producers may have considered themselves hip, but sideburns and shades could not hide a capitalist mind. “The rock business was full of the same sharks as it always has been full of, except they were wearing kaftans and smoking dope instead of sinking pints of beer and wearing suits,” Mark Williams, the first music editor at IT, discovered. When asked why the Beatles never produced an identifiably anti–Vietnam War song, John Lennon claimed that their manager, Brian Epstein, would not let them. (Paul McCartney gave perhaps a more honest answer when he explained: “We’re not the preaching sort. . . . We leave it to others to deliver messages of that kind.”) No producer wanted to annoy a significant proportion of the record-buying public.25

At the beginning of the rock revolution, Ricky Nelson picked up a guitar and became a pop star overnight, simply because of the power of television. Eight years later, a bunch of ordinary young men were transformed into a pop sensation, again simply because of television. The Monkees were an industry invention, a cynical attempt to mate two media: pop music and the situation comedy. Instead of fab, they were prefab. The aim was to produce a moneymaking show which would emulate, on a weekly basis, the success of the Beades’ film A Hard Days Night. An ad was placed in Variety asking for applications from “folk and rock” musicians who wanted to appear in a TV series. A flood of replies and a long selection process produced two musicians who wanted to act (Mike Nesmith and Peter Tork) and two actors who wanted to sing (Micky Dolenz and Davy Jones). Stephen Stills (later of Buffalo Springfield and Crosby, Stills, Nash, and Young) made the short list, but the producers were put off by his stringy hair and bad teeth. They asked him if he knew anyone who looked just like him, but with better teeth and hair. Stills recommended his friend Tork, until that time a respectable member of the Greenwich Village folk scene.

“The Monkees wasn’t a pop group—it was a TV show,” Dolenz admits. “We weren’t family like the Beach Boys, or childhood friends like the Beatles. The producers picked us and then introduced us on a stage set. . . . We were members of a cast.” They were four separate elements combined like chemicals in a laboratory beaker. The amalgam that emerged was hugely successful. The songs were provided by a stable of talented artists which included Tommy Boyce, Bobby Hart, and Neil Diamond. The experiment underlined just how powerful the industry was, how easily executives could manipulate the medium. In 1967, the Monkees outsold the Beatles and the Rolling Stones combined. The top-selling song that year was “I’m a Believer”—a delightful irony, since they believed in nothing. In perhaps the best example of what Greer wrote about the power of the industry to absorb, one of the most memorable Monkees episodes was “The Frodis Caper,” in which the group saves the world from an alien force bent on controlling earthlings’ minds through television.26

The creation lasted only two seasons and began to unravel when the Monkees started behaving like rock stars. “It was a unique phenomenon, to be a member of a group that wasn’t really a group and yet was a group,” Tork feels. “If we d been a group, we would have fought to be a group or we would have broken up as a group. But we were a project, a TV show, a record-making machine.” Eventually, they developed pretensions, started arguing with each other and, even more significantly, demanded creative control. That, however, hardly mattered to the producers, who had already moved on to the next big idea.27

The great names of Sixties pop music—Dylan, Lennon, McCartney, Jagger, Joplin, Diana Ross—were loved by the music-buying public, but were nightmares for the industry. True artists had an unfortunate habit of thinking for themselves. They decided what kind of music they wanted to make and had the clout to get their way. They haggled over contracts and sued over copyright. Much easier were the hundreds of poor players who strutted their hour on stage and then were heard no more. The music industry depended upon the fact that bands, like clothes, went out of fashion.

As Ricky Nelson and the Monkees demonstrated, stars need not be born, since they could so easily be made. Creativity was a quality more important in the producer than in the musician. A pop star needed a nice face and a bit of charisma, perhaps some sex appeal. He or she didn’t need to be able to write music, since the tunes could be supplied by people like Gerry Goffin and Carole King who turned out songs like toys from a factory. Most Sixties music has been long forgotten, but that is exactly what the industry intended. When music became an industry, songs became commodities—something to be enjoyed for a short period and then thrown away.

We remember the songs of the Beatles and Dylan because they seem to evoke the mood of an age and because they ooze talent. But if we look at the industry today, and understand the importance of synthesizers, drum machines, sequencers, managers, makeup artists, stylists, publicists, marketers, and the like, we begin to appreciate the implications of music as a business. The Monkees were a band carefully made for television, the creation of a marketing mind. We are plagued by such bands today. Dylan might have written great music, but the Monkees were the real harbinger of where we are now.

LOS ANGELES: A GODDAMNED ELECTABLE PERSON

Shortly after Lyndon Johnson’s landslide victory in the 1964 election, Vernon Christina, linchpin of Barry Goldwater’s electoral team in California, ran into two other campaign staffers, Walter Knott and John Gromala, in a Los Angeles parking garage. “We were talking about, ‘What shall we do now?’” Christina recalled. They all agreed on the need to continue. “So we lost an election, we lost a candidate, but we didn’t think our cause was that bad.”28

Quite by chance, along came Ronald Reagan, who had delivered the best speech of Goldwater’s campaign. “A Time for Choosing,” broadcast on national television, had summarized perfectly every aspect of the Republican platform, but in a manner that did not frighten voters. The central issue in the election, Reagan had argued, was “whether we believe in our capacity for self-government or whether we abandon the American Revolution and confess that a little intellectual elite in a far-distant capital can plan our lives for us better than we can plan them ourselves.”29

That speech was a bright spot in an otherwise dismal campaign. “The idea hit each of us about the same time,” Christina recalled. “Why doesn’t he . . . run . . . for governor? He’s got all the ingredients to make a goddamned electable person. I don’t give a damn how smart you are—if you’re not electable, forget it. You know. You’ve got to have that charisma to get elected.”30

The Republican nomination was supposed to have gone to George Christopher, the urbane mayor of San Francisco and darling of party patricians in the north of the state. He was widely favored to defeat the incumbent Democrat, Edmund G. (Pat) Brown, who, after two terms, had outlived his welcome. Republicans from Southern California, however, had other ideas. They espoused a new conservatism: a brash, populist, chauvinistic sort attractive to blue-collar workers in the new industries of San Diego and Los Angeles. All that was missing was an inspiring personality who could humanize Goldwater conservatism and take it out of its Southern enclave.

The selection of Reagan laid the foundation for a political movement that would dominate American politics for the next forty years. Republicans tapped into a rich conservative vein among white workers of low to moderate income and low educational attainment. This group had once loyally supported the Democrats but had been alienated by Great Society programs, which, to them, meant tax increases and blacks moving in next door. In order to persuade workers to shift allegiance, Republicans had to address their alienation. Television provided the perfect vehicle for doing so. Working-class disaffection was exacerbated by nightly images of inner-city race riots, student unrest, and countercultural excess. Television also provided salvation in the form of Reagan, a candidate with an instinctive gift for communicating with the common man.

Reagan’s candidacy at first provoked ridicule. The San Francisco Chronicle called it “a flagrant example of miscasting” and predicted that, after an initial wave of bemused interest, his campaign would “bottom out.” “If Reagan wins the Republican nomination in June,” another analyst warned, “the Democrats, in all likelihood, will smash the GOP in November—a loss that could set back the Republican Party for many years.” The Brown team shared these preconceptions. “A Republican can’t win in November unless he gets the votes of Democrats,” one staff member predicted, “and they won’t go for Reagan.” Richard Kline, who served on Brown’s staff, later admitted that “we were conventional thinkers: the people want to elect to high office those politicians who are qualified for high office. . . . Ronald Reagan was a movie actor with no public experience, and a right-winger to boot.”31

While Reagan often seems an embarrassingly simple man, his simplicity lies in his paper-thin ideology and his “aw shucks” image. His presentation, on the other hand, was highly sophisticated, requiring skills for which he is seldom credited—namely, shrewdness, sensitivity, perfect timing, and an extraordinary memory for detail. He was also cleverly marketed by a thoroughly modern campaign team. Critics who claim that he was merely a media creation ignore the fact that in modern politics every candidate is.

The Reagan team was one of the first to recognize and exploit this inevitability. After selecting their candidate, Vernon Christina and his friends took the unusual step of hiring the advertising and public-relations firm Spencer-Roberts to manage the campaign. They turned Reagan’s lack of experience to advantage by playing upon the public’s disenchantment with politics. Thus, when announcing his candidacy on January 4, 1966, Reagan stressed: “I am not a politician in the sense of ever having held public office, but I think I can lay claim to being a ‘citizen politician.’” According to Stuart Spencer, the “citizen-politician” label arose “somewhere out of our long brainstorming sessions. . . . We felt it was a strength to have somebody that had been out of the system. . . . We felt that that’s what people wanted, for a change.” At the same time, Reagan repeatedly emphasized that Brown was a politician, one who had lost touch with the people. He was, Reagan claimed, “aided by his well-oiled and heavily financed machine”—a somewhat ironic charge. The Brown camp had not prepared for a campaign of this sort. “We were surprised,” Richard Kline admitted. “We didn’t understand that the public wanted non-politicians.”32

“This is my campaign,” Reagan insisted. “Goldwater is not running it. I want the voters to listen to what I have to say and how I stand on the issues and then make up their own minds what is the Reagan image.” The word “image” was ironically apropos. Reagan did not change his beliefs; he merely changed his makeup: he did his best to appear different from Goldwater. As the Sacramento Bee reluctantly admitted on the eve of the election, packaging had become all-important: “Ronald Reagan . . . resembles a carefully designed, elaborately ‘customerized’ supermarket package, complete with the glossiest wrapping and the slickest sort of eye appeal.”33

The makeover was successful in part because Reagan’s conservatism was entirely natural—unfettered by dogma. More a populist than a conservative, he was able to tune into voters’ feelings, in a way the inflexible Goldwater never could. Reagan’s warmth and optimism also made his conservatism less frightening. After his primary victory over George Christopher, the San Diego Union concluded: “Ronald Reagan proved that . . . the people have not rejected conservatism, when it is constructive and there is an opportunity to properly present it and make it understandable.”34

Those who disparage Reagan’s B-movie background ignore the fact that modern politics is a form of acting. Reagan knew precisely how to command attention, charm a crowd, and feign deep emotion. His timing was impeccable. “Ron was a man trained . . . in the movie business, where you have a director and a producer and everybody carries their load,” Spencer maintained. “And because of that . . . background, he was a very easy candidate to work with.” He was, however, woefully ignorant on issues. Spencer therefore hired the Behavioral Science Corporation (BASICO), a firm of public-opinion consultants run by Stanley Plog and Kenneth Holden, both professors of psychology. Experts in voter profiling, Plog and Holden sought to discover what really mattered to California voters in the summer of 1966. “These vile-sounding fellows,” Pat Brown complained, “are digging into the minds of people and finding out how to exploit their anxieties.” That was precisely what they did, and what virtually every campaign team has done since.35

The Reagan team quickly discovered that student unrest was an effective anxiety trigger. “We jumped on it,” Spencer admitted. “I think Reagan escalated it into an issue and it started showing up in the polls.” The campaign was supposed to be about big government, welfare, and high taxation, but, as Reagan recalled: “After several weeks of the campaign I had to come back and say, ‘Look, I don’t care if I’m in the mountains, the desert, the biggest cities of the state, the first question is: “What are you going to do about Berkeley?’” and each time the question itself would get applause.”36

Student unrest brilliantly highlighted the populist themes of Reagan’s campaign: morality, law and order, strong leadership, traditional values, and, it must be said, anti-intellectualism. California higher education had, he argued, failed the heavily burdened taxpayer who financed the system and the parents who entrusted their children to it. By adroit manipulation of this issue, Reagan won comfortably in 1966. Time and again he told the voters that “a small minority of beatniks, radicals, and filthy-speech advocates have brought shame to . . . a great University. . . . This has been allowed to go on in the name of academic freedom. What in heaven’s name does ‘academic freedom’ have to do with rioting, with anarchy, with attempts to destroy the primary purpose of the University, which is to educate our young people?” Reagan’s supporters cared little about the nuances of higher-education policy; what they wanted was a governor who could bring order to the universities, by force if necessary. Brown’s mistake was that he cast himself as the benevolent uncle of the university. Reagan convinced the voters that an authoritarian father was needed.37

At the beginning of the campaign, Reagan tended to overanswer questions, and often got himself into trouble as a result. Under the tutelage of Plog and Holden, he learned the art of the soundbite. Issues were distilled into epigrammatic statements, which were then typed onto small cards and collected into little notebooks. These were then memorized by Reagan, rather like a script. The technique was entirely familiar to those who worked in sales and advertising, but was relatively new to politics. Brown ridiculed Reagan as a man who knew nothing of California problems other than that which could be crammed into his notebooks. Hostile journalists described how his carefully crafted one-liners sounded like “a broom sweeping up broken glass.” “Evasive, photogenic, given to generalities and angelic-sounding platitudes,” one journalist remarked, “Reagan has left his public almost completely in the dark as to how he will accomplish the things he says he will.” What critics failed to understand was that that was precisely his intent.38

Reagan’s soundbites annoyed seasoned veterans of the political circuit, but they delighted ordinary voters, as a correspondent from the New Republic found: “‘One of the great problems of economics,’ he explained, ‘is unemployment.’ The crowd cheered. ‘For every ounce of federal help we get, we surrender an ounce of personal freedom.’ ‘There’s no such thing as left or right any more; it’s up or down. Steady cheers right along. . . . We knew that pretty soon he was going to say, ‘I am not a politician’ (he did), and add, ‘Ordinary citizens created this country of ours’ (Flourish, Cheers).” Reagan’s simple conservatism fit perfectly with the superficial coverage television provided. He carried into the campaign some basic beliefs about the relationship between citizen and state. These were culled from a mythical past when politics was supposedly simple. Since the myths in question were close to American hearts, Reagan’s platform had enormous appeal. For example, every American cherished the Bill of Rights—in particular, freedom of speech. But while most people worshiped freedom in the abstract, few understood its implications. Those who revered the image of Patrick Henry asserting his rights in Boston in 1776 had no time for Jerry Rubin asserting his in Sproul Plaza. Reagan capitalized on this dichotomy: “Freedom of speech stops short of vulgarity and obscenity forced upon those that don’t want to hear it, and certainly freedom of speech, when some Americans are fighting and dying for their country, must stop short of lending comfort and aid to the enemy.” “Mr. Reagan spoke down-to-earth American language, about down-to-earth United States of America constitutional Americanism,” a correspondent to the San Diego Union wrote. “To me, it is the fresh breeze of realism that has been absent from the political scene for thirty-five years.”39

Since registered Democrats in California outnumbered Republicans by approximately 1.3 million, in order to win by even a whisker Reagan had to attract, according to one estimate, 90 percent of the traditional Republicans and 25 percent of the traditional Democrats. Since, in fact, he defeated Brown by over a million votes, he obviously managed to impress an enormous number of Democrats. “We lost votes with the conservative Democrats,” Donald Bradley, a Brown campaign manager, confirmed. “I’d say the backbone of the party, the blue-collar majority worker. That’s who we lost.” According to Caspar Weinberger, then a San Francisco lawyer and former state Republican chairman, Reagan “got something in excess of 40 percent of the union labor, and he did that because he was talking about things they were very interested in. . . . When he talked about low taxes and keeping government small and less intrusive . . . he was striking a very strong chord.”40

In 1970, Reagan easily won a second term. During his eight years in Sacramento, he was one of the most popular governors in the history of the state. He was able to maintain his support not by delivering practical policies which improved the lives of his constituents, but rather by keeping their anger and fear stoked. In this regard, unruly students helped enormously. During his first term, Reagan’s approval rating on higher education was always comfortably over 50 percent—no other aspect received a score as high. His popularity was at its lowest in spring 1968, when campuses were quiet. It surged again in 1969, after prolonged confrontations with militant students at San Francisco State and Berkeley. As governor, Reagan’s reaction to campus unrest was identical to that revealed during his campaign. He stressed the same populist messages, promised the same dynamic action. Reagan, like his supporters, did not really understand the disorder, but he did understand voters’ fears. Supposedly off-the-cuff remarks allowed him to communicate directly with the people by echoing their chauvinism. On one occasion he described a bunch of protesters who “were carrying signs that said ‘Make Love Not War.’ The only trouble was they didn’t look like they were capable of doing either. His hair was cut like Tarzan, and he acted like Jane, and he smelled like Cheetah.” He also promised that he would protect the rights of those ‘who are legitimately trying to get an education, . . . at the point of a bayonet if necessary.” The conflict with campus militants was, according to him, a war. He referred to “moment[s] of confrontation” and the need “to make a stand.” The universities were presented as a battlefront in the Cold War—in Vietnam and in Berkeley, the dominoes would not be allowed to fall. “This is guerrilla warfare,” he argued, adding: “The only thing that can win in campus guerrilla warfare is . . . you eliminate them by firing the faculty members and expelling the students.”41

Firebrands like Reagan who defied dissidents with ersatz bravado were like the war heroes of a previous generation: they benefited from the public’s desire for strong leadership. As an actor, Reagan was ideally poised to exploit the voters’ taste for heroes: he knew how to look courageous. His warnings, though often paper thin, sounded ominous. The voters believed that he meant business. Those most impressed were the “middle Americans” who resented the activities of the privileged elites on campus. Once Reagan attracted this group to his side, they never left him. Their unswerving support allowed him to be as conservative as he wished. As one disgrunded moderate California Republican remarked: “For the first time, the Republicans don’t see the rainbow ending in the middle of the spectrum. In the past, Republican conservatives, to survive, had to move toward the middle. But the law-and-order issue and the Reagan phenomenon have created a view that it is dangerous to occupy the middle.”42

Reagan had clearly discovered a potent formula for success. After his first campaign tour, his press secretary, Lyn Nofziger, remarked to Stuart Spencer’s partner William Roberts, “Hey, this guy could be president someday!” Outside the campaign team, observers took only slightly longer to come to the same conclusion. The Daily Oklahoman was sufficiently impressed by his defeat of George Christopher in the primary to consider Reagan a firm contender in the 1968 presidential election. As early as June 16 (in other words, before Reagan had defeated Brown), seventy-four Republican county chairmen across the nation, polled by Gallup, supported Reagan for the Republican presidential nomination. New York Times correspondent David Broder, reporting on Reagan’s visit to Washington in August, remarked on what seemed, “in the eyes of many of those present, the Washington debut of a potential presidential candidate.”43

The most successful political revolution of the 1960s was not conducted by students, nor was it left-wing. It was instead a populist revolution from the right, which had Ronald Reagan as its standard bearer. Some analysts, frightened by the implications of the Reagan phenomenon, insisted that California had taken leave of its senses. They assumed that if the people supported Ronald Reagan, something must be wrong with the people. That refrain would reverberate through subsequent decades as liberals struggled to come to terms with the Reagan revolution. “Perhaps only the capricious California electorate could stage such a political jest,” Emmet John Hughes remarked in Newsweek. Reagan’s victory, he decided, “dramatizes the virtual bankruptcy, politically and intellectually, of a national party. . . . The political point should not be more sharp. Some men learn from history. Some men run from it. And the GOP has chosen feckless flight.” To an extent, Hughes was right. The GOP, with Reagan in front, was running from history. What many failed to realize, however, was that the voters were running in the same direction. Unique among the revolutionaries of the Sixties, Reagan did not need a weatherman to know which way the wind was blowing.44

ST. LOUIS: CURT FLOOD VERSUS BASEBALL

In 1969, the sporting world was stunned by the success of the New York Mets, the team that, prior to that year, had deservedly been dubbed the worst team ever to play major-league baseball. The underdog image was carefully nurtured by coaches and management, while, behind the scenes, the Mets underwent a careful process of development. This suddenly bore fruit in 1969 and provided the team with some of the most gifted ballplayers of their generation. In sports, however, fairy tales make the action on the field much more meaningful, and so fans are only too willing to believe them. The sportswriter George Vecsey wrote:

On October 17, the miracle happened. The Mets had stormed through an unbelievable season to capture the World Series, and the people were piling into the street to celebrate. . . . People were dancing in the streets, holding hands with strangers, reaching back to their childhood for ways to express their joy. . . . It was like the closing minutes of some old Bing Crosby movie, the kind of ending that had become impossible in our world of war and ghettoes. . . . The Mets reminded people of love and hope, sentiments that were not very stylish nor realistic in 1969. If the Mets could win the World Series, people said, anything was possible.

To people around the country, the Mets seemed to embody the best about sports: the idea that a mere game can embody profound moral truths. Most of all, the victory seemed to suggest the importance of the Team—the principle that a theoretically weak group of players can emerge victorious if they believe in one another.45

Forty years later, American sports fans—many of whom weren’t even born by 1969—nostalgically recall the heroics of Tom Seaver and his fellow Mets. Few remember another athlete from that remarkable year, Curt Flood—yet he had a much greater impact on the future of professional sports than any of the Mets. Flood was the star centerfielder for the St. Louis Cardinals, a gifted player who had been an All Star three times and had won the Gold Glove seven times for his defensive prowess. At the end of the 1969 season, the Cardinals decided to trade him to the Philadelphia Phillies. Since they owned the rights to Flood, they were fully entitled to send him where they wanted. Under the terms of baseball’s “reserve clause”—written into every player’s contract—a player was bound in perpetuity to the club owning his contract.

Philadelphia was America’s “northernmost Southern city,” a hotbed of racism. For that reason, Flood, a black man, had no desire to join the Phillies. His reluctance was not based exclusively on an aversion to the City of Brotherly Love. He also objected to the fact that he had had absolutely no say in the decision. “After twelve years in the Major Leagues,” he wrote in a letter to baseball commissioner Bowie Kuhn, “I do not feel I am a piece of property to be bought and sold irrespective of my wishes. I believe that any system which produces that result violates my basic rights as a citizen.” Aside from professional sportsmen, no other workers, he argued, operated in such a restricted market. To baseball, however, the idea of challenging the reserve clause was as preposterous as questioning the fact that the game was played on a diamond. The sanctity of the clause was validated by no less an authority than the US Supreme Court, which, a half-century before, had ruled that since baseball was not a form of commerce, usual practices did not apply.46

Flood felt that the issue went beyond mere matters of baseball and law. He saw it as a question of freedom. The reserve clause, he felt, rendered him chattel—a slave—albeit a highly paid one. He considered his challenge of the reserve clause as significant as Jackie Robinson’s breaking of the color barrier in baseball in 1947. A fight for freedom harmonized well with the mood of the Sixties, as he explained years later:

I’m a child of the sixties, I’m a man of the sixties. During that period of time, this country was coming apart at the seams. We were in Southeast Asia. Good men were dying for America and for the Constitution. In the southern part of the United States we were marching for civil rights and Dr. King had been assassinated, and we lost the Kennedys. And to think that merely because I was a professional baseball player, I could ignore what was going on outside the walls of Busch Stadium was truly hypocrisy. . . . All of those rights that these great Americans were dying for, I didn’t have in my own profession.

Flood knew all about racism. He had worked his way up through the minor leagues in the Deep South, everywhere encountering the sharp edge of segregation. While his white teammates ate at restaurants, he was handed food through the back door. When the team bus stopped at a public toilet, he peed in the bushes. But the freedom he fought for in 1969 was not racially determined. He struggled for the right of all players—black, white, Hispanic, or Asian—to control their own destiny. Nevertheless, his crusade was made all the more powerful by the fact that it was fought by a black man.47

Flood decided to challenge the trade, even though he knew that this would almost certainly mean the end of his career. The Players Association backed his challenge and former US Supreme Court justice Arthur Goldberg provided legal representation. From January 1970 to June 1972, Flood meandered his way through the American legal system, encountering failure at every stage. Eventually, the Supreme Court rejected his argument, maintaining that the integrity of baseball depended upon exemption from antitrust statutes.

Flood had lost, but so too had baseball. The protracted legal battle had revealed that the reserve clause, though technically legal, was untenable. Realizing that a new system was required, in 1975 the sport brought in the Messersmith-McNally rulings, which instituted the system of free agency. Under the new regime, a player was bound to a team only during the term of his contract, rather like any other contracted employee. After the contract expired, he became a free agent and could sell his services to the highest bidder.

Flood died in 1997. At his memorial service in Los Angeles, he was acclaimed not just as a hero but also as a martyr, a man who had sacrificed his career in order to right an injustice. Among the mourners were dozens of former ballplayers, most of them millionaires, who had benefited enormously from his crusade. One mourner compared Flood’s effect to that of Rosa Parks, whose challenge to segregation on public buses in Montgomery, Alabama, ushered in the modern civil rights movement. The conservative columnist George Will, a baseball fanatic, called Flood the Dred Scott of baseball.

Flood felt that his cause harmonized perfectly with the spirit of the Sixties. He was right in a way he probably never realized. The Sixties was the selfish decade, a time of fragmentation when social harmony was abandoned in favor of factionalized goals. Nothing was sacred in the pursuit of self-interest, not even America’s national pastime. Free agency became the norm not just in baseball but in other professional sports in the United States, and around the world. Sportsmen, even rather average ones, became enormously wealthy, and teams were built on the strength of an owner’s checkbook. Players became free agents not just in terms of negotiating their contracts but in a much wider sense. Free agents belong to no one. They represent only themselves. The individual had won, but the Team was never the same.

Today, it is difficult to decide whether Flood’s crusade was a victory for justice or for greed. The reserve clause was undoubtedly iniquitous, but the law is notoriously unsubtle, unable to take account of the nuances of the sporting world. At the time of his trade to the Phillies, Flood was earning $100,000 a year, making him one of the highest-paid employees in the United States. A few decades later, Alex Rodriguez signed a $252 million contract with the Texas Rangers, a figure that made baseball fans cringe. Flood, in other words, had opened the floodgates. When teams become mere collections of selfish stars, and players earn more per game than the average person’s yearly salary, it is difficult to find real heroes in baseball. The reason few people remember Flood is that, to the ordinary fan, his victory was not very glorious.

Or so it seems. Back before the Civil War, one of the first organized baseball games was played at Elysian Fields in Hoboken, New Jersey. Baseball fans, lovers of romantic myth, drool over that historical tidbit. The place sounds heavenly, but in fact the infield produced vicious bad bouncers, and the outfield was riddled with gopher holes. The past, in other words, is never as lovely as we like to imagine. Critics argued that Flood’s victory would mean the end of competitive baseball, that a handful of rich teams would share the titles. Yet in the immediate aftermath of the demise of the reserve clause, from 1978 to 1987, ten different teams won the World Series. Compare that to the period 1936–1964, when the New York Yankees won twenty-two American League pennants. Furthermore, though ticket prices rose steadily after 1976, so too did attendance figures. Ordinary people somehow found money to watch baseball.

The romance of baseball is nine-tenths illusion—a myth made durable by the fans’ eagerness to believe. The Team was in large part an artificial creation, a group of self-centered individuals held together by an invidious law which restricted their freedom. If the Team had been holy, it would have survived the death of the reserve clause. Curt Flood’s campaign was simply an unpleasant dose of reality which demonstrated that all sportsmen, contrary to what we want to believe, are simply mercenaries who fight for their paymaster.