8

A Lady in the Boardroom

My first glimpse of the boardroom of the Manufacturers Hanover Bank, a world no woman had ever penetrated before, was dazzling. Although the scene was gracious rather than forbidding, I had again the sense of entering the halls of power that had come over me when I passed through the guarded gate of the Old EOB. The bank's board of directors met around a long, highly polished mahogany table, lit by crystal chandeliers, in a room high above New York's Park Avenue. This elegant, rarified environment gave no hint of the upheavals that would reshape the organization several times during the years I was associated with it. All the appurtenances were in a matching formal style, and each director's name was permanently embossed on a brass nameplate in front of his seat. As I went around the room to shake hands with my new colleagues, most of whom were the chief executives of leading companies, I could see that some of them were not entirely at ease with this strange creature in their midst. We were only a few minutes into the meeting when short, portly Richards Reynolds, heir to the tobacco fortune, followed up an emphatic “Damn sure” with a hasty “Pardon me ma'am; we're not used to having a lady in here” in his Virginia drawl.

I had joined this exalted group as the result of a luncheon invitation a couple of months earlier, just as I was about to leave the CEA, from Gabriel Hauge, chairman of Manufacturers Hanover. Mr. Hauge turned out to be a courtly silver-haired gentleman with a piratical black patch over his left eye (he had lost the eye, I learned later, to the cancer that would eventually kill him). I was vaguely aware that Manny Hanny, as it was invariably called, was one of a handful of large so-called money center banks, with headquarters in New York but a presence all over the world. That's about all I knew when, somewhere between the salad and the coffee, Mr. Hauge asked me if I would consider joining the bank's board of directors.

I was so taken by surprise that I didn't have the wit to inquire about what the duties and responsibilities of an “outside” or “independent” board member (one who is not part of the company's management) were, even though I was totally ignorant about what I might be getting into. Like most newspaper-reading Americans, it was always the chief executives I had read about; their boards of directors were generally shadowy figures in the background. It was only many years later, when the Enron scandal of 2001 and the financial crisis of 2008–9 clobbered the US economy and wiped out many families' financial security, that the American public became aware that the failure of many boards of directors to discharge their responsibilities effectively was a key to these disasters. Nor did it occur to me to ask about the compensation that came with such directorships, although I soon discovered that it was large enough to have a significant impact on our lifestyle, and eventually to raise serious questions from the same American public about whether the performance of company directors really made them worth their pay.

I was excited by the opportunity to penetrate a sanctum few academic economists had ever entered, even though for-profit companies were major players in the issues and events we studied, analyzed, and taught. I had no illusions, though, about why this opportunity had opened up. Part of it was that my training as an economist had made me comfortable with the mysteries of profit and loss and the bottom line. And my stints in the administration, with both the Price Commission and the CEA, had given me an insider's view of the US regulatory environment, as well as a priceless network of acquaintances in government. But these qualifications paled in importance before the fact that I wore a skirt. By the mid-1970s, companies were feeling strong social pressures to elect women and minorities to their white male boards, and a few of the more forward thinking were beginning to search for and recruit viable candidates.

How much influence women would gain by joining corporate boards was a question that hung in the air. In a 1974 article entitled “New Voices in Business: Ladies of the Boardroom,”1 most of the other women quoted there agreed with my comment that the CEOs who had approached me about directorships—there had been several—had made it clear that the fact that I was a woman was relevant. They weren't playing games. In less than a year, I said, I had discovered that 95 percent of the time we directors are rubber stamps; women will get significant leverage in the economy when they accede to responsible positions inside corporations, rather than serving only as outside directors. Significantly, none of the women directors interviewed, except for a couple who headed family-owned companies, was the CEO of a major corporation, the usual route to a directorship. Their leadership experience, like mine, was in government, universities, or nonprofit organizations.

Despite such doubts, I accepted Gabe Hauge's invitation with the comment “I realize I'm a token, but please don't expect me to be just a token.” Manny Hanny's Board already had its token minority member, Jerome “Brud” Holland, who had won fame as the first black football player at Cornell. A sociologist and former president of two historically black colleges, he had been appointed by Richard Nixon as the US ambassador to Sweden, the first African American to attain such a high diplomatic position. During the two years he spent at that post, he had endured having eggs and tomatoes hurled at him as the anti-Americanism engendered by the Vietnam War reached its height.

Eventually, either my fellow directors' discomfiture at having a woman in their fraternity dissipated or they learned to hide it better. But it wasn't long before the gender issue came up explicitly during the planning of one of the board's trips to the bank's facilities in other countries. The established pattern was that while the directors were receiving all-day briefings on the economic and political environment, as well as the bank's own operations wherever they were visiting, their ladies were entertained with fashion shows or luncheons with the wives of government leaders.

These outings held no appeal for Bob, who would have much preferred to sit in on the briefings. When I passed his request on to Gabe Hauge, the chairman responded with an immediate invitation for Bob to attend. But when Laura Holland, Brud's wife, expressed the same preference, she was at first refused, which embarrassed and infuriated Bob and me. Eventually, a choice between attending these briefings or participating in the programs planned for them was extended to all the spouses and, though almost all of them continued to choose the ladies' outings, I felt gratified that we had won one more small victory over male chauvinism.

At my first Manny Hanny board meeting, I had tried to stay alert in a haze of cigar smoke, listening to a series of boilerplate reports required by banking laws and wondering how long it would take me to understand, let alone make intelligent judgments about, the performance of the bank's loan portfolio and the economic and competitive conditions that affected its financial results. How in the world, I asked myself, did new board members acquire the knowledge needed to do their job? I realized gradually that new members were expected to acquire expertise about the company by osmosis, keeping quiet for the first year or so until they felt enough on top of the situation to ask a pertinent question or make an intelligent comment. Naturally impatient and congenitally incapable of remaining silent for so long, I was determined to accelerate the process. Besides, I felt pressure to come up to speed as fast as possible—wasn't my performance a test of whether women were up to holding such important positions?

After a couple of meetings did little to reduce my befuddlement, I asked the corporate secretary to set up private sessions with the heads of the bank's business units and major staff functions, so that I could learn more about what a money center bank does and what distinguishes a profitable operation from an unprofitable one. My request was met with an immediate offer to set up such sessions before or after each board meeting. But it was also greeted with surprise, as if no one had ever asked before.

Even after I had been given these tutorials, I was frustrated by the fact that neither the board meetings nor those of board committees seemed well designed to elicit useful questions or comments from the outside directors. In meetings of the Loan Committee, for example, the time was spent reviewing sample credit analyses for a cross section of borrowers, many of them in the “rag trade,” the insiders' name for the New York garment industry. I couldn't for the life of me figure out what value I or my fellow directors could add in these discussions; surely we couldn't outguess the professionals' judgments about the creditworthiness of individual borrowers.

Looking back, I realize that the purpose of these carefully packaged presentations may have been to increase the directors' confidence in the bank's lending decisions and so discourage penetrating questions that might make the management uncomfortable. Even so, because there was no follow-up to connect the specific cases we saw to their ultimate outcomes, there wasn't any opportunity for us to learn by doing.

Learning to speak up, to ask challenging questions in board meetings, didn't come easily, even to someone as naturally outspoken as I am. One number that Manny Hanny's directors were expected to keep a watchful eye on was the bank's capital-to-asset ratio, an important measure of the institution's safety or soundness, its cushion against disaster. The management assured us that our bank's ratio was comfortably in the middle compared to those of its competitors. But what, I wondered silently, if all those banks' ratios are too low to protect against a sudden increase in bad loans; what if each of the huge edifices that money center banks had become was balanced on the head of a pin? Since the other directors seemed satisfied, I kept my worries to myself.

As bank failures increased during the 1980s in the wake of worldwide recession, US regulators decided that banks' capital ratios, which had been falling for many years, were indeed too low, and they established higher minimum requirements. My gut reaction was vindicated, but I felt like an idiot for not having followed my instincts and spoken up at that earlier board meeting; I had certainly muffed one chance to try to make a difference. The new, higher capital requirements in turn proved badly inadequate when large banks' headlong increase in risk taking propelled them into the center of the worldwide financial crisis of 2008–9. So why had banks' boards of directors been so easily reassured? Why hadn't we all learned to ask harder questions?

My education in banking, and in the responsibilities of directors, took another leap forward in the 1980s when Manufacturers Hanover had a near-death experience caused by a severe debt crisis and spreading loan defaults in several Latin American countries where the bank was a major lender. Things were shaky enough to bring on quarterly visits to board meetings by the president of the New York Federal Reserve Bank, our main regulatory supervisor. Despite that gentleman's low-key style, these visits were an ominous signal, a sharp reminder of the directors' responsibility for overseeing and guiding improvement in the bank's condition. We got the message that our job was not to be simply rubber stamps for management, and, under our polite but persistent prodding, the painful but necessary changes were made. Most significantly, we prevailed on the CEO to fire the executive in charge of the bank's international lending, sending a sharp message about personal accountability.

Over the years that I spent on the board, the bank I had joined as Manufacturers Hanover provided a crash course in mergers and acquisitions. The bank, itself the creation of a major merger in 1961 and numerous smaller ones since, merged with the Chemical Bank and adopted the latter's name in 1992; the process was repeated when Chemical became Chase Manhattan in 1996, which in turn became JPMorgan Chase in 2000. The days and weeks leading up to these decisions involved difficult meetings, intense discussions, and the knowledge that any slip of the tongue could put a director at risk for violating strict regulations against trading on inside information, with the possibility of a serious fine or even jail time.

This exposure was brought home to me when I was suddenly called to a meeting in New York in connection with one of the mergers and had to make apologies to the hosts of a dinner party we had promised to attend, saying simply that I “had to go out of town.” The host, an active and knowledgeable investor, asked casually, “Oh, by the way, are you still a director of Chemical Bank?” I realized immediately that he had guessed the reason for my trip, and my heart sank as I contemplated the potential fallout if he took advantage of his knowledge to trade in the stock. When I returned, just after the merger between Chemical and Chase had been announced, he called to tell me that he had indeed guessed that the merger was about to occur, but, he added, “I didn't trade.”

Our vote authorizing a merger was just the beginning. Actually merging previously distinct executive ranks, workforces, branch systems, information technology systems, and, above all, cultures was a complex and often painful process, as employees from the top to the bottom of the organization were squeezed out in order to avoid redundancy and achieve the cost savings from consolidation that were the whole point of the merger. Many of the surviving employees also felt extreme stress, as their job descriptions changed or, at the least, they had to adapt to new ways of doing things. The one-on-one competitions to be the survivor in a particular job slot were fierce, and persuading old Manny Hanny and old Chemical survivors to regard themselves as part of one team often seemed like a Sisyphean task. My heart ached for Manny Hanny's CEO, John McGillicuddy, a warm and public-spirited man, as well as an outstanding banker, as he was gradually but inevitably marginalized during his brief time as head of the merged entity by the former CEO of Chemical, who, by mutual agreement, had been designated as McGillicuddy's successor.

When I joined the board of Manufacturers Hanover in 1973, banks, even sophisticated money center banks, generated earnings primarily by taking in deposits, using those funds to make loans, and deriving profits by charging higher interest rates on the loans they made than they paid out on their deposits. By the time I retired from the board of JPMorgan Chase in 2002, the activities that produced the earnings of money center banks had changed dramatically. Of the loans originally made by the bank, less than 25 percent remained on its own books. The rest were either sold outright or “securitized”—that is, packaged into groups of loans with differing characteristics and sold to a variety of investors.2 The bank's profits now came mainly from the fees it charged for these and many other financial services, and from trading in currencies or securities for its own account.

Understanding the risks these varied activities carried was a complex business. As a member of the bank's Risk Policy Committee, I was actively involved in discussions about the sophisticated statistical techniques used to estimate various types of risk to which the bank was exposed. Yet no one from management ever mentioned to us that the structured transactions it had entered into with Enron before the latter's collapse in 2001 might entail financial risk, as well as risk to its reputation. The bank's leadership apparently believed that it was fully hedged against financial losses, although these transactions eventually cost the bank several hundreds of millions of dollars. As the world of banking changed, we directors struggled to keep up, but we weren't fast enough up the learning curve and neither, it turned out, were the managers. Neither group seemed to learn from experience either; the same sort of failure by banks, on a vastly larger scale, to understand or estimate accurately the risks they were taking culminated in the financial crisis of 2008–9. By then I had retired from all corporate directorships and could only join my fellow citizens in shocked disbelief as I watched the financial sector's house of cards collapse into a global disaster.

With one notable exception, every one of the large, successful firms whose boards of directors I joined soon found themselves threatened, as Manny Hanny was, by challenges they hadn't prepared for, challenges that forced on them wrenching changes in form or function or, often, both. One of those companies was Westinghouse Electric, long established and highly respected as one of the premier firms headquartered in our hometown of Pittsburgh, but with operations in many countries. Its major business, nuclear power, was one of special interest to me—after all, my father had been a major figure in the Manhattan Project; my mother had been a founding employee of Brookhaven Laboratory, one of the national labs created to explore the peacetime uses of atomic energy; and I had argued with Edward Teller over the role of nuclear energy as a member of Nelson Rockefeller's Commission on Critical Choices for Americans. As the clincher, my brother George had spoken admiringly of its CEO, Don Burnham, whom he had gotten to know while he, George, was director of the National Productivity Commission.

About the time Bob Kirby succeeded Burnham as chairman and CEO of Westinghouse, soon after I joined its board, the price of uranium began to shoot up, reaching forty dollars per pound by 1975. This escalation meant that the company would have gone broke trying to fulfill contracts it had signed when the price was five to six dollars per pound, promising to deliver the uranium they needed to the owners of the nuclear power plants it had built. Instead, it reneged on the contracts. The twenty-seven utility customers promptly sued, exposing Westinghouse to a potential two billion dollars in liabilities and setting off a round of suits, countersuits, and associated suits that would occupy the firm for the next five years, tie up many of the nation's major law firms, and set the company on a path of diversification that would ultimately end in its transformation into an entertainment company, CBS.

The situation created a great deal of tension for the directors personally. At one point, we were advised that the board as a whole should hire its own lawyer to protect itself against the numerous lawsuits in which we were named as defendants, quite separate from those who were defending the company itself. Our choice was John McCloy, the elderly but still canny establishment lawyer, banker, and adviser to presidents who had been US high commissioner for Germany just after World War II, the president of the World Bank, chairman of Chase Manhattan Bank and the Ford Foundation, and president of the Council on Foreign Relations. Despite his awe-inspiring pedigree, McCloy's advice to the members of the board was down to earth and practical, always cautioning us to keep our heads and not panic, however much our personal assets and reputations might appear to be threatened. “The worst thing you can do,” he told us, “is let your opponents see that you feel threatened by their accusations.” We swallowed hard and tried to remain calm in the face of the huge sums for which the other side tried—ultimately unsuccessfully—to hold us personally accountable.

Although my six years on the board had been dominated by those lawsuits and the tensions they created, both inside and outside the company, I had actually felt more comfortable on that board than I did during my early years at Manufacturers Hanover. Even though I was, once again, the first and only woman, I didn't stand out as an oddball nearly as much in Pittsburgh as I did in New York. Perhaps it was because we were all too busy concentrating on the company's problems; there's nothing like a crisis to create team spirit.

Neither the Westinghouse management nor any of my fellow directors was directly responsible for the gender-related restriction that separated me from the rest of the pack. All of Westinghouse's senior executives, along with the top executives of every major corporation headquartered in Pittsburgh, belonged to the Duquesne Club. The companies generally paid their executives' membership fees and deducted them from taxes as a business expense. It was illegal, though, for any club that served a business purpose to discriminate in its membership policies, and, during the late 1970s, feminist and minority activists were beginning to bring lawsuits against these firms, arguing that they could not deduct the dining club and country club fees they paid and at the same time insist that the clubs were purely social.

I had been freed from the humiliation of being relegated to the Duquesne Club's ladies' entrance when the club tightened its security, by closing that door, after it was briefly stormed by a group that included some of my more radical faculty colleagues. Now, I figured, I ought to strike my own blow for equality by joining any such lawsuit, if the opportunity came up. This would have been highly embarrassing to Westinghouse, and when I told Bob Kirby of my intention, he replied, “I hope there will be enough time for me to get the club to shape up before that happens.” Kirby was as good as his word, and the Duquesne Club soon took in its first minority member, the African American dean of the Duquesne University Law School. He and I had made a bet as to which of us would be invited first; he was, so I won the bet. It wasn't long before the first woman joined as well, but it wasn't me; by that time I no longer lived in Pittsburgh.

McCloy's wisdom stood me in good stead on a very different issue from the one he had been engaged for. While I was still on the Westinghouse board, Ben Stein, Herb Stein's son, coauthored with his father a novel about the chaos created by runaway global inflation and an attempt by the Chinese to secretly acquire the world's supply of gold. It was a suspense story and also a roman à clef, featuring thinly disguised individuals who had served in the Nixon administration along with Herb. My fictional counterpart was the heroine, who rescued civilization by figuring out where the gold was disappearing to and getting it back, enabling the United States to go back to a gold standard, stopping the worldwide inflation in its tracks.

No one could object to being cast as the savior of Western civilization. But the authors also enmeshed my character in a torrid love affair with one Peter Hanrahan, who would be immediately recognizable by any journalist or Washington insider as Peter Flanigan, who had succeeded Pete Peterson as director of the Council on International Economic Policy while I was at the CEA. I was amused but also ticked off by this linkage, not least because Flanigan was on my personal blacklist. I had been annoyed and embarrassed at a dinner party given by him and his wife, when I was sent off with “the ladies” after dinner. “How pretentious can you get,” I had thought to myself, “emulating a custom still practiced only by the stuffiest of embassies?”

When Mark Perlman, my rigidly moralistic friend and chairman of the Pitt economics department, learned about this story line shortly before the book's publication, he insisted that I should be prepared to sue the authors to protect my good name. “Come on, Mark,” I responded, “how Victorian can you get?” But I decided to ride Westinghouse's coat-tails by seeking some free advice from McCloy. After he had skimmed a prepublication copy of the book, he advised me that my best chance of winning a suit would be to sue not for libel, which is very hard to win under US law, but for calumny. “What on earth,” I asked, “is calumny?” “The false imputation of unchastity,” he replied with a straight face.

“Are you suggesting that I do that?” I asked. “Well,” McCloy said, “let me put it this way: it's a perfectly dreadful novel, and I'm sure it will sink like a stone. The one thing that might save it is the publicity that would result if you sued.” I took his advice, and the novel did sink, although that didn't prevent Ben Stein from later gaining fame as an actor, columnist, and television personality.

I really began to flex my muscles as a director at Marcor, a Chicago company that, among other things, owned the catalog retailer Montgomery Ward. Just after I joined that board in 1974, Mobil Oil Company announced its intention to buy 51 percent of the company's stock, in addition to the 4.5 percent it already owned. Despite objections from Marcor's management and the Department of Justice, Mobil persevered and won.

Mobil's representative in discussions with Marcor's board about the price that Mobil would offer for the remaining shares of Marcor stock was its president, William Tavoulareas. A tough-talking lawyer and accountant, Tavoulareas was already famous as the canniest of all the Western oil companies' negotiators with Middle Eastern governments. He used every one of his negotiating tricks to keep the price offered for those Marcor shares as low as possible. He would challenge us with statements such as “What do you mean I can't be part of the discussion about the price that's paid for the remaining shares? We now own a controlling interest in Marcor, dammit.” We reminded him that it was the Marcor board's fiduciary duty to represent the interests of the remaining minority shareholders by getting the best possible deal for them, and he wasn't yet a member of the board.

The more Tavoulareas tried to manipulate us, the angrier I got. My pent-up frustration boiled over during the board's discussion of Mobil's “absolutely final offer,” and I found the courage to pipe up. “That doesn't seem fair,” I objected. “Let's tell him no and see what happens.” I was the youngest, newest, and most inexperienced director, and my more seasoned colleagues were dubious—after all, Mobil held all the cards—but they agreed to try. Tavoulareas, caught off guard by our unexpected stubbornness, raised his offer. So the last act of Marcor's board of directors before it was dissolved was to get a slightly better deal for the company's remaining minority shareholders. Among all the boards I sat on, my tenure on Marcor's was the briefest, but it was also the one on which my input most immediately affected the outcome, and I felt a flush of satisfaction as we shook hands at the end of our farewell dinner.

With the Marcor board dissolved, I was free of any potential conflict of interest in joining the board of Procter and Gamble (P&G), one of the world's largest consumer products companies. That firm had won its way into my heart even before I attended my first board meeting. Just after my appointment was announced, two young women who were on the first rung of the its famed process for grooming future executives came to visit me in my office. They wore blue suits and matching pumps, then the regulation uniform for aspiring females in the business world, but their manner was open and friendly. They said they had come simply to introduce themselves and welcome me to the company. I asked them what it was like to be female pioneers in P&G's highly sought after and competitive program. “Lonely,” was their answer.

I had assured the CEO, and I meant it, that I had no intention of using my board seat to be an advocate for women as a special-interest group. But I did bring a different and therefore useful perspective on some issues important to the company. Soon after I joined the board, I met and talked informally with a large group of women employees; quite a few of them told me later how much such interaction with a director of the company had meant to them. When I was shown some of the advertising department's favorite ads as part of my introductory training program, I commented that several of them, featuring a male authority figure and a smiling housewife, struck me as sexist. Although their initial reaction was open-mouthed astonishment, gradually P&G's ads came around to recognizing that women are not obedient automatons but capable decision makers, able to evaluate detergents and diapers without male guidance.

I had to wait a long time, though, before my hope of no longer being the only woman on every board I sat on was fulfilled. When Lynn Martin joined the P&G board in 1993, I greeted her by saying, “Welcome, Lynn; I've waited seventeen years for you!” A former Republican congresswoman from Illinois, Lynn had also been the secretary of labor who coined the term glass ceiling in her efforts to reduce or eliminate the barriers that confronted women in the workplace. Once she had joined, Lynn distinguished herself by being the only member of the board who tried out every new P&G household product herself and gave her opinion at the next meeting. “The Swiffer did a really terrific job on my floors,” she reported when the innovative sweeper was introduced, but she didn't see much use for Fit, a rinse tailored for fruits and vegetables, which never did catch on with the American public.

Of the four firms whose boards I joined in the 1970s, only P&G was still an independent company, with the same name, when I stepped down. There was continuity in management there as well; every CEO had been either president or its equivalent before he succeeded to the top job, consistent with the company's commitment to promotion from within. At first, coming from an academic environment where the most effective way to get a promotion was to brandish outside job offers, I had been appalled by such insularity. As I saw the results over time, though, I had to admit that the powerful culture and unwavering loyalty to the company and its principles that this process produced was a major strength. This emerged not only in the consistency—with one notable exception—of P&G's financial results and successful global expansion but in its impressively high level of social responsibility.

This last was attested to by the awards, prizes, and public recognition it garnered every year for its achievements as a “most admired company” that offered a welcoming workplace to women, working mothers, minorities, and people with disabilities. It also received accolades for its environmental progress, its use of advanced technologies to improve consumers' quality of life worldwide, and its work on finding alternatives to animal testing—this last in the same year that its annual meeting was picketed by animal-rights activists for not having eliminated such testing entirely.

Cautiously and gradually, P&G's strong culture became less insular, opening up to the outside world. In its research and development, this process took off in the late 1990s. For decades, the company had a closed innovation process, centered around its own secretive research and development operations. Then, in less than a decade, P&G increased the proportion of new product ideas originating from outside the firm from less than a fifth to around half.3 Nor did their conservatism and insularity prevent P&G executives from taking forward-looking positions on local issues. When a group tried to close down an exhibit of Robert Mapplethorpe's controversial photographs by bringing obscenity charges, several of these executives said publicly that this misguided effort would only subject the city to national ridicule, a stance the board roundly applauded.

The chief executives at P&G may all have been “proctoids,” as they were sometimes derisively dubbed, but that's not to say that their personalities didn't vary widely, requiring the directors to adjust to a new style with each change of leadership. A particularly sharp style change occurred when quiet, courteous, consensus-building John Pepper succeeded autocratic, sharp-tongued CEO Edwin Artzt, whose nickname both inside the company and in the press was the “Prince of Darkness.”

The one time I saw Artzt act with ruthless decisiveness was when a senior executive failed to alert him to a potential crisis, a wrangle over the allegedly deceptive labeling of P&G's pasteurized orange juice as Fresh Choice, which led the Food and Drug Administration to order the product to be immediately pulled from supermarket shelves. Artzt, who had not been told about the situation, was blindsided and, as he told the board, “mad as hell.” The executive vice president responsible, who had been regarded as a possible heir apparent to the top job, suddenly resigned.

In another embarrassing situation, though, Artzt showed that he did not hold himself excused from accountability. He turned down a hundred thousand dollars of his annual bonus in 1994 in the wake of losses on complicated derivative securities transactions the company's treasury department had entered into with Bankers Trust, without his knowledge and against guidelines authorized by the board of directors only a month or so before. Even though P&G eventually recouped almost its entire loss in the settlement of a lawsuit against Bankers Trust, heads rolled again, including that of the company's treasurer, because these executives had explicitly violated the board's guidelines on using derivatives. By turning down his bonus, Artzt signaled that, as the captain of the ship, he, too, had to bear some responsibility.

Artzt's biggest public relations stumble occurred when he discovered that somebody on the inside was leaking proprietary information—company secrets—to the Wall Street Journal. In his eagerness to locate the culprit, he asked the Cincinnati police department to comb through hundreds of thousands of phone calls to the Journal reporter who wrote the stories. When he told this to the board, my heart sank as I thought to myself, “Don't do it, Ed,” remembering the pithy advice Jack McNulty, the vice president of public relations at General Motors (GM) had given me: “Never get into a fight with someone who buys ink by the barrel.” I said as much, and others chimed in, but it was too late; the search of telephone records was already under way.

Whether the inside leaker was located and punished or not, it wasn't worth the widespread negative publicity; the furious journalist published a book about the company that was as negative as she could possible make it. Artzt had to admit publicly to “an error in judgment,” and the whole episode at least temporarily tarnished the company's image in the eyes of the public. But the board had spoken its displeasure out loud, and Artzt took the lesson to heart, behaving much more circumspectly after that.

One of the primary responsibilities of any board of directors of a publicly held company is to hire and, if necessary, fire the company's chief executive. All of us on the P&G board knew that, of course, but we never thought it would happen to us, or anticipated how painful it would be.

When John Pepper announced his intention to retire as P&G's chairman and CEO, there was no doubt as to who the top executives had agreed his successor should be: Dirk Jager, the second in command. In fact, Jager had been Artzt's choice as his own immediate successor, but the board had persuaded him that Pepper should become CEO and Jager president. As Jager's mentor, we had argued, Pepper could smooth off some of his rough edges. Jager was a total product of the P&G system; he had joined the company straight from university and worked his way up through successively more responsible positions. He was a large, blond Dutchman with steel-rimmed glasses and a stern visage. This appearance, along with his clipped speech, made him seem cold and distant. But he was admired for his well-honed analytical mind and his hands-on approach to his job. Even as president, he never missed an opportunity to visit grocery stores to see for himself whether P&G products were properly displayed and how they were selling.

The directors were less certain than the top executives about Jager's suitability; we had a lively discussion chewing over the pros and cons. “He's absolutely brilliant, totally customer oriented, and has a fabulous track record,” argued his supporters. “But he's got lousy people skills and is quick to blame others for problems,” countered those who had their doubts. In the end, though, we agreed that he should succeed Pepper in the top job. The first signal that the doubters might have had the better argument came quickly. Jager's first board meeting as chairman and CEO had been preceded by a dinner the night before in honor of a recently retired P&G executive vice president who was also a member of the board. Jager, who made no secret of his dislike for this man, remarked loudly at the luncheon following the board meeting that he had skipped the dinner in favor of staying home to watch TV with his wife. With that pointed comment, he shattered the sense of team spirit so important among top executives and boards of directors alike. I knew right then that we had made a mistake in appointing Jager. So, from the looks on their faces, did my fellow directors. But it was too late for us to do anything about it.

Less than eighteen months later, after three successive negative earnings “surprises” and a 50 percent drop in the price of P&G stock, Jager was gone. Despite his obvious strengths, his inability to set appropriate goals or exercise effective leadership had proven too costly to the company and its shareholders, including nearly all of its employees, who had chosen to put their retirement nest eggs entirely into P&G's profit-sharing plan and now saw their value cut in half. The episode was most painful for Jager, who, I believe, never did understand where he had gone wrong. But it was also painful for the board members, who had to admit to a serious error in judgment; we should have been more alert to the warning signs that had come up in our discussions of the CEO succession.

In light of the fact that Jager had spent his entire career at P&G and been responsible for many of its successes during his climb to the top, the board regarded three years' compensation as a reasonable severance payment. But when the amount he received, which totaled about nine million dollars, became public, we were heavily criticized for giving an outrageous “reward for failure.” The American public was beginning to be resentful of the sums that were bestowed as parting gifts on chief executives as they were being shown the door by disappointed boards of directors. In the first few years of the current century such severance payments, often made to CEOs with very brief tenures at their companies who had built guarantees of such compensation into their employment contracts, became truly outrageous, often amounting to more than ten or even twenty times what we had decided on as Jager's payment.

By then I had joined one or two outspoken directors of other large firms in insisting, both privately in meetings of board committees and in public speeches, that the total compensation of many top executives was exceeding the bounds of reason and decency, and that self-policing by companies' compensation committees was urgent. If we don't fix it, I warned, others will, and you executives won't like their fix one bit. My friend Ann McLaughlin, who was also a member of the board at several leading companies—including GM—put the warning even more tersely: adapt or Congress will adopt. I had no trouble getting other directors, many of whom were CEOs themselves, to agree with me in compensation committee meetings, but none of them was willing to step forward and say so publicly, each insisting that he would be verbally lynched by the community of fellow executives if he did that.

With the financial industry meltdown of 2008, the American public's building anger against executives who grew unimaginably rich while the activities that ballooned their paychecks created economic disaster for many ordinary Americans exploded in a demand for government action. Congress and the administration have responded by making Ann McLaughlin's and my warnings a reality. Firms that have received government assistance are subject to a variety of restrictions on executive compensation, and, at one point, legislation was proposed to claw back bonuses, through retroactive taxation at confiscatory rates, from executives who have already received them. Board compensation committees are beginning to be held more strictly to account by both shareholders and the public, and at least some of them show signs of acting more independently of management.

Gradually but continuously over the more than three decades (1973–2005) I spent as an outside director on corporate boards, they were evolving from the rubber stamps of management I found when I first entered these august boardrooms to monitors who tried to look out for the interests of both the shareholders and the organization itself. In Manny Hanny's case, the initial impetus was primarily external, as when the president of the New York Fed put the board on notice that it was responsible for making sure that the bank's fragile condition improved with all deliberate speed. At P&G the embarrassing fallout from Dirk Jager's failures and subsequent removal played a role.

Above and beyond developments at individual companies, though, those years had been ones of upheaval and change in the governance of all public corporations that swept their directors along. We had been subject to increasing pressure from several landmark lawsuits and a dramatic shift in the ownership of corporate shares from individuals to activist institutional investors, mainly pension funds and mutual funds. This new class of owners was capable of turning the glare of unfavorable publicity on firms whose governance didn't meet its requirements. In response, the standards by which a board's performance was judged rose dramatically, affecting the makeup and processes of every board on which I sat.

Boards became better suited to fulfilling their monitoring role as they both shrank in size and became more diverse, and the number of directors who were also members of management fell, often to the CEO alone. Directors spent more time studying their homework in advance of meetings, and the training sessions for new directors that had been a novelty when I joined the board of Manufacturers Hanover have become routine at most large public firms, supported by a cottage industry of training programs for directors at law and business schools eagerly embracing a new cash cow. As someone who had often questioned just how much I was contributing to a company's performance by sitting on its board, I welcomed this more intense engagement, even though it increased both the hours I spent preparing for meetings and the personal exposure, both financial and reputational, that I risked.

At P&G, these developments were accelerated by the recruitment of an increasingly diverse, sophisticated, cosmopolitan, and strong-minded group of outside directors. These included not only younger CEO's from companies in industries newly relevant to P&G's success, such as software and Web services, but also outstanding people from outside the business world, like Joshua Lederberg, who won the Nobel Prize for Medicine at the age of thirty-three, and Ernesto Zedillo, the former president of Mexico, who brought an international perspective. As a group, we exerted polite but constant pressure to cut down on the carefully scripted presentations by management, allowing the meetings to become more informal, better focused, and with more opportunity for spontaneous give-and-take.

Meetings of the outside directors, without management present, which used to occur only at times of crisis, became regularly scheduled events, and we developed processes for annually evaluating the performance of the CEO, as well as the effectiveness of the board's own functioning. A growing minority of US firms has instituted the separation of the roles of chairman and CEO, which is usual in many European countries; in those that haven't, the role of lead director has developed as a partial substitute. This nonmanagement director works with the chairman to set the agenda for board meetings and presides over meetings of the outside directors without management.

At P&G, towering, deep-voiced Norman Augustine was chosen for this role by the universal acclaim of his board colleagues. Augustine, the CEO of Lockheed Martin, had held several important positions in our government's defense establishment and gained fame as the author of Augustine's Laws, a book about government and business bureaucracies as wise as it is hilarious. He applied this wisdom by putting his stamp on both the structure and the content of P&G board meetings.

In the wake of the Enron and other corporate scandals, most of these changes were codified into requirements by means of legislation and regulation. I was closest to these developments at P&G, where I chaired the Governance and Nominating Committee for several years. In that role, I worked closely with CEO John Pepper to make sure the board's membership and procedures met the highest standards of corporate governance, a steadily moving target. The result was that, when the Sarbanes-Oxley legislation and its implementing regulations were passed in 2002, I was proud to discover that P&G already met almost all its requirements relating to boards of directors.

The one exception was that certain committee assignments had to be changed for two directors the board had categorized as independent but who did not meet the tightened criteria for director independence mandated by the new legislation. Ironically, one of the two was Lynn Martin, far and away the most candidly critical and outspoken of all the directors. The issue arose because she was associated with the consulting arm of Deloitte and Touche, P&G's main accounting firm; her role was to advise companies on how to eliminate practices that could be regarded as sexual harassment. “Legal requirements have trumped common sense, Lynn,” I grumbled when P&G's lawyers told us that she could no longer be a member of the Governance and Nominating Committee that I chaired.

One change in governance that the P&G Board made on its own initiative was to establish term limits for directors. As successful younger people, some in their early forties, were elected to the board, the possibility that tenures of thirty years or more would make it harder to bring new faces and ideas onboard led to the decision to limit directors to four three-year terms. As the chair of the committee that proposed this change and the longest-serving director, with twenty-seven years on the board, I immediately told my colleagues that I wouldn't stand for reelection at the next shareholders' meeting.

I had no doubt that this was the right move for the board and P&G, but it gave me a sharp pang of loss. I still miss the P&G board meetings, the thrill of being involved with a superbly managed and successful company, and the interactions with my outstanding colleagues there. Despite its reputation for conformity and its commitment to promotion from within, P&G has risen to the competitive challenges of a globalizing world through a process of continuous change, without the wrenching distortions that have made most of the other companies I've been associated with either disappear as independent entities or alter so drastically as to be virtually unrecognizable.

Years later, with the experience of thirteen years as a senior executive at one of the country's largest multinational companies, General Motors, under my belt, I had a much better understanding of what makes big companies tick when I joined the boards of Browning Ferris Industries (BFI), a Texas-based waste management company, and Unocal, the old Union Oil Company of California, in the 1990s. I had worked with and come to admire BFI's chief executive, Bill Ruckelshaus, during the Nixon administration. As deputy attorney general in 1973, Bill had resigned rather than follow Nixon's orders to fire the Watergate special prosecutor, Archibald Cox, in what came to be known as the Saturday Night Massacre.

It's hard to imagine what could make a garbage company exciting. But there was plenty of excitement when Bill told the astounded board that he and the company's general counsel had been working secretly for months with the Manhattan district attorney to end the Mafia's stranglehold on commercial waste disposal in New York City. The Mafia had fought back with tactics we had all watched goggle-eyed in The Godfather: the wife of BFI's New York district manager, greeting some women guests, had found a severed dog's head on her doorstep. But BFI had the last laugh when the Mafia refuse collectors were rounded up, tried, and convicted. In his Texas drawl, with an unlit cigar clamped in the corner of his mouth, our lead lawyer explained: “I took the papers that would complete BFI's purchase of his company to the jail where one of the Mafia owners was locked up and watched while he signed them, cussing all the way. I had to bob and weave to duck his spit.”

Despite its success in this bit of heroic derring-do, BFI was struggling in an industry where the accounting practices of at least one of its largest competitors skirted the edge of legality. In 1997 the board decided that selling the firm to another company (not the one with the dubious accounting) was in the best interest of the shareholders. We didn't come to the decision easily; when BFI's young president first broached the idea, several of the directors said to Ruckelshaus, “You should fire him for disloyalty.” The evidence was ultimately persuasive, though, and we voted the company, along with ourselves as a board, out of existence.

Given the roughneck nature of the petroleum industry, it's not surprising that, when I joined the board of Unocal, I was nonplussed by the “cowboy culture” I found there. At one of my first meetings, the chairman reported on a leak that had allowed a toxic substance capable of causing skin irritation and flulike symptoms to escape from the firm's San Francisco refinery. The refinery's managers, he told us, had known about the leak but decided to do nothing about it until the time came for a scheduled overhaul of the plant. “And how,” we asked, “had those managers been punished for their irresponsible decision,” a misjudgment that ultimately cost the company some hundred million dollars in fines and penalties? “Oh, they were reprimanded and temporarily suspended,” came the bland reply. “You mean they weren't fired or at least transferred?” I sputtered.

Speechless with indignation, I couldn't manage even a sputter when Unocal's president, John Imle, reported that he had entertained several members of the Taliban at his home for dinner to discuss the possibility of Unocal getting involved in business in Taliban-ruled Afghanistan. It wasn't long before Unocal recognized the impossibility of working with the Taliban, and Imle was pushed out of the presidency not long afterward, but other elements of the company's traditional culture took longer to uproot.

Soon after I joined the board, the outside directors, acting through the various committees of the board, started putting steady, persistent pressure on Unocal's top management to change the firm's behavior from top to bottom, which in some cases involved ousting or reassigning some of its senior managers. Many of these initiatives originated with the Corporate Responsibility Committee, which I chaired during much of my time on the board. The directors themselves wrote a charter for each Board committee, and conducted an audit every year to check whether its commitments had been met and whether any revision or updating was required.

Beginning in 1994, Unocal started to issue an annual report to stockholders, separate from the required one focused on financial performance, in which it discussed candidly its problems in the areas of corporate social responsibility—health, safety, and the environment—and what it was doing to correct them. And it adopted as its motto “To improve the lives of people wherever we work.” The process was a gradual one, but, over time and with the directors pushing and prodding every inch of the way, Unocal took steps to match its actions to its words. It became more forthcoming in admitting to and aggressive in cleaning up underground leaks that had persisted for many decades, and in compensating the communities that had suffered as a result. It strengthened the language in its code of conduct for both employees and directors, which was then cited by several activists as one of the most progressive in the industry. And it's Operations Management System, introduced in 1999 to identify, evaluate, and mitigate the various safety risks in its operations, was so cutting edge that Unocal received requests from other companies for help in implementing such a system in their own operations.

The most inflammatory issue Unocal's directors had to confront was the company's participation in building a gas pipeline through Myanmar (Burma), a country then ruled by one of the most thuggish regimes on the planet. The company was under constant, highly emotional pressure to get out of the country by selling its share in operations there. How, our angry critics demanded, could we partner with the state oil and gas company of such a reprehensible regime? My fellow directors in corporate jobs could shield themselves from hostile calls, but, as a member of a university faculty whose telephone and office door were open to anyone who called or knocked, I was confronted face-to-face by groups of students who told me bluntly that doing business in such a country was immoral.

We argued intensely over the relative merits of selling our interest in the project, what I dubbed the Pontius Pilate choice—washing our hands of responsibility for a situation by placing it in the hands of others—versus “constructive engagement.” Neither side persuaded the other, of course; some of the students prayed for my soul, while others burned me in effigy on the Diag, the center of the University of Michigan campus, where I had become a professor.

The Unocal directors chewed over the Myanmar issue frequently and at length. We quizzed the top management intensively on the nature of operations in that country, sending the CEO there in person to see the situation for himself. When he returned, we demanded and got from him personal assurances that, contrary to widespread allegations, the actual operator of the facilities in which Unocal held part ownership (a French firm called Total) had never cooperated with the Myanmar government, either in using forced labor or in relocating villages to make room for the petroleum pipeline. On the contrary, he described to us in detail Unocal's active program of providing schools, clinics, and training (as, for example, in fish farming) to the people in villages along the pipeline route.

Our CEO's replies to our probing were corroborated by four field reports, covering the period 2002–5, based on extensive interviews with a broad range of stakeholders inside Myanmar, including villagers in most of the communities along the pipeline. These interviews were conducted by a small American nonprofit focused on working with companies to help them ensure that they have positive rather than negative impacts on the communities where they operate. The final report concluded, “[T]he overwhelming majority of [those interviewed] argue that Total [and its partners] should neither leave the country nor limit its interaction with the military regime in Myanmar/Burma.”4

We recognized that when revenues began to accrue from the pipeline, some would go into the coffers of the despised and cruel autocracy that held—and continues to hold—the country in an iron grip. Weighing all these considerations, we concluded that the benefits we could bring to at least a small part of Myanmar's population by staying in the consortium there was preferable to a forced sale to another company, probably Chinese, that would almost certainly not continue investing in socioeconomic projects that benefited the local population.

Because of this decision to stay, Unocal was sued in 1996 by activist groups under a centuries-old law originally directed at curbing the operations of pirates on the high seas. The case dragged on inconclusively for nearly a decade. Meanwhile, the directors had gradually come to the conclusion that Unocal was too small to reap full economies of scale, implying that it would be in the shareholders' best interest to sell the company to a larger firm. This decision meant disposing of the lawsuit that was hanging over its head, and the case was settled out of court in 2005. Chevron, the company that ultimately bought Unocal, has continued to support economic and social initiatives in Myanmar and has continued to come under pressure for disinvestment.

Most of my years on the Unocal board were characterized by slow, steady progress in the effectiveness of board oversight; the final year, in contrast, was one of high drama. After the company had been in play, or up for sale, for several months, it entered into negotiations with the only bidder that had met the announced deadline, Chevron, America's second-largest oil company. Terms had been agreed to and the transaction appeared well on its way to a shareholder vote when the Chinese National Overseas Oil Corporation (CNOOC), a firm 70 percent owned by the Chinese government, tendered an all-cash bid with a significantly higher value than Chevron's combined stock and cash offer.

With two contenders now in the game, the Unocal board, whose fiduciary duty was to get the highest possible price for Unocal's shareholders, successfully elicited a higher offer from Chevron. But in the meantime, all hell was breaking loose in Washington. Several legislators, egged on by Chevron's lobbyists, were raising objections to a sale to a state-owned Chinese firm on grounds of national security. They were threatening, at the very least, to complicate and stretch out the required approval process, to the point that CNOOC withdrew its bid. Chevron's was accepted, and Unocal was merged into Chevron.

How much the buildup of both congressional and public hostility to the CNOOC bid was actually based on national security concerns, in the military or strategic sense, and how much on a belief that a company owned and possibly subsidized by the Chinese government would provide unfair competition to privately owned American firms is impossible to tell. In any case, CNOOC's withdrawal rendered moot what would have been an interesting but difficult discussion by Unocal's board, centered on two questions. First, how should we have weighed our fiduciary obligations to the shareholders against our obligations as citizens to our country's best interests? Second, if we had concluded that the latter should dominate, would we have decided that the United States would be better off if CNOOC were allowed to buy Unocal or if it were prevented from doing so?

That conversation never took place in the Unocal boardroom, but I have played it over in my own mind every time a proposed investment in the United States by a foreign entity has attracted controversy. No general rule can cover all cases, but my own belief is that, if the United States is to continue to be regarded as a hospitable host to foreign investment, such transactions should be prohibited only when national security, in the conventional meaning of the term, is at issue. Given that CNOOC had undertaken to sell all of Unocal's US assets—its interest was in the ones in Southeast Asia—it's hard to believe that our national security would have been threatened if the Chinese company had been the winning bidder.

Reflecting on the more than three decades I spent as an independent director of multinational companies, I ask myself how much value-added I had contributed to changes in corporate governance, beyond my symbolic role as a pioneering woman. In most cases, I'm confident that I did have some impact on the board's deliberations and the company's behavior. The one situation about which I feel no such reassurance is my performance as a director of Alcoa, the worldwide aluminum company. I had been asked to join that board by its chairman and CEO, Paul O'Neill, another acquaintance from my days on the CEA, when he was a rising young deputy director of the Office of Management and Budget (OMB). Paul liked to anchor Alcoa's strategic decisions firmly in the current global political and economic picture. As part of that approach, he relied on Ken Dam—who had been Paul's colleague at OMB and, at later points in his career, became deputy secretary of both the state and treasury departments—to give periodic reports on political developments and trends around the world, not only to the board but to business unit managers as well; Paul relied on me to do the same on the economic side.

For very different reasons, both Paul and I struck out after he resigned, in 2000, to become secretary of the treasury and took Ken Dam with him as his deputy. Paul's successor at Alcoa, Alain Belda, had a very different management style, which did not include our global briefings. Apparently that difference made me superfluous in Alain's eyes; one day he invited me into his office a few minutes before a meeting of Alcoa's nominating committee and asked me to resign to make room for a new director. I was hurt and angry at the grade of F he had implicitly given me and thunderstruck by the brusque way in which it was delivered, allowing me almost no time to make up my mind. But, privately, I had to admit to myself that I hadn't found a way to have much impact on the board's deliberations or decisions once Paul O'Neill had departed. Paul's dismissal from his cabinet post was far more public. It came from President George W. Bush, after Paul disagreed with the President and his other economic advisers on their proposed tax cuts and insisted repeatedly that there was no evidence of weapons of mass destruction in Iraq.

Whatever progress “my” boards made in corporate governance, it wasn't enough. Of the seven firms, only two—P&G and Alcoa—preserved their identities in the face of the upheavals that were reshaping American business during the last quarter of the twentieth century. Marcor, BFI, and Unocal were acquired by and merged into larger firms; Manny Hanny had been involved in three major mergers and name changes; and Westinghouse, known as a leader in the nuclear power industry, had sold off its birthright and transformed itself into the entertainment company CBS. And the financial scandals, crimes, and disasters that marked the first decade of the twenty-first century revealed how far corporate boards of directors still have to go to fulfill their monitoring role effectively. Women are still in the minority on corporate boards today, but very few of them are feeling the isolation of being the first woman, as I did. Silently, I say to them, “Go girl; be a pushy broad and put some spine into whatever board you're on.”

The governing bodies of leading universities were grappling with some of the same social issues as the boards of for-profit corporations, I discovered when I served on two of them. One was Harvard, where I had spent my undergraduate years; the other was Princeton, which had played such an important role at various stages of my life, even though it had refused to admit me as a graduate student. Both institutions were also caught up in questions of effective governance. And, although universities are grounded in a dedication to the principles of openness and the free exchange of ideas, both conducted much of their decision making in sessions closed to outside eyes.

Harvard and Princeton had their own version of the “withdrawal versus constructive engagement” controversy several decades before it erupted at Unocal. There the question was whether to continue to hold stock in companies that did business in South Africa, then under the yoke of a regime firmly committed to apartheid. As was true of many college campuses during the 1970s, Harvard was feeling strong pressures, intensified by student protests in 1972 and again in 1977, to disinvest entirely from such companies.

Like many of its sister institutions, Harvard at first chose a path of compromise, actually selling stocks only in cases where companies failed to adopt the Sullivan Principles. These principles required firms operating in South Africa to treat all employees equally regardless of race, to promote the advancement of blacks and other nonwhites in the workplace, and to take measures to improve the quality of life for these groups outside the workplace as well. This question, which was roiling the Harvard campus when I left its Board of Overseers in 1978, was doing the same at Princeton when I joined its Board of Trustees in 1980. Later, as a director of Unocal, I was to confront the issue once again. In each of these situations, I would have found it more comfortable to come to a decision based on some simple universal principle. But the moral choice is never unambiguous, the empirical evidence is mixed, and the question is not likely to be resolved definitively in my lifetime, if ever.

Like the Harvard and Princeton governing boards at the time, I believed, and still do, that in most situations constructive engagement—maintaining limited political and business links with a country despite its inexcusable policies, while continuing to press for political or social reform—is likely to be more effective than total withdrawal. In hindsight, though, I have become convinced that, under the particular circumstances of apartheid South Africa, disinvestment was the more effective course. By the end of the 1980s, Harvard had almost entirely withdrawn from investment in South Africa; Princeton eventually did the same. And the Reverend Sullivan, a director of General Motors and the author of the principles that bore his name, had himself abandoned his original principles as ineffective and become a champion of disinvestment.

A very public issue, the changing status of women in the corporate world, had its counterpart in the private world of Harvard. Although many of the restrictions on female students' full participation in Harvard life had been eliminated by the time I joined its board, women undergraduates, who were more and more often seeing themselves as Harvard rather than Radcliffe students, had become increasingly vocal about their dissatisfaction with the distinctions that remained. They were particularly unhappy that the men's and women's housing were separated by a mile or more, and that the Harvard Houses offered a range of house-centered intellectual and cultural activities that the Radcliffe dorms lacked.

This gap in the lives of female undergraduates was filled by the creation of a unified house system for undergraduates. Women would be allowed to live in the Harvard freshman dorms and upper-class houses, and men could choose to live in the Radcliffe dormitory quadrangle, which gradually acquired many of the ancillary benefits enjoyed by the Harvard Houses. As a Radcliffe student twenty-five years earlier, I had been denied access to multiple Harvard facilities, including its undergraduate library and its MBA program. Now, as a member of Harvard's governing board, I felt a special thrill of satisfaction in helping to knock the last of these barriers down.

The dramatic changes in the status of Radcliffe undergraduates were formalized in a 1977 agreement between the President and Fellows of Harvard College and the president of Radcliffe, which stated, “Undergraduates admitted to and subsequently enrolled in Radcliffe will thereby be enrolled…in Harvard College with all the rights and privileges accorded Harvard College enrollment.”5 No longer would women graduates have to be awarded Harvard degrees retroactively, as my generation was, in order to be able to vote in alumni elections and to serve on either of Harvard's two governing boards.

The status of women faculty at Harvard has not been so easily resolved. The question of why there were so few female faculty members, especially at the senior level, came up again and again in meetings of the board and its committees. Harvard's president at the time, Derek Bok, repeatedly stated his commitment to the recruitment and advancement of women faculty, and I believe he was sincere. At the same time, he clearly believed that women's sense of obligation to family often prevented them from pursuing an academic career with the same single-minded intensity as men.

Bok may have been influenced in this judgment by the experience of his wife, the influential philosopher and ethicist Sissela Bok who, despite her growing eminence, declined to fight her way up the tenure-track ladder. As someone who had only recently completed that climb while raising children, I felt that the president's view was rather patronizing and told him so, but I doubt that I had much effect on his outlook.

Why, I wondered when I was elected to Harvard's Board of Overseers, are we called overseers rather than trustees? Whereas other universities have a single governing board, Harvard has two. The Harvard Corporation, known formally as the President and Fellows of Harvard College, is the university's executive board. This self-perpetuating body meets every other week and effectively runs the institution; in the words of the Harvard Guide, “[T]he seven-member board is responsible for the day-today management of the University's finances and business affairs.”6 The second governing board, the Board of Overseers, is a larger body elected by the alumni for six-year terms; its role is to “advise and consent.”

It didn't take me long after I was elected to discover two things about the Board of Overseers, whose archaic title, the Reverend and Honorable Board of Overseers, goes back to the charter granted to Harvard by the Commonwealth of Massachusetts in 1650. One was that the presence of women was as much of a novelty there as it was on corporate boards. The first one, Helen Gilbert, also chair of the separate Radcliffe Board of Trustees, had become an overseer only two years before Adele Simmons and I were elected. Because only Harvard alumni could vote for overseers, Radcliffe graduates had been awarded Harvard degrees retroactively so they could vote in these elections or serve as overseers.

The male overseers had welcomed Helen Gilbert, a middle-aged Boston grande dame with strong aristocratic features, gray hair befitting her age, and impeccable social credentials, without difficulty; she actually headed the board during my first year there. Adele Simmons was another matter. A history professor at Tufts University and dean of its women's college at the time, she later became a Princeton professor and dean, president of Hampshire College, and president of the MacArthur Foundation. Adele was also beautiful, an outspoken blonde barely into her thirties. I shall never forget the faces of our male colleagues sitting in the basement bar of the Harvard Faculty Club as they watched Adele, earnestly discussing some fine point of Harvard governance as she simultaneously nursed her new baby and drank an old-fashioned through a straw.

The other salient truth that struck me almost from my first moment was that Harvard's bicameral governance structure was a perpetual source of angst and soul-searching to the overseers. Since the Harvard Corporation made or approved all the important decisions, what was our function? As one member put it, the Board of Overseers was an impressive club to belong to, but she didn't see that it had any real responsibilities or influence over Harvard policies. The rationale for the bicameral structure, the functions of its two bodies, and the relationship between them, were the object of a major review as my term as an overseer was coming to an end. Some incremental changes were made in the interest of better defining their respective roles and increasing the interaction between them, but the bicameral structure itself—and the overseers' angst—remained. Not until 2010 was a more drastic overhaul of the composition, structure, and practices of the Harvard Corporation announced;7 it is too early to tell what effect that will have on the practice of governance of the university.

The makeup of the Board of Overseers itself was also a cause for hand-wringing The 1970s was an era of social ferment and change, when antiestablishment views gained currency and the military-industrial establishment was widely regarded as the axis of evil. The Harvard alumni tended to reflect these views, and they expressed them in their votes for overseers. A candidate who was a woman, a minority, an environmentalist, or a public servant was virtually assured of election, while it was almost impossible for a businessman or banker to be among the chosen. But since one of the responsibilities of the board has traditionally been fund-raising for Harvard, the discovery that the current membership, however worthy, collectively had very shallow pockets and lacked a wide circle of deep-pocketed acquaintances was unsettling. As I quipped, “If you turned all the overseers upside down and shook them, a few nickels would roll out.”

The problem was resolved by Andrew Heiskell, himself a successful and wealthy alumnus, when he became the board's president. A man with an overbearing personality to match his towering physical presence, he prevailed upon the Harvard Alumni Association, which was responsible for selecting the slate of nominees, to designate a slate so heavily loaded with CEOs and bankers that some of them would have to be elected.

Because of their restricted role in making policy for the university as a whole, the overseers' most hands-on involvement occurred through their leadership of the visiting committees to each of the university's many schools and departments. The charge of these committees was both to evaluate the effectiveness of the university's schools and institutions and to provide them with encouragement and advice.8 I chaired two of them, and they stand out in my mind as much for what they were unable to accomplish as for what they did.

The graduate students in the department of economics poured out a litany of complaints to its visiting committee: graduate courses were too large, often they were not well taught, and the senior faculty paid little attention to the students, their concerns, and their progress. Two years later their view was that “Substantial steps have been made towards increasing faculty-student interaction…But much more needs to be done.”9 And student complaints about the aloofness of the Harvard faculty, and its members' frequent absences from the campus, have persisted down through the years.

The main recommendation of the visiting committee to the statistics department was to urge greater communication and coordination among statisticians throughout Harvard, many of whom were outside that department and had little or no contact with it. Our committee reported substantial improvement in the two years between reports, but we still expressed frustration at the bureaucratic obstacles to giving statisticians joint appointments in more than one school or department.

This inability to coordinate expertise scattered throughout the university is just one example of the limitations created by Harvard's tradition of departmental autonomy. “Every tub on its own bottom” is not just a motto but a key operating principle, and more than one commentator has observed wryly that the president of the university has less effective decision-making power than the dean of arts and sciences. Twenty years after I served on the Board of Overseers, I was a member of the committee visiting the Kennedy School of Government. At one of our meetings the dean, Joe Nye, commented that both the Kennedy school and the business school were in the process of establishing programs in public management. “Why on earth couldn't the two schools join forces and establish a strong joint program in this new area?” I asked. Joe's only response was to shake his head and say, “Marina, you know Harvard better than that.” And I did.

My term as a Harvard overseer had barely ended when Bill Bowen, my late-night library companion in graduate student days, asked me to become a trustee of Princeton, where he was now president. I knew I would like working with Bill, whose crinkly-eyed smile and midwestern twang camouflaged the most intense workaholic I've ever known—I used to tease him that he wrote books faster than I could read them, all the while heading Princeton and, later, the Mellon Foundation. Along with all the positive reasons for saying yes, I felt a touch of sweet revenge at the chance to be at the top of the power structure of an institution that had once refused to admit me.

Some of Princeton's trustees were elected and some appointed by the board itself, which avoided the difficulties Harvard had faced in getting on its board people who had reached the pinnacle of business or financial success and were therefore a promising potential source of gifts to the university. Princeton had also come up with an innovative response to the pressures that had built during the activist 1970s to add students to the board. Each year a graduating senior was elected by the votes of juniors, seniors, and the two most recent classes of alumni to a four-year term on the board. These young alumni trustees could bring the perspective of their age group to bear on the deliberations of a body whose college days were long behind them, without being subject to political pressures from their on-campus peers.

It took only a couple of meetings of the trustees for me to see how sharply decision making at Princeton differed from what I had become used to at Harvard. Whereas Harvard was a decentralized collection of feudal fiefdoms, held together loosely by a lord who depended heavily on persuasion and negotiation to make his limited powers effective, Princeton was a benevolent monarchy, with important decisions centralized in Nassau Hall.

Two of the major changes that marked my term as a Princeton trustee held a special importance for me because of events in the lives of our own two children. When our son, Malcolm, was finishing high school in the late 1970s, Bob and I took him on the obligatory tour of potential colleges. At my urging, he somewhat reluctantly included Princeton. After his visit, I asked him what he thought. He replied that, because he was already committed to a career in the biological sciences, he couldn't possibly consider Princeton, whose biology department wasn't good enough to prepare him for a first-rate PhD program.

Bill Bowen clearly agreed with Malcolm's evaluation, and he was determined to change it. Soon after I joined the board, he began a decade-long discussion with the trustees about making Princeton one of the nation's leading universities in the biological sciences. Starting basically from zero, but with our strong support, he set about raising money for a building with the most modern laboratories and equipment and assembling a world-class faculty. By 1986, the Lewis Thomas Laboratory building was dedicated and two of the country's leading molecular biologists had been recruited to form the core of the faculty that populated it. By the time I left the board in 1990, Princeton was becoming recognized as one of the nation's top-tier institutions in the field. If Malcolm were choosing a college today, Princeton would have to be high on his list.

Another major change that occurred during Bill Bowen's presidency and my time on the board was the creation of residential colleges for freshmen and sophomores. The upperclassmen had their eating clubs to provide a social framework for their lives, but the underclassmen, perhaps the most in need of some community smaller than the university as a whole, had only the beds and desks in their dormitory rooms to call home.

Our daughter Laura had found a similar situation less than a month after she started at Duke as a freshman. The absence of any kind of community living structure became critical for her when her freshman roommate, rendered half unconscious by her first encounter with alcohol, was gang-raped by the pledge class of a fraternity. Because there were no adults to turn to for help, Laura found herself coping alone with the fallout from this tragedy: escorting her roommate to the hospital and the police and searching all over the campus for her when she left notes in their room hinting at suicide.

The first residential colleges at Princeton were simply groupings of existing dormitories, but as they acquired faculty associates who often dined with the residents and participated in house-centered cultural and social activities, a sense of community developed. Later, generous gifts from two billionaire alumni enabled the university to purchase the sprawling Princeton Inn on the edge of campus and convert it into Forbes College, and then to have a building specially designed and built as Whitman College. (Meg Whitman, a Princeton alumna and the former CEO of eBay, is a distant relative of my husband.) Princeton's underclassmen would now have the kind of supportive housing environment that Laura and her roommate had so painfully lacked.

These two transformative changes at Princeton might not have occurred without the conjunction of Bill Bowen's strong personality, his ability to articulate and raise funds for a compelling vision, and a centralized, unicameral governance structure that enabled him to work closely with a supportive Board of Trustees. Such advances had been harder to come by during my term as an overseer at Harvard, where the leadership was less persuasive, responsibility for governance was divided between two bodies, and decision making on academic issues was famously decentralized.

When Bowen told the trustees that he intended to step down from his fifteen-year term as president, I was appointed to the search committee whose task it was to identify the man best suited to replace him. (The notion that a woman might be eligible to become Princeton's president had to wait until 2001, when the university's own renowned molecular biologist Shirley Tilghman was appointed.) The unanimous choice of my search committee, enthusiastically supported by an advisory committee of faculty, staff, and students and approved by the full board, was Harold T. Shapiro, then president of the University of Michigan. I was living in Ann Arbor at the time, and the community was volubly upset when they learned that I was party to luring away their wildly popular president to a much smaller school “out East.” “I wonder,” I mused to Bob, “whether I'll need a bodyguard to protect me from all of Shapiro's angry admirers.”

Shapiro, a noted economist, was one of identical twin sons born to a couple in Montreal who had never graduated from high school but owned and ran the largest kosher Chinese restaurant in Canada. When their father died suddenly during their senior year in college, the twins managed the restaurant successfully for five years before resuming their education. Each went on to earn a PhD and, eventually, to head a major research university, Michigan and then Princeton in Harold's case, McGill in his brother Bernard's.

Shapiro's popularity at Michigan did not immediately carry over to Princeton. Bill Bowen and his provost and close confidante, Neil Rudenstine (who later became the president of Harvard), had run the university like a mom-and-pop shop, involving a lot of personal interaction with the campus community, particularly the faculty. Theirs was an extremely effective partnership, one that shepherded the important advances I've described, along with many others.

When it was time to search for a new president, the trustees recognized that, partly as a result of those successes, Princeton had grown large and complex enough to require a different management style, and we discussed this requirement with the candidates for the presidency. But neither the search committee nor the campus advisory committee that worked with us conveyed the need for such a change to the faculty. The names of candidates had been a carefully guarded secret to avoid embarrassing the ones who weren't chosen. But a more open process would have had the advantage of raising the issue for discussion and understanding by this crucially important constituency.

Harold Shapiro, coming from a university many times the size of Princeton, brought a style of leadership that involved a more complex administrative structure and more delegation of authority than the faculty was used to. Resenting the absence of the hands-on relationship they had had with Bowen and Rudenstine, the professors gave Shapiro a very tough first year. But they gradually recognized that he had his own ways of paying attention and showing respect. Whenever a member of the faculty sent a copy of his or her newly published book to the president, a thank-you note from Bill Bowen had been immediately forthcoming. A response from Harold Shapiro involved a delay of weeks or even months, but when it came, it included detailed comments showing that he had actually read the book, even if it was in a field totally unfamiliar to him.

The faculty eventually came to appreciate Shapiro's more formal, scholarly style, and with their strong support, his was an extremely successful and innovative presidency. Among other achievements, he raised more endowment money than any president before him. Nonetheless, I have always felt a bit guilty that Harold's difficult introduction to Princeton might have been avoided if we trustees had been more effective in conveying our vision for Princeton's future leadership, and the reasons for it, to the campus community. The experience taught me that when decisions are made in secret to ensure timeliness and effectiveness, the decision makers have a special responsibility to explain honestly and persuasively the reasons why they decided as they did.

My fiduciary role at all these institutions, both corporate and academic, taught me that the answers to the important questions confronting many organizations are often neither black nor white but shades of gray. Learning not only to tolerate but to embrace complexity and ambiguity, as uncomfortable as it is, has been important in my own progress toward maturity, although it has often made me the butt of jokes about the economists who so infuriated Harry Truman by saying, “On the one hand…but on the other hand.” When I tried to bring this conviction to life in my public television series, though, I learned the hard way that it's hard to persuade other people to see the world through the same lens.

images

images

images

images

images

images

images

images

images

images

images

images

images

images

images

images

images

images

images

images

images

images

images