3

When Our Leaders Don’t Notice

In 2005 Jamie Dimon, the influential and accomplished CEO of JPMorgan Chase, hired Ina Drew as chief investment officer, responsible for overseeing the bank’s risk exposure. In 2011, facing demands from shareholders to jump-start the bank’s profits in the aftermath of the financial crisis, Drew dropped Morgan’s requirement to sell investments when losses exceeded $20 million. Dimon, who had paid less and less attention to the CIO after watching a stream of large profits roll in, was unaware of the change. Drew had provided excellent returns for the bank, and Dimon encouraged her to boost profits more through greater risk taking. In February 2011, speaking to three hundred JPMorgan Chase senior executives, Dimon said that although times were tough, it was the job of the bank’s leadership to step up and be bold. He singled out Drew for praise on this count, saying, “Ina is bold.”1 Dimon, a busy leader, was focused on profits rather than oversight requirements, saw Drew’s strength and missed Drew’s and the bank’s weaknesses.

Things began to unravel on April 4, 2012. That is when Dimon read an article in the Wall Street Journal about a JPMorgan trader in London, Bruno Iksil, who was making huge bets that exposed the bank to high levels of risk. On April 8 Drew assured Dimon and the operating committee of JPMorgan that the large trades would work out and were being properly managed. She argued that the Wall Street Journal story was “blown out of proportion.”2 Dimon accepted her answer and referred publicly to the trades as a “complete tempest in a teapot.”3

Yet large losses escalated as a result of Iksil’s trades, and inside the company Dimon blamed himself “for failing to detect the group’s exposure,” according to the Journal.4 On April 30 Dimon finally demanded that Drew show him the specific trading positions. His review made it clear that a huge problem existed, one that he undoubtedly would have detected earlier if he had noticed what Iksil was doing. In the second week of May the Journal reported that Dimon publically admitted, “The last thing I told the market—that it was a tempest in a teapot—was dead wrong.”5 Dimon formally revealed the extent of the losses in a conference call on May 10, and he requested and accepted Drew’s resignation soon afterward.

How did this disaster occur? To start with, Iksil was part of the London unit reporting to Drew in New York, and the two offices did not get along. Iksil and others in London were “quants” who used sophisticated quantitative analyses to form their investment decisions. Iksil’s job included creating complex bets using derivatives that would gamble on the direction in which the market was likely to move. It is not clear that Drew fully understood the quants’ methods, and it is even less clear that Dimon was monitoring Drew’s supervision of London. When Drew questioned Iksil’s superior in the London office about his positions, she often received ambiguous, incomplete answers. The London office dodged her questions, she did not sufficiently demand clearer answers, and as a result she never understood the magnitude of the risk being taken.

Drew did know that she had been publicly praised by Dimon for her boldness. Unfortunately the line between boldness of the type Dimon admired in Drew and recklessness can be unclear. All indications are that both Drew and Dimon were unaware of the magnitude of the risk that Iksil was taking. The incentives to fully grasp the risk simply weren’t there.

Dimon told the U.S. Senate Banking Committee, “It morphed into something I can’t justify.”6 He also reportedly confided to his wife that he had “missed something bad.” In a later interview with the Wall Street Journal he put the same idea more artfully: “The big lesson I learned: Don’t get complacent despite a successful track record.”7 We can distill Dimon’s lesson even more narrowly: successful leadership is defined by vigilance. The lack of vigilance at JPMorgan Chase was amazingly costly. By September 2013 the estimated trading losses from this episode were $6.2 billion; Iksil was cooperating with prosecutors; his boss and one of his subordinates had been indicted; JPMorgan Chase had agreed to pay $920 million in fines to the Office of the Controller, the U.K. Financial Conduct Authority, the Federal Reserve, and the Securities and Exchange Commission; JPMorgan Chase’s reputation was badly scarred; and the story was far from over. JPMorgan senior management had also been accused of hiding its losses from its own board’s audit committee.8

Leaders often fail to notice when they are obsessed by other issues, when they are motivated to not notice, and when there are other people in their environment working hard to keep them from noticing. This chapter is about how leaders can overcome these threats to noticing.

JPMorgan can take little comfort in hardly being unique. In 2008 the French bank Société Générale finally noticed that one of its traders had lost over $7 billion through a series of fraudulent trades. What is underappreciated in both cases is that the executives involved did not make the types of classic decision-making errors that have been so well documented in the fields of behavioral decision research, behavioral economics, and behavioral finance. It is not the case that they were affected by the framing, anchoring, or other decision biases that are well represented in the behavioral science and behavioral economics literatures. Rather they failed to notice that they lacked the oversight needed to identify when employees’ behavior had fallen dramatically out of the range of what the banks should tolerate. They failed to implement appropriate monitoring systems and failed to ask the right questions, thus neglecting to meet a key aspect of effective leadership: possessing the knowledge needed to have confidence in the actions of one’s subordinates. These errors arose from a fundamental failure to notice that the most relevant information wasn’t in front of them. That began as a failure of oversight—not detecting when employees were taking risks out of the range of what the banks should tolerate—and a failure to ask the questions that would uncover the critical information that was needed.

It is the responsibility of leaders to notice when things are going seriously wrong in their organizations. Consider steroid use in professional sports, which has tarnished the reputation of nearly every sport to some degree. Major League Baseball is a telling case in point. By the late 1990s allegations of steroid use among players were widespread. Barry Bonds, Sammy Sosa, Roger Clemens, and Alex Rodriguez are just a few of the famous players implicated in the steroid era, which peaked between 1998 and 2001 but continued in the years that followed. Multiple players were indicted. One famous episode involved Barry Bonds, one of the greatest hitters in the history of the game and the son of a former all-star. When he was being deposed in conjunction with allegations that he had used steroids, Bonds was asked if he had ever had a syringe injected into him by his trainer. Bonds answered:

I’ve only had one doctor touch me. And that’s my only [sic] personal doctor. Greg, like I said, we don’t get into each other’s personal lives. We’re friends, but I don’t—we don’t sit around and talk baseball, because he knows I don’t want—don’t come to my house talking baseball. If you want to come to my house and talk about fishing, some other stuff, we’ll be good friends, you come around talking about baseball, you go on. I don’t talk about his business. You know what I mean? That’s what keeps our friendship. You know, I am sorry, but that—you know, that—I was a celebrity child, not just in baseball by my own instincts. I became a celebrity child with a famous father. I just don’t get into other people’s business because of my father’s situation, you see.9

This rambling, disjointed answer led to Bonds being convicted on charges of obstruction of justice.

Reporters have lined up to damn, and a small few to defend, athletes like Bonds. What about the leaders who created the environment that led athletes to see steroid use as a rational course of action? The leaders who created the incentives for players to use steroids were never criminally charged, but they were responsible causal agents. During all the time they were benefiting from their players’ home runs, they failed to notice the obvious. The evidence of steroid use was abundantly clear in the changed physique of players. Looking at more rigorous evidence, consider that from 1991 to 1994 (before the steroid era), the leading MLB home run hitter averaged forty-four home runs.10 By contrast, during the height of the steroid era, the number of players who matched or beat that average was ten in 1998, eight in 1999, six in 2000, and nine in 2001.11 These very basic data provided evidence that was clear to sports journalists and baseball fans. Yet team owners and MLB commissioner Bud Selig failed to notice it. Their motivated blindness documents a failure of leadership.

Leadership comes with responsibilities. A critical one is noticing the outlying evidence. If you see an anomalous trend, investigate until you are given a clear answer. Daniel Kahneman notes that people too often act as if “what you see is all there is.” But it is the job of leaders to identify what information is needed and how to obtain that information, rather than acting on the information that is in the room.

BOARD OVERSIGHT

The unique responsibility of corporate boards—the final authority on corporate governance for publicly traded firms—highlights the urgency of leadership-driven noticing. Far too many boards, of both for-profit and nonprofit organizations, fail to notice even the facts before them, let alone the information that executives may be hiding from them.

David B. Duncan, the head of the Arthur Andersen team that audited Enron, the American energy company, was far from an innocent character in the debacle of 2001 that ended with the dissolution of both companies. But to his credit, he did inform Enron’s audit committee, a subcommittee of Enron’s board of directors, in 1999 that Enron’s accounting was “pushing limits” and “at the edge” of acceptable accounting practices.12 During this time former Stanford University dean and accounting professor Robert K. Jaedicke was a member of Enron’s board of directors and the chair of the board’s audit committee. But neither he nor any other member of the committee requested more information about Duncan’s Enron audit or recommended a more prudent approach, according to the U.S. Senate Permanent Subcommittee on Investigations.

Enron’s audit committee met to receive updates on the firm’s audits once or twice annually from 1999 through 2001. Despite his expertise and long tenure as chairman of the audit committee, Jaedicke rarely if ever had any contact with Andersen outside of official committee or board meetings, as governance experts recommend. When revelations of gross accounting irregularities came to light, ultimately resulting in Enron’s bankruptcy and criminal charges against several of its leading executives, the Senate committee concluded that multiple board members could have and should have prevented many of the fraudulent practices that led to Enron’s implosion. They had failed to ask Duncan and other auditors some simple questions. To take one example, in 2001 the board was told about a letter from an Enron whistleblower (later identified as Sherron S. Watkins), but none of the directors asked for her name or for a copy of the letter.

“By failing to provide sufficient oversight and restraint to top management excess, the Enron board contributed to the company’s collapse and bears a share of the responsibility for it,” the Senate committee concluded in its report. An attorney representing Enron’s outside directors (members of the board of directors who were not Enron employees), W. Neil Eggleston, called the committee’s report unfair, insisting, “This board was continually lied to and misled by [Enron] management.” It is quite likely that Enron executives did indeed lie to the board. But this explanation seems insufficient. Didn’t the board have a responsibility to follow up on Duncan’s warning and Watkins’s letter? Surely leadership requires us to question unusual patterns of data and demand the necessary information to reach accurate conclusions.

Assuming a board’s purpose goes beyond prestige, compensation, and rubber-stamping, the answer is obvious. The journalist Robert Byrne argued, “Shareholders have a right to expect directors, who at Enron were paid as much as $350,000 a year in cash, stock options, and phantom stock, to be engaged and active. They should be assured that directors will place investors’ interests above those of executives. And certainly, shareholders should expect that the board will follow up when outside experts appear before them to warn of potentially explosive danger. Yet Enron’s directors ignored warnings and heaped riches on executives time and time again.” Byrne argued that Enron’s board was “recklessly negligent” and should be held “at least partly accountable and personally liable” for the company’s downfall, given the enormous losses faced by Enron investors.

The Enron board’s “see no evil, hear no evil” defense on Capitol Hill was an unacceptable dereliction of duty and a real-world example of perverted leadership. Unfortunately it is all too common for boards of directors to take a passive approach to corporate oversight. I know a CEO of a large nonprofit organization who has proudly stated that its board works for her; she doesn’t work for the board. Not only is this statement legally incorrect, but if her description of the organization’s leadership is accurate, then the board is failing to meet its fiduciary responsibilities.

More important, such board members are probably not overseeing the organization in a manner that allows them to notice information that is core to their legal obligations. Skilled, highly educated, and experienced boards charged with oversight responsibilities all too often fail to meet them. Board members often fail to realize that the CEO reports to them rather than vice versa and that they have a financial and moral obligation to provide oversight of the organization’s activities. Typically the president or CEO of an organization personally invites professionals to sit on the board (including boards of directors and advisory boards) of their organization. Often the board meets irregularly and the CEO runs the meeting, determining what input he or she will offer. Partially due to this structure, too many board members fail to fulfill their responsibility to notice and act on leadership failures in their organizations. Too often boards institutionalize patterns of behavior that create blinders, and these blinders lead to their failing to notice critical information.

The Indian corporation Satyam, founded in 1987, was a phenomenal success, eventually supplying IT solutions to more than 35 percent of the largest five hundred companies in the world. At its peak Satyam employed nearly 50,000 people and operated in sixty-seven countries. As markets around the world collapsed in 2008, and the Indian Stock Exchange fell from a high of over 21,000 to below 8,000, Satyam continued to report positive results during most of the year.13

The first hint of a problem came in October 2008, when the World Bank fired Satyam as a service provider and issued an eight-year ban on hiring the company. The World Bank claimed that Satyam installed spy systems on its computers and stole some of its assets. Also in October, during a public conference call that Satyam held with stock analysts, one analyst drew attention to the large cash balances that Satyam’s owners were holding in non-interest-bearing bank accounts. The owners offered no explanation. Why would the owners allow large amounts of cash to sit passively in accounts that were not accruing interest? Moreover why did the owners fail to explain this behavior when asked about it publicly?

A third hint that there were problems at Satyam occurred in December 2008, when its board of directors unanimously approved the purchase of Maytas Properties and Maytas Infrastructure, two companies that were unrelated to Satyam’s core business. The board was unanimous in its support of the transactions, but investors were outraged. As it turned out, Satyam CEO B. Ramalinga Raju’s family held a larger stake in Maytas Properties and Maytas Infrastructure than it did in Satyam. Some observers suspected that the transactions were an attempt to siphon money out of Satyam and into the hands of the Raju family. As a result of the public outrage, the owners aborted the transactions. A fourth hint was the fact that CEO Raju’s personal holdings in Satyam fell from 15.67 percent in 2005–6 to just 2.3 percent in 2009.14 But Satyam’s board appeared to notice none of these strong indicators of trouble at the company.

After the Maytas acquisition incident, analysts put sell recommendations on Satyam’s stock. Its shares dropped nearly 10 percent, four of the company’s five independent directors resigned, and on December 30, 2008, Forrester Research analysts advised its clients to stop giving IT business to Satyam because of growing suspicions of widespread fraud. Satyam hired Merrill Lynch for advice on how to stop the freefall of its stock price. Eight days later Merrill Lynch sent a letter to the stock exchange stating that it was withdrawing from the project because it had uncovered material accounting irregularities.

On January 7, 2009, Raju confessed to Satyam’s board that he had been manipulating the company’s books for years. He eventually confessed to overstating assets on Satyam’s balance sheet by $1.47 billion. Indeed the company had overstated income virtually every quarter for several years. According to Raju, the manipulation had started out small but grew larger over the years. “It was like riding a tiger, not knowing how to get off without being eaten,” he said.

While Raju was the primary individual responsible for the fraud, Satyam’s auditors and board also bear responsibility for failing to see the obvious signs of wrongdoing, and they have been sued by investors for their lack of adequate oversight. Big 4 accounting firm PricewaterhouseCoopers (PWC) audited Satyam’s books from June 2000 until the fraud admission—about nine years. Interestingly Satyam paid PWC about twice the normal audit fees in the industry.15 If one of the four major audit firms overlooked such extreme fraud for nine years—fraud that Merrill Lynch detected in less than ten days of due diligence—it seems reasonable to question the veracity of the auditing industry. In the next chapter we will discuss audit failures as a form of not noticing, but it is also important to recognize that Satyam’s board didn’t detect the fraud either.

Most tests of board responsibility are more ambiguous than the outrageous episodes at Enron and Satyam. I have served on a number of nonprofit boards and have experienced the constraints and gray areas of detecting wrongdoing. Here is one personal anecdote that showcases the opportunities and risks associated with noticing. The president and founder of a nonprofit organization asked me to serve as a member of its board. The president was a good person who genuinely wanted to make the world a better place. He did good work, and he had a highly optimistic view of the impact that the organization could have on the world. This optimism, however, spilled into a tendency to make financial overstatements that would help the organization look stronger to outside constituencies. From my perspective, these claims seemed at the least technically false. I spoke up at board meetings, calling some of the president’s actions unacceptable. In response the organization backed off from the specific statements but then engaged in another form of similarly questionable practice—always with the goal of doing good. Eventually I could no longer accept what was occurring, and I resigned from the board, losing a friend in the process.

This was not a happy episode for me, but I felt comfortable with my decision. If I had continued to expect the organization to reform itself, I would have been guilty of many of the accusations that I have made against other boards in this chapter. At the same time, I am not holding myself out as exemplary for anything other than noticing and acting on what I noticed. Indeed if leadership entails encouraging systemic change, my need to resign highlights my inability to get the organization to reform. If they had changed, I would have had no need to resign.

Before we turn to solutions, one more example shows how systemic leadership failure can arise when the failure to notice becomes routine throughout an industry.

REGULATORY OVERSIGHT

In June 2012 Barclays bank was the first to admit to fraudulently manipulating Libor, the London InterBank Offered Rate. Barclays paid $450 million in fines for its action. For many people, even those with business sophistication, this news story lacked pizzazz. But the Libor scandal was a big deal, worthy of more attention than it received. The goal of Libor is to coordinate the fair and efficient rate that banks pay each other for short-term loans. Numerous other rates are then fixed to Libor, including the rates that banks charge customers for car loans, student loans, mortgages, and so on. Libor’s influence extends far beyond the United Kingdom, significantly affecting, for instance, rates on adjustable-rate mortgages in the United States.

The way Libor is calculated is straightforward but fundamentally flawed. Just after 11 A.M. each trading day, traders at the leading banks across the globe report the interest rates at which they claim their bank could borrow money. This is not a measure of the rate at which they actually borrowed money or the rates that they charged other banks; rather it is the bank’s subjective self-reported estimate. The highest and lowest quarter of estimates are discarded, and the rates from the middle half are averaged. This calculation is made for ten currencies and fifteen different loan durations; thus, 150 Libors are calculated. These rates are then applied to $360 trillion in assets. This means that if all Libors were lowered by 1/10 of 1 percent on average for a year, interest rates would fall by $360 billion. The actual cost to U.S. states, counties, and local governments alone from the manipulation in the Libor scandal is estimated to be $6 billion.16

Barclays made investments on the direction that the Libor would move and then reported its own rates in a biased manner to help its investments. At other times Barclays intentionally lowered its estimates in order to create predictability and stability for its own internal rates. Barclays was not the only financial organization involved in the manipulation, and ample evidence exists of collusion across banks to manipulate the Libor. Academic research suggests that Citibank’s underreporting was 50 percent greater than that of Barclays and that the Royal Bank of Canada was the most extreme manipulator.17 In December 2012 UBS agreed to fines of $1.5 billion over similar allegations, and multiple criminal charges were made against individuals from multiple banks.

Royal Bank of Scotland (RBS) trader Tan Chi Min admitted that RBS knew about and was involved in a cartel formed to manipulate Libors. In this transcript of instant messages among traders, Jezri Mohideen, the head of yen products for RBS in Singapore, asks to have the Libor fixed:

Mohideen: What’s the call on the Libor?

Trader 2: Where would you like it, Libor that is?

Trader 3: Mixed feelings, but mostly I’d like it all lower so the world starts to make a little sense.

Trader 4: The whole HF [hedge fund] world will be kissing you instead of calling me if Libor move lower.

Trader 2: OK, I will move the curve down 1 basis point, maybe more if I can.

Then there is this exchange between Tan and Deutsche Bank’s Mark Wong, which highlights the negative impact of Libor manipulation on outside parties:

Tan: It’s just amazing how Libor fixing can make you that much money or lose if opposite. It’s a cartel now in London.

Wong: Must be damn difficult to trade man, especially [if] you [are] not in the loop.18

From a regulatory standpoint, it is incredible that an international system would be in place that was so obviously ripe for corruption. The very banks that could benefit from rate manipulation were in control of setting rates. How could regulators fail to see how easy it would be for the banks to manipulate rates for their own benefit and at the expense of society? How could key regulators stay quiet about the systemic fraud in global finance?

In the aftermath of the scandal, there have been logical proposals for reform, including the requirement that bank submissions to Libor be based on actual interbank deposit market transactions rather than subjective reports.19 There have also been recommendations for criminal sanctions for manipulation of benchmark interest rates such as the Libor. But this leaves unanswered the most glaring question: Why was a disaster required for regulators to recognize the need for such commonsense changes?

Before Timothy Geithner became the U.S. treasury secretary, he was the head of the New York Federal Reserve Board. While holding that position, he wrote to the head of the Bank of England to make recommendations for Libor reform as early as 2008. But his recommendations were vague and failed to note the seriousness of the problem. In the Guardian, journalist Naomi Wolf wrote of Geithner, “It is very hard, looking at the elaborate edifices of fraud that are emerging across the financial system, to ignore the possibility that this kind of silence—‘the willingness to not rock the boat’—is simply rewarded by promotion to ever higher positions, ever greater authority. If you learn that rate-rigging and regulatory failures are systemic, but stay quiet, well, perhaps you have shown that you are genuinely reliable and deserve membership in the club.”20

This argument sounds quite cynical. Unfortunately it seems to parallel the evidence that came out in the Libor case. The banks’ failure was a moral one: they engaged in intentional distortion of the rates for their own benefit. The failure of regulators worldwide was a failure to notice that the system was corrupt and in need of regulatory reform. Clearly, allowing banks to subjectively report rates that are biased in the direction of their own self-interest cannot be the best way to honestly establish such rates.

This story documents the need for greater regulation of financial markets—not more regulation, but wiser regulation. Regulators and policymakers need to consider how actors in the market can be expected to play their roles, given their self-interest—that is, how their actions can distort markets or otherwise take advantage of those who do not have a voice.

ORDINARY LEADERSHIP

It isn’t difficult to reflect on noticing failures that have been well documented in the press; after all, the data are easily available. When we are providing oversight ourselves—whether for our children, our employees, or our peers—it is often the case that things just don’t seem right. Typically we ignore the growing data or simply decide that we lack sufficient evidence to badger other people for more information that could reveal the truth. Through our silence and complacency we accept and promote corruption.

The press frequently reports on massive cheating scandals in colleges and universities, including my own. This reporting focuses on the end episode, the actions of the students. These actions are truly unfortunate, but the press underreports on the leaders—teachers and administrators—who have overlooked the conditions, norms, and incentives that create the environment for the cheating to take place. I believe that it is the job of leaders to notice these conditions before the scandal occurs and to reform the organization rather than focusing only on the students engaged in the dishonest behavior.

In the next chapter I will continue to explore corruption in academia (among other topics). My focus will be on the faculty in an effort to better understand how easily a corrupt system can develop when leaders don’t notice potential pitfalls.