CHAPTER 8
Provide Organizational Supports for Teamwork

If you doubt that the organizational context is critical for team behavior and performance, take a brief break from your own intelligence team and join me in the cockpit of a commercial airliner for a trip from Washington to Chicago.1 Even before the captain and first officer meet, the organizational context has exercised hidden influence on team dynamics—for example, through the Crew Resource Management (CRM) training they both had received and through the airline’s crew composition system.

The captain has not previously flown with either the first officer or the lead flight attendant. Before heading to the aircraft, she checks in at the operations desk and picks up the paperwork for the flight. She glances at the weather (good, except that some nasty-looking thunderstorms are developing in western Pennsylvania) and examines the fuel load (a little low, she thinks, since the thunderstorms may require holding or a diversion, so she takes the unusual step of ordering another 4,000 pounds).

Then it is off to the aircraft, where she will brief the other crew members. They are already there, along with a surprise guest—a Federal Aviation Administration (FAA) inspector who will be in the cockpit jump seat for the flight. Once the briefing is completed (about which more in the next chapter), the crew begins to get the aircraft set up for departure. While the first officer is entering departure data into the flight management computer, a call comes from the redshirt, a ground staff member charged with expediting the departure. “It will be just another five or six minutes,” he says, “while they load some late-arriving bags.” The captain notes that it actually takes ten minutes but no matter, there still should be no problem arriving in Chicago on time. Checks completed, the captain listens to the Automatic Terminal Information System (ATI S), a continuously updated report on the latest local weather and runway information, and then checks the en route weather one last time. Nothing unexpected from ATIS, but the Pennsylvania thunderstorms appear to be intensifying. She is glad she ordered the extra fuel.

When all is ready, the captain radios ground control for permission to push. Once the tug has moved the aircraft back from the gate, the captain acknowledges the salute from the marshal on the ground signaling that all is clear, and calls for engine start. As the captain navigates the plane through the airport’s maze of taxiways, the first officer, who actually will fly this leg of the trip, runs the final pre-flight checklist. When the tower issues takeoff clearance, the captain turns the aircraft onto the departure runway and says, “It’s your airplane.” The first officer takes control, advances the throttles, and the plane begins its roll.

As soon as climb is established, the first officer announces “positive rate, gear up” and the captain moves the landing gear lever to the “up” position. Three lights on the cockpit panel—one each for nose gear, left main gear, and right main gear—turn yellow and then, one by one, green, signifying that the gear has been successfully raised and stowed. The flight is passed off from controller to controller and eventually settles into cruise at 35,000 feet. Only a small deviation from the flight plan is needed to avoid a cell in the Pennsylvania thunderstorms, which the en route controller immediately approves. All is normal, all is routine.

The excitement comes on final approach. When the first officer calls for gear down, there is an audible thump, the plane briefly yaws—and the indicator light for the right main gear remains yellow. “You fly the airplane; I’ll work the problem,” the captain instructs as she tries, unsuccessfully, to cycle the gear. She then calls approach control to report the problem and terminate the approach. The controller issues a vector that takes the plane out over Lake Michigan where it will be clear of other traffic.

The crew first brings up the gear problem checklist on the aircraft computer and goes through each step specified. Although resetting the circuit breaker for the indicators does not extinguish the offending light, visual inspection through a viewport in the cockpit floor shows that the gear is down—but there is no way to know if it is locked. The captain asks the FAA inspector if he has any questions or suggestions, but he does not. She then radios the airline’s dispatch desk to report the problem, followed by a call to company maintenance for consultation about next steps. After some discussion, a maintenance supervisor calls a technical specialist with the aircraft manufacturer and patches him into the conversation, but the additional checks and procedures he suggests do not help. Eventually, all agree that everything that can be done has been done and that the crew should attempt a landing.

The captain tells the lead flight attendant to prepare the passengers for an emergency landing and calls approach control to request a straight-in approach to the airport’s longest runway, with fire equipment standing by just in case. Then she takes the airplane and flies the approach. The gear holds, the landing is routine, and the crew hears sustained applause from the cabin as the aircraft rolls out.2 After the passengers have departed, the FAA inspector turns to the crew and says, “Nice flying, guys, very professional.” The captain nods and responds, “All in a day’s work.” Pilots with the right stuff don’t need compliments, especially not from outsiders. Or at least they act as if they don’t need them.

There is no way to understand this crew absent knowledge about the context within which it was operating. Just look at the four aspects of a team’s context that research has shown to be especially consequential for team behavior and performance.3

1. Access to the information a team needs to accomplish its work. Many different kinds of information affected how this crew operated. Its very composition was shaped by a personnel database that specified who could be rostered to fly together. And then, once the crew began its work, members had access to a rich array of data for use in monitoring the situation and making decisions about how to proceed. There was up-to-the-minute information about the weather, about the status of airport operations, and more. There was the expediter who kept the crew informed of departure preparations and the marshal who made sure the plane was not going to hit anything during pushback. And there were data from various electronic and mechanical systems on the aircraft itself, including the troublesome landing gear indicator lights.

2. The availability of educational and technical resources to supplement members’ own knowledge and skill. There was the CRM course members had taken that helped them hone their teamwork skills. There was advice from company dispatch and maintenance staff, and then from the aircraft manufacturer’s technical specialist. There was the computerized manual aboard the aircraft that guided the team in diagnosing its landing gear problem. And there were the many checklists, each one carefully developed and refined through experience, that lessened the chances of operational oversight.

3. Ample material resources for use in carrying out the team’s work, ranging from the availability of extra fuel on demand to the presence of fire equipment at the arrival airport.

4. External recognition and reinforcement of excellent team performance. In real time, there was just the appreciation of the passengers and the compliment of the FAA inspector. We do not know if airline management subsequently recognized the crew’s exemplary performance but it is likely that the chief pilot, at least, would have commended the crew.

To remove the four contextual supports just listed is to invite trouble. Even a team whose members are highly motivated cannot succeed if it is unable to get the information, the tools, the resources, and the support it needs to perform well. That is just as true for teams in the intelligence community as it is for crews that fly airplanes. The best leaders of intelligence teams, therefore, give close attention to those often-hidden contextual features that most powerfully affect team behavior and performance. These leaders do whatever they can to secure the resources and support their teams need, and they use their influence to remove contextual roadblocks. As will be seen next, that can be a considerable challenge in the intelligence community, but it assuredly is one well worth taking on.

Information

The perversity is that the very business of intelligence is information, but informational supports for teams that do intelligence work can be hard to come by. Sometimes the information a team most needs to proceed smartly with its work cannot be obtained at all. At other times, a team may be flooded with such vast quantities of undifferentiated information that members are overwhelmed. At still other times, seemingly relevant information may be available but at a time or in a format that makes it all but unusable.

We saw in Chapters 3 and 7 that having a task-appropriate way of proceeding with the work—that is, a good task-performance strategy— is key to team effectiveness. And coming up with a strategy that works depends not just on having norms that support the search for one but also on access to concrete information about the performance situation. Consider, for example, the futility of trying to develop an endgame strategy in a basketball game without knowing the score or the time remaining, or the frustration of a cockpit crew that must decide whether to fly through or around a cumulus buildup without access to current weather data.

The same is true for intelligence teams. If an analytic team does not know who will be using the assessments it generates and what the user will do with them, members will be flying just as blind as that cockpit crew. For an operations team, the less information members have about the obstacles they will have to overcome the more they will have to improvise in real time rather than execute a well-considered plan— always a risky proposition. Similarly, a science and technology team that is developing a new collection device risks making poor design decisions if members do not have trustworthy data about the environment where the device will be deployed. And a team designing a training program for new intelligence officers may generate a curriculum that fails to meet trainees’ educational needs absent information about their existing competences and previous experience. No matter what a team’s task, devising or selecting a task-appropriate performance strategy requires information about contextual requirements, constraints, and opportunities.

Why do intelligence teams so often have trouble getting the information members need to develop an appropriate performance strategy? For starters, there are some things that no one can know. In such circumstances, there is no alternative but to make the best possible estimate and then to develop a strategy that keeps open many different options for proceeding. At other times, the information a team most needs is potentially available—but it has not been collected. And still other times, the information that is most readily available to the team turns out to be disinformation planted by adversaries. Distinguishing between information that can be trusted and that which is designed to mislead can require high-level tradecraft.

The most frustrating and, unfortunately, most common informational problem intelligence teams face occurs when the needed information is available within the community but the team cannot get its hands on it. This difficulty often appears to stem from incompatible information technologies, but its roots actually go deeper than that. In fact, the technologies and safeguards needed to support what is known as trusted information sharing across systems have been available for some time. The problem, then, is less that the information systems of different intelligence organizations cannot talk to one another than that people in those organizations choose not to do so. As one senior intelligence officer explained, community members make a sharp distinction between “us” and “not us” in deciding what information can be shared. And if “not us” becomes “them,” as is the case for the relationships between some intelligence organizations, then the protective wall around “our” information can become opaque and impermeable.

This problem is exacerbated when the main intelligence work to be done has been chopped up into small pieces and assigned to different individuals or teams, with the overall product to be assembled from those separately produced components (see Chapters 4 and 5). Consider, for example, the difference between a watch team that does nothing beyond monitoring and a unit such as the CIA’s Counterterrorism Center, which, as of this writing, has a larger and more integrated responsibility that extends all the way from collection through analysis to operations. When a team does but a thin slice of the work, members necessarily are dependent on others for the information they need to plan and execute their work. Moreover, the ability of the next team down the line to do its work depends on how promptly and competently the earlier team passes on whatever it has learned.

It gets worse. The widespread tendency in the community to protect information by classifying it or placing it in a restricted compartment often makes it nearly impossible for a team to get information that it needs for its work. In fact, it often is possible for a team to develop and implement an appropriate performance strategy based solely on readily accessible information. One of the quirks of intelligence work is that information obtained at great risk or expense, or that is labeled as secret and kept in a hard-to-access compartment, is viewed as far more valuable than even highly trustworthy data from open sources. Still, community practices about work design (split the task up into small pieces) and information protection (do not give teams access to sensitive data not directly relevant to their specific part of the work) have the unintentional consequence of occasionally requiring teams to work in the fog, unable to obtain what they need to devise and implement the best possible strategy for accomplishing their overall mission.

At a meeting of an intelligence community advisory panel, an external advisor expressed dismay about what he viewed as the rampant over-classification of materials. Might it be possible, he asked, for most materials to be kept in the open and therefore readily available to other community members who need them for their work, but also to protect more vigorously than ever the smaller set of materials that really do need to be kept secret? During a break, a contractor told the advisor that his proposal showed just how uninformed he was about the realities of intelligence work. Interestingly, a senior intelligence officer who also was present noted that he once had made an almost identical proposal—which, as near as he could tell, had had no impact whatever. The culture of secrecy that pervades the community is strong, selfperpetuating, and occasionally counterproductive.

Although it is true that the organizational context can put informational roadblocks in intelligence teams’ way, teams themselves share responsibility for not having the information they need to plan and execute their work. Members need to take the initiative to figure out whom to ask for the needed information, how to request it in a way that increases the chances of a favorable response, and how to frame questions so they can be answered. Nothing constructive is accomplished when team members merely complain among themselves that nobody has collected what they most need. Nor is much to be gained from a request to “Give us whatever you’ve got on that,” since the response can be a dump of undifferentiated data that can overwhelm or misdirect the team. And when a team does find that it has access to lots of potentially relevant information, as was the case for the blue team in the PLG simulation described Chapter 1, members must resist the temptation to scoop up everything they can get their hands on. What actually will be most helpful in planning and executing its work, in many cases, is something other than that to which the team has the easiest access.

The best intelligence teams know, or take the initiative to find out, where they can get the information that they most need—including from sources outside their own organizations.4 They know how to frame questions to increase the chances of obtaining high-value information rather than whatever is easiest for the provider to supply. They regularly activate what is colloquially known as the sneaker net, using personal contacts to access information because they prefer the richness of direct human exchange even when technical means are available to obtain the same thing. At times they even may mount a little op to get others in the community to provide data they need. All of these strategies, and more, are employed by teams whose members know from experience how critical it is to base their performance strategies on trustworthy information about the task and situation.

Technical and Educational Resources

Even well-composed teams rarely consist of members who collectively have the full extent of knowledge, skill, and experience needed to carry out the team’s work. Among the supports that intelligence organizations can provide to teams, therefore, are technical and educational resources that can provide assistance with any aspects of the work for which members are not already capable—including, if necessary, the honing of members’ skills in working collaboratively on collective tasks.

TECHNICAL TOOLS.

Intelligence teams use a wide variety of technical devices and computer software in their work, ranging from sophisticated collection technologies to devices for real-time communication in field operations to spiffy visualization programs.5 Other tools, useful for helping teams tap into databases of existing knowledge or for seeking assistance from other intelligence professionals, include Intellipedia (modeled on Wikipedia) and A-space and C-space (collaborative workspaces for analysts and collectors, respectively, modeled on MySpace). As additional tools of these kinds come online, it will become increasingly common for teams to participate in communities of practice that extend across the full range of intelligence agencies.

The degree to which teams actually use the tools available to them, however, depends considerably on how those tools are made available. There is a world of difference between “We’ve put some great new software on your desktops that you and your teammates can use to coordinate your activities—give it a try, you’ll really like it” and “Can we talk to your team about how you work together, see what’s getting in your way or slowing you down? Maybe there are some tools out there that you’d find helpful.” Those who develop some new device or software understandably believe in its value and want to see it used. But those who are doing front-line intelligence work want mainly to get on with it and therefore may be quick to dismiss anything that seems unlikely to provide immediate help. This difference reflects the commonly observed tension between “push” and “pull” in the development and deployment of technological, consultative, and educational resources. How, providers ask, can we make our offerings so attractive that people will break down doors to get them? But why, users respond, can’t we get the one thing we actually need to get over this particular hump?

The tension between providers and users becomes especially pronounced when emerging technological capabilities are at the cutting edge. In such cases, providers can become so enchanted with what they are creating that they lose sight of users’ actual needs and wind up producing something that is more elegant than practical. And, for their part, users can become so entrenched in the way they are already working that, even when asked, they are unable to envision a tool that might help them work together better. At this writing, Google Wave, a browser application that combines multiple forms of communication into a unified user interface, has just been announced. That application appears to have considerable potential for facilitating coordination among members of dispersed intelligence teams. But how likely is it that one of those teams would have come up with the idea for the Wave and asked community developers to generate such an interface? Not very.

The can-do attitude that pervades the intelligence community, as admirable as it is, does lessen the likelihood that front-line teams will seek assistance even to solve problems for which team members’ own capabilities are limited. Rather than ask for help, which may involve entering a bureaucratic labyrinth that yields nothing useful, people just soldier on, developing their own workarounds as needed. It usually falls to the team leader, therefore, to facilitate the relationship between those who provide technical or consultative assistance and the teams that could be helped by what they have to offer. Good team leaders get providers to realize that it is just as important to be responsive to teams’ immediate needs as it is to deploy aids that incorporate the most recent or most elegant innovations. They also help users realize that an investment in learning something new and unfamiliar actually can pay off handsomely over the longer term.

EDUCATIONAL SUPPORT.

The importance of educational support for work teams is vividly illustrated by what transpired at a semiconductor manufacturing plant that changed from an individual to a team production model. Prior to the introduction of teams, each individual worked at a separate station, performed one small part of the overall task, and followed detailed work procedures that others had designed.

That all changed when production teams were formed and given primary responsibility for productivity and quality. Previously, company engineers could stop the line at any time to adjust the technology or smooth out production processes. Now, because teams were in charge of their own production, engineers had to work directly with them to identify non-disruptive times to make technical or process modifications. And, in the course of those interventions, they occasionally took the time to explain aspects of the production process to team members. Relationships with maintenance staff also improved. Previously, production workers had to call the maintenance office when equipment broke down and then wait for a technician to show up and make the repair or adjustment. Now, each production team included an adjunct member from the maintenance department, who often showed members how they could make routine adjustments themselves. That resulted not only in speedier fixes of malfunctioning equipment but also in improved relations between the two groups.6

Because team development was a long-term project at the semiconductor plant, there was plenty of time available for these educational activities. But sometimes there is no time to seek outside assistance or consultation—for example, when a flight-deck crew encounters windshear on final approach, or when a patient with a life-threatening trauma arrives at a hospital emergency room, or when an intelligence team is tracking a rapidly unfolding terrorism situation. In such cases, team capabilities cannot be developed gradually through members’ interactions with specialists; they must already be available when the crisis hits.

In aviation and emergency medicine, highly realistic simulations are used to help team members develop the skills needed for competent real-time responses under pressure. The training intelligence professionals receive, however, is mainly (and in some areas, exclusively) focused on individuals. Initial training for analysts, for example, helps individuals develop their analytic skills, such as the proper use of various structured analytic techniques. Similarly, the training of clandestine officers emphasizes individuals’ mastery of the tools and techniques of their trade, such as how to plan and execute a bump meeting or how to get off the “X” when things go bad.

Here’s the problem. Just because a team is composed of individual members who have finely honed skills does not mean that the team will operate smoothly or well. Indeed, all members of the PLG blue teams described in Chapter 1 were highly skilled, but nonetheless they had great difficulty working together. We saw the same thing in our experimental laboratory: Teams that included members with very high task-relevant expertise were far less likely than others to figure out what some would-be terrorists were planning unless they also received an intervention that helped them use that expertise well.

Managers at the semiconductor plant understood that technical expertise is not enough, so they launched a training program specifically designed to help team members develop their skills in collaborating, sharing leadership, and managing relationships with other groups. This kind of teamwork training, if competently designed and executed, can significantly enhance members’ ability to work together to achieve collective purposes.7 The resource management training that had been received by the pilots in the example that opened this chapter included high-fidelity simulations of line flights to help pilots learn and practice effective strategies for working together. That approach now has diffused to healthcare organizations, especially operating room and trauma teams.8

As of this writing, relatively little simulation-based training in teamwork skills is being carried out in the intelligence community. Yet when high-fidelity simulations are used to help participants explore team dynamics that affect their performance, as is done in PLG simulations and in some field training problems for paramilitary special operations teams, intelligence professionals learn a great deal. If training to help participants develop and practice positive teamwork skills were more widely available, it surely would generate at least modest improvements in the quality of teamwork throughout the community.

Material Resources

Imagine an intelligence team that has just the right number and mix of members to accomplish a highly consequential purpose. That team, moreover, is well structured and supported—it has constructive norms of conduct, access to the information members need to plan an appropriate performance strategy, and the ready availability of any consultative or educational assistance that members may need to cover gaps in their own capabilities. This team, then, has in place all the conditions discussed so far in this book. Members are eager to plunge into the work and they expect to turn in a fine performance.

Now reflect on what would happen if the team could not obtain the mundane material resources needed to actually execute the work—the money, people, space, transport, equipment, or whatever else is required. It would be like getting all dressed up to go to a long-anticipated concert and having the car break down as you pull out of the driveway. Team failures that result solely from scarcity of material resources are among the most distressing that one observes in organizational life.

When teams are poorly resourced, members generally try to do the best they can with whatever is available. Some teams may take a more proactive stance and try to secure the needed resources on their own, either through normal bureaucratic channels or by what sometimes are referred to as “other means.” If resource insufficiency is chronic, however, the frequency of such initiatives tends to lessen and more serious consequences appear: cynicism and, eventually, motivational disengagement.

Resource munificence can be nearly as problematic as resource scarcity. For tasks that managers view as highly important, teams may be provided with resources that far exceed what they are likely to need, especially if they will do their work far from headquarters. “You can have whatever you require, just say the word,” the team is told. “And, just to make sure you are not held up in your work, here are all the funds and helpers you possibly could need. Go to it, and just send back anything you wind up not using.” As lovely as that little speech may sound to team members, it has a hidden downside. Precisely because the team has more resources than it actually needs, members are not forced to consider the trade-offs among alternative strategies for carrying out the work. So there is little chance that they will invent a creative and cost-effective way of proceeding of the kind sometimes seen in teams that operate in resource-poor contexts.

To competently resource a task-performing team is to walk a fairly narrow balance beam—making sure the team has ready access to those resources that really are essential for its work but not providing such munificence that members are tempted to mindlessly adopt what may be a suboptimal strategy. The best team managers develop a finely honed sense of how much is enough, and how much would be too much. If available resources are insufficient, they do whatever they can to secure more—including exercising political influence if that is what it takes to get a team what it needs. But if a team already has more than enough, they may actually limit access to some resources in hopes of prompting members to think creatively about how they might do more and better with less.

Recognition and Reinforcement

The team has finished its work—the estimate has been written, or the operation is complete, or the data have been retrieved, or the device has been tested. The project was a success. Did anyone notice?

Intelligence professionals, like the pilots of the flight with the landing gear problem, tend to deflect proffered praise (“It’s all in a day’s work”). In fact, we all appreciate being acknowledged and recognized for our accomplishments. No matter how much we may claim otherwise, that is how we are wired. Even a small acknowledgment from the client who received the work, or from the official who commissioned it, can make a large difference to a team and its members. And if that does not happen, one of the more constructive things the team’s manager can do is prompt the client to let the team know that its work made a difference. It is the recognition that counts, and for teams of professionals positive words count just as much as, or more than, tangible tokens such as certificates or cash. Indeed, managerial practices that make money a salient feature of the work context foster both feelings of personal self-sufficiency and a wish to be free of dependence on others—not a state of affairs conducive to teamwork.9

There are three ways to go wrong in providing recognition and reinforcement for task-performing teams. The first is to ignore successes (“That’s what we expect from our teams so further comment is unnecessary”) but to call out failures. That strategy, of course, encourages risk aversion, a stance inconsistent with what is needed to accomplish many intelligence tasks. Indeed, one of the most well-established principles of human psychology is that positive reinforcement is a powerful tool for shaping behavior, whereas punishment fosters either withdrawal or variation in behavior as people try to head off aversive outcomes.

A second way to go wrong is to identify the person whom one views as mainly responsible for the team’s success and single out that individual for special recognition. The most extreme version of this error occasionally is seen in business organizations when team members are put in direct competition with one another for financial rewards from a fixed pool. Such practices divert members’ attention from the team’s work, refocus it on monitoring who gets what when rewards are distributed, and undermine relationships among members who are supposed to be working together to accomplish collective purposes. Because intelligence organizations rarely rely on financial compensation to foster work motivation, they are protected against the worst of these problems. Still, individual team members sometimes are singled out to receive recognition for what actually was a team accomplishment, and even these small events can send a large signal about the actual importance of teamwork in the organization. If a team did the work, then the team should get the recognition.

The third way to go wrong may seem inconsistent with the whole idea of recognizing and reinforcing team accomplishments. It is this: Provide specific, concrete objectives for the team’s work, reinforced by strong incentives contingent upon achieving them. Without question, that state of affairs will foster strong, outcome-focused motivation. But it also risks inviting some unintended and unwanted secondary outcomes, such as tempting performers to compromise their normal ethical standards.10 During the Vietnam conflict in the early 1970s, for example, military performance was assessed in part by body counts. Field commanders were required to regularly report the number of enemy dead, and it was made clear to them that big numbers were what was wanted. So big numbers were what got reported back to Washington, where policymakers based strategy decisions on data that at best were unreliable and occasionally were entirely imaginary.11 The temptation to do whatever has to be done to achieve specific, heavily incented targets, even if that involves overlooking troublesome data or violating expected standards of behavior, can be hard to resist.

These three caveats suggest that providing performance-contingent rewards for team accomplishments must be done thoughtfully, taking full account of the potential for unintended secondary consequences. It remains true, nonetheless, that team-focused recognition does sustain collective motivation and, at the same time, encourage members to think of “us” rather than “me.” The key is to make sure that the primary source of motivation for teamwork is the importance of the team’s purpose, not the prospect of obtaining some tangible reward, and certainly not the hope of winning a competition with another team for the ear of a policymaker (see Chapter 11). Ultimately, members of the team itself know better than anyone else who was especially helpful in accomplishing the work, and spontaneous recognition from one’s professional peers usually is more important to those who receive it than anything the broader organization could provide.

Conclusion: What Is a Leader to Do?

Transactions between teams and the contexts within which they operate are an integral part of everyday group behavior.12 Although both team members and managers tend to view the context as “just the way things are in this organization,” contextual features actually are highly consequential for team behavior and performance. We all have seen how smoothly things unfold when just the right resources and support become available to a team at just the right time (“the operation ran like clockwork!”). We also have seen how problems accumulate when contextual supports are unavailable, when they take too much time or effort to obtain, or when they arrive too late to be of any use. A key responsibility of team leaders and managers, then, is to do whatever they can to provide a context that supports rather than frustrates their teams.

In practice, leaders and managers vary greatly in how they deal with their teams’ contexts. Some simply muddle through, addressing context-driven problems as best they can only after they appear. They do nothing in advance to minimize the frequency or severity of those problems or to proactively create contextual supports for their teams. Others focus their attention mainly on buffering their teams from the worst of the external roadblocks that get in the way, calling in favors from managerial colleagues whenever possible to get those roadblocks removed.

The best team leaders and managers go further. They make careful assessments of what their teams may need and then use their own authority, persuasion of managerial colleagues, and even political action to create a context for their teams that is as supportive as they can make it. That takes managerial and interpersonal skill, to be sure. But it also requires rejection of the cynical view that there is nothing anyone can do to change how the organization works. In fact, there always are some things one can do to strengthen a team’s organizational context, and the most effective managers marshal whatever resources they can command to do it.