Almost everyone hates meetings, and yet the idea of doing away with them is seen as revolutionary, or ridiculous. Jim Buckmaster, chief executive of the hugely successful website craigslist.org, has a simple policy – ‘No meetings, ever’ – but if you’re a manager, you’re probably already thinking of reasons why you couldn’t do the same. An important recent book, Why Work Sucks and How to Fix It, proposes a total shift in how we think about office life, but one part is considered so startling, it’s singled out on the cover: ‘No meetings.’ It has been reported that senior executives find at least half of all meetings unproductive.61 Yet still they happen. ‘Meetings,’ writes the humorist Dave Barry, ‘are an addictive, highly self-indulgent activity that corporations and other large organisations habitually engage in only because they cannot actually masturbate.’
Why Work Sucks and How to Fix It reports on an experiment undertaken at the US electronics chain BestBuy: a ‘results-only work environment’, in which staff could work where and when they liked, so long as their jobs got done. The first casualty was meetings. ‘Why do we spend so much of our business life talking about the business we need to take care of?’ the authors write.
There are several reasons why meetings don’t work. They move, in the words of the career coach Dale Dauten, ‘at the pace of the slowest mind in the room’, so that ‘all but one participant will be bored, all but one mind underused’. A key purpose of meetings is information transfer, but they’re based on the assumption that people absorb information best by hearing it, rather than reading it or discussing it over email, whereas in fact it’s been estimated that only a minority of us are ‘auditory learners’.62 PowerPoint presentations may be worse. The investigation into the 2003 Columbia space shuttle disaster, caused by a fuel tank problem, suggested that NASA engineers might have been hampered in addressing it sooner because it was presented on PowerPoint slides, forcing the information into hierarchical lists of bullet points, ill-suited to how most brains work.
The key question for distinguishing a worthwhile meeting from a worthless one seems to be this: is it a ‘status-report’ meeting, designed for employees to tell each other things? If so, it’s probably better handled on email or paper. That leaves a minority of ‘good’ meetings, whose value lies in the meeting of minds itself – for example, a well-run brainstorming session.
Countless books advise managers on how to motivate staff. But motivation isn’t the problem. Generally, people want to work; they gripe when things like meetings stop them doing so. Indeed, a 2006 study showed there’s only one group of people who say meetings enhance their wellbeing – those who also score low on ‘accomplishment striving’.63 In other words: people who enjoy meetings are those who don’t like getting things done.
Books from the 1970s on time management always make two key suggestions for how to stop people interrupting you when you’re trying to work: close the door to your office, and get your secretary to screen your phone calls. This would be brilliant advice, except that the entire time I was working full-time in an office, I never had a door; I could have got my secretary to go and buy me one, but I never had a secretary, either. As a result, I was always extremely busy, and regrettably had no time to take what would have been the most cathartic action – i.e., hunting down and killing the person who first suggested that open-plan offices might be a boon to productivity.
Besides, trying to eradicate interruptions doesn’t always make sense. Even more recent books counsel shutting yourself off from sources of distraction – Never Check E-mail in the Morning, by Julie Morgenstern, is one very readable example. But many of us these days work in jobs that behavioural scientists describe as ‘interrupt-driven’, where responding quickly to messages and requests is part of what we’re paid to do: try telling a call-centre worker (or an investment banker, or a journalist) to take the phone off the hook for an hour. What we need are guerrilla tactics for managing office interruptions, and the personal-development gurus, of course, are happy to oblige:
This preposterously simple idea, from the productivity guru David Allen, really might change your life: deal immediately with all interruptions that you think can be dispatched in two minutes. (Write down the others, and process them later.) Crucially, two minutes is short enough not to lose a feel for the work you were engaged in. This is important, given the findings of the psychologist Mary Czerwinski, an ‘interruption scientist’ (no kidding): 40 per cent of the time, she notes, office workers who get distracted from a task don’t return to it when the interruption ends.64
A potted plant or stack of books on your desk need not literally block colleagues; people respond surprisingly sensitively to symbolic cues. I spent many days in the office wearing an out sized pair of headphones; amusingly, they can be very obviously plugged in to nothing at all, and you’ll still be left alone.
If surfing the web is a major source of distraction, adjust your browser’s preferences so your homepage is ‘Get Back to Work’, by the blogger Mark Taw (marktaw.com/getbacktowork.htm), a page that a) tells you in very large letters to get back to work, and b) includes a clever system for clarifying and monitoring the work you’re meant to be doing. By sharpening your awareness of what you’re doing, self-monitoring of this kind can seriously reduce the time spent in the depressing limbo that is neither focused work nor real relaxation.
There are other techniques for stopping people bothering you. But they’re not for the weak-hearted, and may have negative consequences for your social acceptability. Roquefort cheese. At your desk. That’s all I’m saying for now.
The English writer C. Northcote Parkinson is best remembered for his maxim that ‘work expands to fill the time available’, but that wasn’t his only biting observation about the irrationality and ridiculousness of business life. (Though, actually, if it really had taken him his whole career to come up with Parkinson’s Law, that would have been an amusing demonstration of Parkinson’s Law.) A less well-known but equally spot-on dictum, also outlined in his book Parkinson’s Law, is the Law of Triviality, which he illustrates with an imaginary tale in which a firm’s executives meet to discuss two new projects: an atomic reactor and a company bike shed. The reactor is complex and bewilderingly expensive, and non-experts risk embarrassment if they speak up, so it gets approved in two and a half minutes. But everyone knows about bikes and bike sheds, and everyone has an opinion. The bike shed, Parkinson writes, ‘will be debated for an hour and a quarter, then deferred for decision to the next meeting, pending the gathering of more information’.
This has come to be known as the Colour of the Bike Shed Phenomenon: the time spent on any item will be in inverse proportion to its cost and importance. Relentlessly, the trivial squeezes out the non-trivial. The reactor may suffer a meltdown due to some overlooked technical matter, but never mind: check out the awesome letterhead stationery we spent so long getting right!
Parkinson’s point – which also applies to politics and the media, where the focus, frustratingly, is often on the least important things – isn’t simply that smaller matters are less intimidating to deal with. It’s that when the members of any group are driven partly by personal egotism – as all of us are – their interests conspire, without them realising it, to keep the focus on the inconsequential. Each wants to demonstrate, to the boss or to themselves, that they are taking part, paying attention, making a difference, ‘adding value’. But with complex subjects about which they’re ignorant, they can’t: they risk humiliation. They may also not want to dwell on their specialist subjects, preferring not to have the non-experts pry too closely. (In Parkinson’s story, the nuclear expert keeps quiet: ‘He would have to begin by explaining what a reactor is, and no one there would admit that he did not already know. Better to say nothing.’)
So what gets discussed is precisely what doesn’t matter. ‘In Denmark we call it “setting your fingerprint”,’ notes Poul-Henning Kamp, a programmer who has helped popularise the conundrum, at the website bikeshed.com. ‘It is about personal pride and prestige. It is about being able to point somewhere and say, “There! I did that … ” Just think about footsteps in wet cement.’
Similar effects – where small stuff preoccupies us precisely because it’s small – course through our lives. The Law of Triviality also calls to mind the caustic comment, usually attributed to Henry Kissinger, that ‘academic politics are so vicious because the stakes are so small’, which is surely a fair take on most office politics, too. I fear that something related is also what’s transpiring whenever I get that delusional feeling of achievement from having powered through multiple unimportant items on my to-do list, leaving untouched the few tasks that really matter. Taken together, Parkinson’s two laws amount to a wry but certainly not trivial warning: the work we do expands to fill the time available – and, half the time, it’s not even the most important work.
In the perfect society, wrote Karl Marx, nobody would be a specialist. It would be ‘possible for me to do one thing today and another tomorrow – to hunt in the morning, fish in the afternoon, rear cattle in the evening, criticise after dinner, just as I have a mind, without ever becoming hunter, fisherman, herdsman or critic’. (The quote comes from his 1845 book Theses on Feuerbach, shortly to be republished, in a new translation by Paul McKenna, as I Can Transform Your Capitalist Relations of Production in Seven Days.) There’s nothing particularly Marxist about this idea – you find it, also, in ancient Greece, and in the Renaissance – but where you don’t find it is in the business section of your local chain bookshop. There, by contrast, the focus is on focus: defining your ‘purpose’, relentlessly pursuing your ‘number one priority’, and developing your ‘personal brand’. Specialism rules.
The problem is that plenty of people don’t have one number-one priority, even within the world of work. From an early age, we’re taught to feel that this is bad, a mark of indecisiveness – a belief exacerbated not just by reading stress-inducing business books but by the very notion of a single, unitary ‘career path’. ‘The conventional wisdom [seems] indisputable,’ writes Barbara Sher, whose excellent book Refuse to Choose! has made her the chief consoler of self-pitying generalists everywhere. The prevailing belief is that ‘if you’re a jack-of-all-trades, you’ll always be a master of none. You’ll become a dilettante, a dabbler, a superficial person – and you’ll never have a decent career.’
Yet Sher identifies a specific personality type she calls ‘scanners’ and offers them a plethora of tips for flourishing in a non-scanner world. Another book, One Person/Multiple Careers by Marci Alboher, gathers examples of people who have made a success of what the author calls ‘slash’ careers. The unifying theme is how much damage is done by the mere belief, among generalists, that specialism is best. ‘Almost every case of low self-esteem, shame, frustration … simply disappeared the moment they understood they were scanners, and stopped trying to be someone else,’ Sher reports.
Apart from anything else, the ‘one focus’ belief serves to inhibit action: if you believe you have to give up your job as a lawyer in order to become a screenwriter – based on an underlying belief that people have to have one job – you’ll probably never become a screenwriter. If you spend one hour actually screenwriting, you already are one.
Most of us, of course, have jobs to do, and a life outside our jobs to attend to; there’s a limit to how much more we can cram in. But we can start by letting go of the idea that specialism is inherently superior. (In some fields, it’s still better-paid, but Alboher marshals evidence that this is changing, too.) How strange that we should have persuaded ourselves that doing only a few of the things we can do is better than doing lots of them. As for that ‘jack-of-all-trades’ thing: the earliest published reference to Shakespeare – hardly an underachiever – as an actor and playwright refers to him using the Elizabethan equivalent of that term, ‘Johannes factotum’.
Another famous dilettante: Leonardo da Vinci. I’m just saying.
Towards the end of his life, the great organisational thinker Peter Drucker sardonically observed that although he’d spent half a century speaking about the mysterious, seductive, much sought-after quality called ‘leadership’, he wasn’t sure that there was much to be said about it at all. ‘The only definition of a leader,’ he remarked, ‘is that a leader is someone who has followers.’ This circularity hasn’t stopped numerous gurus presenting themselves as purveyors of the ‘laws’ or ‘secrets’ of leadership. ‘The true measure of leadership,’ writes John Maxwell, who has published nearly 60 books on the topic, served as adviser to the US military and now runs a worldwide leadership development programme, ‘is influence … If you don’t have influence, you will never be able to lead others.’ Um, yes, thanks. Observations about the excretory habits of bears in arboreal settings may spring to mind, but that probably just shows you’re not cut out to be a leader.
The more closely it’s examined, the more ‘leadership’, good or bad, starts to look like a mirage. In his interesting book Obliquity, the economist John Kay cites John Sculley, chieftain of the Apple empire in the 1980s. (Another secret of leadership: lots of the people involved are called John.) Sculley’s reign was a huge success until, suddenly, it wasn’t: with profits in freefall, he was forced out. Much ink was spilled over this. Was he a great leader who lost his touch, or a mediocre one who’d briefly risen above himself? Kay suggests a third option: neither. Maybe Sculley just sat at the top while other forces – technological, economic, cultural – determined Apple’s fortunes.
Kay quotes the philosopher Alasdair MacIntyre: ‘One key reason why the presidents of large corporations do not, as some radical critics believe, control the United States is that they do not even succeed in controlling their own corporations … when implied organisational skill and power are deployed and the desired effect follows, all that we have witnessed is the same kind of sequence as when a clergyman is fortunate enough to pray for rain just before the unpredicted end of a drought.’ That’s extreme (and hard to square, say, with Steve Jobs’s much more successful, and very personality-driven, reign at Apple). But it highlights the tautologies and rationalisations that abound. Even assuming that there are skilled and unskilled leaders, ‘leadership’ serves too often merely to re-label the mystery. See also ‘charisma’: it’s all very well to describe Hitler, or George Clooney for that matter, as charismatic. But what have you really said?
Psychological studies support a related and similarly circular conclusion: the people we follow as leaders are the ones who decide they’ve got what it takes to lead. We chronically mistake bossiness for leaderly talent. Make the most suggestions in a group context, one research team found, and you’re likely to be seen as the most competent, even if the suggestions are among the worst.65 Voice an opinion three times over, another study suggests, and fellow group members are almost as likely to conclude it’s the group’s prevailing view as if three different people had voiced it.66 (‘Quantity,’ Stalin supposedly said, ‘has a quality all its own.’)
None of which means there’s no such thing as an effective leader. But trying to identify whatever it is that he or she does as some kind of essence called ‘leadership’ may raise more questions than it answers. What makes a great leader? A certain je ne sais quoi, of course.
It’s entirely possible that you’ve never heard of strategic incompetence and yet that you are, at the same time, a lifelong expert at it. If you aren’t, you’ll know someone who is. Strategic incompetence is the art of avoiding undesirable tasks by pretending to be unable to do them, and though the phrase was apparently only fairly recently coined in a Wall Street Journal article, the concept is surely as old as humanity. Modern-day exemplars include the office colleague who responds to the photocopier message ‘clear paper jam’ by freezing in melodramatic pseudo-panic until someone else steps forward to help; you’re equally guilty, though, if you’ve ever evaded a household task or DIY project by claiming you might screw things up. (‘I’d do the laundry – I’m just worried I’ll damage your clothes.’) The Journal interviewed one executive who’d managed to avoid organising the office picnic for several years running. ‘You’d be amazed,’ he noted, ‘at how much I don’t know about picnics.’
What swiftly happens, the masters of strategic incompetence learn, is that people stop expecting you to undertake certain tasks; they no longer ask you to do them, and they adjust how they rate you: your failure to perform the activity stops counting against you. If all this sounds overly Machiavellian, it’s worth noting that it’s only a personalised version of what corporate types refer to as ‘expectations management’, which is a key component of any company’s customer-relations strategy. If you want satisfied customers, it’s certainly wise to act in ways that will satisfy them. But it’s also wise to pay attention to (and, if possible, influence) their criteria for feeling satisfied.
Most of us are bad at this, because deep down we want to please people, whether in hope of personal gain or out of what the self-help writer Elizabeth Hilts calls ‘toxic niceness’: the chronic urge to please resulting from the fear of confrontation. I realise I’m not exactly part of the target market for her pop-psychology book Getting In Touch With Your Inner Bitch – she identifies toxic niceness as an overwhelmingly female phenomenon – but she’s on to something, I think, that crosses gender boundaries. Seen from this perspective, expectations management isn’t just for lazy people who want to avoid boring tasks. Training our bosses, partners or children not to expect a ‘yes’ in response to every single request might be crucial for preserving sanity.
If you start small, it’s surprisingly easy to begin adjusting others’ expectations. It’s like strength training: gradually, you build up tolerance. If you think you shoulder an unfair burden of chores at home, pick one, don’t do it, and monitor what happens. If you’re driven crazy at work by ceaseless emails demanding instant responses, try always waiting a few hours to respond, even when you’ve no reason to wait. Far better to have a reputation as someone who reliably replies within 24 hours than someone who replies within seconds – because in the latter case, as soon as you fail to respond instantly, you’ll be seen as underperforming. Thus do the people who try hardest to please end up annoying people more than those who don’t try so hard. No, it’s not fair. Well spotted.
It’s probably true that we need a new word to describe the way that work, these days, seeps more and more into our free time, giving rise to an unfocused, dissatisfying twilight zone that’s neither work nor leisure. Still, that doesn’t excuse the American sociologist Dalton Conley, who has coined ‘weisure’ as a name for the phenomenon – a portmanteau of ‘work’ and ‘leisure’ that may be the most eye-searingly ugly neologism since ‘vlogging’ or ‘Brangelina’. But ‘weisure’ is better than ‘lork’, I suppose. And unlike most other monstrous recent neologisms, it doesn’t involve the words ‘Twitter’ or ‘tweet’. So we should probably be thankful for small mercies.
As Conley notes, weisure isn’t just a matter of mobile phones and BlackBerries enabling bosses to pester staff at all hours. It’s also a subtler intermingling of worlds previously kept separate. We’re more likely to make close friends through work than a generation ago, and less likely to work for monolithic organisations, which helped impose hard edges between downtime and time at work.67 And judging by the explosion of books on the topic, we’re doing far more networking – a concept that couldn’t exist without a blurring of friendships and working relationships.
Self-help’s prescriptions for combating the energy-sapping effects of weisure tend to focus on shoring up the dyke against the rising waters of work: switching off your mobile, say, or sticking with discipline to a strict going-home time. That’s fine as far as it goes. But it ignores a less obvious dimension to the problem, in which the culprit isn’t work, but leisure.
In its modern form, dating from Victorian times, leisure is a negative concept: it’s defined in contrast to work, as non-work – the time we gain, as a result of earning money, that we don’t need to spend earning money. (It sounds strange to refer to an unemployed person’s free days as leisure time.) And so it’s all too easy to think of it as ‘empty’ time – time, in other words, just asking to be colonised by work.
Many of us welcome in the invader. ‘Most people reflexively say they prefer being at home to being at work,’ writes Winifred Gallagher in Rapt, an absorbing book (appropriately enough) on the psychology of attention. But research into ‘flow’ – the state of mind when time falls away, and people feel ‘in the zone’ – suggests otherwise. ‘On the job, they’re much likelier to focus on activities that demand their attention, challenge their abilities, have a clear objective and elicit timely feedback – conditions that favour optimal experience,’ Gallagher notes. At home, on the other hand, they watch TV, an activity that, according to one study, induces flow only 13 per cent of the time.68 We crave leisure and disdain work even though it may be work, not leisure, that fulfils us more.
That’s not an argument for workaholism. It’s an argument, as Gallagher says, for ‘pay[ing] as much attention to scheduling a productive evening or weekend as you do to your workday’. This feels wrong: we imagine that when leisure time finally arrives, we’ll enjoy being spontaneous. Planning how to relax seems like a contradiction in terms. But then the moment arrives, and what we spontaneously decide is to watch TV, entering a half-focused, barely enjoyable state of passivity. Or, as I shall henceforth be calling it, ‘peisure’.
It’s a reliable rule of life that any email marked ‘urgent’ – with a red exclamation mark, or a ‘please read’, or similar – can be safely ignored for days, and possibly for ever. A few of the people who send them are, presumably, self-important and do it all the time. But mostly it’s a sign of insecurity: the sender knows only too well that their message is one you’d otherwise have every reason to neglect. That exclamation mark is a declaration of war. It says: I know better than you how you should apportion your attention to get your work done.
But ‘urgent’ emails are only the most obvious manifestation of an endemic phenomenon. The battle to decide what merits your attention at any moment is a constant, low-level war of all against all. You might believe it’s always you who chooses what to focus on, or your boss. Yet in the average workplace, countless voices – superiors, underlings, clients, random emailers – compete to control your concentration. We complain of having too many things to do. But how much of that overworked feeling is really resentment that it wasn’t you who got to decide those things were important?
This, the management guru Peter Drucker argued, is a distinctive problem of modern ‘knowledge work’. When you’re ploughing a field or shoeing a horse, the answer to the question ‘What’s the most important thing for me to be doing right now?’ is usually obvious: it can’t be fought over. Not so in the blurry world of ideas, hence Drucker’s maxim that if you’re a knowledge worker, defining your work – staying aware of what genuinely deserves your attention – is the most crucial work you’ll do. This is why ‘information overload’ is a questionable complaint: if we couldn’t handle vast amounts of information, we’d have a breakdown each time we stepped into nature, or a busy street. The real trouble is that we have defined too many things as worthy of having the power to distract us.
The best ‘time-management’ strategies are about reclaiming this power. Spending the first hour of the day (or more) on a major project before you check email is one example: that way, you start the morning by putting yourself, not the incoming flow of attention-demands, in the driving seat. Alternatively, make it harder for others to seize your focus: the website awayfind.com, for example, offers ingenious ways to make it slightly more laborious to email you on holiday, so people won’t do so lightly.
The deeper truth is that in reality it’s always you who’s choosing what you’re doing, though not in the straightforward way you might think. You’re 100 per cent free to disobey a boss, refuse a task, quit a job; you have only to live with the consequences. It’s always a choice. That’s cold comfort, of course, if your choice is between doing appalling work and starving to death. But too often we live as if that’s the case, when really, on closer inspection, it isn’t.
In 1969, a crotchety ex-schoolteacher named Laurence Peter published The Peter Principle, answering a question that millions, surely, had asked: why does the world contain so many people who are so strikingly useless at their jobs? Droll curmudgeon though he was, Peter’s now-famous principle identified a real problem: in hierarchical organisations, people tend to rise to their ‘level of incompetence’. Being good at your job gets you promoted, and so on, ever upwards, until your performance isn’t good enough to warrant further advancement. Seen this way, it’s no accident that companies and governments are filled with bunglers – they’re giant machines for sorting people into precisely the jobs they can’t do. The cartoonist Scott Adams offers the only marginally less depressing Dilbert Principle: modern corporations systematically promote their least talented staff to the ranks of management where, since managers don’t really do anything, they’re mainly harmless.
It’s easy to see why you might accept a job despite knowing that it exceeded your capacities: better pay, prestige, unwillingness to admit your limits. But the problem of incompetence goes deeper. One of the hallmarks of being terrible at something, it turns out, is not realising how terrible you are.
The key study here is called ‘Unskilled and Unaware of It’, and assuming that the Cornell psychologists who conducted it weren’t themselves unsuspectingly incompetent, its conclusions are unsettling. ‘The trouble with the world is that the stupid are cocksure while the intelligent are full of doubt,’ Bertrand Russell wrote. The Cornell study broadly concurs: those who scored worst in various tests requiring ‘knowledge, wisdom or savvy’ were those who most overestimated their performance; top achievers tended to underestimate.69 A special version of this problem preoccupies the business scholar Michael Gerber, whose book The E-Myth Revisited (‘e’ stands for ‘entrepreneur’) makes a devastatingly simple case for why most small businesses fail. People assume that having a skill – baking, say – means they’ll be skilled at running a business doing that thing, so they open a bakery. But there’s no necessary connection; indeed, a keen baker is likely to find tasks such as book-keeping so aggravating, because they get in the way of baking, that he’ll do them especially badly.
Even if you’re fortunate enough to recognise your weaknesses, you may not respond wisely. According to Gallup research compiled by the marketing expert Marcus Buckingham, most people try to ‘plug’ their weaknesses, while the really successful focus on exploiting strengths.70 The weakness-plugger is the employee who goes on courses to become less awful at public speaking, when she’d be better off in a job that calls on her written skills. But you’ll rarely improve a weakness beyond mediocrity, argues Buckingham, not least because it’s hard to invest sustained energy in something you don’t enjoy. If you truly know what you’re bad at, you’re already ahead of the pack. Don’t throw that away by wasting your time getting slightly less bad.
61 Robert Nelson and Peter Economy, Better Business Meetings (Burr Ridge, Illinois: Irwin Professional Publishing): 5.
62 Zoltán Dornyeï, The Psychology of the Language Learner (Mahwah, New Jersey: Laurence Erlbaum Associates, 2005): 158.
63 Steven Rogelberg et al, ‘“Not another meeting!”: are meeting time demands related to employee wellbeing?’ Journal of Applied Psychology 1 (2006): 86–96.
64 Discussed in Mary Czerwinski et al, ‘A diary study of task switching and interruptions’, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (2004): 175– 182.
65 Cameron Anderson and Gavin Kilduff, ‘Why do dominant personalities attain influence in face-to-face groups? The competence-signaling effects of trait dominance’, Journal of Personality and Social Psychology 96 (2009): 491–503.
66 K. Weaver et al, ‘Inferring the popularity of an opinion from its familiarity: a repetitive voice can sound like a chorus’, Journal of Personality and Social Psychology 92 (2007): 821–833.
67 This research is collected in Dalton Conley, Elsewhere, USA: How We Got from the Company Man, Family Dinners, and the Affluent Society to the Home Office, BlackBerry Moms, and Economic Anxiety (New York: Pantheon, 2009).
68 Winifred Gallagher, Rapt: Attention and the Focused Life (New York: Penguin Press, 2009): 109, referring to research by Mihaly Csikszentmihalyi.
69 Justin Kruger and David Dunning, ‘Unskilled and unaware of it: how difficulties of recognizing one’s own incompetence lead to inflated self-assessments’, Journal of Personality and Social Psychology 77 (1999): 1121–1134.
70 In Marcus Buckingham and Donald Clifton, Now, Discover Your Strengths (New York: Simon and Schuster, 2001).