CHAPTER TWO

The Explosion

Flexibility and willingness to work under pressure in a chaotic news environment with ever-changing responsibilities and deadlines a MUST.

Must be available at least 80 hours a week and able to work most weekends and all major holidays.

This is a non-paying internship.

Great opportunity to get your foot in the door and gain news experience.

A minimum of ten years’ experience.

—McSweeney’s Internet Tendency, “A Great Job Opportunity”

What is an intern anyway? A burger-flipper at Disney World or a rising star shaping policy in the White House? Does the word mean anything at all if it can include undergraduates and retirees, the unpaid and the well compensated, rigorous training programs and “virtual” positions without any oversight or mentoring?

The very significance of the word intern lies in its ambiguity. It represents a broader concept, and sends out a more targeted social signal, than “temp” or “freelancer” ever has. Of more importance than any definition is the rhetoric that flavors the internship discourse: “a foot in the door” for young people and a way of “paying your dues,” internships are also “a great way to get experience,” “build your résumé,” and “make contacts.” A “win-win” for employers and “go-getter” interns alike, “you get out what you put in.” We understand from these hopeful, endlessly echoed sentiments that the burden of creating something meaningful falls squarely on the shoulders of the intern, new to the workforce and desperate to squeeze in, tasked with making an impression at any cost and learning on the fly. The image of the cheerful, obsequious intern is not a hollow caricature. While freelancers may often be characterized by a certain proud independence, temps and part-timers by detachment and even alienation, and volunteers by a supposed selflessness, interns perform as affective labor what others would call menial. They work, after all, for their own good name, so that someone, some day, will vouch for their fitness to do “real” work.

“You can get more out of the person because they’re your intern,” a congressional candidate (and former serial intern) said to me, explaining his “understanding of the internship culture.” “What I did in my campaign is I advertised a bunch of internship positions—like ‘You can be the New Media intern, you can be Communications intern’—so I got these people to come more on a regular basis.” Other employers described switching from advertising “summer jobs” to offering “internships” as a way to boost interest among young people; many confessed that “internship” was simply a buzzword they latched on to, looking for free, temporary help around the office. The fact that no definition of internship is in common use is particularly convenient in such scenarios—some people assume that they are unpaid by definition, while others think the exact opposite. Those closer to university campuses (and especially in career services offices) tend to claim an educational component, or assume academic involvement, despite the vast number of internships with no educational content whatsoever.

“For my work, I say an internship is ‘any experience of the world of work from which a student can learn about a career’,” says Natalie Lundsteen, an Oxford doctoral student who is undertaking some of the first academic research on the topic. Previously, she worked as an internship advisor at Stanford for five years. “The example I usually give is: simply making coffee as a barista is a summer job, but having the opportunity to meet management, learn about logistics, marketing, advertising, sales, etc. makes it an internship!” The association between internships and university students learning management skills remains strong—but the reality is that nowadays there are barista internships too.

Michael True, another internship professional from an academic career advising background, has recently proposed a paragraph-long definition of academic internships, heavily referencing experiential education and singling out “structured and deliberate reflection contained within learning agendas or objectives”—yet the definition remains agnostic as to pay, hours, or any actual work conditions.1 On the other hand, employers are most likely to recognize internships as flexible, temporary work arrangements for young people, where there is at least some vague idea of getting ahead, although interns are commonly understood to be at the bottom of the office hierarchy. Whether or not there is a dedicated learning or training component depends on perceived legal requirements, the initiative and status of the intern herself, and any requirements set by her school. As Lundsteen says, “Employers don’t care about the development of their interns—why should they? The bottom line for employers is usually productivity, not personal development.”

Interns themselves seem to grasp the inadequacy of any particular definition—an internship is understood more in terms of its cultural and professional function than in terms of actual responsibilities: a box that has to be checked, a rite of passage, a prerequisite for future ambitions. One former intern told me that even before she entered high school she “understood that an internship was a thing you did to make yourself look better”—that was all. Despite the Disney College Program, an internship is still usually perceived as being different from flipping burgers or waiting tables. It is “relevant” to one’s career, a step up the ladder, however tentative and lowly: analogous to the inevitable but temporary shame of being a college freshman or rushing a fraternity. The sixteen definitions of intern in the collective unconscious of Urban Dictionary invoke this discomfort, but with stark, overdramatized images of exploitation: interns are “company bitches” who provide “slave labor,” and objects of sexual attention for the regular staff. Yet it is also the most transient of work identities––only as prestigious as the name of the employer you intern for—and a status hopefully vaulted over as rapidly as possible. Only poor Monica Lewinsky remains ever and always an intern.

Unlike apprenticeships, which continue to have a fairly concrete meaning in the workplace (as we shall see in the following chapter), what defines an internship depends largely on who’s doing the defining. At the same time, internships are part of a nebulous cluster of job titles that have characterized the surge in nonstandard or “contingent” labor over the past forty years. “We’ve just bastardized all the language across the landscape,” says Phil Gardner of the College Employment Research Institute. “Nobody knows what anything means anymore.” It’s true that temps, part-timers, freelancers, permalancers, permatemps, externs, trainees, and certain volunteers all share a family resemblance; operationally and in terms of the worker demographics and industries involved, however, they tend to be quite distinct. Even just in the realm of different experiments at bridging the school-to-work gap, internships jostle with “job shadowing,” formalized “mentoring,” cooperative education, “schoolsponsored enterprise,” and “tech prep.” Yet of all the different terms and titles, internship covers the broadest range of industries and work arrangements and resonates with the most prestige and possibility.

Solid statistics about the internship phenomenon have remained as elusive as a proper definition. A notable exception is the recent studies produced by the research and consulting firm Intern Bridge, working in conjunction with Gardner. “There hasn’t been anybody that’s really monitored it,” says Gardner of the internship explosion. “Right now the research capacity in this area is dismal. Most schools don’t even know how many of their students actually have internships, period.” Only now are reports such as The Debate Over Unpaid College Internships, published by Intern Bridge, finally beginning to shed light on the shape and scale of the internship explosion. Based on his survey work, Gardner estimates that 70 to 75 percent of students at four-year schools undertake at least one internship. This is in the same neighborhood as the impressionistic figures provided by individual schools and by companies such as Vault and Quintessential Careers, two career information websites. Gardner sees this figure as being at least double what it was in the early 1980s, and studies by the National Society for Experiential Education and the National Association of Colleges and Employers (NACE), undertaken over the same time period, have similarly registered exponential gains in the percentage of college students who graduate with an internship.

According to The Debate Over Unpaid College Internships, nonprofits, government offices, and small for-profit companies emerge as the worst offenders, but nearly 20 percent of large for-profit companies (over 5,000 employees) also have unpaid internships. “Women were significantly more likely to be engaged in an unpaid internship (77 percent),” states the report. This explosive new finding alone demands further study, and it dovetails closely with my own anecdotal research: internship injustice is closely linked to gender issues, both because of the fields that women gravitate toward and possibly also because female students have been more accepting of unpaid, unjust situations. Class differences in the internship world are no less stark, as we’ll see in Chapter 9: according to The Debate Over Unpaid College Internships, most unpaid interns are from low- and middle-income families, except in the “glamor” fields of finance, entertainment, and the arts, while “[h]igh income students through their preferences, social networks, and status, enjoy more opportunities at the largest companies, are more likely to be paid, and have access to a limited number of opportunities in organizations their peers compete fiercely to enter.” A student’s academic major is another key factor: as many as 87 percent of engineering and computer science majors reported having paid positions while 70 percent of business students and approximately two-thirds of science and agriculture students had them. On the other end of the spectrum are students majoring in education (34 percent paid), social sciences (35 percent paid), health sciences (39 percent paid), communications (41 percent paid), and the arts and humanities (43 percent paid).

The conservative estimate of 1 to 2 million internships annually in the United States, mentioned earlier, does not begin to account for internships taken by community college students, graduate students, recent graduates, and others—all major growth areas. Nearly a million registered members of LinkedIn, a professional social networking site, list themselves as former or current interns. The National Longitudinal Survey of Youth 1997 (NLSY97), a major representative study of 9,000 young people, found that approximately 4 percent of high school students take an internship or “apprenticeship”—also unaccounted for in my estimate. A list of the largest intern employers compiled by CollegeGrad.com is indicative, although it misses the massive Disney program, among others: the top ten firms were planning to hire a combined 26,000-plus interns in 2009: the industries represented include general merchandising (Walgreen’s, primarily in the pharmacies), insurance (New York Life), consulting (Deloitte & Touche), aerospace and defense (Lockheed Martin), and automotive rental (Enterprise). That’s 26,000 interns for 10 employers—and internships have a very, very long tail.

The best current estimate is that as many as 50 percent of all internships in the U.S. may be unpaid or paid below minimum wage. (Recent research in the U.K., undertaken by a major professional association for Human Resources, found that 37 percent of all internships were unpaid; in Germany, a survey found 51 percent.) In The Debate Over Unpaid College Internships, based on a broad sample of student responses, Intern Bridge found that 57 percent of nonprofit internships, 47 percent of government internships, and 34 percent of for-profit internships were unpaid—not to mention those that were underpaid. And the ranks of the unpaid are swelling, while paid positions disappear. Studies across the board, including a recent one conducted by Internships.com, have found that unpaid internships continue to grow at the expense of paid opportunities.2

Vernon Stone, a professor of journalism at the University of Missouri, documented early on how “[u]npaid internships, once rolling, tended to crowd the paid ones off the road.” Stone found that, in 1976, 57 percent of TV and 81 percent of radio interns received some pay; by 1991, those numbers had sunk to 21 percent and 32 percent respectively. “Seven times as many [unpaid interns] were in TV newsrooms in 1991 as in 1976,” wrote Stone. “Radio’s increase was fivefold.” A similar pattern has been noted for a variety of industries—and according to Intern Bridge, compensated internships in some industries are in danger of disappearing altogether: only 11 percent of interns in the field of game design are now paid, and under 16 percent of interns in law enforcement and security.3

Unsurprisingly, organizations and businesses in the internship space boast that employers regularly list internship experience as a top hiring criterion—50 percent of new college graduate hires came out of internship programs at the same firm, according to a Michigan State University study, while an additional 40 percent had interned at other firms.4 Among employers surveyed by NACE, who are more likely to care about internships and university recruiting in the first place, 76.3 percent reported that relevant work experience was the critical factor in making hires.5 There is no denying this crucial signal that internships send in the postcollege job market.

At the same time, there is a largely unexplored distinction between employers who use internships as an efficient and far-sighted recruiting tool, where there is a real possibility of being hired, and those that use internships as a simple money-saving device. At many of the latter firms, there may be no hires at all from a pool of dozens of interns, and there are never positions reserved even for the best-performing interns. Sociologist Mark Granovetter notes that, even when converting an internship to a full-time position is unlikely, an unrealistic sense of hopefulness can persist: “There may be just enough cases around that people know about to give people encouragement, but not enough to really make it likely that that’s going to happen for any particular person.”

No rigorous studies have yet analyzed the importance of internships over the course of a career or their contribution to lifetime earnings, or even the details of how much interns really learn from their positions. Crude measures of intern satisfaction do exist for individual internship programs and segments of a few industries, showing predictable variation, but in general the experiences of interns remain little understood. In two telling findings from Intern Bridge, some 65 percent of former interns said that the programs they participated in needed improvement; and, when asked whether their work had benefited their employers, rated their contributions on average over 4.2 on a 5-point scale. The statistics that exist don’t yet paint a full picture, or even include the experiences of noncollegiate interns, but they do illuminate the overall contours of the internship explosion. If many key questions remain unanswered, and economists, sociologists, and policy-makers continue to look the other way, this is at least in part because the Bureau of Labor Statistics, the Census Bureau, and other large-scale research initiatives have left internships in the shadows.

As with all efforts to integrate young people into the workplace, we should ask who bears the burden and who stands to gain: internships in this regard are a dream solution for employers, allowing them to “testdrive” young workers for little or no cost. Just as the federal government has increasingly shifted the financial burden of a college education onto students and their families, companies have effected a similar stratagem by transitioning from training programs and entry-level jobs to internships. In the breathless rhetoric of the age, the intern is an “entrepreneur” and a “free agent” from Day One (read: an at-will employee, often without pay or protections), “learning on the job” (since savvy firms do not invest in dedicated training—and an ever-changing, intangible economy would supposedly make it irrelevant anyway) and building up a “personal brand.” With internships, this message now penetrates almost every institution of higher education in the developed world.

The first interns were medical students. Until World War II, the term “internship” was associated exclusively with hospitals, where aspiring doctors were interned (in the sense of confined) within an institution’s four walls, enduring a year or two of purgatory before entering the profession. The practice of apprenticing boys to physicians and apothecaries stretched back centuries, but only in the final decades of the nineteenth century—with the establishment of major medical schools and an upsurge in the number of hospitals—did formal internships come into being. One of the earliest mentions appears in the second annual report to the trustees of the Boston City Hospital in 1865. American hospitals of this period, according to historian Rosemary Stevens, began to use the term “interne,” apparently a borrowing from French, to describe house surgeons, house physicians, and assistant house physicians—junior medical men busy anesthetizing, bloodletting, vaccinating and the like, not yet qualified to be resident physicians.6

In 1904, in the midst of momentous changes in the medical profession, the American Medical Association (AMA) created a Council on Medical Education to reform and standardize a wildly uneven landscape of practices. A year later, the Council recommended internships as a postgraduate year for medical students, who would have just completed four years of study. A decade later, the AMA set standards for the approval of internships. In theory, writes Stevens, “as part of medical education, the internship should have been under the guidance of medical schools; but it was not.” The hospitals were in charge. The internship was soon pervasive, virtually mandatory: by 1914, a U.S. commissioner of education estimated that 75–80 percent of medical graduates were taking an internship; in 1932, the Commission on Medical Education put the figure near 95 percent.

Much of this supposed training was “virtually unsupervised,” according to Stevens; critics were soon accusing hospitals (as many still do today) of squeezing exhausting, cheap labor from young medical graduates. The alliance of hospitals and medical schools would “make it more difficult for outsiders, including minority groups, to enter the system at any level,” and internships were frequently secured through personal connections. William Mayo, one of the founders of the famous Mayo Clinic, wrote that medical interns “seem to spend their days in permanent yessir-ing, in being flunkies for the permanent staff.” His answer was to establish a three-year training program for “fellows.” In a similar vein, the medical school at Johns Hopkins, drawing on the German practice of “assistantship,” introduced the term “resident” in the 1880s to mean an advanced trainee level beyond the internship. In the 1920s and ‘30s, the notion of residency began to take hold—and “internship” has since come to denote only the first year of clinical training and practice undertaken after medical school. In 1975, just as the internship boom was getting started in other fields, ironically enough, the Accreditation Council for Graduate Medical Education officially dropped the term altogether.

Today, there are approximately 200,000 medical residents in the U.S.; a significant fraction of them are still known as interns while in their first year. The exclusive association between internships and medicine has long disappeared. The disconnect seems all the greater given that medical interns tend to be significantly older and more experienced than office interns; they are paid; and they are highly likely to become medical professionals. They have a fifty-year-old union, the Committee of Interns and Residents, which boasts 13,000 members from coast to coast and affiliation with the mighty Service Employees International Union. On the other hand, medical interns work famously long hours, performing difficult and serious hospital tasks for little pay and an often empty promise of supervision and training—the internship condition taken to the nth degree. The Committee of Interns and Residents has had to fight tooth and nail just to win an eighty-hour work week.

There is evidence from as early as the 1920s that other professions were interested in following medicine in the establishment and formalization of internships. Dr. A. I. Gates of the Teachers College at Columbia University delivered a speech advocating “a more gradual transition from school to work” and the completion of a “social internship” emphasizing civic responsibilities. A professional journal for accountants ran an item in 1928, airing the idea of “a probationary period comparable to the internship in medicine.” In 1937, the Journal of Marketing wrote, in the context of business school education, that “a sort of internship is highly desirable so that students may perform activities involving various applications of the principles of retail store management.” Although none of these proposals was directly acted upon, as far as I know, their tenor is clear: internships were conceived as part of “the progressive rationalization of management,” linked to the professionalization of training and the growing standardization of higher education.7

The first real application of the internship concept beyond medicine came in the political realm. The model established here would in many ways set the tone for the coming internship explosion. Some of the earliest efforts were at the state and municipal levels, including programs launched in the 1930s by city governments in Los Angeles, Detroit, and New York City, and by the state of California. Drawing on and enhancing these models, the long-forgotten National Institute of Public Affairs—once a prominent nonpartisan organization, dedicated to promoting public service—developed an early influential model that put internships on a national stage. Otis Theodore Wingo, executive secretary of the National Institute, described the program’s launch in February 1935 as a “hopeful experiment,” bringing thirty “carefully chosen” recent college graduates to Washington D.C. for broad training in public administration.8

Compared to other forms of civil service training, the distinctive feature of the National Institute’s program was that it aimed to provide young people of promise, not currently government employees, with general training in public administration. The interns worked without pay for an entire year, but the range of training compares favorably even with some of the best-organized programs today. Fifty years later, Herbert Kaufman, an alumnus who became a professor of public administration, remembered that the program “opened the way to placements in the federal administrative establishment to which I could not otherwise have aspired realistically even if I had known about them.” After a few weeks of orientation on issues like budgeting, congressional appropriations, government accounting and auditing, and so on, Kaufman found himself in the Administrative Management Division of the Bureau of the Budget, where he did “a variety of simple clerical chores in return for the privilege of sitting in on the conferences.” When the work “began to pall after a while,” Kaufman was able to intern for a while on the research staff of President Truman’s Committee on Civil Rights. Wingo emphasized that, although “interns in the current program of the Institute are uniformly performing work of considerable responsibility,” they “[i]n no event … replace paid government personnel, and their assignments are to work that would not be carried on by regular personnel in the due course of government business.”

Even after the National Institute itself had disappeared from the scene, the internship program was carried on under the auspices of the Civil Service Commission—it remained a high-profile model for decades. The Legislative Reorganization Act of 1946 equipped lawmakers and congressional committees with large staffs of experts for the first time, vastly expanding the world of Capitol Hill, where interns would soon be pervasive. The massive growth of Washington D.C. itself from the 1940s and 1950s onward—enabled to some extent by the spread of air conditioning, as Nelson Polsby has argued more broadly for the American South—was both a cause and effect of the internship boom.

Municipal- and state-level internships proliferated as well. A survey of public administration internship programs, conducted by the National Civil Service League in 1956, turned up forty-two different internship programs across the country, a testament to steady post-war growth. Fifteen cities, thirteen states, and four county or county-city systems had embraced the model, from Philadelphia to Phoenix, from the Massachusetts Department of Health to the Minnesota highway authorities, often working in concert with local schools. Many of the positions were paid, but some were not: for example, over 100 interns employed by California’s Department of Mental Hygiene were receiving only “maintenance” as opposed to a regular salary. The report lauded such programs for giving undergraduate and graduate students “a front-seat view of government in operation” and noted significant variations in terms of how long internships lasted, the degree of collaboration with schools, and the legal status of the interns, who were considered “exempt” in Florida and Massachusetts; “provisional” in Wyoming; “probationers” in Illinois and New York; and so on. In one jurisdiction, a host of future problems was tellingly prefigured by the report, which bemoaned a “tendency of departments to use interns as temporary employees rather than trainees, insufficient length of the internship period, insufficient coordination between academic instruction and work assignments, and difficulty of fitting interns into the permanent civil service.” Legislative internships in both Congress and various state assemblies were also launched during the 1950s.

In the 1960s, a new impetus developed for the growth of publicspirited internships, this time from schools and students caught up in the social and political ferment of the time. According to Eugene Alpert, a past president of the National Society for Experiential Education, “Many schools felt that they wanted to have their curriculum become more relevant and more socially-oriented—and more students were interested in donating their time, but they wanted credit for it for the first time.” Internships in teaching, social work, psychology, criminal justice, and journalism began to proliferate. In Washington, a further upsurge came with the Congressional Reorganization Act of 1974, which created the current sprawl of sub-committees. “That meant there were multiple access points,” says Alpert, “and that brought in all the nonprofits and the lobbyists to town,” who in turn actively sought young, free labor. In this period, internships became an entrenched feature of the Washington scene in something like their current form, although they were still little known elsewhere. “Back in the 1970s I was a Washington intern,” political commentator Jonathan Alter has written. “In those days, identifying oneself as such in your home state sometimes met with a puzzled stare, as if you were a medical student.”

The adoption of internships in other industries followed fast and furious, decisively picking up steam in the 1980s and 1990s, the period when they became standard operating procedure, with some firms formalizing and expanding what had been smaller-scale, informal arrangements based on connections. The genealogies of influence are almost impossible to trace, but it appears that a wide range of industries began experimenting around the same time, drawing on the models from medicine and public service. Management training programs at large firms gave way to summer internship programs.

A few highlights may suffice to illuminate the broader picture. In some cases, earlier models have come to be seen and described as internships, such as the legendary “guest editor” contests at the fashion magazine Mademoiselle, which continued for years. In June of 1953, the poet Sylvia Plath was one of these twenty female college students, selected from an enormously competitive field for a brief stint at the magazine. Paid only a modest $150 stipend, the guest editors were wined and dined around New York: the high-society social functions attended by the young women made great fodder for Mademoiselle, and one of Plath’s biographers has described the program as “another marketing scheme on the part of the magazine to increase sales and advertising.” Other fashion magazines launched similar efforts—in 1951, then a senior at George Washington University, Jackie Bouvier (later Kennedy) won a year-long editorial internship (half in New York, half in Paris) in the Prix du Paris contest hosted by Vogue.

In 1967, less glamorously, the Northwestern Mutual Life Insurance Company of Milwaukee began a program that has become one of the nation’s largest, ballooning to include 2,500 interns each year—interns work on commission selling insurance as “fully licensed financial professionals.” By 1979, a national survey of undergraduate sociology departments found that more than half supported internship programs, described as “field-work placements … under the auspices of cooperating agencies, such as a city planning department.” Many banking and financial internships, among the best-paid and most prestigious of all internships today, began over twenty years ago, according to Natalie Lundsteen. In a 2000 survey, 92 percent of business schools and programs reported having internship programs, and one-third claimed that more than half of their students take on internships.

It is particularly telling that new companies, industries, and academic degree programs, especially those emerging in the 1990s and 2000s, have unquestioningly given internships pride of place almost from the very beginning. A 2001 report on degree programs in sports management, for instance, found that 76 percent of responding programs required students to complete an internship. With each new wave of tech startups, Silicon Valley becomes more and more awash in interns—the latest batch are “Twinterns,” interns tasked with promoting their employers via Twitter (even Pizza Hut has one). “I saw first-hand how interest in internships began growing in the late nineties,” says Lundsteen, who set up the first corporate headquarters internship for Gap, Inc. during this period, when internships made a perfect fit with the go-go rhetoric of the dotcom bubble and the New Economy. Programs continue to sprout and take on massive proportions virtually overnight. The retailer Target, for example, went from having no internship program at all before 2006 to taking more than 1,000 just a few years later.

The internship explosion, like any major shift in how people work and shape their careers, could not have occurred in isolation. Changing attitudes among lawmakers, educators, parents, and young people have all played a critical role, as we’ll see in the following chapters. For the moment, though, there are two other shifts in the workforce that need examining, both of which have propelled the rise of internships—and help to explain it.

Post-industrial, networked capitalism has provided the ideal petri dish for the growth of internships, which are only one of many forms of nonstandard or contingent labor that have mushroomed since the 1970s. The term “contingent labor” was first used in 1985 by economist Audrey Freedman, in reference to “conditional and transitory employment relationships as initiated by a need for labor—usually, because a company has an increased demand for a particular service or product or technology, at a particular place, at a specific time.” Included are part-time, temporary, seasonal, casual, contract, on-call, and leased employees, among others—what Andrew Ross calls “an explosion of atypical work arrangements far removed from the world of social welfare systems, union contracts, and long-term tenure with a single employer.”9 The Bureau of Labor Statistics estimated in 2005, using what they called “the broadest measure of contingency,” that about 4 percent of the total workforce in the U.S. fits the description, approximately 5.7 million workers. These contingent millions were twice as likely as noncontingent workers to be under twenty-five years old, and a clear majority said they would prefer full-time employment. If you include regular part-time workers and the self-employed, who often work under contingent circumstances, figures from a Government Accountability Office (GAO) report in 2000 paint a much starker picture: nearly 30 percent of the American workforce, comprising almost 40 million individuals, is involved in contingent, nonstandard employment arrangements.

Ross and others have pointed out that the Keynesian decades of “standard” employment may in fact have been “a brief exception to the more general, historically enduring rule of contingency.” Nonetheless, as law professor Mark Grunewald writes, “traditional, full-time, long-term employment” rapidly assumed a wider importance in the mid-twentieth century: “It has been the core for the development of collective bargaining. It has been the institutional base for the assurance of health care. It has bankrolled the private pension system. In short, it has marked most of the important differences between staked members of our society and the economically vulnerable and insecure.”

As many authors have documented—including Luc Boltanski and Eve Chiapello, for the French case—the push for flexibility emerged at least in part from a “revolt against work” in the 1970s, initiated by workers themselves. Another driving force behind contingency, as economist Greg Kaplan points out, “has been an increase in female labor force participation—the big thing in the last fifty years.” Many workers, interns among them, have undoubtedly welcomed the more positive aspects of their contingency—the ability to plan their own schedules, work from home, or spend more time with family, for instance—and the youth labor market usually tends to center more on short-term, flexible contracts anyway. There’s also evidence that what Gina Neff calls an “increasing spirit of entrepreneurial behavior” has been effectively hammered into many young people: “It’s a move from the company man to ‘I am the CEO of me’ … to Daniel Pink’s notion of the Free Agent Nation,” says Neff. According to Andrew Ross, “the flexibility [which free agency] delivers is a response to an authentic demand for a life not dictated by the cruel grind of excessively managed work,” despite the uncertainties it brings.

If the revolt against work began on assembly lines and in the alienating offices of large bureaucratic firms, management soon learned to reap the benefits and actively promote the “casualization” of the workforce. Ross writes of “the steady advance of contingency into the lower and middle levels of the professional and high-wage service industries” after companies began to realize “lavish returns from low-end casualization—subcontracting, outsourcing, and other modes of flexploitation.” According to the GAO report, “regular part-time” work increased from 14.5 percent to over 18 percent of the total workforce over the course of the 1970s and 1980s, with the largest increase in the late 1970s and early 1980s, due to changes in business cycles—such as the recession in 1983.

Most prominent and explosive of all has been the growth of the temporary help industry (from 0.5 percent of the total workforce in 1982 to over 2 percent in 1998) and of independent contracting. Indeed, much of this supposed boom in independent contracting—with its positive associations of being one’s own boss—has been revealed as a massive tax dodge by corporations. A recent federal study found that employers were illegally passing off 3.4 million regular employees as independent contractors, and the Labor Department has identified up to 30 percent of companies as engaging in such misclassification. In other words, many workers are still performing the same work they always have, only under increasingly precarious, contingent conditions, at the behest of employers. Microsoft’s “permatemps”—thousands of employees deliberately misclassified as independent contractors, though they worked for years alongside full-timers, often performing identical work—achieved a rare victory in 2000 in reclaiming their status as regular employees. Seen in this light, many interns could be considered workers who have been purposely misclassified as students and trainees.

The independent contractor tax dodge may be one indication that a shift in employer practices, and a lack of legislation and enforcement, are more responsible for the rise of contingent labor than sweeping changes in the economy. In the GAO report, employers cited the need “to accommodate workload fluctuations, fill temporary absences, meet employees’ requests for part-time hours, screen workers for permanent positions, and save on wage and benefit costs”—worker advocacy groups responded that contingent hiring also allows companies to “avoid paying benefits, reduce their workers’ compensation costs, prevent workers’ attempts to unionize, or allow them to lay off workers more easily.” All these same motives, those reported by employers and by advocacy organizations, have played into the internship explosion. In one way or another, interns have fallen into every single category of contingent labor.

“The general attitude in the labor movement about contingent workers across the board is they’re more of a pain in the ass to organize than it’s worth,” says Jim Grossfeld, who has written on the disconnect between contingent youth workers and organized labor. He adds that unions have failed to understand the appeal of flexible working conditions and the dynamics of organizing contingent workers. Small wonder then that interns, substantially but not entirely in nonunion, office environments, have been submerged in the larger rise of contingent work and attracted little attention from organized labor. “People think of internships as part of their strategy for becoming autonomous,” says Grossfeld—and go-it-alone autonomy is pitched as the way to survive a brutal economy. As Andrew Ross writes, “self-direction morphs into selfexploitation, and voluntary mobility is a fast path to disposability.”

A second massive shift in the post–World War II workplace, contributing in equal measure to the internship boom, has been the rising field of Human Resources. At least at the mid-size and larger organizations that have led the way, internship programs are typically the responsibility of HR departments, which continue to have a strong interest in both initiating and institutionalizing them. “They’ve realized it comes to bite them in the butt when they don’t,” one HR executive told me, explaining that the original impetus to bring in interns may come from executives or other employees, but that HR professionals can ultimately be held accountable for the safety and legal issues that sometimes result.

Companies are usually looking for immediate benefits from their interns, and the creation of such roles is often a response to the most local of concerns, like returning a favor or handling overflow work. Given such informal beginnings, many internship arrangements fly under the radar for years, even at larger firms. It’s still common for “Hey, let’s get an intern” to be the impetus, leading to a casual post on Craigslist or the company website with little thought given to setting up a professional, sensible program. “Economic rationality would say that of course they would have all this worked out,” says sociologist Mark Granovetter of organizations, “and that they would only invest up to the point where the marginal return equaled the marginal cost, but it’s so hard to gauge that. And you have to figure out what the likelihood is of the employee staying with you, and what the training is worth, and no one really knows those things.” Not to mention that sometimes the motive is as simple as wanting to copy other firms in one’s industry. “One of the things that sociologists find about a lot of HR practices is that there’s a high level of what is called mimetic isomorphism,” adds Granovetter, “which basically means that whatever everyone is doing you do, because otherwise you look like you’re not a modern firm.”

Indeed, even the establishment of HR departments, beginning particular in the 1920s and accelerating during and immediately after World War II, fits the copycat pattern. In 1946, the number of Americans employed in labor relations and what was then called “personnel” was under 30,000, increasing markedly to 53,000 in 1950 and 93,000 in 1960—a growth pattern far in excess of other professions and the broader workforce during the same period. From a relatively rarity before World War II, personnel departments rapidly became a virtual requirement for modern firms, represented in 63 percent of surveyed companies by 1946 and 79 percent by 1953.

If many of these departments were first established to thwart or parley with labor unions, they have taken on a raft of much broader functions over the years. Finding a cost-effective method of recruiting new employees and replacing departing ones came to be considered one of their core responsibilities. To the extent that it drew on corporate initiative and interest, the later rise of internships was predicated on and shaped by the human resources profession. By now, almost all sizable employers have more or less formal internship programs of one kind or another—in many cases, an outgrowth of college recruitment efforts for which the company has dedicated HR personnel. “ ‘Efficiency imperatives’ became less imperative as modern personnel administration became standard operating procedure,” write Baron, Dobbin, and Jennings in their analysis of the rise of HR. Pamela Tolbert and Lynne Zucker, looking at the spread of “best practices” touted by and shared between many HR departments, echo this view: “As an increasing number of organizations adopt a program or policy, it becomes progressively institutionalized, or widely understood to be a necessary component of rationalized organizational structure.”10

“Back then, Human Resources was—not entirely, but almost entirely—transactional,” says Howard Curtis, a long-time human resources executive, of the profession’s early decades. “Certainly leaders understood that there was a cost to hiring, a cost to turnover, a cost to leaving a job vacant” and a cost to strikes, but not much more than that. As late as the 1970s and 80s, the profession was dominated by perspectives from psychology, counseling, and labor relations—the traditional subjects in which many HR managers received their degrees; more recently a marked business orientation set in. The constant push, evident in countless books and conferences, has been to promote HR as a “strategic function” with “a seat at the table” on big decisions. At bigger firms, internships can be seen as part of the HR repertoire, one tool among others for advancing and justifying the HR profession on a wider stage. Internships have become part of an unstoppable, unimpeachable standard operating procedure.