2

Colonial and Pioneer
Medicine Set the
Stage for the
Patent-Medicine Industry


Medicine-show people were able to make a living because they were competitive, scientifically and economically, with their contemporaries in the medical field. Given the state of the medical arts in the nineteenth century, self-dosing with a proprietary medicine was as reasonable a treatment as any other. In order to paint an accurate picture of the medicine show phenomenon, one must include the development of the American medical profession, and the growth of the patent medicine industry. In this chapter and the next, those two topics will be discussed.

We take for granted much of today’s medical technology—anesthesia, antibiotics, scanning devices, and so on. In the history of medicine, however, these advances are very recent; the twentieth century has seen unprecedented scientific development. Until the first few decades of the nineteenth century, the body of medical knowledge was slim indeed. Various diseases had been identified, but their causes were unknown. Most medical practitioners subscribed to one of three theories: disease was an imbalance of the four humors—blood, bile, phlegm, and water; it was due to a loss of body heat; or it was due to either too much tension or too much relaxation.1 Germ theory was unknown, as was knowledge of cancer cells, cholesterol, or most other causes of illness.


Colonial and Pioneer America, or Home (Sick) on the Range

In Colonial times, professional medical treatment was not only primitive, it was largely unavailable. In 1775, there were only 400 doctors with university degrees in the colonies, and they lived in cities of the Eastern Seaboard.2 Even if there had been enough doctors to serve the population, they wouldn’t have done much good. So often did doctors kill their patients that they were said to be God’s nutcrackers—they opened the corporeal shell to let the soul escape.

The English practice of specializing the professions of pharmacist, doctor, and surgeon had to be abandoned in the colonies. Specialization works only where there is sufficient population to support it, and this was not the case in the New World. There were no guilds or professional associations to standardize and regulate the profession, so men who had formerly been apothecaries or apprentices in England were free to call themselves physicians in America. In the seventeenth century, several attempts were made to license physicians, but this was to recommend rather than permit. Licensing was meant to encourage excellence among those who undertook to heal the sick, not to exclude those who lacked knowledge or skill. Those without a license, often called domestic practitioners, were free to operate as they chose. If a cure or two were effected, a person was often made a doctor by approbation. One such person

was a fellow with a worsted cap and great black fists. They stiled him doctor. Flat [the innkeeper] told me he had been a shoemaker in town and was a notable fellow in his trade, but happening two years agoe to cure an old woman of a pestilential mortal disease, he thereby acquired the character of a physitian, was applied to from all quarters, and finding the practice of physick a more profitable business than cobling, he laid aside his awls and leather, got himself some gallipots, and instead of cobling soals, fell to cobling human bodies.3

Anyone with a flair for healing was given their due. In the seventeenth century, before the formation of medical schools and societies, women doctors were not subject to gender bias. A Sarah Alcock of Roxbury, Massachusetts, was “very skilled in physick and chirurgery,” and was praised “within the gates.” In the same state in 1663, the good people of Rehoboth lured Mrs. Bridget Fuller away from the neighboring town of Plymouth so she could serve the community as midwife.4

After 1760, there was a shift in medical practice. Men traveled to Edinburgh, Scotland, or went to Philadelphia College for a degree. Graduates of these august institutions began to form a medical elite. Once again, a trend toward regulation arose, and it lasted for a decade or so. Various colonies passed regulations for medical practice, but all such regulations were eventually abolished. It was not yet possible or practical to separate scientific knowledge from mere empiricism.5 It would be many decades before the medical profession was organized and licensed.

After the American Revolutionary War and the War of 1812, Americans saw medical regulation as a loathsome holdover from England. Americans had not fought and won two wars against the British in order to emulate them. Doctors themselves, like their fellow Americans, didn’t want a hidebound medical profession any more than they wanted primogeniture, a monarchy, or any other hierarchical British form of social organization.6 Furthermore, “the physical immensity of the country and the abundance of free land frustrated every attempt to graft European institutions onto American society.”7 Anyone with a medical problem was free to seek treatment from any available source. Accepted protocols were a mix of superstition, apochrypha, case histories, and anything that seemed like a good idea at the time.

The Colonial doctor built his practice by inspiring confidence, but few doctors could claim a truly confidence-inspiring record. Some physicians published testimonials or encouraged favorable word of mouth. The most honest thing that could be said of a doctor was that he had a knack. Hardly anyone claimed to produce consistent results, even those who had been to college.

The most respected physician in the Colonial era was Dr. Benjamin Rush, a signer of the Declaration of Independence. Rush went to Edinburgh for his degree and was a professor at the College of Philadelphia. He was a rationalist whose thinking was consonant with the eighteenth-century Enlightenment. He believed that superstition had no place in medicine and that the secrets of the universe could be unlocked with applied study. While that may seem like a commendably modern approach, Rush’s medical theories were unfortunately both archaic and lethal. Rush believed in the one-cause-fits-all model of disease, namely “morbid excitement caused by capillary tension.”8 In other words, the patient must be bled. He wielded his lancet with a fervor, and even as his patients turned into corpses, he was not dissuaded from his theory. When a yellow fever epidemic killed 4,000 Philadelphians (10 percent of the population) in 1793, Rush’s patients died by the score. Rush bled his patients so thoroughly that the sticky stream often overflowed the containers placed to catch it. Yellow fever is spread by mosquitoes, and as blood dripped to the floor, mosquitoes swarmed. In all probability, many of Rush’s patients would have died anyway, but some might have had a chance had they kept their bloodstream intact. It was plain to Rush’s detractors that bleeding did nothing to stanch the epidemic, but Rush held his ground and his lancet.

Rush’s post-epidemic reputation was somewhat diminished, but he continued to represent the best the profession had to offer. Among the doctor’s victims over the course of his long career were his own sister, three apprentices, and George Washington. As the father of our country lay dying with a nonspecific infection, Rush relieved him of thirty-two fluid ounces of blood, more or less ensuring Washington’s demise. When Rush wasn’t draining his patients (he never really knew for certain how much blood the human body contained), he was either blistering or purging them. Blistering succeeds only in causing unnecessary pain, and purging causes dehydration.9

Another medical torture inflicted by Rush and his contemporaries was calomel—a tasteless white powder of mercury chloride that produces at least as much suffering as the ailment for which it is prescribed; and in the Colonial era, it was prescribed frequently. “The doctor comes with good free will, but ne’er forgets his calomel,” went a popular rhyme.10 Calomel causes a heavy flow of saliva, bleeding gums, mouth sores, tooth loss, and an unfettered, bloody evacuation of the bowels. Patients hated it, and those who prescribed it hardened themselves for the messy business of doctoring. “We went into it with our sleeves rolled up,” said one physician.11 Early medical treatment was not for the faint of heart, neither patient nor doctor. It was assaultive, invasive, full of revolting effluvia, and often useless.

The few hospitals that existed in Colonial times had little to offer. Conditions were far from antiseptic, and one could easily be bled or purged at home. Patients deemed morally unfit were not admitted. The designation generally applied to unwed mothers, those with venereal disease, and vagrants.12

Doctors were not entirely to blame for the morbidity of the population. Ignorance of the causes of disease and workings of the body made for some appalling conditions. There was no sanitation because there was no perceived need for it. Sewage was dumped into the streets, sometimes right onto the heads of unfortunate passersby. What didn’t flow into the gutters was consumed by swine, which were in turn consumed by people. Salt pork was eaten in enormous quantities, and vegetables were eaten hardly at all. The water supply was impure; in fact, there were no municipal water systems until the mid–1800s. On farms, outhouses were located near wells, and clothing was caked with manure and worn until it fell apart. Even the beds of the wealthy were infested with vermin. Bathing was considered an eccentricity. Night air was thought to be poisoned.

Almost everyone, even children, drank oceans of liquor. Madeira, rum, cider, applejack, sherry, rye, and whiskey were popular beverages. Alcohol was consumed at every meal, in between meals, at polling booths, barn raisings, weddings, and funerals. It was partial pay for soldiers, de rigeur as a token of hospitality, the pick-me-up of choice on work breaks, and an ingredient in folk remedies. It wasn’t until late in the nineteenth century, when the temperance movement gained momentum, that alcohol consumption slackened. In the first half of the nineteenth century, Americans drank twice the amount we do today, and further courted illness with nonexistent sanitation and poor nutrition.13

Epidemics raged through the country in the first half of the nineteenth century, as well. Yellow fever–bearing mosquitoes took the lives of many who could have been saved by proper sanitation, fresh air, and fluid replacement. The disease was terrifying, and vulnerable citizens fought it with the best information at hand—superstition. Cannons were fired to scare it, amulets of garlic and pieces of tar were carried, as were handkerchiefs soaked in vinegar; but swamps were not drained and rain barrels were not covered.14

Another terrible disease, cholera, decimated the population in 1832, 1849, and 1866. The relentless worldwide 1832 epidemic of Asiatic cholera made its way from India into every population center in the United States. It swept through city slums, bolstering the widely held notion that illness was the just reward of an immoral life—immorality being equated with poverty in Calvinist philosophy. Death from cholera ensued quickly, followed by panic and despair among the living. In Chicago, everyone who could leave did; in Detroit, funeral bells that had been ringing ceaselessly were banned for their demoralizing effect; in New Orleans, the disease claimed 300 victims a day:

It was not so much the number of cases and high fatality of the disease, but the mysteriousness and suddenness with which it struck that filled people with a dread and fear that often reached panic. Persons in excellent health were suddenly stricken with a feeling of uneasiness and shortly were consumed with inward burnings and a craving for cold drinks; then came vomiting, intestinal spasms almost as severe as in tetanus cases, and finally, general debility, slow circulation, sunken eyes, cold lifeless skin, and collapse. The fate of many victims was decided within a few hours.15

Cholera is a bacterial infection that usually spreads through contaminated water or milk. During the nineteenth century, however, its cause was unknown. Many erroneous theories were considered, such as comets, poisoned air, and fumes from the ground. One rumor averred that licking postage stamps caused the disease. No new remedy was considered too offbeat or misguided to try; sulfur pills were thought to prevent the disease, and one Alabama doctor administered tobacco-smoke enemas as a sure cure. A doctor named Drake almost deduced the real cause of cholera: He believed in the presence of animalculae, creatures too tiny to see—an idea that came close to germ theory. Although the disease still occurs, cholera is relatively easy to cure if the patient can get antibiotic and fluid injections.16 Unfortunately for cholera victims in earlier times (and for some third-world populations today), those life-saving items were not available.

The pioneers of the midwestern states did not fare much better with health than their city cousins. Until the twentieth century, 90 percent of Americans lived in the country, and few doctors dwelt among them. If a pioneer wanted shelter, he had to build it; if he wanted food, he had to grow it; and if he needed clothing, he (or she) had to make it. Women worked as hard as men, and children worked as hard as they could. Every aspect of daily life required effort. Short-handed, poorly trained prairie doctors fought an uphill battle to control illness in an exhausted population. When sickness descended, and it did so with depressing regularity, nothing got done. Winters were harsh, summers were stifling, and to make things worse, the pioneer diet abetted the onset of illness. Midwestern settlers ate mostly salted meat and fat. Even teething babies were given bacon rinds to chew. Tobacco was chewed and expectorated with no regard for sanitation, much less aesthetics. As in the cities, milk and water supplies on the farm were often contaminated. Alcohol flowed. It was reported in the Medical Repository early in the century:

The inhabitants are almost constantly in a state of repletion, by stuffing and cramming, and by the use of stimulating drink. The consumption of animal food is probably much greater in the Fredonian [United] states, than in any other civilized nation; and it ought likewise to be observed, that the quantity of ardent spirits drank by our people, exceeds everything of the kind, that the world can produce; the appetite for inebriating drink seems to be increasing and insatiable.17

The middle of the continent, long before it was paved over with housing developments and shopping malls, was a mosquito-infested swamp. The land around the Great Lakes was known for its sickliness, as was the land adjoining the Erie Canal. The postcolonial settlers of Kentucky and Ohio complained of epidemic fevers and influenzas. In 1807, for example, malaria raged through the Ohio Valley. Settlers identified a chain of cause and effect between stagnant, putrid water and malaria, but their analysis was off the mark—it was mosquitoes, not swamp gas, that carried the disease. Michigan residents, Native Americans included, battled malaria every summer. In Michigan, after the land was plowed in the spring, it was observed that

the malarial gases set free, that country became very sickly. Crops went back into the ground, animals suffered for food, and if the people had not been too sick to need much to eat they, too, must have gone hungry. The pale, sallow, bloated faces of that period were the rule; there were no healthy faces except of persons just arrived.18

Ohio, Illinois, and Indiana were plagued with yellow and spotted fevers, scarlatina, whooping cough, and diphtheria. A nonspecific infection killed one-eighth of the population in Indianapolis in 1821. A wave of typhoid and pneumonia hit in 1838 and wiped out half of Elkhart County, Indiana. “Bilious” fevers were rampant. Ohio journalist and lawmaker James Kilbourne said, “Respecting the healthfulness of this country, I have to repeat that it is in fact sickly in a considerable degree.”19

The diseases of the East Coast traveled west with no impediment: Measles, small pox, mumps, and scarlet fever attacked frontier settlements. Influenza and erysipelas, also known as black tongue, were epidemic. Skin rashes affected whole villages at a time; rheumatism was common, perhaps due to the habit of allowing wet clothing to dry on the body. Children often came down with croup. Bad dental hygiene permitted various infections to take hold.20 On the other hand, heart disease and other degenerative diseases of the liver and kidneys were largely unknown—because few people lived long enough to develop them.21

Of all the ailments that plagued the region, the most common was a malaria-like infection called ague. It struck so frequently that it was barely worth a mention: “He’s only got the ager,” it would be said of a sufferer. Some types of ague caused chills and shaking so severe it was impossible to hold a tool. It was reported in 1836 that an Illinois family had to stop shingling the cabin roof for fear their shaking would cause them to fall off. Other types of agues were feverish; some caused alternating fevers and chills. Typical symptoms were

yawning and stretching, a feeling of lassitude, blueness of the fingernails, then little cold sensations which increased until the victim’s teeth chattered in his jaws and he “felt like a harp with a thousand strings.” As the chills increased, the victim shivered and shook “like a miniature earthquake.” After an hour or so, warmth returned, then gradually merged into raging heat with racking head pains and an aching back. The spell ended with copious sweating and a return to normal.22

Someone who knew the symptoms all too well wrote:

You felt as though you had gone through some sort of collision, thrashing-machine or jarring machine, and came out not killed, but the next thing to it. You felt weak, as though you had run too far after something, and then didn’t catch it. You felt languid, stupid and sore, and was down in the mouth and heel and partially raveled out. Your back was out of fix, your head ached and your appetite crazy. Your eyes had too much white in them, your ears, especially after taking quinine, had too much roar in them, and your whole body and soul were entirely woebegone, disconsolate, sad, poor, and good for nothing. You didn’t think too much of yourself and didn’t think other people did, either; and you didn’t care. You didn’t quite make up your mind to commit suicide, but sometimes wished some accident would happen to knock either the malady or yourself out of existence. You imagined that even the dogs looked at you with a kind of self-complacency. You thought the sun had kind of a sickly shine about it. About this time you came to the conclusion that you would not accept the whole state of Indiana as a gift; and if you had the strength and the means, you picked up Hannah and the baby and your traps, and went back “yander” to “Old Virginny,” the “Jerseys,” Maryland or “Pennsylvanny.” 23

Sickness was thought to be unavoidable. Even when vaccinations were available, many farmers rejected them as contrary to the will of God. Persuading a patient to submit to treatment was no easy task. One country doctor remarked, “Among the most disagreeable things attending the practice of medicine, are the prejudices the physician must constantly meet with, either in the mind of the patient, or in those of his friends. It is easier to cure the body complaint of a hundred persons than to eradicate the prejudices from the mind of one.”24 A widely held prejudice was that one’s health was determined by providence alone. The Indian Doctor’s Dispensary, published in Cincinnati in 1813 says, “If the Lord will, I shall get well by this means or some other.”25


Folk Remedies

Pioneer women were responsible for running the home, and that duty included medical care. It was assumed that a wife knew her family better than any doctor could.26 A doctor, even if one could be located, was usually a last resort and was only called for in serious cases. Before a doctor was summoned, it was likely that a pioneer wife had first tried a folk remedy. She might have used her own recipe, or one suggested by another woman, who was, perhaps, the local midwife. If there were no knowledgeable neighbor, then a Native American medicine man or local botanist might have been relied on to offer a cure. The local materia medica was a combination of plants, animals, and superstition.27 A remedy was given credence only if it tasted like hell—“one bad devil to drive out another.”28 Some recipes were, by today’s standard, absurdly complicated, time consuming, and definitely not for the squeamish. The following recipe is for a massage oil to ease the pain of rheumatism:

Take a young fat dog and kill him, scald and clean him as you would a pig, then extract his guts through a hole previously made in his side, and substitute in the place thereof, two handfuls of nettles, two ounces of brimstone, one dozen hen eggs, four ounces of turpentine, a handful of tanzy, a pint of red fishing worms, and about three-fourths of a pound of tobacco, cut up fine; mix all those ingredients well together before deposited in the dog’s belly, and then sew up the whole [sic], then roast him well before a hot fire, save the oil, anoint the joints and weak parts before the fire as hot as you can bear it.29

Whether the recipe was good for the patient is not known—it most certainly was not good for the dog.

Dozens of home remedies were jotted down in family albums, passed along in conversation, or published in newspapers. For tapeworm, drink pumpkin-seed tea; treat rheumatism with bear or rattlesnake oil; draw out measles with saffron; cure an itch with soft soap applied with a corn cob, followed by sulfur and lard. Goose grease was prescribed for just about everything.30 Pioneer women spent a good deal of their time identifying, cultivating, and gathering ingredients for home cures. For some, it was an avocation, for others a necessity. Recipes were handed down and self-help medical almanacs were consulted.

The first European settlers of the New World brought their herbal recipe books from England. Bancke’s Herbal, first published in 1525, was popular, as was Culpepper’s Complete Herbal and English Physician, published in 1652. Because different species of plants grow here than did in England, many recipes were rendered useless. Herbs had to be shipped from the Old Country, which was expensive and impractical in the case of a full-blown disease. Some ingredients were grown here; Paul Revere’s garden in the north end of Boston was a re-creation of an English kitchen garden. Colonists tended to ignore the profusion of indigenous healing plants that grew right under their noses.

Native Americans, to those who cared to acknowledge the obvious, were onto something regarding health and hygiene. A Dutch settler observed, “It is somewhat strange that among these barbarbous people, there are few or none who are cross-eyed, blind, crippled, lame, hunch-backed, or limping men; all are well-fashioned people, strong and sound of body, well-formed, without blemish.” Even though Indian remedies tended to be simpler and more effective than those of the white settlers, it took a long time for them to gain acceptance. The Continental Army had a list of forty-eight plant remedies, and only three were made from native plants. Little by little, however, white Americans overcame their Eurocentrism. Native American cures, especially in the midwest, where English botanicals were unavailable, became very popular. Indigenous medicine, as discussed later, eventually had a big influence on the American patent-medicine industry.31


Who Is a Doctor?

The formation of the American medical profession was a chaotic affair. For decades, everybody pointed a fin-ger at everybody else and screamed, “Quack!” In America, as in the mother country, “Quacks were other people. Everybody felt happy in execrating the quack, because, everybody could agree, the quack was someone else.”32 Only those who worked with their hands were worthy of respect in a culture where everything was homemade. A pioneer doctor had to supplement his uncertain income with farming, blacksmithing, or another trade. Doctors were as low on the social scale as their leeches. They were regarded as parasites who lived off the labor of real men. On the rare occasions when a doctor’s services were called for, he had to ride long distances on muddy tracks and create his own pharmacy on site.33

There were a few medical schools in the nineteenth century, but they were not adequate to serve a rapidly expanding nation. Americans were too busy subduing the continent to spend time on such frivolities as a protracted education. The only schooling deemed necessary was that which could be applied to an immediate, practical result—say, setting broken bones. Doctors were not often called for the treatment of infectious diseases, even after germ theory had been accepted in the late nineteenth century. As the country expanded, and distances between settlements grew, educational standards dropped. By the middle of the nineteenth century, there were several medical schools, but knowledgeable consumers rightly regarded them as degree mills. Medical schools were run with an eye to profitability, not excellence.34 A typical course of study lasted four or five months. Schools had no books, no exams, and no labs. Instruction was given in the form of lectures, dry discourses handed down to the entire class with no opportunity for questions or discussion.35 Neither were there opportunities for clinical practice.

Many pioneer doctors were illiterate. The dean of a medical college in 1870 declined to give written exams because so few students could read.36 Illiteracy did not further damage a doctor’s low standing, because book learning was highly suspect. For many pioneer doctors, an apprenticeship was the best way to enter the profession. An aspirant would live in a doctor’s home, roll pills, mix powders, take care of the horse, and if he could, he would “read medicine.” After a few years of keeping a discreet distance on house calls, the young doctor might venture an opinion or tie a splint. When he was deemed fit to practice by his mentor, the young man went his own way.37 Medical training came into the modern era in 1893, with the establishment of the Johns Hopkins School of Medicine. The school required a college degree for entrance and was the first to have a well-equipped lab and a teaching hospital. It was decades before this prototype of medical training was widely copied.38

In the meantime, scientifically dubious practitioners multiplied in number and type. There were hydropaths, homeopaths, botanico-medical, physio-medical, uroscopian, electric, eclectic, hygeo-therapeutic, faith, and many other “healers.” The midwest harbored quacks as manure does flies. Indiana was called “a sink-hole in medical practice,” and Ohio “a paradise for the incompetent.”39 It was a freewheeling medical environment in which the medicine showman with his tall silk hat and folksy manner could operate unrestrained.


The Thomsonians

One of those old, discredited methods of healing was especially influential in preparing the way for the nostrum seller in America. The Thomsonians, believers in the teachings of Samuel Thomson, promoted herbal self-medication and rejected the “riglar” doctor with the religiosity of the converted. Thomson was an unlettered, self-taught herbalist. His system of diagnosis and cures sold for twenty dollars and conferred the right on the purchaser to practice it. His method was so simple that he believed a medical profession would soon be obsolete. Thomson’s theory of disease was as simple as it was wrong-headed: According to him, all illness is caused by a deficiency of body heat. Restore the body’s capacity to warm itself, Thomson asserted, and disease will vanish. To that end, Thomson relied heavily on steam baths and the ingestion of Lobelia inflata, commonly known as puke weed. Thomson regularly steamed his patients to the point of fainting, and then administered Lobelia or some other herb guaranteed to flush the system with alarming speed. He claimed to cure “dropsy, cancer, humors, mortifications, ‘felons,’ dysentery, consumption, rheumatism, ‘scalt’ head, venereal diseases and fits.”40 Thomson was once arrested for murder in Salisbury, Massachusetts, after one of his patients died, allegedly from a Lobelia overdose. Thomson was acquitted. In 1813, he went to Washington, D.C., where he obtained a patent. (Patents were issued at that time, although not for originality or usefulness.)

Thomsonians themselves formed societies for moral support and the exchange of information. Groups sprang up like puke weeds. Their motto was “To make every man his own physician.”41 Thomsonian “irregulars” waged a propaganda and legislative war against the “regular” doctors. Regulations that had hindered the Thomsonians were repealed, and the medical establishment was humiliated. Thomsonians were convinced they were fighting for true medical principles, and the abolition of a monopoly on health care. Their unscientific approach horrified the regulars, who saw in Thomsonianism a threat to both public health and their personal finances.

Thomson gathered a wide following. Between 1830 and 1840, an estimated one-sixth of the American population practiced his method.42 By 1840, 100,000 patent rights had been sold.43 Thomsonians made the process of quack identification easy: The enemies of health were the doctors who spoke Latin. They “are not of the people, but arrayed against the people, and bent on killing them with rat’s bane…. They bleed us to fainting, blister us to wincing, stupefy us with opium…. Go among their patients and labor to overthrow a long established confidence! Denounce them as mercenary! Break down the aristocracy of learning and science!”44 This sort of rant was in accordance with the prevailing philosophy of Jacksonian democracy: No one was above anyone else. A call to the healing profession was more important than a degree.45 After 1840, the Thomsonian movement splintered, creating more choices for consumers—and more confusion as well. The Thomsonians were in disarray, but they had inculcated an ethic of self-care in American culture.

The midwestern pioneer was a plum to be picked by anyone with a homespun approach. Midwesterners had a strong regional self-identity. Foreign ways, such as speaking Latin, were suspect. Even the customs of the more civilized east-coast states were considered effete. “At all times, the pioneer reserved the sovereign right to try to make the science of medicine conform to his concept of democracy, to criticize, complain, refuse to regulate, do his own doctoring, or none at all.”46 The proud unlearned pioneer, in his effort to resist a con, fashioned himself into the most compliant sucker imaginable. The medicine show doc, with his populist appeal, couldn’t have had a more receptive audience.

In self-defense, regular doctors formed medical societies. The various groups struggled to survive, but many disbanded. There was constant internecine warfare among the schools, licensing agencies, and societies. This was consistent with a general rejection of British institutions. Societies came and went until the formation of the American Medical Association in 1846. Even for decades after that medical watershed, you were a doctor simply if you said you were.47 The operative principle in American medicine was laissez-faire. Thus, the stage was set for the unprecedented growth of the patent-medicine industry.