The organization of health care services is an important factor in determining which forms of care are made available to members of a community. Within a given country, there typically exists multiple levels of care and multiple forms of medical practice. In general, the more sophisticated the health care system, the more expensive it is for a country to maintain. Thus, variations in health care costs between communities and countries often are a reflection of differences in organization and sophistication. Costs associated with health care may be subsidized by the government, by private groups, by the public, or by some combination thereof. In all cases, however, costs can influence the efficiency with which services are administered and distributed to the public, and the extent of care made available must be balanced with costs in order to ensure that quality services reach the public in an affordable fashion. Variations in the quality and level of services provided within different societies is intimately associated with the organization of medical practice itself.
It is generally the goal of most countries to have their health services organized in such a way to ensure that individuals, families, and communities obtain the maximum benefit from current knowledge and technology available for the promotion, maintenance, and restoration of health. In order to play their part in this process, governments and other agencies are faced with numerous tasks, including the following: (1) They must obtain as much information as is possible on the size, extent, and urgency of their needs; without accurate information, planning can be misdirected. (2) These needs must then be revised against the resources likely to be available in terms of money, manpower, and materials; developing countries may well require external aid to supplement their own resources. (3) Based on their assessments, countries then need to determine realistic objectives and draw up plans. (4) Finally, a process of evaluation needs to be built into the program; the lack of reliable information and accurate assessment can lead to confusion, waste, and inefficiency.
Health services of any nature reflect a number of interrelated characteristics, among which the most obvious, but not necessarily the most important from a national point of view, is the curative function; that is to say, caring for those already ill. Others include special services that deal with particular groups (such as children or pregnant women) and with specific needs such as nutrition or immunization; preventive services, the protection of the health both of individuals and of communities; health education; and the collection and analysis of information.
In the curative domain there are various forms of medical practice. They may be thought of generally as forming a pyramidal structure, with three tiers representing increasing degrees of specialization and technical sophistication but catering to diminishing numbers of patients as they are filtered out of the system at a lower level. Only those patients who require special attention either for diagnosis or treatment should reach the second (advisory) or third (specialized treatment) tiers where the cost per item of service becomes increasingly higher. The first level represents primary health care, or first contact care, at which patients have their initial contact with the health-care system.
Primary health care is an integral part of a country’s health maintenance system, of which it forms the largest and most important part. In 1978, at the International Conference on Primary Health Care in Alma-Ata, Russia (now Almaty, Kazakhstan), the Declaration of Alma-Ata was formed. According to the declaration, primary health care should be “based on practical, scientifically sound and socially acceptable methods and technology made universally accessible to individuals and families in the community through their full participation and at a cost that the community and country can afford to maintain at every stage of their development.” Primary health care in developed countries is usually the province of a medically qualified physician. In developing countries, first contact care is often provided by nonmedically qualified personnel.
The vast majority of patients can be fully dealt with at the primary level. Those who cannot are referred to the second tier (secondary health care, or the referral services) for the opinion of a consultant with specialized knowledge or for X-ray examinations and special tests. Secondary health care often requires the technology offered by a local or regional hospital. Increasingly, however, the radiological and laboratory services provided by hospitals are available directly to the family doctor, thus improving his or her service to patients and increasing its range. The third tier of health care, employing specialist services, is offered by institutions such as teaching hospitals and units devoted to the care of particular groups—women, children, patients with mental disorders, and so on. The dramatic differences in the cost of treatment at the various levels is a matter of particular importance in developing countries, where the cost of treatment for patients at the primary health care level is usually only a small fraction of that at the third level. Medical costs at any level in such countries, however, are usually borne by the government.
Ideally, provision of health care at all levels will be available to all patients. Such administration of health care may be described as universal. The affluent, both in relatively wealthy industrialized countries and in the poorer developing world, may be able to get medical attention from sources they prefer and can pay for in the private sector. The vast majority of people in most countries, however, are dependent in various ways upon health services provided by the state, to which they may contribute comparatively little or, in the case of poor countries, nothing at all.
The costs to national economics of providing health care are considerable and have grown at a rapid rate, especially in countries such as the United States, Germany, and Sweden. This trend has been the cause of major concerns in both developed and developing countries. Some of this concern is based upon the lack of any consistent evidence to show that more spending on health care produces better health. There is a movement in developing countries to replace the type of organization of health care services that evolved during European colonial times with some less expensive, and for them, more appropriate, health care system.
In the industrialized world the increased costs of health services have caused both private and public health-care delivery systems to question existing policies and to seek more economical methods of achieving their goals. Despite expenditures, health services are not always used effectively by those who need them, and results can vary widely from community to community. In the United Kingdom, for example, between 1951 and 1971 the death rate fell by 24 percent in the wealthier sections of the population, but by only half that in the most underprivileged sections of society. The achievement of good health is reliant upon more than just the quality of health care. Health entails multiple factors, including good education, safe working conditions, a favourable environment, amenities in the home, well-integrated social services, and reasonable standards of living.
Developing countries differ from one another culturally, socially, and economically, but what they have in common is a low average income per person, with large percentages of their populations living at or below the poverty level. Although most have a small elite class, living mainly in the cities, the largest part of their populations live in rural areas. Urban regions in developing and some developed countries have developed pockets of slums, which are growing because of an influx of rural peoples. For lack of even the simplest measures, vast numbers of urban and rural poor die each year of preventable and curable diseases, often associated with poor hygiene and sanitation, impure water supplies, malnutrition, vitamin deficiencies, and chronic preventable infections. The effect of these and other deprivations is reflected by disparities in life expectancy between people living in developing countries and those living in industrialized countries. In the former, men and women typically are expected to live for shorter periods of time, relative to people in the latter. The extension of primary health care services is therefore a high priority in the developing countries.
Developing countries, lacking the proper resources, have often been unable to generate or implement the plans necessary to provide required services at the village or urban poor level. It has, however, become clear that the system of health care that is appropriate for one country is often unsuitable for another. Research has established that effective health care is related to the special circumstances of the individual country, its people, culture, ideology, and economic and natural resources.
The rising costs of providing health care have influenced a trend, especially among developing countries, to promote services that employ less highly trained primary health care personnel who can be distributed more widely in order to reach the largest possible proportion of the community. The identification and treatment of the principal medical problems in these countries is, for the most part, relatively straightforward, with success relying primarily on the depth of resources and number of personnel available. Furthermore, preventive measures are usually simple and cheap, and treatment and prevention do not necessarily require extensive professional training: in most cases these elements of care can be addressed adequately by the “primary health worker,” a term that includes all nonprofessional health personnel.
Those concerned with providing health care in the developed countries face a different set of problems. The diseases so prevalent in the Third World have, for the most part, been eliminated or are readily treatable. Many of the adverse environmental conditions and public health hazards have been conquered. Social services of varying degrees of adequacy have been provided. Public funds can be called upon to support the cost of medical care, and there are a variety of private insurance plans available to the consumer. Nevertheless, the funds that a government can devote to health care are limited and the cost of modern medicine continues to increase, thus putting adequate medical services beyond the reach of many. Adding to the expense of modern medical practices is the increasing demand for greater funding of health education and preventive measures specifically directed toward the poor.
Health insurance is a system for the financing of medical expenses by means of contributions or taxes paid into a common fund to pay for all or part of health services specified in an insurance policy or law. The key elements common to most health insurance plans are advance payment of premiums or taxes, pooling of funds, and eligibility for benefits on the basis of contributions or employment. Although health insurance traditionally has been a component of care limited almost exclusively to those workers living above the poverty line in industrialized societies, a number of countries have implemented or are working toward the implementation of compulsory insurance systems.
Health insurance may apply to a limited or comprehensive range of medical services and may provide for full or partial payment of the costs of specific services. Benefits may consist of the right to certain medical services or reimbursement to the insured for specified medical costs. Some types of health insurance may also include income benefits for working time lost because of sickness (i.e., disability leave) or parental leave.
A health insurance system that is organized and administered by an insurance company or other private agency, with the provisions specified in a contract, is known as private, or voluntary, health insurance. Private health insurance is usually financed on a group basis, but most plans also provide for individual policies. Private group plans are usually financed by groups of employees whose payments may be subsidized by their employer, with the money going into a special fund. Insurance of hospital costs is the most prevalent form of private health insurance coverage. Another type is major medical expense protection, which provides protection against large medical costs but avoids the financial and administrative burdens involved in insuring small costs.
Any system that is financed by legally mandated compulsory contributions or by taxes and whose provisions are specified by legal statute is known as a government insurance or social insurance. This type of medical insurance plan dates from 1883, when the government of Germany initiated a plan based on contributions by employers and employees in particular industries. In the United States, Medicare and Medicaid—medical insurance for the elderly and the poor, respectively—are government insurance programs. The distinction between public and private programs is not always clear, because some governments subsidize private insurance programs.
Quite different, however, are government medical care programs (which are sometimes characterized as “socialized medicine” in the United States). In these systems, which are usually financed from general tax revenues, doctors are employed, directly or indirectly, by a government agency, and hospitals and other health facilities are owned or operated by the government. The National Health Service in the United Kingdom and the Veterans Health Administration program operated by the U.S. Department of Veterans Affairs are examples of such systems.
In the United States, health maintenance organizations (HMOs) became popular in the late 20th century as a way to control medical costs through the use of prenegotiated fees for medical services and prescription medicines. An alternative to the HMO is the preferred provider organization (PPO), also known as a participating provider option, which offers features of traditional fee-for-service insurance plans, such as the ability of patients to choose their own health care providers, but also follows the lower-cost strategies of HMOs. For example, those enrolled in a PPO can see any medical provider at any time, without a referral from a primary care physician. However, if the insured uses one of the insurance company’s “preferred providers,” the company generally pays a higher percentage of the cost. In both HMOs and PPOs, the insured is usually responsible for a certain portion of the cost of the medical services, with a co-payment fee (paid by the insured at the time of an office visit) being one of the most common charges.
An HMO, either public or private, provides comprehensive medical care to a group of voluntary subscribers, on the basis of a prepaid contract. HMOs bring together in a single organization a broad range of health services and deliver those services for a fixed, prenegotiated fee.
There are two main types of HMOs, the prepaid group practice model and the medical care foundation (MCF), also called individual practice association. The prepaid group practice type of health care plan was pioneered by the Ross-Loos Medical Group in California, U.S., in 1929. In this model, physicians are organized into a group practice, and there is one insuring agency. The Kaiser Foundation Health Plan in California, the Health Insurance Plan of Greater New York, and the Group Health Cooperative of Puget Sound are generally regarded as innovators of this type of HMO. The MCF usually involves a number of insurance companies. The organization is a loose network of individual physicians, practicing individually and paid on a fee-for-service basis. The medical care foundation reimburses the physicians from the prepaid fees of subscribers.
The U.S. government, which began to promote the HMO concept in the 1970s, viewed HMOs as a means of controlling health care costs (by discouraging physicians from performing unnecessary, costly procedures), meeting the public’s increased demand for health services, and providing health care where it had previously been inadequate. Advocates of prepaid medical plans feel that the HMO, by the nature of its contract, guarantees the availability of health care to those enrolled. They also believe that HMOs foster preventive medicine, encouraging the patient-subscriber to seek treatment early, rather than postponing it out of financial considerations. Thus a potentially serious condition may be diagnosed and treated at an earlier stage and usually at lower overall cost. Opponents of HMOs question this reasoning, arguing that prepayment encourages unnecessary visits to doctors and could, by virtue of the expenses involved, render physicians unable to perform the most thorough testing procedures.
Preventive medicine is directed toward the prevention of disease, either in the community as a whole—an important part of public health—or in the individual. In the 5th century BCE, Hippocrates is believed to have classified causes of disease into those concerned with seasons, climates, and external conditions and those concerned with individual behaviours, such as irregular intake of food, exercise, and other personal habits. Through the Middle Ages the principles of preventive medicine were ignored, in spite of the scourges of leprosy and plague. With the Renaissance came the new learning that revolutionized the whole content of medicine. Practitioners again observed the relation of the seasons, environmental conditions, and personal contact to the incidence of disease.
Concurrent with the growth of medical knowledge there was an empirical movement of practical prevention. For example, in 1388 there was passed the first sanitary act in England, directed to the removal of nuisances. In 1443 came the first plague order recommending quarantine and cleansing, and in 1518 the first rough attempts at notification of epidemic disease and isolation of the patient were made. The study of mortality statistics was initiated in England in the 17th century, and the basis of epidemiology was laid in the mid-17th century. In 1700 a treatise on occupational disorders was published in Italy. An English practitioner in the first half of the 18th century wrote on poisons, on plague and methods of its prevention, and on smallpox, measles, and scurvy. Vaccination was introduced in 1798. The early and middle years of the 19th century were notable for discoveries in the transmission of contagious diseases such as typhus, cholera, typhoid fever, and childbed (puerperal) fever. In the same period increasing attention was given to problems of hygiene and nutrition.
The modern era in preventive medicine opened in the mid-19th century with French chemist Louis Pasteur’s discovery of the role of living microbes as the cause of infections. Toward the close of the century the principle of insect-borne transmission of disease was established. Serological tests were developed, such as the Widal reaction for typhoid fever (1896) and the Wassermann test for syphilis (1906). An understanding of the principles of immunity led to the development of active immunization to specific diseases. Parallel advances in treatment opened other doors for prevention—in diphtheria by antitoxin and in syphilis by arsphenamine. In 1932 the sulfonamide drugs and later the antibiotics including penicillin, streptomycin, chlortetracycline, and chloramphenicol afforded new opportunities of prevention and cure of bacterial diseases.
After 1900 there were many advances in preventive medicine other than those related to infectious diseases. The use of X-rays and radioactive substances in the diagnosis and treatment of disease (e.g., tuberculosis and cancer) as well as in fundamental physiological research opened new possibilities. A greater understanding of endocrine functions, with the production of prepared hormone extracts such as insulin, led to preventive measures in certain metabolic diseases. The role of nutrition in health and disease and the isolation of many essential food factors illustrated the importance to health of adequate diet. In the 21st century, further advances in preventive medicine included a wider recognition of psychological factors in relation to total health, new surgical techniques, new methods of anesthesia, and genetics and epigenetics research.
Medicare and Medicaid are two U.S. government programs that guarantee health insurance for the elderly and the poor, respectively. They were formally enacted in 1965 as amendments (Titles XVIII and XIX, respectively) to the Social Security Act (1935) and went into effect in 1966.
The Medicare program covers most persons aged 65 or older and consists of four related health insurance plans: a hospital insurance plan (called Part A), a supplementary medical insurance plan (Part B), and two privately run plans, Medicare Advantage (Part C) and prescription drug coverage (Part D). The hospital plan, which is financed through Social Security payroll taxes, helps pay the cost of inpatient hospital care, skilled nursing home care, and certain home health services. The plan meets most of the cost of hospital bills for up to 90 days for each episode of illness. (An episode of illness is termed a “benefit period” and lasts from a patient’s admittance to a hospital or nursing facility until he or she has been out of such facilities for 60 consecutive days.) The patient must pay a one-time fee called a deductible for hospital care for the first 60 days in a benefit period, and an additional, daily fee called a co-payment for hospital care for the following 30 days. Medicare covers the rest of the expenses.
The hospital plan also pays for skilled care in a nursing care facility for 100 days if such care follows a period of hospitalization within 30 days. This nursing care is free for the first 20 days after hospitalization, with the patient required to make a co-payment for any of the next 80 days. A person is thus eligible for 90 days of hospitalization and 100 days of nursing care in any benefit period. In addition, home health visits by nurses or medical technicians are covered by Medicare, as is hospice care for the terminally ill.
A patient becomes eligible for Medicare benefits again anytime he or she has gone for 60 consecutive days without receiving skilled care in a hospital or nursing facility. Reentry into such a facility marks the start of a new benefit period. In addition, each person has a “lifetime reserve” of 60 more hospital days that can be used at any time (including times when the 90 days covered in a benefit period have been exhausted), though a sizable co-payment is required.
A nurse holds the hand of a terminally ill patient at the Hospice of Saint John in Lakewood, Colorado, in 2009. The nonprofit hospice accepts patients regardless of their ability to pay, although most are covered by Medicare or Medicaid. John Moore/Getty Images
Medicare’s supplementary medical insurance plan supplements the benefits provided by the hospital plan and is available to most persons 65 years or older. Persons who enroll in the plan pay a small deductible for any medical costs incurred above that amount and then pay a regular monthly premium. If these requirements are met, Medicare pays 80 percent of any bills incurred for physicians’ and surgeons’ services, diagnostic and laboratory tests, and other services. Almost all people entitled to the hospital plan also enroll in the supplementary medical plan. The latter is financed by general tax revenues and members’ payments.
Medicare Advantage plans (Part C) are run by private insurance companies approved and subsidized by Medicare. They must cover all services that the original Medicare covers (with the exception of hospice care) but can offer extra coverage, such as vision, hearing, and dental, at additional costs and can have some different rules as to how recipients receive services. Part D is also run by Medicare-approved companies, and an individual must have Parts A and B to enroll. Coverage and costs vary for each plan, but all must provide at least a standard level of coverage set by Medicare. Most drug plans charge monthly premiums as well as deductibles and co-payments, and they commonly have a coverage gap known as the “doughnut hole.” Once a participant and the insurer have paid a certain amount for covered drugs, the individual is responsible for all costs up to a yearly limit, at which point catastrophic coverage kicks in and out-of-pocket costs drop sharply.
The legislation enacting Medicare was passed in 1965 under the administration of President Lyndon B. Johnson and represented the culmination of a 20-year legislative debate over a program originally sponsored by President Harry S. Truman. Amendments to the program passed in 1972 extended coverage to long-term disabled persons and those suffering from chronic kidney disease. The program’s rapid and unanticipated growth spurred the federal government to legislate various cost-containment measures beginning in the 1970s, notably one in 1983 that set standard payments for the care of patients with a particular diagnosis. Parts C and D were enacted in 2003 and went into effect in 2006.
Medicaid is a health insurance program established for low-income persons under age 65 and persons over that age who have exhausted their Medicare benefits. The program is jointly funded by the federal government and the states. To participate in the plan, states are required to offer Medicaid to all persons on public assistance. Aside from this, and within broad federal guidelines, the individual states determine the eligibility guidelines for enrollment in their own programs, with Medicaid generally offered to persons whose incomes and assets fall below a certain level. The federal government pays the states from 50 to about 80 percent of their Medicaid costs. Hospital care, physicians’ services, skilled nursing facility care, home health services, family planning, and diagnostic screening are covered by the plan.
Like Medicare, Medicaid quickly grew larger than originally expected, and in 1972 the federal government instituted the first of several sets of cost-containment measures in an effort to reduce the program’s expenditures. From the early 1980s, increasing numbers of physicians refused to treat Medicaid patients because of the low reimbursement levels involved.
The comprehensive U.S. health reform enacted in 2010 significantly affected both Medicare and Medicaid. Under the reform, Medicare’s drug coverage “doughnut hole” was slated to gradually shrink, with complete elimination in 2019. Federal subsidies to Medicare Advantage plans were scheduled to be cut, and Medicare payroll taxes for high-income earners were scheduled to increase starting in 2013.
As a result of the reform, about half of the 32 million people added to health insurance rolls were directed into Medicaid, which was given expanded eligibility in order to cover anyone with an income below 133 percent of the poverty level. Medicaid reimbursements were to be increased to match those of Medicare. The federal government initially was to pay all costs for newly eligible recipients, although its contributions were set to decline gradually.
The approaches to medical practice vary within and between communities and countries. For example, whereas the number of practicing physicians is growing in some countries, it is in decline or is stagnant in others, with the result that there are too few doctors available to fulfill health care demands. The differences in medical practice on national and international levels are revealed by comparisons of the systems utilized in developed countries, such as the United Kingdom and the United States, as well as by comparisons of the systems utilized in developing countries.
Before 1948, general practitioners in the United Kingdom settled where they could make a living. Patients fell into two main groups: weekly wage earners, who were compulsorily insured, were on a doctor’s “panel,” and were given free medical attention (for which the doctor was paid quarterly by the government). Most of the remainder paid the doctor a fee for service at the time of the illness. In 1948 the National Health Service (NHS) began operation. Under its provisions, everyone is entitled to free medical attention with a general practitioner with whom he or she is registered. Though general practitioners in the NHS are not debarred from also having private patients, these must be people who are not registered with them under the NHS. Any physician is free to work as a general practitioner entirely independent of the NHS, though there are few who do so. Almost the entire population is registered with an NHS general practitioner, and the vast majority automatically sees this physician, or one of his or her partners, when they require medical attention. A few people, mostly wealthy, while registered with an NHS general practitioner, regularly see another physician privately. A few may occasionally seek a private consultation because they are dissatisfied with their NHS physician.
A general practitioner under the NHS remains an independent contractor, paid by a capitation fee (i.e., paid according to the number of people registered with the physician). The physician may work entirely from his or her own office and provides and pays a receptionist, secretary, and other ancillary staff. Most general practitioners have one or more partners and work more and more in premises built for the purpose. Some of these structures are erected by the physicians themselves, but many are provided by the local authority, the physicians paying rent for using them. Health centres, in which groups of general practitioners work have become common.
In the United Kingdom only a small minority of general practitioners can admit patients to a hospital and look after them personally. Most of this minority are in country districts, where, before the days of the NHS, there were cottage hospitals run by general practitioners; many of these hospitals continued to function in a similar manner. All general practitioners use such hospital facilities as X-ray departments and laboratories, and many general practitioners work in hospitals in emergency rooms (casualty departments) or as clinical assistants to consultants, or specialists.
General practitioners are spread more evenly over the country than formerly, when there were many in the richer areas and few in the industrial towns. The maximum allowed list of NHS patients per doctor is 3,500. The average, however, is about 2,500. Patients have free choice of the physician with whom they register, with the proviso that they cannot be accepted by one who already has a full list and that a physician can refuse to accept them (though such refusals are rare). In remote rural places there may be only one physician within a reasonable distance.
Until the mid-20th century it was not unusual for British doctors to visit patients in their own homes. A general practitioner might make 15 or 20 such house calls in a day, as well as seeing patients in his or her office or “surgery,” often in the evenings. This enabled the physician to become a family doctor in fact as well as in name. In modern practice, however, a home visit is quite exceptional and is paid only to the severely disabled or seriously ill when other recourses are ruled out. All patients are normally required to go to the doctor.
It has also become unusual for a personal doctor to be available during weekends or holidays. His or her place may be taken by a partner in a group practice, a provision that is reasonably satisfactory. General practitioners, however, may now use one of several commercial deputizing services that employs young doctors to be on call. Although some of these young doctors may be well experienced, patients do not generally appreciate this kind of arrangement.
Whereas in the United Kingdom the doctor of first contact is regularly a general practitioner, in the United States the nature of first-contact care is less consistent. General practice in the United States was in a state of decline in the second half of the 20th century and early part of the 21st century, especially in metropolitan areas. The general practitioner, however, has been replaced to some degree by the growing field of family practice. In 1969 family practice was recognized as a medical specialty after the American Academy of General Practice (now the American Academy of Family Physicians) and the American Medical Association (AMA) created the American Board of General (now Family) Practice. Since that time the field has become one of the larger medical specialties in the United States. The family physicians were the first group of medical specialists in the United States for whom recertification was required.
There is no national health service, as such, in the United States. Most physicians in the country have traditionally been in some form of private practice, whether seeing patients in their own offices, clinics, medical centres, or another type of facility and regardless of the patients’ income. Doctors are usually compensated by such state and federally supported agencies as Medicare and Medicaid. Not all doctors, however, accept poor patients. There are also some state-supported clinics and hospitals where the poor and elderly may receive free or low-cost treatment, and some doctors devote a small percentage of their time to treatment of the indigent. Veterans may receive free treatment at Veterans Administration hospitals, and the federal government through its Indian Health Service provides medical services to American Indians and Alaskan natives, sometimes using trained auxiliaries for first-contact care.
In the rural United States first-contact care is likely to come from a generalist. The middle- and upper-income groups living in urban areas, however, have access to a larger number of primary medical care options. Children are often taken to pediatricians, who may oversee the child’s health needs until adulthood. Adults frequently make their initial contact with an internist, whose field is mainly that of medical (as opposed to surgical) illnesses. The internist often becomes the family physician. Other adults choose to go directly to physicians with narrower specialties, including dermatologists, allergists, gynecologists, orthopedists, and ophthalmologists. Patients in the United States may also choose to be treated by doctors of osteopathy. These doctors are fully qualified, but they make up only a small percentage of the country’s physicians. They may also branch off into specialties, but general practice is much more common in their group than among M.D.’s.
It used to be more common in the United States for physicians providing primary care to work independently, providing their own equipment and paying their own ancillary staff. In smaller cities they mostly had full hospital privileges, but in larger cities these privileges were more likely to be restricted. Physicians, often sharing the same specialties, are increasingly entering into group associations, where the expenses of office space, staff, and equipment may be shared. Such associations may work out of suites of offices, clinics, or medical centres. The increasing competition and risks of private practice have caused many physicians to join HMOs. The cost savings to patients are considerable, but they must use only the HMO doctors and facilities. HMOs stress preventive medicine and outpatient treatment as opposed to hospitalization as a means of reducing costs, a policy that has caused an increased number of empty hospital beds in the United States.
While the number of doctors per 100,000 population in the United States has been steadily increasing, there has been a trend among physicians toward the use of trained medical personnel to handle some of the basic services normally performed by the doctor. So-called physician extender services are commonly divided into nurse practitioners and physician’s assistants, both of whom provide similar ancillary services for the general practitioner or specialist. Such personnel do not replace the doctor. Almost all American physicians have systems for taking each other’s calls when they become unavailable. House calls in the United States, as in the United Kingdom, have become exceedingly rare.
In the late 20th century a main goal of the World Health Organization (WHO), as expressed in the Alma-Ata Declaration of 1978, was to provide to all the citizens of the world a level of health allowing them to lead socially and economically productive lives by the year 2000. By the late 1980s, however, vast disparities in health care still existed between the rich and poor countries of the world. In developing countries such as Ethiopia, Guinea, Mali, and Mozambique, for instance, governments in the late 1980s spent less than $5 per person per year on public health, whereas in most western European countries several hundred dollars per year was spent on each person. The disproportion of the number of physicians available between developing and developed countries was similarly wide.
Along with the shortage of physicians, there was a shortage of everything else needed to provide medical care—of equipment, drugs, and suitable buildings, and of nurses, technicians, and all other grades of staff, whose presence is taken for granted in the affluent societies. Many of these problems still exist in developing countries in the 21st century. Furthermore, within these countries there are few people who can afford to pay for medical care, and in a free market system the physicians tend to go where they can make the best living. This situation causes the doctor patient ratio to be much higher in the towns than in country districts. A physician in Bombay or in Rio de Janeiro, for example, may have equipment as lavish as that of a physician in the United States and can earn an excellent income. The poor, however, both in the cities and in the country, can get medical attention only if it is paid for by the state, by some supranational body, or by a mission or other charitable organization. Moreover, the quality of the care they receive is often poor, and in remote regions it may be lacking altogether. In practice, hospitals run by a mission may cooperate closely with state-run health centres.
Because physicians are scarce, their skills must be used to best advantage, and much of the work normally done by physicians in prosperous countries has to be delegated to auxiliaries or nurses, who have to diagnose the common conditions, give treatment, take blood samples, help with operations, supply simple posters containing health advice, and carry out other tasks. In such places the doctor has time only to perform major operations and deal with the more difficult medical problems. People are treated as far as possible on an outpatient basis from health centres housed in simple buildings. Few patients, however, can travel except on foot, and if they are more than a few miles from a health centre, they tend not to go there. Health centres also may be used for health education.
Although primary health care service differs from country to country, largely rural developing countries have devised rural health centres. These centres, with their related dispensaries, are intended to provide comprehensive health services for the community. The staff may be headed by an assistant medical officer and a medical assistant. The assistant medical officer, who typically has at least four years of experience, followed by further training, serves to bridge the gap between medical assistant and physician. The medical assistant usually has at least three years of general medical education. The work of the rural health centres and dispensaries mainly involves diagnosis and treatment, maternal and child health, and environmental health. The main categories of primary health workers also include medical aids, maternal and child health aids, and health auxiliaries. Nurses and midwives may form another category of worker in the rural centres. In small, remote villages there may be village health posts staffed by local medical helpers working under supervision from the rural health centre.
In some primitive elements of the societies of developing countries, and of some developed countries, there exists the belief that illness comes from the displeasure of ancestral gods and evil spirits, from the malign influence of evilly disposed persons, or from natural phenomena that can neither be forecast nor controlled. To deal with such causes there are many varieties of indigenous healers who practice elaborate rituals on behalf of both the physically ill and the mentally afflicted. Although such methods may sometimes be harmful, they may often be effective, especially where the cause is psychosomatic. Other patients, however, may suffer from a disease for which there is a cure in conventional, or Western, medicine.
In order to improve the coverage of primary health care services and to spread more widely some of the benefits of Western medicine, attempts have sometimes been made to find a means of cooperation, or even integration, between traditional and modern medicine. In Africa, for example, some such attempts are officially sponsored by ministries of health, state governments, universities, and the like, and they have the approval of WHO, which often takes the lead in this activity. In view, however, of the historical relationships between these two systems of medicine, their different basic concepts, and the fact that their methods cannot readily be combined, successful merging has been limited.