Chapter 8
Exposure Science, Industrial Hygiene, and Exposure Assessment

Michael G. Yost and P. Barry Ryan

Dr. Yost and Dr. Ryan report no conflicts of interest related to the authorship of this chapter.

This chapter introduces concepts and activities that are at the core of environmental health: recognizing, measuring, and ultimately controlling human exposures to harmful agents. Our account begins with industrial hygiene, a technical field that historically evolved in industrial workplaces. It then moves beyond industrial hygiene to describe exposure science, a modern field focused on exposure assessment in both the workplace and the general environment.

Industrial hygiene and exposure science share a common task: quantifying exposures. This task is relevant both to public health practice and to research. In public health practice, quantifying exposures helps us to assess potential problems, direct preventive efforts and monitor their success, and check compliance with regulations. Quantifying exposures is also essential for research, because it allows investigators to quantify the association between the exposures and health outcomes. For example, knowing that carbon monoxide is an asphyxiant is only partially useful. Knowing how much carbon monoxide exposure is dangerous, and knowing how to measure the exposures where and when they occur, enables us to understand the biological effects more completely, identify acceptable levels and set standards accordingly, and monitor environments to be sure they are safe.

Industrial hygiene has moved beyond its traditional approach of measuring exposures to controlling them. Typically, an industrial hygienist in a factory is called upon to, say, monitor air levels of a hazardous agent such as a chemical solvent. If the exposures are excessive in a particular part of the factory, the hygienist will implement controls: for example, by substituting a safer solvent, upgrading the ventilation system, or providing personal protective equipment for affected workers.

In contrast, exposure scientists usually focus only on measuring and quantifying exposures in the general population (often in a research setting); the results of the exposure assessment then become the inputs for risk assessment and decisions by public health policymakers. Responsibility for controlling the excessive exposures rests with these other professionals.

Anticipation, Recognition, Evaluation, and Control

Industrial hygiene (referred to as occupational hygiene outside the United States) has been defined as “the discipline of anticipating, recognizing, evaluating and controlling health hazards in the working environment with the objective of protecting workers health and well-being, and safeguarding the community at large” (International Occupational Hygiene Association, 2009). Industrial hygienists are professionals trained to manage workplace risks, in collaboration with allied professionals such as occupational physicians and nurses who treat work-related illness. Industrial hygiene has been practiced in the United States for about one hundred years. Industrial hygienists work to predict and then recognize workplace hazards, quantify the exposures, and implement appropriate control strategies.

Koren and Bisesi (2002, pp. 563–565) have developed concise definitions of each part of this paradigm. They define anticipation of hazards as “proactive estimation of health and safety concerns that are commonly, or at least potentially, associated with a given occupational or environmental setting.” Recognition of occupational hazards is the “identification of potential and actual hazards in a workplace through direct inspection,” a definition that emphasizes that empirical observation is at the heart of industrial hygiene. Evaluation includes measuring exposures through “visual or instrumental monitoring of a site.” Finally, control is the “reduction of risk to health and safety through administrative or engineering measures.” Industrial hygienists spend much of their time in real workplaces, observing, measuring, and problem solving to improve worker health and safety.

Anticipation

Anticipation is the first step before conducting a field assessment. The hygienist typically obtains information such as the site history, a manufacturing processes diagram, worker job titles, and material safety data sheets for the chemicals in use. The hygienist uses this information and practical knowledge to develop a preliminary list of potential workplace hazards, including safety hazards that pose a risk of injury and health hazards that pose a risk of disease. This anticipation process may also uncover environmental hazards that can impact nearby communities, rivers, woodlands, or other sensitive environments.

Examples of safety hazards include insufficient emergency egress; moving machinery or vehicles, such as fork lifts; slippery, elevated, or uneven surfaces contributing to trips and falls; and inappropriate chemical storage posing risk of fires or explosions. Although these concerns are the domain of a related profession, safety engineering, many industrial hygienists handle safety concerns as part of their job, especially at smaller facilities.

Examples of health hazards in the workplace include physical hazards, or agents, such as high noise levels, elevated temperatures and humidity, and radiation. Physical agents have in common high levels of energy or force on parts of the body. Physical hazards also include repetitive motion such as occurs in typing or hand tool use, which can increase the risk of musculoskeletal injuries such as back pain or carpal tunnel syndrome. Exposure to chemical hazards, or agents, can occur in many workplace processes and may present an acute or chronic hazard. Acute, high-level exposures to certain highly toxic chemicals, such as chlorine gas, may result in both acute and chronic health effects, disability, and even death. Such events must be clearly anticipated and controlled. More commonly, long-term exposures can lead to chronic effects, such as neurological damage from solvent exposures. For example, long-term exposure to benzene increases the risk of bone marrow dysfunction and aplastic anemia, a blood disease; inhaling asbestos fibers can lead to lung disease and cancers; inhaling crystalline silica contributes to silicosis in foundry workers; and radon increases the risk of lung cancer in uranium miners.

Some workplaces can also present biological hazards. For example, in health care settings workers may be exposed to blood-borne pathogens, such as hepatitis A, B, or C or HIV, that can be contracted due to mishandling of needles or other instruments that come into contact with body fluids. In these workplaces strict disinfection protocols, personal protective equipment such as gloves and masks, and secure disposal of biohazard wastes are used to prevent the spread of infections.

Industrial hygienists often are called upon to anticipate environmental hazards as well as workplace hazards. Environmental hazards may endanger public safety (as when a chlorine tank ruptures and neighbors are exposed to toxic gas), health (as when organic solvents from improper storage or disposal contaminate groundwater or drinking water), or welfare (as when smokestack emissions damage nearby trees or homes). Environmental effects also include ecological damage (such as killing fish by reducing the oxygen-carrying ability of lakes or streams) and economic damage (such as contaminating nearby land with heavy metals, industrial solvents, or pesticides that diminish the land's value for residential or recreational purposes). The industrial hygienist should anticipate such possibilities and review the site history. For example, a review of records or employee interviews may suggest that solvents have seeped into the ground and migrated off-site, contaminating groundwater. A hygienist who suspects such widespread contamination may consult with an environmental specialist to assess nearby groundwater.

Text Box 8.1 presents an example of an industrial hygiene evaluation, emphasizing hazard anticipation. The example shows that even with minimal information, the hygienist can anticipate hazards prior to visiting a facility. This assessment strategy depends on examining a range of information before visiting the site: the industrial process description, the job titles of workers, the chemicals in use at the facility (often found on the material safety data sheets required by law), and the history of the site. Using this information the hygienist develops a list of potential health and safety hazards, perhaps in checklist form, to enhance observations during the walk-through visit.

The initial recognition phase is usually accomplished during a site visit or walk-through, where the hygienist conducts a visual inspection of the facility to assess both qualitative and quantitative information about hazards. The hygienist inspects processes and procedures at the facility, observes workers in various job categories, and reviews any health and safety programs in place at the plant. During a walk-through the hygienist looks for a wide range of issues: physical, chemical, and biological exposures; ergonomic, mechanical, and psychological factors; safety hazards such as exposed machinery or slippery surfaces; high noise levels; and the presence of chemicals. A similar on-site review may look for environmental hazards, with an emphasis on off-site emissions.

Another important purpose of the walk-through is recognition of special subpopulations in the facility that are at elevated risk. Some workers may perform lifting or repetitive movements in their jobs, exposing them to ergonomic hazards. Work in a high-temperature area may subject other workers to heat stress. During a walk-through the hygienist notes these subpopulations and may evaluate hazards differently for different groups. The hygienist concludes the recognition phase with a detailed picture of the manufacturing processes, a list of the associated hazards, and a written hazard evaluation plan. From this plan a detailed protocol is developed for the next phase, the evaluation of the hazards.

Evaluation

After the walk-through the hygienist will have a list of potential hazards but no quantitative information about worker exposures. For example, if a metalworking facility uses toxic degreasing solvents, the risk of exposure may be minimal with proper storage, personal protective gear, and appropriate ventilation. The evaluation phase actually begins during the walk-through, and there is a smooth transition from the recognition of hazards to their evaluation.

Evaluation focuses on quantifying the degree of exposure. Exposures can be assessed in several ways: area sampling collects measurements in a room in the vicinity of some workers; personal sampling collects contaminants in the breathing zone of individuals, using small portable samplers; biological sampling collects body fluids or breath samples to measure contaminants or specific metabolites.

Population Sampling for Exposure Assessment

The starting point when conducting an exposure study is the identification of the study population to be sampled. This involves enumerating all the people who could be sampled. In a workplace the focus is usually on certain workers with specific job titles—those identified during the anticipation phase as having a potential for high exposures. Other selection criteria may also apply, based on concerns raised by workers or their union, consultants, or regulatory authorities.

The next choice is the sampling strategy to be applied to the population. In small workplaces (e.g., a dozen or fewer employees) a simple census of all individuals may be taken. This ensures that all exposures are monitored. In larger facilities this approach can be too costly, so a statistically representative sample is needed. A common approach here is stratified sampling.

In stratified sampling the population is divided into subgroups (strata), and each individual monitored represents a known number of individuals in the subgroup. For example, if a trucking company has 5,000 drivers serving a state, it may be impractical to monitor all of them for exposure to diesel exhaust during trips. The hygienist may subdivide the drivers into groups according to the type of route (say, long-haul or delivery) and destination city, creating, say, fifty groups of roughly 100 individuals each. The hygienist then selects some members from each group (usually at random) to create a statistically representative sample of the full 5,000. Although stratified sampling is subject to error because not all the exposed people are monitored, it can be an efficient way to characterize an entire population when differences between groups have an effect on exposures. Techniques are available to estimate the size of the error. Statistical colleagues can help the hygienist to determine the number of samples needed from each group to characterize exposures for the entire population.

A third strategy is the so-called convenience sample. Often such a strategy consists of monitoring volunteers or individuals with a particular complaint. Convenience sampling can be subject to bias; those who volunteer or have complaints are not likely to represent all members of the group. This sampling strategy should be avoided in favor of randomly selecting individuals. However, a related sampling strategy may have a role and is used in regulatory settings. The hygienist may choose worst-case sampling—selecting workers at highest risk of exposure or measuring at times when exposures are highest. The assumption in worst-case sampling is that if these workers' exposures are acceptable, then all the remaining workers are also unlikely to be overexposed.

Exposure Evaluation Instruments

Two general types of devices are used for measuring environmental exposures: direct reading devices and sample collection devices. Direct reading devices provide near real-time measurements of the exposure of interest, and sample collection devices store or trap samples for later analysis.

Direct reading instruments are available for measuring many physical hazards, such as temperature, noise, and radiation. These instruments typically have a digital readout and the ability to store data over a period of time for later downloading. Common examples are digital thermometer-hygrometers to measure temperature and humidity, noise monitors, and ultraviolet radiation monitors. Other direct reading instruments can measure various pollutants, including gases, vapors, and airborne particles. For example, fine particles can be measured with a device called an optical particle counter. These instruments are usually portable, battery operated, lightweight, and enclosed in a rugged case for field surveys.

Sample collection instruments are often used when multiple airborne pollutants are present or further laboratory analysis is needed. The collection device draws in a known volume of air, including whatever contaminants are in it, and traps the contaminants on an absorbing medium. The absorbing medium is used to stabilize and store the contaminant so the sample can be taken later to a laboratory where the mass of the contaminant stored in the medium is determined. The air concentration of the contaminant in the sample can then be quantified in units of mass per volume, by dividing the mass of contaminant collected on the absorbing medium by the volume of air that was sampled. These air concentrations typically have units of micrograms per cubic meter (µg/m3). By increasing the volume of air sampled, more contaminant mass can be collected, thereby increasing the sensitivity for detecting low levels of contamination.

Sample collection instruments can be either active or passive. Active sampling devices draw air through the absorbing medium using an electric pump. The pump's airflow rate can be varied, and the total volume of air sampled is calculated by multiplying the flow rate by the duration of sampling. The sampling time period can sometimes be shortened by increasing the pump flow rate, thereby delivering the same volume of air in less time. This can be useful when exposures occur over short time periods or are highly variable. Conversely, the flow rate can be decreased if longer sample times are needed.

Active sampling is highly versatile, sensitive, and specific for the contaminant of interest because sophisticated laboratory analysis methods (such as mass spectrometry) can be used to analyze the samples. However, running the pump requires a battery or electricity and the pump may be bulky or noisy and have limited run time. These drawbacks make such devices unsuitable for some kinds of personal sampling, although area sampling is more feasible. In Figure 8.1, both the sampling devices (for ozone and particulate matter) and the pump are located inside the box at the bottom of the apparatus. The vertical pipe with the metal cone on top collects the fine particle sizes that can be inhaled deeply into the lungs. Ozone is sampled off the same airstream.

Image described by caption/surrounding text.

Figure 8.1 An Air Pollution Monitoring Station for Ozone and Particulate Matter, in Atlanta

Passive sampling devices use diffusion, rather than a pump, to collect the air sample. This method requires an absorbing medium that removes the compound of interest from the air by reaction or absorption at the surface of the medium. The concentration gradient between the air to be sampled and the surface of the absorbing medium causes the contaminant of interest to diffuse from the air to the surface where it is trapped. The mass collected in the medium is then analyzed in a laboratory, as in active sampling analysis. The flow rate of the air delivered to the surface during sampling is computed using Fick's law of diffusion; the volume sampled is this sampling rate multiplied by the sampling time. The concentration is then calculated in the same way as for an active sample, by dividing the mass collected by the volume of air sampled.

Although passive devices do not require a pump, their diffusion method generates low sampling rates; often these flow rates are 1,000-fold slower than rates in active sampling devices. Thus the contaminant mass sampled in a given time is correspondingly lower. However, in occupational settings concentrations are often sufficiently high that passive devices can still achieve excellent results. Further, laboratory analysis has substantially improved, reducing the amount of material needed for accurate quantification. When available and of sufficient precision and accuracy, passive sampling devices can be the method of choice. Passive devices for particulate matter are not yet of sufficient precision and accuracy to merit their use in typical occupational settings.

Biomonitoring (also called biological monitoring) involves the collection of body fluids or tissue such as saliva, blood, hair, or urine. These are analyzed for either the contaminant or a metabolite of that contaminant. Biomonitoring is discussed later in this chapter.

Control

Control of workplace hazards is an important element of industrial hygiene practice that corresponds, in public health terms, to primary prevention (as discussed in Chapter 26). Several approaches are used to modify the workplace environment: substitution, isolation, ventilation, administrative changes, and personal protection. Substitution involves replacing a hazardous material or process with a less hazardous one. For example, benzene (a bone marrow toxin) might be replaced by less toxic toluene. Isolation involves containing or limiting human access to the hazardous materials, usually through engineering controls. For example, a metal casing may be may be used to enclose a solvent washer. For certain hazards, most notably chemical and heat-related hazards, ventilation provides a viable control strategy. For example, the introduction of fresh air or use of a local exhaust hood may significantly reduce exposure to these hazards. Administrative controls consist of policies and procedures that reduce risks. For example, maintenance workers place a lock and signed tag on machinery controls to prevent unintended operation during repairs, a standard injury prevention strategy called lockout tag-out. Rotating workers to limit the time spent by any individual in a high-exposure location may have a role as well (an approach used, for example, with radiation workers).

Protective devices are often used to control safety hazards. For example, a cutting machine may be designed so that the worker needs to push two buttons, one with each hand, to initiate a cut; this guarantees that the worker's hands cannot be in the cutting zone during operation.

Personal protective equipment (PPE), such as respirators, gloves, safety glasses, hardhats, safety harnesses, and steel-toed boots may be recommended, although this approach is less preferable than the environmental changes described previously. Figure 8.2 shows an example of personal protective equipment in use. Working at a degreasing tank, a worker may inhale vapors or absorb solvent that splashes on bare skin. This worker wears personal protective equipment consisting of gloves and a face shield to protect the hands and face from splashed solvent and a respirator to prevent vapors from being inhaled.

Image described by caption/surrounding text.

Figure 8.2 Personal Protective Equipment

Source: Courtesy of Phillip L. Williams, University of Georgia College of Public Health.

This worker is positioned over a solvent bath. Note the sampling apparatus on the worker's belt, and the hose running from the breathing zone.

Exposure Science

Exposure science is broadly defined as the study of human contact with chemical, physical, or biological agents occurring in the environment. It focuses on the mechanisms and dynamics of events either causing or preventing adverse health outcomes (National Research Council, 2012). Exposure scientists quantify exposures in both occupational settings (as do industrial hygienists) and in community settings where people may encounter hazards as they go about their daily activities. Exposure science often focuses on evaluating exposure determinants, which are the factors and conditions that influence these exposures. Exposure assessment, one aspect of exposure science, aims to quantify exposures in both occupational and environmental settings. These assessments focus on key concepts such as concentration, exposure, and dose (as discussed in Text Box 8.2).

Frequency, Intensity, and Duration of Exposure

An important aspect of exposure is its time course, sometimes referred to as the exposure profile, which can be graphed as the concentration present in a person's breathing zone (or, less typically, in other media such as in drinking water or during dermal exposure) over a period of time. The term total exposure is sometimes used to describe the area under this exposure-time curve. Different exposure profiles can yield similar total exposures. For example, one worker may weld for 15 minutes in an enclosed space and sustain a concentration of metal fumes of 40 mg/m3, receiving a total exposure of (40 mg/m3)(0.25 hr) = 10 mg/m3 × hr. After finishing his task, he experiences no further exposure to welding fumes. A coworker, working in the same area but not exposed directly to the fumes, remains for the entire 8-hour shift. Over the course of the day, the coworker experiences a concentration of 1.25 mg/m3. The coworker receives an identical total exposure [(1.25 mg/m3)(8 hr)= 10 mg/m3 × hr], but the exposure profile is different. In some circumstances, different exposure profiles may have different health effects, even with equivalent total exposures.

The shape of the exposure profile matters because some contaminants are easily cleared at low exposure levels but toxic at higher levels. In this case, the dose rate may affect the health outcome. Exposure assessors focus on the intensity of exposure, frequency of exposure, and duration of exposure, asking questions such as: What is the peak concentration in the monitoring period? How much variability occurs from minute to minute or hour to hour? Do exposures recur regularly or episodically? Is the duration of exposure short followed by no exposure, or does exposure occur at moderate levels for a long period? Such information can prove invaluable in addressing potential effects and control strategies.

Three basic scenarios, defined by the U.S. Environmental Protection Agency (U.S. EPA), are used by scientists to distinguish exposures across different time periods (U.S. Environmental Protection Agency, Office of Research and Development, National Center for Environmental Assessment, 2015). Acute exposure occurs by ingestion, skin absorption, or breathing for twenty-four hours or less. Chronic exposure consists of repeated episodes that occur by the same routes for more than approximately 10% of the life span in humans. Subchronic exposure is repeated exposure by one or more of these routes for more than thirty days, up to approximately 10% of the life span in humans. When acute exposures occur at high levels, poisoning or other immediate responses may follow. Chronic exposures at lower levels may be linked to health outcomes such as cancer, chronic lung damage, or similar effects. Subchronic exposures are between these two and also may be episodic or recurring.

Routes and Pathways of Exposure

There are three principal routes of exposure for people: inhalation exposure, ingestion exposure, and dermal exposure. These routes of exposure are different from the exposure pathway, or the path by which the contaminant moves from a source to a human receptor. For example, pesticide exposures in children may come from several pathways. Children may ingest pesticides from residues present on food (a dietary pathway); they may get pesticides on their skin from their parents' contaminated clothing if the parents work on a farm (a take-home pathway); if spraying takes place close to their home, they may inhale pesticide particles or vapors (a drift pathway). These pathways differ substantially and each requires entirely different assessment and control strategies to reduce exposure.

Exposure Assessment Methods

Ideally, an exposure assessment method quantifies the mass of contaminant reaching the target organ in each exposed person. Of course this is generally not feasible, but four broad assessment method categories approximate this ideal to increasing degrees: imputing or modeling exposures, measuring environmental exposures, measuring personal exposures, and measuring biomarkers. In general these methods become increasingly expensive, and increasingly accurate, as one moves up this continuum.

Imputing or Modeling Exposures

Exposure scientists use indirect exposure assessment methods to impute exposures when they lack direct measurements or only have partial data. Indirect approaches are usually substantially simpler and less costly than direct measurements. Additionally, for retrospective studies, in which it is impossible to take measurements, indirect approaches are the only methods available.

Air pollution studies provide one example. In a study of inhalation exposure to air pollution, researchers might use a time-location study to identify various microenvironments (home, work, travel, etc.) in which people spend a significant portion of their day. The researchers then measure pollutant concentrations in representative microenvironments, and have subjects record or estimate the amount of time spent in the microenvironments. From these data, the scientists could multiply the concentrations by the amount of time spent in each microenvironment and sum the results for an estimate of each person's exposure. A similar approach can be used, for example, for ingestion. Concentrations of contaminants can be measured in many different foods; people can record the types and amounts of foods they eat, using a food diary; and dietary exposures can be estimated by summing over all the foods eaten.

A related strategy that uses exposure scenarios constructs estimates without direct measurement by assuming activity patterns for typical individuals (children, adult men, adult women, etc.). Available monitoring data for each activity and location can then be combined to model estimates of individual exposures. This approach is inexpensive because no individuals are measured or activities recorded. Exposure scenarios are used extensively in risk assessment.

A special case of indirect exposure assessment is the job-exposure matrix (JEM) (see Chapter 4). Suppose an exposure assessment is needed for a retrospective study of silica exposures in a worker cohort. Consulting old employment records, the exposure scientist identifies ten job categories, each with characteristic tasks, and fifteen work zones, each with silica concentrations derived from historical industrial hygiene monitoring or by using estimates from a panel of experts. The exposure scientist then constructs a JEM, assigning an exposure level to each worker based on his or her job assignment and work zone location. If the workplace changes over time, as is typical, then a JEM, for each time period is created to account for the differences. A JEM is often the only way to assess exposures in retrospective studies. However, JEM creation can be time consuming, and accurate records may not be available to complete the assessment.

Measuring Environmental Exposures

Direct exposure measurements are typically done at fixed locations (area sampling) or on individual subjects (personal sampling). In air pollution monitoring in most major cities, fixed-site sampling has been commonly used over many years. Daily and hourly measurements of O3 (ozone), NOx (nitrogen oxides), SOx (sulfur oxides), and PM (particulate matter) track compliance with regulations. These measurements also provide information used to warn the public of dangerous peaks and to support health research.

Measuring Personal Exposures

Personal exposure monitoring generally involves placing a small, portable sampling device on a person to collect a sample within his or her breathing zone during daily activities. Personal monitoring started in the workplace in the early 1960s and has become routine practice in industrial hygiene assessments. Personal sampling is considered a reference method for assessment because it accounts for time, location, and the person's behavior; all of these can have a profound effect on exposures.

The breathing zone air sample can be analyzed for the contaminant of interest, either in real time or with a time-integrated sample collection device. With direct reading, real-time methods, a person's exposure profile can be observed and synchronized with a video recording. The video is overlaid with a bar graph or other indicator from the real-time monitor, demonstrating how specific activities contribute to exposures. This technique, known as video exposure monitoring (VEM), is a powerful training tool and aid for developing control strategies. Similarly, real-time monitors can be wirelessly paired with smartphones or GPS devices to evaluate time-activity contributions to exposure.

While the focus here is on inhalation exposures, generalization to other personal exposures is possible. For example, duplicate diet sampling, in which identical meals are collected from each individual in a study, is a method of quantifying ingestion exposures from food.

Aggregate and Cumulative Exposure Assessment

The Food Quality Protection Act of 1996 expanded the single contaminant approach to exposure assessment and introduced new concepts of aggregate exposure and cumulative exposure (also see Chapter 27). Aggregate exposure refers to assessing simultaneously all routes and exposure pathways involved for a single compound. An agricultural worker may have exposure to a single pesticide through both inhalation and skin absorption while picking crops. In addition, he may ingest food containing residues of the same pesticide. The clothing he wore at work and brought home may also carry residual contamination. Counting only one route and pathway, such as inhalation during spraying, underestimates his total exposure, perhaps substantially. Aggregate exposure assessment over all routes and pathways simultaneously is necessary in order to quantify the hazard accurately.

The concept of cumulative exposure extends this approach to multiple compounds that have similar biological mechanisms. Cumulative exposure is defined as aggregate exposure to a series of compounds (or nonchemical exposures) that affect health through similar mechanisms. A common example of cumulative exposure focuses on organophosphate (OP) pesticides such as chlorpyrifos, malathion, and diazinon (see Tox Box 18.1, in Chapter 18). These compounds share a common biological mechanism of toxicity: inhibition of the enzyme acetylcholinesterase, which is necessary for normal transmission of nerve signals. OP pesticides interfere with this process, resulting in continued firing of the neuron. A cumulative exposure assessment for all OP pesticides is needed to understand the impact of exposure not just to a single OP but rather to all pesticides operating through acetylcholinesterase inhibition. This requires measurement either of all the compounds simultaneously or of some biological effect, such as acetylcholinesterase inhibition in exposed people, that integrates over all exposures. Naturally such assessments are complex, especially when extended to nonchemical exposures, such as stress or malnutrition, that may compound the effects of chemical exposures.

Measuring Biomarkers

So far we have focused on sampling environmental media from places where people are likely to contact a contaminant. However, it is possible to measure contaminant levels in humans themselves, and thereby verify that exposures occurred. Biological markers (sometimes referred to as biomarkers) of exposure are used for this purpose. Biomarkers are collected by sampling body fluids or tissues such as exhaled breath, urine, blood, feces, or hair. These biosamples are analyzed for the contaminants of interest (called the parent compound) or a related compound (usually a metabolite), or a biological response known to reflect exposure. For example, cotinine (a metabolite of nicotine) can be measured in urine to quantify exposure to tobacco smoke, and carboxyhemoglobin levels in blood are measured to quantify exposure to carbon monoxide. Pesticides provide another example. Blood samples can be analyzed for OP parent compounds, urine samples can be analyzed for OP metabolites such as dialkyl phosphates, or acetylcholinesterase enzyme activity in serum can provide a measure of the biological effect.

Biomarkers of exposure have important advantages and disadvantages. Detection of an exposure biomarker proves that absorption of the compound has occurred. Other environmental measurements cannot confirm this conclusion. Furthermore, biomarkers account for bioavailability, which describes the ability of a compound to pass across the contact boundary into the body through, for example, ingestion. Biomarkers also integrate over all routes of exposure and therefore are useful for aggregate assessments. For these reasons, biomonitoring has been called the “gold standard” for exposure assessment (Sexton, Needham, & Pirkle, 2004).

Biomonitoring has evolved rapidly to consider biological indicators not only of exposure but also of biological response to the exposures. Many of these approaches are called omics because they draw on a suite of molecular biology techniques with that suffix: genomics, proteomics, and metabolomics (as explored in Chapter 7). These techniques employ large-scale array technology to screen for hundreds or thousands of genes, proteins, and metabolites that are associated with particular exposures or in some cases correlate with disease risk.

Patterns or features associated with clusters of these proteins and biological molecules can be derived with bioinformatics and multivariate statistical analysis to provide insights into how an organism dynamically responds to environmental exposures. These responses may persist over various time scales: days, months, or years. Together, these biomolecular features, reflecting gene expression, cellular functions, and metabolism, become part of a person's exposome (the totality of exposure events that affect the person) (Wild, 2005). For example, exposures to OP pesticides can cause a nonspecific, short-term depression in enzyme activity that may last a few days; however, this exposure can also give rise to pesticide-protein adducts that persist in the blood for more than a month and that can identify the specific pesticides (Marsillach, Costa, & Furlong, 2013). Some exposures may also induce epigenetic changes that persist for years or become permanent (Kyrtopoulos, 2013). For example, smoking and exposure to tobacco smoke both result in epigenetic changes that revert to normal over time, and changes that can persist over decades (Bossé et al., 2012).

However, interpreting measurements of biomarkers in relation to the timing of the actual exposure events can be complex. Some compounds produce long-lived biomarkers that reflect months or years of exposure; other compounds produce short-lived biomarkers that may correspond only to exposures that occurred a few minutes ago. In addition, there are usually multiple metabolic pathways that influence biomarkers, and these may behave similarly for several related chemicals, or become saturated if multiple exposures use the same pathway. This may make it difficult to know what specific chemical caused the exposure. To understand and apply biomarkers properly, the exposure scientist should be well versed in the pharmacokinetics of the compound: that is, how it is processed in the body (see Chapter 6). Collaboration with a toxicologist can be helpful for understanding such problems.

The ability of biomarkers to integrate exposure over all routes and pathways, a major strength, also can be a major shortcoming. For example, once a molecule such as a pesticide enters the body and is metabolized, the source is no longer identifiable. The exposure may have come from inhalation, dermal contact, or through ingestion of residues in the food supply. A related problem occurs when individuals have different abilities to metabolize a contaminant, due to a genetic polymorphism (see Chapter 7). This can mean that identical exposures can generate different observed metabolite levels in individuals and also convey different risks of health effects. Failure to recognize these genetic differences may lead to misclassification of exposures and the risk of harm to the individual.

Given the valuable insights from such assessments, the use of biomarkers of exposure and effect is likely to increase in the future. New, more accurate biomarkers of exposure are being developed and are appearing in the literature. Current research suggests that panels of biomarkers, measuring multiple markers at once, may be able to overcome some of the shortcomings listed here while giving new and powerful insights into mechanisms of toxicity and control strategies for exposure (Ryan et al., 2007; Cohen Hubal et al., 2010; Cochran & Driver, 2012).

Ingestion and Skin Absorption: Challenges for Exposure Assessment

Ingestion and skin absorption are important routes of exposure in many circumstances. These two routes also pose special challenges for exposure assessment. A duplicate diet study is a direct approach to assessing ingestion exposure. Duplicate portions of the food eaten by test subjects are collected and analyzed for contaminant levels. Typically all of the food eaten is weighed and homogenized to create a single bulk sample. An aliquot of the sample is analyzed for contaminant concentration (mass/mass). The exposure is computed by multiplying this concentration by the amount eaten.

Dietary diaries offer another and indirect approach for assessment of ingestion exposures. Each subject keeps a food diary listing foods eaten and portion sizes. The researcher purchases these foods at local grocery stores for later analysis. A data set is compiled listing each type of food and the contaminant concentrations. The food diary data from each participant can be combined with the concentration data to estimate the amount of contaminant ingested.

Food diaries are much easier to administer than duplicate diet studies and so can be implemented on a large scale. Fewer food samples have to be analyzed because once all the individual food items have been assessed, no further analysis is needed. However, because the foods consumed are never measured, they may differ from what is analyzed. This causes error in the exposure estimates due to the variability in concentrations in various food items.

Dermal exposures can present unique challenges. Patch sampling is one direct assessment technique; an adsorbent material patch is placed on the skin or outer garments. The subject then carries out normal activities, creating exposure to the contaminant. The patches intercept the contaminant of interest before it reaches the skin. Following exposure the patches are removed and analyzed for contaminant mass. Knowing the area of the patch relative to the total exposed skin surface area, one can estimate overall skin exposure to the arms, legs, torso, and so forth. The total exposure is estimated by summing over all exposed areas of the body. Tape stripping is another technique. A special adhesive tape is applied to a known area of the skin, and then stripped away to remove a single layer of exposed skin cells. The tape samples are analyzed like the patch samples described previously, and used to compute total exposure over the skin area. Tape stripping has the advantage that it measures what gets on the skin; repeated stripping in the same area can measure the depth of contaminant penetration.

A limitation of both the patch and tape stripping methods is that they sample only a small portion of the overall exposed skin area. This makes it possible, indeed likely, that some exposed areas will be missed. This can lead to underestimation of exposure. The fluorescent tracer method used in pesticide sampling is an alternative that samples all skin areas. A nontoxic fluorescent tracer is added to the pesticide spray mix, and later the worker is video-imaged under ultraviolet light, revealing areas where the tracer deposited on the skin. The tracer technique can be very useful for both field studies and training simulations; it also has been used to assess potential biological contamination and hand-washing effectiveness.

Summary

The paradigm of industrial hygiene, based on the anticipation, recognition, evaluation, and control of workplace hazards, provides a framework for more general exposure assessment. Many tools of traditional industrial hygiene are transferable to exposure science, which spans both occupational and community settings. But exposure science requires some new tools as well. Sampling strategies, compliance with monitoring protocols, and field implementation are often more difficult in community exposure assessment studies. Community studies also call for statistical sampling techniques much like those used in epidemiological studies. Exposure science is a rapidly growing area, ripe for contributions from professionals in many areas of environmental health.

Key Terms

absorbed dose
The amount of a substance penetrating across the absorption barriers (the exchange boundaries) of an organism, via either physical or biological processes.
absorption factor
The ratio of the mass of the material crossing the absorption barrier to the mass of the material applied to the barrier.
active sampling
Using a mechanical pump, fan, syringe, or other device to draw an environmental sample into a collection medium or vessel for the purpose of capturing an agent.
acute exposure
A contact event between an agent and a target occurring over a short time, generally less than a day. (Other terms, such as short-term exposure and single dose, are also used.)
administrative controls
Methods used to modify or control exposures based on changing work practices or procedures, such as hours worked, location of work, and rest-work rotation schedules.
aggregate exposure
The simultaneous assessment of all routes and exposure pathways into an organism for a single compound (cf. cumulative exposure).
anticipation
The ability to expect the presence of a hazard based on common work practices and knowledge of similar exposure scenarios.
area sampling
Collecting environmental samples at fixed locations, rather than near moving individuals.
bioavailability
The ability or tendency, rate, and extent to which an agent can cross an exposure barrier, be absorbed by an organism, and be available for metabolism or interaction with biologically significant receptors. Bioavailability involves both release from a medium (if present) and absorption by an organism.
biological hazard
A biological substance (such as a bacterium or virus) capable of causing harm to an organism, or a material (such as medical waste) or process (such as cleaning health care facilities) that may entail exposure to such a substance.
biological sampling
Collecting biological specimens.
biomarkers
Indicators of changes or events in biological systems; also called biological markers.
biomarkers of exposure
cellular, biochemical, analytical, or molecular measures obtained from biological media such as tissues, cells, or fluids, indicative of exposure to an agent of interest.
biomonitoring
Collecting biological specimens, such as blood, urine, breath, saliva, or other materials, to detect the presence and amount of exposure to a potentially harmful agent.
chemical hazard
A chemical agent capable of causing harm to an organism.
chronic exposure
A continuous or intermittent long-term contact between an agent and a target. (Other terms, such as long-term exposure, are also used.)
concentration
The amount of contaminant present in an environmental medium, generally expressed in mass per volume (µg/m3) or as fractional dilution ratio (ppmv or ppbv).
control
The ability to modify or limit exposures to harmful agents through intentional modification of exposure pathways.
cumulative exposure
The simultaneous assessment of aggregate exposures to multiple compounds (or nonchemical exposures) that affect organism health through similar mechanisms (cf. cumulative impacts, in Chapter 11).
dermal exposure
Exposure to a contaminant through contact with the skin.
direct reading instruments
Exposure monitoring devices capable of measuring an agent in a sampled medium and providing a direct indication of the agent concentration to the operator through a display or other output.
dose
The amount of agent that enters a target after crossing an exposure surface.
dose rate
Dose per unit of time.
duplicate diet study
A method of sampling ingestion exposures that collects meals identical to those eaten by subjects over a period of time in order to measure the presence of an agent in their food.
duration of exposure
The length of time over which continuous or intermittent contacts occur between an agent and a target. For example, if an individual is in contact with an agent for 10 minutes per day for 300 days over a 1-year time period, the exposure duration is 1 year.
environmental hazard
An agent present in the environment capable of causing harm to an organism.
evaluation
The collection of data in the form of records, interviews, photographs, samples, or other empirical indicators of exposure, and the synthesis of this data into a consistent representation of the health hazard presented by an agent.
exposome
The measure of all a person's exposures, beginning in utero and extending over a lifetime, and how those exposures relate to health.
exposure
Contact between an agent and a target. Contact takes place at an exposure surface over an exposure period.
exposure assessment
The process of estimating or measuring the magnitude, frequency, and duration of exposure to an agent, along with the number and characteristics of the population exposed. Ideally, it describes the sources, pathways, routes, and uncertainties in the assessment.
exposure pathway
The course an agent takes from its source to the target receptor.
exposure scenarios
Combinations of facts, assumptions, and inferences that define a discrete situation where potential exposures may occur. These scenarios may include the source, the exposed population, the time frame of exposure, microenvironment(s), and activities. They often are created to aid in estimating exposure.
exposure science
The application of scientific methods to study human contact with chemical, physical, or biological agents occurring in the environment and to determine the mechanisms and dynamics of events either causing or preventing adverse health outcomes.
frequency of exposure
The number of exposure events in an exposure duration.
health hazard
A potential adverse change in health status.
indirect exposure assessment
Assessment that relies on estimated values (or self-reported values) for the frequency, intensity, and duration of exposure events, rather than on directly observed or measured quantities.
industrial hygiene
The science and professional practice of anticipation, recognition, evaluation, and control of workplace and environmental hazards.
ingestion exposure
Exposure to an agent through eating or swallowing contaminated media.
inhalation exposure
Exposure to an agent through breathing in contaminated air or gases.
intensity of exposure
Generally refers to the magnitude or amount (how much) of contact between the agent and the barrier; can be expressed in quantitative terms (e.g., a concentration) or qualitative terms (high, medium, or low).
isolation
The containing of or limiting of access to hazardous materials (e.g., by placing a physical barrier such as a container between a hazardous agent and the worker).
job-exposure matrix (JEM)
A cross classification of jobs and workplace exposure levels across different agents or time, which assigns typical exposures according to common job classifications and work practices. Used for imputing past workplace exposures (known as exposure reconstruction).
modeling exposures
Creating a physical or a conceptual mathematical representation of the exposure process, including events and outcomes.
passive sampling
Sampling an agent without using a mechanical pump, fan, syringe, or other device to draw an environmental sample into a collection medium or vessel. Typically passive sampling collects samples by diffusion or gravitational sedimentation.
peak concentration
The maximum concentration experienced during an exposure event.
personal protective equipment
Clothing, eyewear, respiratory protection devices, or any other items worn by a person and designed to prevent injury or harm from an agent present in the environment.
personal sampling
Collecting a sample with a portable sampler affixed in the immediately breathing zone of a mobile individual.
pharmacokinetics
A branch of science dedicated to determining the internal fate and distribution of substances administered to a living organism.
physical hazard
An agent that presents a hazard due to its ability to deposit excessive energy (e.g., mechanical, acoustic, thermal, electromagnetic, or nuclear) in an organism.
protective devices
Barriers, guards, or other equipment placed between a person and a hazard, designed to prevent injury or harm (also see personal protective equipment).
recognition
Appropriate identification and classification of hazardous agents in the workplace or environment.
safety engineering
A discipline of applied science that seeks to minimize potential health hazards through the application of process design and testing standards that reduce the likelihood of injury or adverse outcomes.
safety hazard
A set of circumstances or agents that can increase the likelihood of injury or adverse outcomes for a person.
sample collection instruments
Devices designed to probe environmental media and capture or record agents for analysis.
subchronic exposure
A contact between an agent and a target of intermediate duration between acute and chronic. (Other terms, such as less-than-lifetime exposure, are also used.)
substitution
Replacing a hazardous agent with another and less hazardous alternative.
target organ dose
The amount or fraction of a harmful agent that reaches an organ or tissue in the body that is the site where the adverse health outcome originates.
ventilation
Movement of air, through the use of fans or other means, so as to provide sufficient clean or uncontaminated air to maintain health.
walk-through
A site survey technique used by hygienists and exposure scientists to observe potential hazards; generally it consists of a planned traverse of the (work) site, accompanied by people familiar with the work processes, along with forms, photographic equipment, or other methods of documenting the conditions.

Discussion Questions

  1. What are the three primary routes of exposure for people?
  2. Which routes of exposure are most important for workplaces? Which routes are more important in the community setting than they are in the workplace? Which are less important?
  3. How are the routes of exposure different from exposure pathways? Which one is more important for the purpose of controlling exposures? Provide an example of an exposure pathway and describe how it applies to exposure control.
  4. Name three key advantages of exposure biomonitoring compared to other types of exposure assessment. Are there any disadvantages to using biomonitoring?
  5. Provide an example of a direct exposure assessment method and an indirect exposure assessment method for dietary exposures. What are the strengths and limitations of each? Which type of assessment applies to a duplicate diet study?
  6. What are the differences between cumulative assessments and aggregate exposure assessments?

References

  1. Bossé, Y., Postma, D. S., Sin, D. D., Lamontagne, M., Couture, C., Gaudreault, N.,…Laviolette, M. (2012). Molecular signature of smoking in human lung tissues. Cancer Research, 72, 3753–3763.
  2. Cochran, R. C., & Driver, J. H. (2012). Estimating human exposure: Improving accuracy with chemical markers. Progress in Molecular Biology and Translation Science, 112, 11–29.
  3. Cohen Hubal, E. A., Richard, A. M., Shah, I., Gallagher, J., Kavlock, R., Blancato, J., & Edwards, S. W., (2010). Exposure science and the U.S. EPA National Center for Computational Toxicology. Journal of Exposure Science & Environmental Epidemiology, 20(3), 231–236.
  4. International Occupational Hygiene Association. (2009). What is occupational hygiene? Retrieved from http://ioha.net/objectives.html
  5. Koren, H., & Bisesi, M. (2002). Handbook of environmental health and safety: Principles and practices (2 vols., 4th ed.). Boca Raton, FL: CRC Press.
  6. Kyrtopoulos, S. A. (2013). Making sense of OMICS data in population-based environmental health studies. Environmental and Molecular Mutagenesis, 54(7), 468–479.
  7. Marsillach, J., Costa, L. G., & Furlong, C. E. (2013). Protein adducts as biomarkers of exposure to organophosphorus compounds. Toxicology, 307, 46–54.
  8. National Research Council. (2012). Exposure science in the 21st century: A vision and a strategy. Washington, DC: National Academies Press.
  9. Ryan, P. B., Burke, T. A., Cohen Hubal, E. A., Cura, J. J., & McKone, T. E. (2007). Using biomarkers to inform cumulative risk assessment. Environmental Health Perspectives, 115(5), 833–840.
  10. Sexton, K., Needham, L. L., & Pirkle, J. L. (2004). Human biomonitoring of environmental chemicals: Measuring chemicals in human tissues is the “gold standard” for assessing people's exposure to pollution. American Scientist, 94(1), 38–45.
  11. U.S. Environmental Protection Agency, Office of Research and Development, National Center for Environmental Assessment. (2015). Integrated Risk Information System (IRIS): IRIS glossary. Retrieved from http://www.epa.gov/ncea/iris/index.html
  12. Wild, C. P. (2005). Complementing the genome with an “exposome”: The outstanding challenge of environmental exposure measurement in molecular epidemiology. Cancer Epidemiology, Biomarkers & Prevention, 14(8), 1847–1850.

For Further Information

Books and Articles

Standard References in Industrial Hygiene

  1. Plog, B. A., & Quinlan, P. J. (Eds.). (2012). Fundamentals of industrial hygiene (6th ed.). Itasca, IL: National Safety Council.
  2. Ramachandran, G. (2005). Occupational exposure assessment for air contaminants. Boca Raton, FL: CRC Press.
  3. Rose, V. E., & Cohrssen, B. (Eds.). (2011). Patty's industrial hygiene (4 vols., 6th ed.). Hoboken, NJ: Wiley.

Overviews of Exposure Assessment

  1. Butt, T. E., Clark, M., Coulon, F., & Oduyemi, K. O. (2009). A review of literature and computer models on exposure assessment. Environmental Technology, 30(14), 1487–1501.
  2. Cohen Hubal, E. A., Sheldon, L. S., Burke, J. M., McCurdy, T. R., Berry, M. R., Rigas, M. L.,…Freeman, N. C. (2000). Children's exposure assessment: A review of factors influencing children's exposure, and the data available to characterize and assess that exposure. Environmental Health Perspectives, 108(6), 475–486.
  3. Cordioli, M., Ranzi, A., De Leo, G. A., & Lauriola, P. (2013). A review of exposure assessment methods in epidemiological studies on incinerators. Journal of Environmental and Public Health, 129470.
  4. Poole, A., van Herwijnen, P., Weideli, H., Thomas, M. C., Ransbotyn, G., & Vance, C. (2004). Review of the toxicology, human exposure and safety assessment for bisphenol A diglycidylether (BADGE). Food Additives and Contaminants, 21(9), 905–919.
  5. Rezagholi, M., & Mathiassen, S. E. (2010). Cost-efficient design of occupational exposure assessment strategies—a review. Annals of Occupational Hygiene, 54(8), 858–868.

Reviews of Exposure Biomarkers

  1. Corradi, M., Goldoni, M., & Mutti, A. (2015). A review on airway biomarkers: Exposure, effect and susceptibility. Expert Review of Respiratory Medicine, 9(2), 1–16.
  2. DeMarini, D. M. (2013). Genotoxicity biomarkers associated with exposure to traffic and near-road atmospheres: A review. Mutagenesis, 28(5), 485–505.
  3. Liu, K. S., Hao, J. H., Zeng, Y., Dai, F. C., & Gu, P. Q. (2013). Neurotoxicity and biomarkers of lead exposure: A review. Chinese Medical Sciences Journal, 28(3), 178–188.

In addition, the Centers for Disease Control and Prevention maintains a Web site with useful information on biomonitoring: http://www.cdc.gov/biomonitoring

The Exposome

In addition to Kyrtopoulos (2013), listed in the References, see the following:

  1. Bonvallot, N., Tremblay-Franco, M., Chevrier, C., Canlet, C., Debrauwer, L., Cravedi, J. P., & Cordier, S. (2014). Potential input from metabolomics for exploring and understanding the links between environment and health. Journal of Toxicology and Environmental Health, Part B: Critical Reviews, 17(1), 21–44.
  2. Verma, M. (2012). Epigenetic biomarkers in cancer epidemiology. Methods in Molecular Biology, 863, 467–480.
  3. Wild, C.P., Scalbert, A., & Herceg, Z. (2013). Measuring the exposome: A powerful basis for evaluating environmental exposures and cancer risk. Environmental and Molecular Mutagenesis, 54(7), 480–499.

Reviews of Job-Exposure Matrices

  1. Burstyn, I. (2011). The ghost of methods past: Exposure assessment versus job-exposure matrix studies. Occupational and Environmental Medicine, 68(1), 2–3.
  2. Lavoue, J., Labreche, F., Richardson, L., Goldberg, M., Parent, M. E., & Siemiatycki, J. (2014). 0382 CANJEM: A general population job exposure matrix based on past expert assessments of exposure to over 250 agents. Occupational and Environmental Medicine, 71(Suppl. 1), A48.
  3. Peters, S., Kromhout, H., Portengen, L., Olsson, A., Kendzia, B., Vincent, R.,…Vermeulen, R. (2013). Sensitivity analyses of exposure estimates from a quantitative job-exposure matrix (SYN-JEM) for use in community-based studies. Annals of Occupational Hygiene, 57(1), 98–106.

Organizations

Information on Industrial Hygiene

  1. American Conference of Governmental Industrial Hygienists (ACGIH): http://www.acgih.org
  2. American Industrial Hygiene Association (AIHA): http://www.aiha.org

Information on Exposure Assessment

  1. International Society of Exposure Science (ISES): http://www.iseaweb.org
  2. U.S. Environmental Protection Agency, Exposure assessment tools and models (2014): http://epa.gov/opptintr/exposure