The principles of green chemistry have closely been related to the field of toxicology since the concepts were first articulated in the early 1990s. At least three of the 12 principles of green chemistry involve either human health issues or environmental toxicity and risk identification [1]. These are:
Chemicals that exist in or that are entering the environment fall into two toxicological categories: data rich and data poor, with 80–90% of all chemicals falling into the data-poor category. Over the last several years, new technologies have emerged that attempt to resolve the data-poor category to allow clear hazard-based decisions and regulatory actions to take place. These technologies have included multiple genomics technologies, high-throughput screening against known toxicity related targets, high-content screening where targets, time frames, and chemicals are evaluated in multiplexed formats, and more relevant cell-based assays that presumably replicate human systems [3]. More recently, the fields of systems biology and computational toxicology have evolved into primary tools used to identify data gaps and suggest prioritized screening programs.
In many respects, these newer approaches have changed the focus of toxicological inquiry from standard testing schemes in animals to a more integrated approach that identifies critical events leading to toxicity and indicators of hazards in a variety of nonanimal systems. The majority of publicly available open access databases of toxicological findings were derived from older, standardized testing procedures in which newer concepts were not incorporated. The newer concepts include attempts to define mode of action (MoA) [4], adverse outcome pathways (AOPs) [5], and the threshold of toxicological concern (TTC) [6].
This chapter highlights both older and newer methodologies and discusses the challenges of using animal-based, cellular, or subcellular assays to predict human toxicological outcomes. A critical question will be raised in Section 11.7: How much toxicological science must a chemist know and understand to function in a high-level green chemistry mode?
Several textbooks and articles have been written on various aspects of the field of toxicology, most notably Casarett & Doull’s Toxicology: The Basic Science of Poisons [7], which offers clear, concise descriptions of key concepts of toxicology. However, a useful starting point for a nontoxicologist is an open access toxicology tutorial at http://sis.nlm.nih.gov/enviro/toxtutor/Tox1. The following sections summarize key points in the online tutorial and can also be studied in more detail in the Casarette & Doull’s textbook.
The discipline of toxicology draws upon a combination of chemistry and biology. More specifically, toxicology is the study of the adverse effects of chemicals and physical agents on living organisms. A toxicologist investigates the inherent properties of toxicants, their interactions with biomolecules, and the subsequent effects on exposed organisms and systems.
Many types of chemicals may cause detrimental effects in living individuals. A toxic agent is simply anything that produces an adverse biological effect, including physical agents such as radiation or temperature. More narrowly, a toxicant is a substance that produces an adverse effect. This definition excludes physical and biological (living) agents. A toxin is a specific harmful protein produced by a living organism. Poison is a general term for a toxicant that causes immediate illness or death in relatively small concentrations. Note that biological agents such as bacteria and viruses that invade and multiply are not considered toxicants, though they possess the potential to harm their hosts. Toxicology typically addresses nonliving agents such as elements and molecules. Toxicology, therefore, is intricately linked with the understanding and advances of chemistry.
Many contributing factors determine the effect of a given toxicant in a particular situation. The age, species, and sex of an exposed organism each influences the toxicant’s action. Additionally, the chemical form, dosage, and route of exposure (dermal, gastrointestinal tract, etc.) of the toxicant are critical factors. Together, these variables govern the amount of the substance that enters the body and thereby its ultimate effect. Toxicokinetics will be discussed more fully in subsequent sections.
Adverse effects can also be classified as chronic, subchronic, subacute, or acute. Chronic toxicity refers to cumulative damage after months or years of exposure to a toxicant. Subchronic usually describes an incidence of exposure that lasts several weeks or months. Subacute indicates an exposure event that is limited, but repeated more than once. Acute toxicity is the term for an immediate and often severe effect that is apparent after a single dose. A single compound may exert different effects at different exposure levels. For example, one acute effect of benzene is central nervous system depression, while chronic benzene exposure may cause bone marrow toxicity.
Toxic effects may be categorized further by the tissue, organ, or system that is targeted. Some examples to demonstrate the variety of toxic pathways are (1) reproductive toxins that harm the gonads, reproductive organs, or fetus, (2) hepatotoxins that specifically damage liver cells, or (3) genotoxins that alter DNA or chromosome structure or number. In this way, toxicants may act on whole organisms, specific cell types, or a single biological molecule. The diversity of toxicants is reflected in the myriad effects they produce in living systems.
The experimental process is essential to the discipline of toxicology. Scientific studies allow toxicologists to identify toxicants and understand their mechanisms. This information is imperative to medical and regulatory professionals who work to preserve human health and environmental sustainability. Toxicological findings and expression of risk will appear differently based on whether the compound in question is an environmental chemical or a therapeutic agent. With therapeutics, there are always risk/benefit evaluations, whereas with environmental chemicals there are typically no benefits, simply risk. For chemicals in products, such as consumer products including intentional food additives, one must assume that the chemical has a desired function that is qualified by potential toxicity risk. Toxicological research, however, can be challenging to conduct as it necessarily investigates adverse effects in living systems, in many instances at much higher doses than those expected to reach humans. Toxicologists generally rely on three sources of scientific data to potentially minimize human injury: (1) accidental or routine exposure cases in human populations, or clinical trials with therapeutics in humans, (2) animal studies (in vivo), and (3) cellular studies (in vitro). Accidental or routine exposure scenarios afford toxicologists the rare opportunity to collect human data on environmental chemicals. This is invaluable in understanding human metabolism and susceptibility. Animal studies have traditionally been used to mimic and predict human reactions to toxicants in various levels and circumstances. Unlike the human exposure cases, animal studies allow the researcher to control more variables and produce statistically stronger data. The task of extrapolating from animals to humans, however, remains a potential challenge of the method. Cellular studies have grown in popularity and utility in recent decades. They offer the advantages of being highly controllable and economical and they do not use animal subjects. Cellular studies, however, are generally not as comprehensive as other test methods, and extrapolation to humans is even more challenging from cell lines than animal models. Currently, scientific advances and ethical concerns are pushing the field of toxicology toward alternative testing methods that spare human, animal, and environmental distress.
A dose is the amount of a substance administered at one time. As simple as this term is, it is the foundation of one of toxicology’s founding precepts: “dose determines response.” Based on this premise, there is a safe and dangerous level for every substance, whether it is typically considered benign or toxic. A dose–response graph depicts the relationship between exposure and result or toxicological endpoint. A typical dose–response graph uses experimental data to plot a variety of doses on the x-axis against the frequency of some measurable effect on the y-axis. Most often the graph resembles an “S,” with low doses eliciting no response and high doses eliciting the maximum response. Between these areas, the response frequency increases with the dose.
Key data points on a dose–response curve and which appear in the literature and public databases are the no observed adverse effect level (NOAEL) and lowest observed adverse effect level (LOAEL). These are actual data points on the graph that reflect the highest level in which no adverse effect is seen, and lowest level at which an adverse effect is seen. In some cases, these levels are the subject of interpretation by a toxicologist as to what is considered adverse in a study. If the level reflected any effect, adverse or not, it would be called the no observed effect level or NOEL. Other important pieces of information are the effective dose (ED), in therapeutic terms, toxic dose (TD), and lethal dose (LD). These doses can be extrapolated from the dose–response graph for any desired portion of the population. For example, the LD50 is the dose that is lethal in 50% of the population being tested. One measure of safety used by toxicologists studying therapeutic entities is the therapeutic index (TI). Usually this is calculated as the ratio of the TD50 over the ED50, or the dose that is toxic in 50% of the population tested over the dose at which 50% of the population is expected to experience the relevant effect. These 50% levels are usually extrapolations from the dose–response curve. The TI is sometimes misleading, however, as it does not consider the unique shape of the dose–response curve for each determination; that is, the shape and slope may be different for toxicity than it is for efficacy.
The shape of a dose–response curve is highly informative. A compound is generally considered safer the further its effective (beneficial) dose is from its toxic dose. The slope of the dose–response line between the TD0 and TD100 visually represents this relationship. A larger or steeper slope describes a potentially dangerous compound for which the toxic dose is close to the effective dose. A smaller or less steep slope implies a safer compound with a greater range between the effective and toxic doses. The margin of safety (MOS) uses this information to categorize and compare the safety of different compounds. The MOS is computed as the ratio of the LD1 over the ED99. The larger the MOS, the safer the compound is generally considered. Again, this is a calculation generally used for therapeutic compounds. For environmental chemicals, other calculations are used, some of which use an estimate of acceptable daily intake versus levels of intentional or unintentional exposure.
A few definitions are needed to understand risk assessment. Risk is the likelihood that a hazard will occur in a given set of circumstances. Hazard refers to the capability of a substance to cause an adverse effect. Furthermore, risk assessment is the practice of identifying hazards, exposures, and risks, and risk management is the regulatory action based on scientific, social, and economic factors.
The first publication of standardized risk assessment concepts and terminology was written by the National Academy of Sciences in 1983 [8]. This paper outlines four basic steps of risk management: (1) hazard identification, (2) dose–response assessment, (3) exposure assessment, and (4) risk characterization. In the first step, a scientist attempts to identify the harm a compound could cause.
This can be determined by animal, cellular, and epidemiology studies or predicted by the chemical structure using a structure–activity relationship (SAR) model. In the dose–response assessment step, available data are mathematically manipulated to address human metabolism and exposure parameters. This usually entails extrapolating a relevant low dose for humans based on high-dose animal studies. A useful value to calculate is the acceptable daily intake (ADI), which is calculated by multiplying the NOAEL by a safety factor, which accounts for uncertainty in cross-species extrapolations. The ADI reflects the amount an individual can be exposed to daily without experiencing adverse effects. The exposure assessment portion identifies the route, frequency, severity, and duration of an exposure event in a given population. Finally, the risk characterization step estimates a substance’s effects in given environments and populations. The characterization must consider the intensity and nature of an effect and the consequences of exposure to multiple substances. Furthermore, this step should evaluate the previous three steps and note areas of uncertainty.
Regulation is the intersection of scientific data and law. Regulation is based in risk assessment and describes the action taken by a government to protect the public from dangerous toxic exposures. Governmental bodies such as the Environmental Protection Agency, Food and Drug Administration, and Occupational Safety and Health Administration are responsible for enacting these protective measures. Regulatory practices include setting exposure limits, mandating protective occupational procedures, and requiring sufficient warning labels. The challenge of regulation is finding an acceptable, realistic exposure level. Calculations must include a variety of factors that may affect exposure such as water intake, air intake, time spent indoors/outdoors, and compounding exposures. With so many untested chemicals and variable exposure factors, regulation is one of the most challenging and important areas of toxicology.
Toxicokinetics describes the journey of a toxicant within a living system. The process includes four fundamental steps: absorption, distribution, metabolism, and excretion. Simplistically, toxicokinetics may be thought of as “what the body does to a chemical.” This is contrasted by the term toxicodynamics, which may be thought of as “what the chemical does to the body.” The amount of toxicant in the blood over certain time periods becomes the internal dose and is the most relevant measurement in determining potential effects in humans [9].
For a toxicant to exert a biological effect, it must first gain access to a living individual. The route by which a chemical enters the body can dramatically influence its action. The absorption path is determined by a combination of the toxicant and organism’s inherent chemical properties. In general, lipophilic, nonpolar compounds pass most easily through the body’s lipid cellular membranes. Specialized receptors, however, may bind and uptake selected substances that could not independently enter the cell. Additionally, a process called endocytosis in which a cell engulfs surrounding molecules may enable cellular absorption of polar, hydrophilic molecules. Some of the most common routes of absorption are listed below.
GASTROINTESTINAL TRACT. The gastrointestinal (GI) tract is a complex system comprised of several diverse environments including the acidic stomach and alkaline intestines. Substances that are taken into the body through the mouth follow this absorption path.
The digestive solvents of the stomach provide a favorable environment for absorption of acidic compounds. Within the stomach, acidic molecules are nonionized due to the excess of protons H+. As nonionized compounds, they pass through the stomach’s cellular membranes more easily into general circulation. Most compounds, however, continue through the GI tract to the small and large intestines after the stomach, where different pH gradients exist. Compounds absorbed via the gastrointestinal tract will be distributed to the liver first and subject to first-pass metabolism prior to entering the systemic circulation. This is discussed later in more detail (see Section 11.2.2.4).
RESPIRATORY TRACT. Volatile substances that are inhaled enter the body through the respiratory tract. This includes the nasopharyngeal region, tracheal area, and lungs. The lungs are the exchange site of vital gaseous compounds present in the air and volatile waste compounds from the body. The exchange relies on a large surface area within the lungs composed of a single layer of cells. While efficient for gas exchange, the slim barrier makes the lungs especially vulnerable to volatile toxicants.
DERMAL ROUTE. Substances that are able to penetrate the skin may follow the dermal absorption path. The skin, or epidermis, is a relatively difficult barrier to penetrate. It has many layers of cells, is comparatively dry, and provides only limited access to blood vessels. The outer layer of the skin, the stratum corneum, is packed with the protein keratin that makes it especially difficult to penetrate.
OTHER ROUTES. The GI tract, respiratory tract, and epidermis are the most important absorption paths of most toxicants, but other routes do exist, especially in a medical context. These include intradermal, subcutaneous, intramuscular, intraperitoneal, and intravenous injections.
Distribution is the step of toxicokinetics in which a toxicant moves from the site of absorption throughout the body. The toxicant may pass cell barriers, enter the circulation system, or enter the lymphatic system. The distribution mechanism can have profound effects on the action of a toxicant. A substance absorbed through the GI tract will be transported through the portal system to the liver. Compounds absorbed through the lungs, skin, or intravenous injection, however, will immediately enter systemic circulation. These two routes are metabolically distinct, as demonstrated in the next section.
Distribution often follows blood flow. Organs that receive the most blood, therefore, are at heightened risk of toxicant exposure. Some toxicants may have special affinity for certain tissues regardless of blood circulation levels. The body also possesses several barriers such as the blood–brain barrier or placental barrier that specifically limit distribution. Distribution is not a uniformly beneficial or detrimental process. A toxicant may be distributed to its site of toxic action or it may be transported to an inert storage tissue. For example, lead is a potent neurotoxin when it reaches the brain. It is chemically similar to calcium, however, and may safely be taken up and stored by bone.
Several distribution models have been developed to describe a toxicant’s movement through the body. These models help estimate the concentration and duration of a toxicant in a living individual. The simplest distribution models treat the body as one compartment. This assumes that the chemical is spread evenly through the body and eliminated at a steady rate proportional to the amount left in the system. In reality, few toxicants follow a one-compartment model. A two-compartment model is slightly more complex and usually more realistic. In this estimation, the first compartment clears the toxin from the body steadily with time, similarly to the one-compartment model. The toxicant concentration in a second compartment, however, rises as the first declines. This represents the toxicant’s movement from one area of the body to a second compartment, for example, from the blood to adipose tissue (body fat). The concentration in the second compartment also declines with time, but at a different rate. This concept can be expanded to multicompartment models that consider additional regions of the body.
The body uses metabolic processes to transform toxic chemicals to more hydrophilic, thus more easily excreted, compounds. In some cases, however, metabolism “bioactivates” a molecule, increasing its toxicity. Metabolism can be broken into two main phases. In phase I, a functional chemical group is added to the chemical compound in preparation for phase II. Phase I enzymes can oxidize, reduce, or hydrolyze a toxicant. The most numerous phase I enzymes are called cytochrome P450s, or CYPs. In phase II, enzymes that recognize the phase I chemical motifs or functional groups add large polar groups to the toxicant. This prevents the compound from passing through cellular membranes and being absorbed and increases the elimination of the conjugate (substance in its hydrophilic state). Two important phase II reactions are glucuronidation which adds glucuronic acid (carboxylic acid C6H10O7) or sulfation which adds sulfate (SO42−) to the chemical structure. Each phase I and II enzyme has a range of accepted substrates, which are well known and in several instances are predictable.
The liver possesses the most metabolic enzymes. This is why it is important for toxicants to pass through the liver and undergo detoxifying metabolic reactions before entering systemic circulation. This protective process is called first-pass metabolism and, as mentioned earlier, is the main pathway for oral absorption. Substances absorbed dermally, intravenously, or inhaled bypass first-pass metabolism and the portion delivered to the liver is proportional to the systemic blood supply to the liver. The kidneys and lungs are lesser sources of detoxification and have approximately 10–30% of the metabolic capacity of the liver.
Excretion is the process by which a compound exits the body. Urinary excretion, fecal excretion, and exhaled air are the most common pathways. Compounds removed from the blood by the kidney are excreted in urine. Within the kidney, functional units called nephrons filter the blood. Some bases and weak acids are actively transported out of the blood. Ions are especially subject to the kidney’s filtration. Large molecules are primarily excreted in fecal matter because of the size exclusion in the glomerulus of the kidney. In addition, large molecules may be secreted in the bile and therefore excreted in the feces. Gases with low solubility and liquids with high volatility are efficiently excreted in exhaled air. Other less prominent excretion pathways include breast milk, sweat, and tears. Certain heavy metals such as Cr, Cd, Co, W, and U will accumulate in hair follicles and be detectable in hair.
The body is an adaptive system that constantly adjusts in response to its changing surroundings. Homeostasis refers to the body’s ability to retain a relatively stable state in variable conditions. In the presence of a disruptive toxicant, cells undergo various response pathways to cope with the imbalance. This may involve an increase or decrease in cellular activity or an alteration to the cell’s morphology and function. At times, the repair system is detrimental to the organism, and this is referred to as pathological adaptation. A relatively new approach in toxicology research deals with the identification of targets or biological molecules that are perturbed and that create a signal of potential outcomes. These signals are called biomarkers and will be discussed later (see Section 11.3.2).
Atrophy is the cellular response of size reduction. This response lessens the cell’s oxygen, organelle (specialized subunit within a cell with a specific function), and nutrient needs. This may be an effective reaction to diminished resources. Hypertrophy is an increase in cell size. This can be beneficial when increased capacity is demanded of a cell that does not normally divide, such as cardiac or skeletal muscle cells. Hyperplasia refers to an increase in cellular number by division. This is only possible in cells capable of mitosis. During metaplasia one mature cell type transforms to another mature cell type. This process is used when scar tissue, for example, replaces normal functioning tissue in response to chronic irritation or inflammation.
Some toxins induce irreversible cell damage. When cells are unable to effectively adjust to an imbalance, they undergo apoptosis or necrosis. Apoptosis is a normal cellular process in which a cell undergoes programmed death. The cell shrinks and fragments into bodies that are naturally phagocytosed or cleared from the area. This does not initiate an inflammation response or effect surrounding cells. Necrosis, conversely, is a disorderly and hazardous cellular death pathway. When a toxicant induces cellular necrosis, the cell may lose membrane integrity and leak its internal contents. This initiates an inflammatory response that is potentially dangerous to surrounding cells. These attributes of cells are used to determine the effects of compounds in specific cell types in cellular assays. Such assays are used to develop in vitro dose–response curves in the studied cells.
The central and peripheral nervous systems are composed of specialized neural cells. Toxicants may affect neurons and supporting cells in three main fashions. The first is by inducing cell death. The second is by disrupting electrical transmission. Neurons function by passing electrical signals down the length of their cell bodies to adjoining cells. When this process is disrupted, sensory, motor, and thought functions are diminished or lost. Some signals and commands are transmitted between neurons by neurotransmitters, or chemicals that are released from one neuron and received by another. Toxicants that interfere with this process may increase or decrease neurotansmitter activity and alter normal signaling.
Cancer is a specialized response pathway that is a health concern of high importance to toxicologists. In cancer, a cell frequently experiences a mutation that confers unregulated growth. In a promotion event, the cell proliferates and forms a mass, or tumor. During progression, individual cells dislocate from the tumor. This is the process of metastasis. Carcinogens are agents that induce mutagenesis, promotion, or progression events. The formation of cancer in animal studies is the endpoint that causes the most widespread concern both for environmental chemicals and therapeutic agents. This will be discussed in more detail later in this chapter (see Section 11.6).
Newer approaches to screening for potential toxic effects in humans rely heavily on the precedent of novel, useful screens that have been incorporated in the drug discovery process of the biopharmaceutical industry [10]. For the most part, procedures that have been most successful are those that are focused on understanding chemically induced toxicity, where a correlation between chemical structure and potential toxicity is assessed [11, 12]. These include both in vitro and in vivo studies designed to uncover chemical reactivity or metabolic instability, unintended off-target effects possibly related to unanticipated interaction with targets or receptors other than those targeted for therapeutic reasons, physiochemical properties of compounds and/or metabolites which affect the absorption, distribution, and elimination of these molecules, and induced or altered biological interactions that could lead to unintended effects. Compounds or classes of compounds that are known to confer toxicity to a specific tissue will typically lead to the development of an in vitro assay with results correlated with potential toxicity endpoints in either animal or human studies. These can become defined, quantifiable endpoints that can lead to a series of predictive toxicological practices. To facilitate the discovery process of therapeutics, chemical libraries are maintained that group chemicals by structural similarities [13]; therefore, detailed quantitative structure–activity relationship (QSAR) modeling can be used as a predictive process. It should be pointed out that these screens are used primarily to rank order compounds in analog series [14]. The challenge is greater for chemical compounds where distinct analog series do not exist, such as industrial chemicals or environmental pollutants. Years of work in this field have led to some important conclusions. First, screens must be predictive, and the predictive endpoint must be human toxicity. If the goal of any screening effort is to predict toxicity in rodents because rodents were used in pivotal toxicology studies, then the toxicity induced in rodents must be relevant and predictive of the same effect in humans. Typically, the most relevant cellular in vitro assays are done in human cell-based systems, which maintain both genotype and phenotype [15]. Second, assays developed retrospectively based on experience or a previously observed effect tend to validate the previous finding but may not be predictive in previously untested compounds. Third, it must be clear whether the screening process is predicting potential toxicity or a plausible mechanism that could potentially lead to toxicity in humans [14]. This lesson was learned in large-scale toxicogenomics screening programs where a gene expression endpoint was a potential mechanism marker but not necessarily predictive of toxicity [16].
Because of these challenges, and particularly the advantages and disadvantages of relying on results from single endpoint screens, the field has been turning to the integrated approach of incorporating emerging technologies into chemical safety assessment [17]. Three important aspects of this approach are highlighted below.
The mode of action (MoA) approach seeks to gain an understanding of the key events along a causal pathway that lead to a toxicological endpoint. Extensive reviews of MoA examples exist in the literature [18, 19, 20]. Additionally, The International Programme on Chemical Safety (IPCS) has published Human Relevance Frameworks [21, 22], a process that incorporates a weight of evidence approach relying heavily on robust mechanistic and experimental data. The evaluation sequence is listed below as a series of questions.
A good example of some of the uncertainty and differences in response between species is the evaluation of tumors in rodent studies and the corresponding relevance in humans [23]. Rodent carcinogenicity studies are evaluated and peer reviewed but only in reference to the chemical being tested and the validity of the findings (usually statistical significance) in the study. Studies and results appear in public access databases, and these data become the evidence from which to draw human relevance. Using examples of rat thyroid and pancreatic tumors, relevance conclusions may contain the following type of information.
The rat thyroid is more susceptible to secondary (nongenotoxic) carcinogenesis than the human thyroid because rats lack the high-affinity thyroxine-binding globulin present in humans. Instead, rats utilize a low-affinity albumin (103-fold lower affinity). Consequently, the thyroid hormone half-life in rats is 10 times shorter than in humans and turnover is more rapid, requiring higher “work” to maintain homeostasis. Furthermore, a greater amount of thyroid hormone is present in the follicular colloid of humans than rats. Thus, greater demand for the thyroid hormone in humans is addressed by the ready reserve available from thyroid binding globulin and colloid. The rat, however, requires synthesis of more hormone and is associated with a greater thyroid follicular proliferative response. Accordingly, rats are more susceptible to hyperplasia and neoplasia with the disruption of the synthesis, secretion, or metabolism of thyroid hormones. In rats, proliferative changes are primarily due to a prolonged stimulation by TSH released by the pituitary (endocrine gland connected to the hypothalamus) in response to decreases in circulating T3 and T4 levels (T3 and T4 are tyrosine-based hormones produced by the thyroid gland and responsible for metabolism regulation). Alterations in the normal feedback mechanisms usually occur from:
Rats in an altered metabolic condition, particularly with an increased metabolic load and a stimulated pituitary gland, would be likely candidates for thyroid follicular tumors [19, 23, 24, 25, 26, 27]. Furthermore, unlike rats, hypothyroidism in humans, associated with increased TSH, is not related to an increased risk of thyroid cancer. The only recognized human thyroid cancer-inducing agent is radiation.
Pancreatic cancers of the acinar cell type (acinar refers to a cluster of cells resembling berries) can be experimentally induced in rats through a sequence of changes beginning with hyperplasia, progressing to adenomas and ultimately carcinomas. In contrast, humans characteristically develop ductal carcinomas and their pathogenesis does not involve acinar cell hyperplasia or adenomas. High unsaturated fat diets, corn or safflower oil by gavage, trypsin inhibitors (e.g., raw soy flour), and gastrointestinal surgical procedures have all been shown to induce pancreatic acinar cell cancer in rats. The gastrointestinal hormone cholecystokinin (CCK) is a trophic factor for the normal rat pancreas leading to increased acinar cell proliferation and may enhance pancreatic tumors in rats. There is evidence that stimulation of endogenous CCK levels by different xenobiotics will lead to rat pancreatic hypertrophy and hyperplasia [28, 29, 30, 31]. CCK does not act as a trophic factor in mice, hamsters, dogs, nonhuman primates, or humans; CCK does not increase acinar cell proliferations in these species and does not appear to be involved in carcinoma induction in these species.
These two examples show the extent of evaluation needed to establish a MoA approach and judge the relevance of toxicological findings from animals in humans.
The adverse outcome pathway (AOP) approach is characterized by identifying a sequence of events based on chemical–biological interactions at the molecular level [5]. It describes in vivo chemical perturbations that trigger subcellular, cellular, tissue, organ, organism, and ecotoxicological population effects from exposure to the toxicant. Since several of the event endpoints or perturbations are speculated or predicted, this creates the need for robust predictive algorithms that can substitute for experimental data. Endpoints in these toxicological systems are frequently referred to as biomarkers. Biomarkers are characteristics objectively measured to become indicators of normal biologic processes, pathogenic processes, or, in the case of therapeutics, pharmacologic response(s) to therapeutic intervention. Biomarkers are frequently used to validate an in vitro system and establish that it characterizes toxicity in a way similar to in vivo studies. Biomarkers are an integral part of targeted therapeutic approaches and personalized medicine because they serve as an objective indicator that a molecule is reaching its intended target and eliciting a desired effect. In environmental research and risk assessment, biomarkers are frequently referred to as indicators of human or environmental hazards [32]. The key to discovering or predicting biomarkers through computational means involves the prediction or identification of the molecular targets of toxicants and the association of these targets with perturbed biological pathways.
The threshold of toxicological concern (TTC) describes a level of toxicant exposure that represents negligible risk to human health or the environment. In some situations, this is also referred to as a de minimis level. Data-poor chemicals often are classified using TTC as a surrogate for definitive toxicity data [6]. Decision trees are often constructed using the original classifications by Cramer and co-workers [33], which are as follows:
Over time the TTC principles have been vetted against a number of diverse datasets and differing opinions have been raised about the robustness of the approach.
Structural alerts or chemical motifs known to be associated with toxicity through either the parent compound or reactive metabolites have also been used to predict potential toxicity from chemical structures. These expert algorithms appear in commercial software programs and in online open access sites. For instance, there are seven chemical domains that are used to define and predict the covalent interaction between a chemical and a macromolecule (biological target) that leads to an initiating event at the beginning of an AOP [34]. These include Michael addition, acylation, Schiff base formation, aromatic nucleophilic substitution, unimolecular aliphatic substitution, bimolecular aliphatic nucleophilic substitution, and reactions involving free radicals. The structural alerts from ToxTree define 57 unique structural features that are categorized into various toxicological domains [34]. These are relevant from a chemical structure standpoint but it must be remembered that they lack biological context. These alerts have been relevant and particularly useful in predicting in vitro genotoxic potential where results can be validated back through higher throughput assays. The website for OpenTox, www.opentox.org, is a source for both the TTC and structural alerts and also for characterizing compounds through a “read across” approach, where groups of chemicals are categorized based on physicochemical and toxicological properties. Using the assumption that “like structure” correlates with “like effects,” these approaches are used to fill data gaps on data-poor compounds [35].
Chemicals of concern are typically included in lists maintained by authoritative bodies internationally. An excellent open access source of the lists is contained in the PLuM (Public Library of Materials) database maintained by the Berkeley Center for Green Chemistry (http://bcgc.berkeley.edu). The lists included in PLuM along with the number of compounds represented on each list (last updated October 2011) are as follows:
| 22,017 1.1 480 127 1012 55 182 37 65 187 86 215 421 303 146 |
Several databases are available for detailed toxicological information on chemicals. Judson [36] and Voutchkova et al. [37] have compiled database lists along with web addresses for several of the most useful resources. The most comprehensive source is from the National Library of Medicine and The Division of Specialized Services (SIS), which produces the Toxicological and Environmental Health Information Program (TEHIP). The TOXNET component at http://toxnet.nlm.nih.gov/ is a compilation of several databases on toxicology, hazardous chemicals, environmental health, and toxic releases. Some of the key databases are listed below.
As mentioned previously, between 80% and 90% of all chemicals reaching the environment, present in the workplace, and/or contained in consumer products lack sufficient toxicological data to develop clear and unambiguous risk assessments for human health. Several new technologies now being used to address these issues include high-throughput screening (HTS), high-content screening (HCS), systems biology and pathway mapping, stem cell screening, and virtual tissues. With HTS, there is a long history of cellular and target screening in the pharmaceutical industry where millions of compounds have been tested against a multitude of targets with potential therapeutic indications and for off-target toxicity. However, for the most part, these screens have involved single concentrations to set molar threshold levels for target interaction. The goal has been to obtain potency information, rank order series of compounds, and create analogs through chemical structure manipulations and test these analogs for improved potency and specificity of target interactions. Compounds that pass certain screening criteria, or “filters,” progress through several series of tests including animal assays and eventually human trials. In this way, the relevance of the initial screens can be assessed. For environmental chemicals lacking relevant human data except epidemiological studies, the HTS has evolved into a quantitative approach where chemicals are typically tested at seven or more concentrations in systems that can process over 100,000 compounds per day. Presumed relevance comes from the use of human cells or cell lines and human proteins. The use of assay cells derived from stem cells continues to be researched in terms of applicability to in vitro screening because the cells are capable of self-renewal and exhibit pluripotency. The goal of high experimental throughput is to quickly identify bioactivity signatures that potentially relate to the induction or exacerbation of adverse biological effects [3]. The balance between experimental throughput and human relevance will continue to be debated and hopefully revealed in the near future. Questions that will need to be addressed for toxic responses are the necessity for tissue or organismal context and the need for exposure and/or metabolic context in isolated cell preparations. While it is safe to say that “we aren’t there yet,” the major effort that is crossing between environmental and food and drug agencies suggests there will be a finish line in sight in the future.
Another challenge arising from new technologies is the status of legacy data both in government agencies and in industrial laboratories. Most current data exist in electronic form; however, a majority of older studies exist in paper format. In addition, certain software applications used by both industry and government agencies utilize arbitrary scoring systems that are specific to certain commercial software applications. As we move into an integrated system utilizing all information and data, these issues must be addressed and resolved.
Computational toxicology involves the application of computer technology and mathematical/computational models to analyze, model, and/or predict potential toxicological effects from (1) chemical structure (parent compound or metabolites), (2) similar compounds, (3) exposure, bioaccumulation, and persistence, (4) differential indicators or patterns related to exposure, and (5) networks of biological pathways affected by the chemical [38]. In addition, it allows the further understanding of mechanisms of toxicity, whether they be organism specific, organ specific, and/or disease specific [3, 39, 40, 41]. Also, it is used to attempt to explain why certain individuals, ethnic groups, or populations are more susceptible to chemical exposures and to draw associations between chemical exposure and increased risk for certain diseases [42]. Key data components are relational databases that allow cross-referencing existing data to create informed predictions about data-poor compounds. Since predictions are only as good as the data models they are built on, a crucial step in this field is innovative approaches to generate and utilize newer data sources. Below we highlight some data sources and tools, primarily open access, which are widely used by toxicologists.
ACTOR (WWW.ACTOR.EPA.GOV). ACToR stands for Aggregated Computational Toxicology Resource and is a data warehouse maintained by the EPA. ACToR incorporates many specialized databases within the EPA including ToxRefDB (animal toxicity data), ToxCastDB (high-throughput screening data, discussed further below), ExpoCastDB (exposure data for chemical prioritization), and DSSTox (chemical structure data). Additionally, ACToR compiles data from over 1000 public sources. In total, ACToR provides toxicity data information for over 500,000 environmental chemicals, searchable by chemical name, identification numbers, or structure. The data warehouse contains a wealth of information including physicochemical properties and in vitro and in vivo toxicological data. ACToR is a comprehensive, easily searchable center for a broad range of chemical and toxicological queries.
TOXCAST (HTTP://WWW.EPA.GOV/NCCT/TOXCAST). ToxCast is a system maintained by the EPA to rapidly screen and prioritize chemicals based on potential risk to human health and the environment. ToxCast utilizes advances in high-throughput (HTP) technology to economically screen large numbers of chemicals. Launched in 2007, the program has already screened more than 300 chemicals. For comparison, the EPA estimates that it took 30 years and $2 billion to screen an equivalent number of compounds with traditional animal toxicity tests. These 300 compounds were part of Phase I, the “proof of concept” portion of the process, in which well-studied chemicals were analyzed with the HTP screening methods. Phase I was completed in 2009 and positively demonstrated the HTP screens’ ability to deliver results similar to the previously performed animal toxicity tests. Phase II is currently in process. In this phase the EPA is investigating 2000 diverse chemicals from various industries. Data produced thus far is available through the ToxCast database on the EPA website. In addition to being an economical and fast option, ToxCast helps reduce animal testing by using and validating alternative methods. ToxCast has helped illustrate the utility and accuracy of emerging alternative assessment tools.
VIRTUAL ORGANS (HTTP://WWW.EPA.GOV/NCCT/VIRTUAL_TISSUES). The Virtual Tissues Research Project undertaken by the EPA aims to create reliable in silico models (performed via computer simulations) to predict chemical responses and disease progression in important human tissues and organs. The tissue models are constructed from data gleaned in both in vitro and in vivo models. Animal in vivo assays typically provide the high-dose–response data for the model. Alone, this information is difficult to apply to low-dose exposure scenarios in humans. To construct the virtual organ, engineers combine this in vivo data with low dose, human cell-based in vitro assays to complete the dose–response curve. Together, a more complete and predictive in silico model is possible.
Due to its central role in toxicant metabolism, the liver is one of the first organs being constructed in the Virtual Tissue Research Project. Physiologically based pharmacokinetic modeling, cellular systems, and molecular networks are integrated to mimic the multitude of activities performed by the liver. Once completed, this innovative project will be an invaluable resource for accessible, accurate, and responsible prediction of liver toxicity.
Another virtual tissue model being developed is the v-embryo. This simulation investigates teratogenesis, or the production of birth defects, resulting from chemical exposures in a pregnant woman. The model is being constructed largely from zebrafish and stem cell research, two areas that are especially informative for developmental processes. The developing eye has been chosen as the prototype organ for the proof of concept phase of the virtual embryo. The eye is an excellent starting point because it is well studied previously, is susceptible to both genetic and environmental interferences, exhibits a range of phenotypes, and utilizes a multitude of signaling pathways. The next processes to be incorporated into the model will be vascular, limb, and embryonic stem cell development. The virtual embryo is an ambitious and important project that could contribute to improved human health from the moment of conception.
EPI SUITE (HTTP://WWW.EPA.GOV/OPPT/EXPOSURE/PUBS/EPISUITE.HTM). The Estimation Program Interface (EPI) Suite is a collection of predictive tools also maintained by the EPA. The suite is comprised of approximately 17 individual models that estimate various physiochemical properties. Such properties include log octanol–water partition coefficient, aerobic and anaerobic biodegradability of organic chemicals, and melting point, boiling point, and vapor pressure of organic chemicals. One model of special importance to toxicologists and green chemists is the Ecological Structure Activity Relationships (ECOSAR) program. This model predicts acute and chronic aquatic toxicity in fish, aquatic invertebrates, green algae, and some saltwater and terrestrial species. EPI Suite is useful in estimating the important physiochemical parameters that must be considered when designing a compound or product.
Structure–activity relationships (SARs) are models founded on the concept that the activity of a compound is a direct function of its chemical structure. These models may be based on qualitative (noncontinuous) or quantitative (continuous) data and together are referred to as QSARs. QSARs are constructed by grouping chemicals by common structural characteristics or descriptors such as a functional group, hydrocarbon chain length, or polarity. A collection of well-studied compounds within the group of interest serves as a training set. The training set compounds are graphed according to structural characteristic and a given endpoint of interest. For example, hydrophilicity may be graphed against acute aquatic toxicity. The models range in complexity from two to many input parameters. Additionally, the graph may be built with a simple linear regression model, or may incorporate complex algorithms depending on the depth of information available, complexity of the relationship investigated, and purposes of the model. The model is evaluated for accuracy by a replacement process in which each compound is removed from the set then replaced according to the model’s predictions. A reliable model will accurately reinsert the test compound. At this point, less researched compounds may be evaluated by the model for the endpoint of interest. This is an extremely efficient way to utilize existing data to fill data gaps without additional animal tests and at far less expense. The most reliable models are constructed from structurally related compounds and attempt to predict values associated with compounds that fit into the chemical space represented by the training set. When a newly evaluated compound falls outside the chemical space of the model, the predictions carry a high level of uncertainty. It is generally recognized that QSAR models should be structured to provide, (1) a defined endpoint, (2) an unambiguous algorithm, (3) a defined domain of applicability, (4) appropriate measures of goodness-of-fit, robustness, and measures of predictability, (5) and a mechanistic correlation. Ideally, the selection of chemical descriptors that are used in a model take into consideration the mechanism of action and the rate-limiting step of the biological process being modeled [43].
OECD QSAR TOOLBOX (HTTP://WWW.QSARTOOLBOX.ORG). The Organisation for Economic Co-operation and Development (OECD) was founded in 1960 and has 34 member countries including the United States. Their free QSAR Toolbox is especially helpful during the difficult task of grouping compounds for QSAR modeling. The toolbox will evaluate a compound for previously recognized structural motifs and indicate if it is currently part of any regulatory inventories or chemical categories. The toolbox will provide data previously generated for the compound of interest and similar compounds. Additionally, the toolbox can group compounds by common metabolites or mechanisms of action. The OECD QSAR Toolbox provides accessibility to existing QSAR data and assists in the building of new QSAR models by accurately categorizing compounds of interest.
OpenTox (www.opentox.org) is designed to assist in the creation of predictive computational toxicology tools. The program supports the formation and validation of models for both publicly available and private datasets. OpenTox includes two subprograms—ToxPredict and ToxCreate. Both modules allow those with limited QSAR training to take advantage of the predictive capabilities of QSAR technology. ToxPredict uses public datasets to predict a compound’s toxicity from its structure. ToxCreate aids people in the generation of their own predictive tools, especially for private datasets.
GENEGO (WWW.GENEGO.COM). Genego is a commercial data mining and analytic tool that uses a systems biology approach. Genego consolidates information from a huge library of gene expression studies, SNPs, metabolic profiles, and high-content screening (HCS) assays into interactive maps and pathways. These networks can be filtered, combined, and organized according to the user’s preferences. Additionally, the program maintains several canonical pathways of key importance to human toxicity. Private data or other compounds and networks of interest may be inserted into these canonical pathways to measure their potential influence on that pathway. Genego is a tool to discover and illustrate the complex relationships between compounds and physiological molecules. It is an excellent way to place a compound in the context of greater biological systems.
STITCH (WWW.STITCH.EMBL.DE). STITCH is a free online tool that takes information from experiments, literature, and other databases to map connections between chemicals and proteins. The user may choose how many intermediary compounds are allowable between the input components. The networks indicate by color what type of relationship has been identified between two compounds (e.g., inhibitory, cofactor, activation). Conveniently, these networks may be exported. Each relationship and compound may be clicked on to reveal the source of information. In this way, the user may have access to and evaluate the strength of the supporting data. STITCH is a simple way to find and illustrate chemical and physiological interactions.
THE COMPARATIVE TOXICOGENOMIC DATABASE (WWW.CTDBASE.ORG). The Comparative Toxicogenomic Database (CTD) [44] is a public research tool primarily intended to help investigate the role of environmental chemicals on human health. The online database compiles research from a variety of reputable sources, such as the U.S. National Library of Medicine, so that users may perform broad or specific searches regarding the relationships of chemicals, genes, and diseases. One may search the database from a variety of starting points: chemical, disease, gene, gene ontology, organism, pathway, or reference. Initial input is returned with known interactions from all the other categories. For example, if “toluene” is typed into the chemical search box, the CTD returns information such as genes, diseases, and pathways associated with toluene in humans and other organisms. Additionally, specific information such as chemical codes, official names, basic chemical properties, and citations are provided. These lists are interactive, so any entry may be clicked to find further information. Each search also returns links to external citations, websites, and databases that may be relevant (e.g., GENETOX, household product database).
More experienced users can take advantage of the CTD’s sophisticated analysis and organization tools. Under the “analyze” tab a variety of additional searches are available. One such function is the “batch search” that investigates several input points simultaneously. The search delivers a compact list of basic identifiers and known interactions within a variety of organisms. Other special analysis tools include the Venn diagram based inquiries. This function takes a few entries from the user then maps which genes, chemicals, or diseases are common between all or some of the different queries. The CTD is extremely useful for many problems related to environmental toxicology. For example, if one is concerned about a certain chemical being emitted into the environment, the CTD is an appropriate place to begin research. Upon looking up the chemical, genes that may be affected by the compound and diseases already known to be associated with the chemical will be returned. If the mechanism of action in the human body is poorly understood, the genes, pathways, related chemicals, and references to previous research may help piece together the mechanism. In the case of laboratory work, the CTD can provide invaluable preliminary information regarding which genes and processes to specially monitor in assessing the chemical’s effects. Using the analysis tools, one may also perform these searches in tandem with another chemical of concern or gene of interest to yield more specific data. The Comparative Toxicogenomic Database is an excellent resource to begin, support, or expand research on the molecular relationships between chemicals and human health.
There are both opportunities and challenges in using computational methods to fill data gaps for risk assessment decisions on data-poor chemicals. Opportunities are detailed elsewhere in this chapter, but the major challenge will be the transparency of data and sources used. All computational models carry uncertainty factors, some of which result from the compounds used to construct the model and the structural similarities of the new compounds under evaluation. Detailed analyses of compounds will involve the movement of results of one model into another to finally reach a relevant decision point. This will test the ability of different information technology systems and software applications to communicate, exchange data accurately, and use the information that has been exchanged. Computationally generated data must eventually exist in a metadata format and contain tags to identify (1) the models used (and when), (2) uncertainty factors and how these change when the data travel from one model to the next, and (3) the validity of previously generated data in the context of new information on the chemical from emerging technologies.
Two recent events highlight these challenges in expert meetings and scientific workshops. The Council of Canadian Academies convened an Expert Panel on the Integrated Testing of Pesticides and published a document in 2012 entitled Integrating Emerging Technologies into Chemical Safety Assessment [17]. This expansive document discusses topics highlighted in this chapter and it will become a major relevant source of information in this field for years to come. The Society of Toxicology convened a SOT CCT Meeting—Building for Better Decisions: Multiscale Integration of Human Health and Environmental Data—in May 2012 at the U.S. EPA site in Research Triangle, North Carolina. Topics in this meeting included the interoperability of models used for decision making on chemicals in the environment. Opinions and recommendations from the meeting will be published in future issue(s) of Environmental Health Perspectives.
Large-scale integration issues present both software and data challenges requiring software engineering as well as toxicology data generation solutions.
REACH is the European Commission’s regulatory system for chemicals produced and used in industry. REACH stands for Registration, Evaluation, Authorization, and Restriction of Chemicals and was enacted into law June 1, 2007. This relatively new system aims to increase the collective knowledge of industrially important chemicals, protect human health, and promote environmental sustainability. REACH is committed to obtaining these goals while conserving resources and respecting animal life to the highest possible degree. This is achievable with collaborative research and reliable alternative testing methods [45].
REACH is currently in the process of implementation. Part of the REACH strategy includes shifting a substantial amount of the chemical safety responsibilities from public authorities to manufactures. Registration deadlines are determined by the amount of a compound a company manufactures per year. For example, the deadline for substances produced at greater than or equal to 100 tons per year is June 2012, while the deadline for those produced at 1 ton or more per year is June 2018.
Safety information is also required of manufacturers according to the amount produced per year. High quantity producers must submit a Chemical Safety Report (CSR) that includes key information such as intended uses of the compound, exposure and risk assessments based on these uses, and risk management strategies to lessen the risks. This information is used to create accurate safety data sheets (SDSs) and labels. Downstream users must alert manufacturers of their intended uses so that they may be included in the initial assessments, or they must perform their own assessments. Additionally, downstream users and smaller manufacturers must incorporate the safety information and risk reduction strategies identified by upstream manufacturers.
Compounds classified as mutagens, carcinogens, or reproductive toxins of high concern must gain special authorization from The Commission before commercial use. This authorization is also required for compounds shown to be especially persistent in the environment, bioaccumulative, or toxic (PBT compounds). Authorization is awarded if proper containment protocols are demonstrated that significantly lessen the risk associated with these compounds of concern or if external circumstances make replacement of the compound unfeasible.
The Commission uses compound registration documents to distill and distribute safety information as well as identify gaps in knowledge. Additionally, this is an effective method to reduce animal testing, one of REACH’s central goals. REACH mandates that data generated from vertebrate studies be shared among researchers, in part through the registration process. New vertebrate studies must be specially approved to ensure that results will be nonrepetitive and of high scientific quality and relevance. Researchers investigating similar compounds gather in a “Substance Information Exchange Forum” to share vertebrate data and define future needed studies. These strategies help preserve the natural world, protect animal life, unite scientists, consolidate data, and reduce research costs [46].
Dedication to alternative testing methods makes REACH regulation unique. Assuming proper validation, REACH accepts QSAR and read-across data in place of traditional in vivo assays. New animal studies are only permissible when the desired information is necessary and may not be reliably obtained with alternative methods. Additionally, the European Commission operates the European Center for the Validation of Alternative Methods (ECVAM). This agency investigates novel testing strategies and promotes sound alternative methods. The assays approved by ECVAM encompass a range of endpoints, from genotoxicity, to skin sensitization, to acute aquatic toxicity. REACH and ECVAM are leaders in the global shift toward responsible, alternative scientific methods.
In 2008, the State of California signed into law two green chemistry bills (AB 1879 and SB 509) to establish a broad policy to be coordinated through the Department of Toxic Substances Control (DTSC). This groundbreaking legislation utilizes toxicological information of chemicals to improve the safety of consumer products. In AB 1879, the DTSC was authorized to establish a process for product life cycle (cradle to grave) evaluation, identify and prioritize chemicals of concern (CoC), evaluate the presence of CoC in consumer products sold in California, and assess alternatives of CoC in products with the aim to create safer alternatives to “products of concern” in California. The bill also established a Green Ribbon Science Panel (DEJ is a member) to advise the DTSC. Bill SB 509 required the DTSC to establish and maintain a Toxics Information Clearinghouse for detailed information on specific chemical hazard traits and environmental and toxicological endpoints.
The regulations, referred to as “Safer Consumer Products,” apply to all consumer products that contain a CoC and are sold, offered for sale, supplied, distributed, or manufactured in California. There are several exemptions, such as dangerous prescription drugs and devices and their associated packaging, dental restorative materials, medical devices, food, pesticides, and products used solely to manufacture a product exempted by law. Also exempted are products to be used solely out-of-state and those that are regulated by other regulatory bodies with the same ultimate purpose of safeguarding public health.
There are four steps in the regulatory process:
These Safer Consumer Product regulations are far reaching and represent one of the first (if not first) legal standards for requiring alternative assessments of the scope envisioned. They also require and demand relevant, validated, and curated toxicological information on CoC that can be viewed and accepted by all parties. With this is mind, the selection of CoC takes advantage of several lists of compounds by authoritative bodies internationally and does not require the DTSC to create new information. All of these processes will take place in the public forum through the DTSC website. The more difficult process will be the alternative analyses and substitution of CoC with less toxic compounds. This will necessitate the ability to assess compounds with less data than the original CoC and to create a process of data gap identification and intuitive data generation so the alternative can be judged to be safe, or at least safer, than the CoC that it replaces. Predicting human toxicity is challenging, as can be seen in therapeutic research where one of the major stumbling blocks to successful drug development is the less than optimal prediction of human toxicity. With new medications, the factors involved in predicting adverse effects are complex and include the chemical (drug) itself, the individual with or without unique susceptibilities and concomitant medications, errors, particularly in compliance, and the combined effects of multiple medications and dietary supplements. All of these factors vary based on underlying conditions or diseases in the affected individual. Relying strictly on the chemical structure—including reactive chemical motifs and the generation of reactive metabolites—has not provided all the answers, particularly in controlled clinical trial situations [16]. This becomes even more critical with environmental chemicals because there are no controlled clinical trials and, in several cases, no relevant human data.
These regulations will require the use and implementation of newer technologies and predictive algorithms and an approach to integrate emerging technologies into chemical safety assessment [17].
The field of toxicology has always been linked directly to the field of chemistry, simply because toxicology describes the adverse interactions between chemicals and biological systems. Years of research on how specific structural features of chemicals influence biological responses when exposure is relevant either from route or measured dose have emerged and can now be accessed via open access online sources as discussed earlier. These so-called structural alerts have been developed through the examination of large sets of data such as in TOXNET and ACToR and in the millions of compounds screened in the biopharmaceutical industry. This has led to several proposals of using similar directed synthesis programs to reduce potential toxicity of newly synthesized chemical compounds. Voutchkova et al. [37] present an excellent review of these proposals. As mentioned earlier, most of these structural modification processes have been the result of years of practical application in the pharmaceutical industry. It is important to remember, however, that synthesis to modify potential biological effects of potential therapeutics is always balanced by the counter screening for therapeutic potency, and new synthetic schemes have to remain within a novel intellectual property chemical space. Frequently, this also involves the synthesis of specific stereoisomers where both potency and toxicity can be significantly different [47]. The most important aspect of green chemistry in the biopharmaceutical industry is in the large-scale process chemistry production of the active pharmaceutical ingredient (API), and the impacts on worker safety and the environment.
With the rapid changes occurring in the field of toxicology there remains the question: “How much toxicology does a chemist have to know?” The majority of toxicologists practicing in the field today, including in academia, industry, and the government, do not have a complete up-to-date understanding of all the new technologies and approaches that are changing the field into an information science. The toxicology field will soon be divided into data generation and “tox-bioinformatics.” There is a need for a specific green chemistry bioinformatics section within the field of toxicology where chemists and other nontoxicologists can access relevant information and models to solve specific green chemistry problems. This is different than simply incorporating toxicology course work into chemistry curricula where the chemist must then access a series of databases just to get relevant information. As mentioned earlier, the interoperability of models and data becomes an issue at the individual level.
A green chemistry tox-bioinformatics system currently does not exist but should be one of the priorities for the application of toxicology into green chemistry in the future.
1. Anastas, P. T.; Kirchhoff, M. M. Origins, current status, and future challenges of green chemistry, Acc. Chem. Res., 2002, 35, 686–694.
2. Richard, A. M.; Yang, C.; Judson, R. S. Toxicity data informatics: supporting a new paradigm for toxicity prediction, Toxicology Mechanisms and Methods, 2008, 18, 103–118.
3. Kavlock, R. J.; Dix, D. Compuational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard, and risk, J. Toxicol. Environ. Health Part B: Crit. Rev. 2010, 13, 197–217.
4. Dellarco, V. L.; Baetcke, K. A risk assessment perspective: application of mode of action and human relevance frameworks to the analysis of rodent tumor data, Toxicol. Sci. 2005, 86, 1–3.
5. Schultz, T. W. Adverse outcome pathways: a way of linking chemical structure to in vivo toxicological hazards: In: In Silico Toxicology: Principle and Applications, Cronin, M.; Madden, J. (Eds.), The Royal Society of Chemistry, London, 2010, pp. 351–376.
6. Munro, I. C.; Renwick, A. G.; Danielewska-Nikiel, B. The threshold of toxicological concern (TCC) in risk assessment, Toxicol. Lett. 2008, 180, 151–156.
7. Klaassen, C. D. (Ed.). Casarett & Doull’s Toxicology: The Basic Science of Poisons, McGraw-Hill, New York, 2008.
8. Committee on the Institutional Means for Assessment of Risks to Public Health, National Research Council, Risk Assessment in the Federal Government: Managing the Process, The National Academies Press, Washington, DC, 1983.
9. Johnson, D. E.; Wolfgang, G. H. I.; Giedlin, M. A.; Braeckman, R. Toxicokinetics and toxicodynamics. In: Comprehensive Toxicology, Williams, P. D.; Hottendorf, G. H., (Eds.). Elsevier Science, Ltd., Amsterdam 1997, Vol. 2, pp. 169–181.
10. Johnson, D. E.; Wolfgang, G. H. I. Predicting human safety: screening and computational approaches, Drug Discovery Today, 2000, 5, 445–454.
11. Green, N.; Naven, R. Early toxicity screening strategies. Cur. Opin. Drug Discov. Dev., 2009, 12, 90–97.
12. Johnson, D. E.; Wolfgang, G. H. I. Assessing the potential toxicity of new pharmaceuticals, Curr. Top. Med. Chem., 2001, 1, 233–245.
13. Johnson, D. E.; Blower, P. E. Jr.; Myatt, G. J.; Wolfgang, G. H. I. Chem-Tox informatics: data mining using a medicinal chemistry building block approach, Curr. Opin. Drug Discov. Dev., 2001, 4, 92–101.
14. Johnson, D. E.; Sudarsanam, S. Molecular challenges in frontloading toxicity testing of anti-cancer drugs in drug discovery. In: Encyclopedia of Drug Metabolism and Interactions, Lyubimov, A. (Ed.), John Wiley & Sons, Hoboken, NJ, 2012, pp. 1–20.
15. MacDonald, J. S.; Robertson, R. T. Toxicity testing in the 21st century: a view from the pharmaceutical industry, Toxicol. Sci., 2009, 110, 40–46.
16. Johnson, D. E. Predicting drug safety: next generation solutions, J. Drug Metab. Toxicol., 2012, 3, 1–4.
17. Canadian Report. Integrating Emerging Technologies Into Chemical Safety Assessment, 2012.
18. Klaunig, J. E.; Babich, M. A.; Baetcke, K. P.; Cook, J. C.; Corton, J. C.; David, R. M. PPARalpha agonist-induced rodent tumors: modes of action and human relevance, Crit. Rev. Toxicol., 2003, 33, 666–780.
19. Meek, M. E.; Butcher, J. R.; Cohen, S. M. A framework for human relevance analysis of information on carcinogenic modes of action, Crit. Rev. Toxicol., 2003, 33, 591–653.
20. Seed, J.; Carney, E. W.; Corley, R. A.; Crofton, K. M.; DeSesso, J. M.; Foster, P. M. Overview: using mode of action and life stage information to evaluate the human relevance of animal toxicity data, Crit. Rev. Toxicol., 2005, 35, 664–672.
21. Boobis, A. R.; Cohen, S. M.; Dellarco, V. L.; McGregor, D.; Meek, M. E.; Vickers, C. IPCS framwork for analyzing the relevance of a cancer mode of action for humans, Crit. Rev. in Toxicol., 2006, 36, 781–792.
22. Boobis, A. R.; Doe, J. E.; Heinrich-Hirsch, B.; Meek, M. E.; Munn, S. IPCS framework for analyzing the relevance of a noncancer mode of action for humans, Crit. Rev. Toxicol., 2008, 38, 87–96.
23. Ward, J. M. The two-year rodent carcinogenesis bioassay—will it survive? J. Toxicol. Pathol., 2007, 20, 13–19.
24. McClain, R. M. Mechanistic consideration for the relevance of animal data on thyroid neoplasia in human risk assessment., Mutation Res., 1995, 333, 131–142.
25. Capen, C. C.; Martin, S. L. The effects of xenobiotics on the structure and function of thyroid follicular and C-Cells, Toxicol. Pathol., 1989, 17, 266–293.
26. Alison, R. H., Neoplastic lesions of questionable significance to humans, Toxicol. Pathol., 1994, 22, 179–186.
27. Dellarco, V. L.; McGregor, D.; Berry, C. Thiazopyr and thyroid disruption: case study within the context of the 2006 IPCS human relevance framework for analysis of a cancer mode of action, Crit. Rev. Toxicol., 2006, 36.
28. Dethloff, L. Gabapentin-induced mitogenic activity in rat pancreatic acinar cells, Toxicol. Sci., 2000, 55, 52–59.
29. Longnecker, D. Experimental pancreatic cancer: role of species, sex, and diet, Bull. Cancer, 1990, 77, 27–37.
30. Watanapa, P.; Williamson, R. C. N. Experimental pancreatic hyperplasia and neoplasia: effects of dietary and surgical manipulation, Br. J. Cancer, 1993, 67, 877–884.
31. Rao, K. N.; Takahashi, S.; Shinozuka, H. Acinar cell carcinoma of the rat pancreas grown in cell culture and in nude mice, Cancer Res., 1980, 40, 592–597.
32. Larson, H.; Chan, E.; Sudarsanam, S.; Johnson, D. E. Biomarkers. In: Computational Toxicology Vol II. Reisfeld B and Mayeno A (Eds.). Humana Press, Springer Science, New York, 2012. Chapter 11, pp. 253–273. DOI 10.1007/978-1-62703-059-5_11.
33. Cramer, G. M.; Ford, R. A.; Hall, R. L. Estimation of toxic hazard: a decision tree approach, Food and Cosmetics Toxicology, 1978, 16, 255–276.
34. Enoch, S. J.; Cronin, M. T. D. A review of the electrophilic reaction chemistry involved in covalent DNA binding. Crit. Rev. Toxicol., 2010, 40, 728–748.
35. Benigni, R.; Tcheremenskaia, O.; Jeliazkov, V.; Hardy, B.; Affentranger, R. Initial Ontologies for Toxicity Data: OpenTox D3.1 Report on Initial Ontologies for Toxicology Data; OpenTox, 2009.
36. Judson, R. S. Public databases supporting computational toxicology. J. Toxicol. Environ. Health Part B: Crit. Rev., 2010, 13, 218–231.
37. Voutchkova, A. M.; Osimitz, T. G.; Anastas, P. T. Toward a comprehensive design framework for reduced hazard, Chem. Rev., 2010, 110, 5845–5882.
38. Raunio, H. In silico toxicology—non-testing methods, Frontiers Pharmacol., 2011, 2, 33.
39. Johnson, D. E.; Rodgers, A. D.; Sudarsanam, S. Future of computational toxicology: broad application into human disease and therapeutics. In: Computational Toxicology: Risk Assessment for Pharmaceutical and Environmental Chemicals, Ekins, S. (Ed.), John Wiley & Sons, Hoboken, NJ, 2007, pp. 725–749.
40. Richard, A. M.; Yang, C.; Judson, R. S. Toxicity data informatics: supporting a new paradigm for toxicity prediction, Toxicology Mechanisms and Methods, 2008, 18, 103–118.
41. Mortensen, H. M.; Euling, S. Y. Integrating mechanistic and polymorphism data to characterize human genetic susceptibility for environmental chemical risk assessment in the 21st century, Toxicol. Appl. Pharmacol., 2011. DOI:10.1016/j.taap.2011.01.015.
42. Chiu, W. A.; Euling, S. Y.; Scott, C. S.; Subramaniam, R. P. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era, Toxicol. Appl. Pharmacol., 2010. DOI:10.1016/j.taap.2010.03.019.
43. Zvinvashe, E.; Murk, A. J.; Rietjens, I. M. Promises and pitfalls of quantitative structure–activity approaches for predicting metabolism and toxicity, Chem. Res. Toxicol., 2008, 21, 2229–2236.
44. Davis, A. P.; King, B. L.; Mockus, S.; Murphy, C. G.; Saraceni-Richards, C.; Rosentsein, M.; Wiegers, T.; Mattingly, C. J. The comparative toxicogenomics database: update 2011, Nucleic Acids Res., 2011, 39, D1067–D1072.
45. European Commission. REACH in Brief, 2007.
46. REACH Implementation Project. REACH Proposal Process Description; June 2004.
47. Tucker, J. L. Green chemistry, a pharmaceutical perspective, Org. Proc. Res. Dev., 2006, 10, 315–319.