This chapter identifies structural and chemical indices that reflect changes in the susceptibility and response of individual trees and forests to various kinds of disturbances. Some of the indices verify the historical frequency of various types of disturbances; others indicate shifts in the availability of carbon, water, and nutrients that predispose ecosystems to disturbance. To test the reliability of these indices one can report how they change under experimental conditions and across biotic, physical, and chemical stress gradients. The chapter identifies a number of structural and chemical properties that have general or specific diagnostic value. Chemical and physical analyses of tree rings are particularly useful to assess historical variation in atmospheric deposition of heavy metals and tree responses to variation in atmospheric pollutants and CO2. The frequency of fire and the intensity of insect outbreaks may also be inferred from scars and historical patterns in growth efficiency. Plant community attributes provide some insight into the populations of browsing animals. Together, biochemical, isotopic, and mineral analyses of foliage, wood, and roots yield clues to the susceptibility or resistance of forests to specific agents of disturbance.
In previous chapters we identified properties of ecosystems that are sensitive to alterations in the availability of resources. Of these, leaf area index was the most general structural variable. During stand development, LAI of the initial complement of species generally increases rapidly and remains relatively stable for some time, particularly when shade-adapted species fill in gaps that develop in the overstory canopy (Chapter 5). We define a disturbance as any factor that brings about a significant reduction in the overstory leaf area index for a period of more than 1 year. This definition parallels one by Oliver and Larson (1996), who define a disturbance as an event that makes growing space available for surviving trees.
Often it is difficult to identify the underlying cause of a disturbance. For example, in the selective harvesting of trees, soil compaction and direct injury to residual stems may create conditions favorable for the spread of pathogens that otherwise would not occur. Likewise, attempts to protect forests against fire for long periods may result in insect outbreaks (Johnson and Denton, 1975; Fellin, 1980; McCune, 1983). For managers, the most useful analysis provides an indication when an ecosystem is beginning to perform “abnormally” but has not yet been “disturbed.” At such times the forest is “predisposed” to change in structure and composition (Waring, 1987). The actual agents of disturbance, be they windthrow, insects, disease, fire, or, if we choose, selective silvicultural practices, may all lead to recovery of ecosystem function, but with marked differences in potential losses of resources and options for the future.
Although many factors may interact to cause a disturbance, they may be broadly classified as either biotic or abiotic in origin. Making distinctions among an array of biotic and abiotic forces is important because the biota present in an ecosystem have adapted to types of disturbances that have previously occurred at predictable frequencies. By recognizing selective adaptations of the biota we are in a better position to predict changes in the future composition of forests and in the rates of ecosystem functions.
If disturbance is required to perpetuate a certain type of forest, we may wish to mimic the historical “natural” sequence, but the historical sequence may be unrepresentative or hard to duplicate with increased atmospheric deposition of chemicals, introduction of nonnative biota, and changes in management practices. Ecosystem and stand development (succession) models described in preceding chapters provide a means of projecting into the future, but their accuracy depends on the validity of a number of important assumptions and requires projections of climatic variation.
In this Chapter we identify structural and chemical indices that reflect changes in the susceptibility and response of individual trees and forests to various kinds of disturbances. Some of the indices verify the historical frequency of various types of disturbances; others indicate shifts in the availability of carbon, water, and nutrients that predispose ecosystems to disturbance. To test the reliability of these indices we report how they change under experimental conditions and across biotic, physical, and chemical stress gradients.
The extent to which biodiversity provides a buffer against various types of disturbances is debatable (Perry, 1994). Species diversity within some tropical forests is amazingly high, with as many as 473 tree species having been recorded on a single hectare in Ecuador (Valencia et al., 1994). In addition, thousands of invertebrates fill important niches with uncountable species of microorganisms. Some redundancy in functional groups is always desirable, but large numbers of species in a given guild do not necessarily make the ecosystem more buffered against disturbance. Slight variations in the resistance of species to a common stress, however, should permit more rapid recovery of primary production, as demonstrated in grassland experiments following drought (Tilman and Downing, 1994).
First-order ecosystem processes, such as photosynthesis, transpiration, and decomposition, are often relatively insensitive to forest species composition. In wet tropical forests of Costa Rica, the number of species was regulated following clearing of the original stand, and soil organic matter and nitrogen levels returned to the original status about as fast with 12 species of trees present as they did with 120 (Fig. 6.1). A diverse mix of species is of less consequence if a particular type of disturbance rarely occurs. For example, large areas of highly diverse temperate rain forests were converted to tussock grass on the South Island of New Zealand in a short time after the Maori people arrived and introduced extensive fire (Newnham, 1992; Evison, 1993). However, the abundance of tree species in many tropical forests may provide those ecosystems a buffer against disturbance by the herbivores and pathogens, which abound in warm and moist conditions. Spatial isolation of individual tropical tree species has been shown to improve the chances of survival against some pathogens (Gilbert et al., 1994).
Spatial isolation is, however, a disadvantage when it comes to reproduction. Most tropical tree species make heavy investments in flowers and fruits to attract butterflies, bats, birds, primates, and other mammals which pollinate, disseminate, ingest, and fertilize the propagules (Chapter 5). In addition to investment in attracting pollinators and seed disseminators, perennial plants must expend additional energy to produce defensive chemicals to deter attacks by herbivores and pathogens. Nicotine, caffeine, cocaine, and tannins are all natural products that help plants defend themselves.
Defensive chemicals present in plants are broadly classified into nitrogen- and non-nitrogen-containing compounds. Compounds that contain N include cyanogenic glucosides, alkaloids, and nonprotein amino acids. Defensive compounds without N include tannins, terpenes, phytoalexins, steroids, and phenolic acids. Each kind of compound may serve in a variety of ways against various organisms (Table 6.1).
TABLE 6.1
Major Groups of Secondary Plant Metabolites Known to Contain Products Important for Defensea
Class | Number known | Contains N | Protection against |
Alkaloids | 1000 | Yes | Mammals |
Amino acids | 250 | Yes | Insects |
Ligans | 50 | No | Insects |
Lipids | 100 | No | Fungi |
Phenolic acids | 100 | No | Plants |
Phytoalexins | 100 | No | Fungi |
Quinones | 200 | No | Plants |
Terpenes | 1100 | No | Insects |
Steroids | 600 | No | Insects |
aFrom Swain (1977). With permission, from the Annual Review of Plant Physiology, Volume 28, © 1977, by Annual Reviews Inc.
Some plants produce fungistatic and bacteriostatic compounds that prevent colonization by pathogens. Other compounds act as physical barriers, such as waxes on the leaf surface or resins or lignin in cell walls. Increasing fiber and lowering water content decrease digestibility of plant tissue and reduce herbivore growth rates and survival (Scriber and Slansky, 1981). Tannins precipitate protein, which inhibits most enzyme reactions and makes protein present in plant tissue nutritionally unavailable to most animals and microbes (Zucker, 1983). Phytoalexins are lipid-soluble compounds, which are activated following an attack by pathogens, and exhibit antibiotic properties (Harborne, 1982). The alkaloids found in many angiosperms are particularly toxic to a variety of mammals (Swain, 1977).
Changes in host biochemistry may also affect the colonization of organisms helpful to the host plant. These include protective ant colonies, macroorganisms that graze on bacteria, and symbiotic associations of N-fixing bacteria and mycorrhizal root fungi. These beneficial organisms may directly infect or prey on attacking organisms, release antibiotics, or provide essential nutrients. Many of the compounds released as exudates, which include a variety of polysaccharides, organic acids, and amino acids, are essential to beneficial associates, but when these organisms are absent they can also be assimilated by herbivores and pathogens.
In general, defensive compounds that lack N reach rather high concentrations in cells, often 10–15% by weight, whereas N-containing compounds are usually at concentrations below 1%. Plants expend less total energy in the synthesis of small amounts of N-containing defensive compounds than when producing large amounts of C-rich compounds, but variations in the turnover rates of defensive compounds and differences in relative growth rates must be considered in assessing relative costs (Bryant et al., 1991). Nitrogen-containing compounds are most frequently found in deciduous, fast-growing vegetation, whereas defensive compounds without N are more characteristic of slow-growing plants, particularly evergreens with long leaf life spans (Bryant et al., 1986; Coley, 1988). Regardless of the defensive compound synthesized, no plant is completely immune to attack. Specialized insects and pathogens have evolved that not only detoxify toxic compounds, but actually require them for optimal growth (Bernays and Woodhead, 1982). These highly evolved specialists are restricted to a few host species, but they may attack vigorous as well as weak individuals (McLaughlin and Shriner, 1980). Other less specialized organisms accommodate a wider range of biochemical challenges and attack a wider variety of plants.
Plants that depend on defensive compounds rich in N are at a competitive disadvantage where N is in short supply. On the other hand, plants producing C-rich defensive compounds are at a disadvantage when growing in shade with an abundant supply of N. Those plants adapted to more fertile soils may be expected to build a variety of defensive compounds from N. Thus, alkaloids predominate in the foliage of trees in many lowland tropical forests where N is relatively abundant (McKey et al., 1978). Plants growing in areas where N is scarce generally produce only tannins and related C-based compounds. Foliage is so unpalatable in some tropical forests growing on sterile sands that primates survive mainly by eating fruits (Gartlan et al., 1980). A similar pattern in distribution of vegetation with N-based or C-based defensive compounds to that observed in tropical forests has been observed in boreal and temperate forests (Rhoades and Cates, 1976; Bryant et al., 1991). At the time of foliage elongation, when N is relatively available, even plants growing in nutrient-poor habitats may produce a few N-based defensive compounds (Dement and Mooney, 1974; Prudhomme, 1983).
Insects differ from other animals in the way they locate host plants. Birds and larger animals depend on sight to recognize flowers and fruits. Insects rely much more on odors of compounds volatilized or exuded by plants. Adult insects seeking to lay eggs on a suitable host may use their antennae to sense volatile compounds at levels as low as 10−12 g cm−3. By direct tasting, insects may discriminate nonvolatile compounds at concentrations of 1 mg per 1000 cm3 in tissue, which is far below toxic levels (Swain, 1977). To meet the challenge of insects, many plants are able to produce toxic compounds quickly and to construct barriers that consist of dead or resin- or gum-filled tissue almost immediately following attack (Schultz and Baldwin, 1982; Raffa and Berryman, 1983). In response to localized insect activity, foliage throughout an entire tree may become less palatable (Haukioja and Niemelä, 1979; Karban and Myers, 1989). Morphological responses such as stiffer thorns on Acacia trees may also be induced by grazing (Seif el Din and Obeid, 1971). Bark sloughing is a response to attack by the woolly aphid (Adelges piceae) that only reaches epidemic populations when balsam fir (Abies balsamea) replaces native forest species (Kloft, 1957). The implication is that the woolly aphid is at a low-level equilibrium with its native host as a result of long-term evolutionary adaptations but reaches epidemic populations on the relatively defenseless introduced balsam fir.
Induced responses to attack can only be effective if sufficient resources can be quickly mobilized. The rate at which stored carbohydrates or protein can be converted to mobile forms (sugars and amino acids) and transported to sites of attack may limit the capacity of trees to respond (McLaughlin and Shriner, 1980), although this may not affect canopy responses to partial defoliation if photosynthetic rates are high. Changes in the allocation of current photosynthate to remote organs, such as the lower bole or roots, however, cannot be accomplished rapidly because of the distance involved and limitations imposed by phloem transport (Chapter 3). For this reason, concentrations of stored reserves in roots, stems, and twigs are a good indicator of a tree’s potential to survive localized attack by insects or pathogens (Ostrofsky and Shigo, 1984). Wargo et al. (1972) demonstrated that defoliation of sugar maple (Acer saccharum) greatly reduced starch content in roots at the end of the growing season. Low starch reserves in the roots predisposed trees to attack by root pathogens (Wargo, 1972).
Tropical forests are rarely heavily defoliated because the insects and pathogens are highly specialized and their host trees are few and widely separated. In temperate and boreal forests, on the other hand, only a few species of trees may be present. Even with genetic diversity within a population, major outbreaks of insects are common in higher latitude forests. Population outbreaks of insects can completely defoliate a large fraction of trees in a forest within a single year, but, depending on the physiological status of the trees, mortality may be low, as shown in two photographs taken during and 5 years after an outbreak of tussock moth in northeastern Oregon (Fig. 6.2). It is critical to understand whether a forest is resistant or susceptible to defoliating insects or other biotic agents before making management decisions. In the following sections we draw on ecosystem-level experiments to test the reliability of various stress indices and follow changes in susceptibility and response to changing environmental conditions created by biotic disturbance.
Most ecological theories are based on observations following change over time without recourse to experimentation. Because many properties of ecosystems change concurrently following a disturbance, theories are often difficult to reject or to confirm. Theories involved with forest insect outbreaks are particularly difficult to evaluate because the physiological status of host and herbivore both change rapidly following a disturbance. To understand the interactions more fully, the canopy structure and environmental resources available to trees under attack may be experimentally altered at the start of an outbreak and followed through its duration. Four examples bring out different aspects of interactions between the physiological state of the vegetation, the availability of resources, and the population dynamics of (1) defoliating insects, (2) bark beetles, (3) pathogens, and (4) browsing mammals.
In the boreal forests of Canada, outbreaks of eastern spruce budworm (Choristoneura fumiferana) occur at regular intervals in the extensive nearly pure forests of Abies balsamea. Using a correlation established between tree age and the conversion of sapwood to heartwood (Fig. 5.11), Coyea and Margolis (1994) determined the dry matter production of stem wood per unit of leaf area per annum (growth efficiency) of trees which survived or succumbed to an extensive stand defoliation (Fig. 6.3). They found that the insect outbreak occurred as the stand reached its lowest mean growth efficiency, and that mortality was concentrated on trees with growth efficiencies significantly below average. Following the death of many of the less resistant trees, the remaining trees reattained their previous levels of wood production per unit of leaf area. These observations suggest that native defoliating insects play an important role in stand development by reducing competition, which allows surviving trees to maintain moderate growth efficiencies and the ecosystem to produce near maximum NPP, a result similar to a silviculturalist prescribing a precommercial thinning.
A somewhat different series of events has led to extensive outbreaks of western spruce budworm (Choristoneura occidentalis) in forests with mixed populations of coniferous species. Throughout much of the western United States selective harvesting of ponderosa pine trees, combined with fire protection, has created conditions that favor the establishment of grand fir (Abies grandis) and Douglas-fir (Fig. 6.4). These new stands support the maximum leaf area index possible for the environment, which may be twice as high as that observed when frequent ground fires occurred. The additional leaf area requires more nutrients and intercepts more of the annual precipitation. As a result, trees growing over extensive areas have growth efficiencies that average < 30 g wood produced m−2 leaf area year−1 (Waring et al., 1992).
During an outbreak of western spruce budworm, a thinning and fertilization experiment with four replications was installed in a dense, 80-year-old grand fir forest under a few scattered overstory pine. The resulting defoliation was recorded, along with measures of tree growth, foliar chemistry, and the availability of soil nitrogen and water (Mason et al., 1992; Waring et al., 1992; Wickman et al., 1992). Fertilization with nitrogen increased amino acid levels in foliage by nearly threefold (Waring et al., 1992). As a result, budworm larvae needed to consume less foliage to develop to maturity. Increased availability of nitrogen, combined with better illumination, induced trees in the thinned plots to produce additional leaves, which enhanced both terminal and diameter growth, and resulted in more than a twofold increase in tree growth efficiency for the thinned and fertilized treatment (Fig. 6.5a,b). With increased foliage and decreased herbivory, the proportion of foliage consumed by insects during the peak of the epidemic in 1987 dropped from >98% in unfertilized treatments to about 75% in the fertilized treatments (Fig. 6.5c). The outbreak ended in 1989 when pathogens caused the budworm population to return to near minimum levels on all treatment areas (Mason et al., 1992; Wickman et al., 1992).
As a result of insect defoliation, N fertilization without thinning (in the same experiment) also eventually resulted in a significant increase in growth efficiency over untreated stands. Summer drought did not exhaust all reserves of water in the rooting zone. Predawn Ψ values recorded on twigs fell to only –0.7 MPa (Waring et al., 1992). Drought, however, did restrict most nitrogen uptake during the growing season to the spring and autumn months when the upper soil horizons were moist (Waring et al., 1992). On other sites where water and nutrients are more limiting, epidemic outbreaks of defoliating insects cause more extensive mortality, correlated with extremely low values of tree growth efficiency (Wickman et al., 1992).
When mortality is extensive, the LAI in these kinds of forests may not return to maximum values for half a century. During that time, water, nutrients, and light are more readily available to surviving trees. In the arid summer environments characteristic of most of the western United States, if dead trees are not harvested, they increase the probability of fire. If a seed source is available, a young pine forest, which is unpalatable to spruce budworm and most other defoliating insects, will develop.
In many cases, managers consider applying biocides or pesticides during an outbreak to reduce damage from insect defoliators. These practices rarely halt outbreaks and can perpetuate them by slowing the natural buildup of pathogens in the insect population unless great care is taken (Cadogan et al., 1995). Attempts to control outbreaks also prevent the benefits of nutrient recycling which result from insects concentrating elements in their frass at more than twice that found in fresh foliage (Rafes, 1971). Through monitoring of a simple structural index such as tree growth efficiency, a judgment can be made regarding the ability of the forest to withstand an epidemic attack of defoliating insects. Whether judicious application of fertilizer might be beneficial once an outbreak commences would depend on the availability of nitrogen (which has changed with increases in atmospheric deposition rates) and its balance with other nutrients (Chapter 4).
Different species of bark beetles attack a wide variety of conifers. Female beetles select susceptible trees based on the presence of terpenes that are generated by the conifers in increasing amounts as temperatures rise (Christiansen et al., 1987). Bark beetles deposit their eggs in galleries excavated in the phloem, cambium, and sapwood of trees. Successful brood production is contingent on the death of these tissues. Most species of bark beetles can only breed in trees that exhibit severe decline or are already dead, and so they merely promote decomposition and mineralization. A few species, however, are able to attack and kill living, sometimes quite healthy trees. Epidemic outbreaks by these “aggressive” species may greatly alter the state and function of forest ecosystems over large areas.
Aggressive species have developed three ways of conquering living trees: (1) by having the first attacking beetles produce chemical attractants to bring other beetles, (2) by tolerating resin secretions, and (3) by inoculating trees with a pathogenic fungus that kills by halting water transport through the sapwood (Christiansen et al., 1987). The degree to which trees can defend themselves successfully is based on the extent to which they can produce resins and mobilize carbohydrates to wall off areas in the phloem and sapwood where beetles have introduced fast-growing strains of blue-stain fungi. Stored reserves are generally insufficient by themselves to protect trees against mass beetle attacks (Christiansen and Ericsson, 1986). In cooler climates where only one beetle population develops in a year, attacks are synchronized with the expansion of new growth. Some variation exists, however, because budbreak is controlled by soil temperature, which constrains water uptake, more than by air temperature to which insect development is closely keyed (Beckwith and Burnell, 1982). Genetic variation also exists in both tree and insect populations.
In another well-replicated experiment, synthetic pheromones were released to attract mountain pine beetles (Dendroctonus ponderosae) to various thinning and fertilization treatments in 120-year-old forest of lodgepole pine (Pinus contorta) (Waring and Pitman, 1985). Treatments included (1) N fertilizer, (2) N fertilizer combined with a reduction in canopy LAI of about 80%, (3) additions of sugar and sawdust to limit mineralization by microorganisms, and (4) untreated plots. At the start of the experiment, tree growth efficiency averaged less than 70 g wood produced per square meter of leaf area per year with a stand average LAI of 4.7. As beetles killed trees and foliage was shed a year later, more light, nutrients, and water became available to surviving trees.
Within 2 years of the application of fertilizer, surviving trees increased their efficiencies by more than 55% to values above 100 g wood m−2 leaf area year−1 (Table 6.2). Surviving trees in untreated stands also increased their growth efficiency by over 40% after 2 years. Only in the sugar and sawdust treatment did tree mortality not result in a significantly improving residual tree growth efficiency. When 100 trees sustaining different levels of attack were compared, tree mortality, measured by the proportion of sapwood observed with blue-stain fungus, was accurately predicted by noting when the ratio of bark beetle attacks (square meter of bark surface) to growth efficiency (grams of wood produced annually per square meter of foliage) exceeded 1.2 (Fig. 6.6). At values above 100 g wood m−2 leaf area year−1 no successful bark beetle attacks were recorded. The same relationship was demonstrated in other thinning experiments with ponderosa pine (Larsson et al., 1983) and lodgepole pine grown at different densities (Mitchell et al., 1983). A comparable response has also been reported for European spruce bark beetle (Ips typographus) cited by Christiansen et al. (1987).
TABLE 6.2
Growth Efficiency (Wood Production per Unit Leaf Area) under Various Treatmentsa
aMeans (n = 12) connected by brackets are significantly different at p = 0.05. From Waring and Pitman (1983).
In areas where bark beetle outbreaks occur, thinning may improve the resistance of residual trees if sufficient time is allowed to raise their growth efficiency to a safe level, as demonstrated in a photograph taken after a bark beetle epidemic swept through an even-aged pine forest that had been partially thinned (Fig. 6.7). Thinning alone, however, may not be sufficient to prevent subsequent mortality if other abiotic or biotic factors constrain water, nutrient, and CO2 uptake. Annual growth efficiency may prove an inadequate index if conditions are highly variable from year to year or if beetle attacks extend throughout the growing season (Lorio, 1986).
Many pathogens and parasites are carried by insects, birds, and other vectors from one area to another. Transport of pathogens by humans into forests where native trees lack resistance has caused near extinction of some species such as American chestnut (Castanea dentata). Below ground, many native fungi are present with the capacity to break down cellulose and lignin in plant cell walls (Harvey et al., 1987). With large accumulations of woody debris following windstorms or logging, pathogens may spread from dead to living root systems. This fact has encouraged removal of stumps or their treatment with fungicides (Thies et al., 1994). In some cases, root pathogens become so well established that a rotation of resistant species is recommended (Thies, 1984).
In the absence of fire or wind, root pathogens often play an important role by opening canopies and fostering nutrient cycling. When pathogens kill trees in stands with near maximum LAI, surviving trees may quickly respond in growth to compensate fully for the reduction in stocking levels (Oren et al., 1985). At times, root pathogens do kill all dominant trees, which allows stand replacement to occur. For example, in some subalpine forests of mountain hemlock (Tsuga mertensiana), laminated root rot (Phellinus weirii) causes mortality in wavelike patterns followed by replacement with younger trees (McCauley and Cook, 1980). Field studies showed that mineralizable N in the undisturbed soil was extremely low but increased significantly in the recently disturbed zone following death of the overstory trees (Fig. 6.8). Growth efficiency of trees increased from less than 30 in the old-growth forest to nearly 80 g wood m−2 leaf area year−1 in young regenerating forest. As stand LAI peaked and mineral soil nitrogen returned to the low values common to older stands, growth efficiency was again reduced so that, about 50 m behind the current edge of dying old-growth forests, trees again became susceptible to infection from the extensive inoculum available in decaying roots. Laboratory studies on mountain hemlock seedlings collected from the site indicated increasing susceptibility to the root pathogen when nitrogen was limited or when shade restricted photosynthesis (Matson and Waring, 1984).
In regions where soils are unable to supply trees with a balance of nutrients, an increase in the concentration of amino acids occurs (Ericsson et al., 1993). The concentration of amino acids in the foliage of Pinus radiata has been shown to be an excellent index of a tree’s susceptibility to the needle-cast fungus Dothistroma (Turner and Lambert, 1986). Entry et al. (1986) also demonstrated in a laboratory experiment that a balance between N and P was important in determining the resistance of Pinus monticola seedlings to the root pathogen Armillaria mellea (Entry et al., 1986). Additional laboratory studies with five species of western conifers confirmed that an adequate balance of both light and nutrients was important in reducing infection by Armillaria on seedlings. The basis for seedling resistance was further related to a critical ratio between the concentration of phenolic or lignin compounds in the root, which inhibit pathogen growth, and the concentration of sugars, which stimulate growth (Entry et al., 1991a).
Entry et al. (1991b) designed a field experiment to test the laboratory-derived biochemical indices. They compared 27 individual Douglas-fir trees with highly variable root biochemistry as a result of having been grown in stands that had been thinned, thinned and fertilized with nitrogen, or left untreated for 10 years. The ratios of phenolic and lignin compounds to sugar in trees roots (expressed in units of energy required to degrade phenolics or lignin to the energy available from sugars) identified those trees that were most susceptible to infection (Fig. 6.9). Subsequently, other field studies have shown the importance of adding a balance of fertilizer, because nitrogen alone often results in decreasing root resistance against pathogen attack (Mandzak and Moore, 1994). Entry et al. (1991b) recommended that managers should commence thinning early in stand development when trees are small, so that any infected roots present will decay rapidly. Where forests have evolved with frequent disturbance initiated by fire or insects but have been protected against these agents, pathogens are likely to play an increasing role in disturbance.
Mortality caused by defoliating insects, stem-killing bark beetles, and root pathogens has similar effects in relation to ecosystem responses (Table 6.3). These biotic disturbances all increase the amount and substrate quality of leaf and fine-root detritus. With reduction in LAI, the microclimate for decomposition and mineralization is also improved. Surviving trees, or those that replace the original stand, have improved access to water, nutrients, and light, which result in more photosynthate being available for growth and defense. If recovery in growth efficiency and biochemical balance is not observed following a biotic-induced disturbance, other factors such as atmospheric pollution or climatic variation may be the underlying cause. In such cases, attempts to control the biotic agent of disturbance will have minor long-term benefits and may actually exacerbate the situation.
TABLE 6.3
Ecosystem Responses following Moderate Mortality at Maximum Stand Leaf Area Indexa
Vertebrate animals are generally less selective in their diet than are invertebrates. However, the quality of browse is as important or more important than the amount available, and it changes with plant species, plant age, and environment. This fact is most clearly demonstrated from extensive studies conducted in the boreal forests of Alaska where the species composition is limited and harsh winters force large numbers of mammals to compete for limited resources. From these and related studies elsewhere, a number of important generalizations have emerged and been summarized in reviews by Bryant et al. (1986, 1991):
• Food selection is not based on “energetic optimization.” A wide variety of browsing animals, which include hare, moose, and deer, avoid eating evergreens such as black spruce although this species contains higher concentrations of lipid and easily digestible sources of energy than more preferred woody plant species. Moreover, the juvenile form and component plant parts of deciduous species with the highest nutritional and energy content are less preferred than older forms and less nutritious components.
• A few specific chemical constituents, which are usually volatile and unstable, play the major role in plant defense against browsing animals. These defense components are often not correlated with the gross fractions of total phenols, tannins, or resins present in tissues.
• A few specific chemical constituents, which are usually volatile and unstable, play the major role in plant defense against browsing animals. These defense components are often not correlated with the gross fractions of total phenols, tannins, or resins present in tissues.
• Browsing animals usually require more than one highly palatable species in their diet to meet daily dietary needs. As the palatability of dietary components declines, a greater diversity of plant species, growth stages, and component parts is required to maintain animal weight throughout critical periods when food resources are scarce. Thus the dietary generalism exhibited by many browsing animals is a necessary consequence to avoid ingestion of large quantities of toxic secondary metabolites.
• Woody species that have evolved to grow well on fertile soils or following disturbances allocate fewer resources to defense and have a greater ability to recover from injury by sprouting (or other mechanisms) than those species that have lower requirements and inherently slower growth rates. Fertilization can improve the growth of evergreen species, but it results in increasing their palatability and thus likelihood of injury from browsing animals.
These conclusions, which are based on numerous field and laboratory experiments, suggest that, where browsing animals are native, the vegetation is likely to be well adapted to herbivory. Evolutionary considerations are important because when numerous species of vertebrate herbivores were introduced into New Zealand, a country previously without native mammals (except for bats), large areas of forests were greatly affected by overbrowsing of understory trees and shrubs; possums introduced from Australia have also caused extensive damage to New Zealand forests (Howard, 1964; Coleman et al., 1980). Even within a region where the vegetation has evolved with vertebrate herbivores, conversion of large areas to fast-growing plantations of single species has obvious implications for residual animal populations. The native ungulates, according to the theories summarized above, will find less diversity in their diet and are likely to shift their herbivory to the plantations.
An independent test of these principles was made in southeastern Alaska where extensive logging of old-growth hemlock forests has greatly stimulated the growth of a deciduous blueberry, Vaccinium ovalifolium, favored by the black-tailed deer (Odocoileus hemionus var. sitkensis), but deer populations have fallen rather than increased in the region (Hanley and McKendrick, 1985). Previous to logging activities, deer largely survived the winter by browsing Vaccinium and Cornus, species that grow well in gaps beneath the old-growth canopy. These two shrub species provide deer with high concentrations of readily digestible protein, which is required for lactating females to nurse fawns born in the spring (Hanley, 1993). Because blueberry was present in all stages of stand development, it was hypothesized that the quality of browse may have changed with removal of the overstory canopy. A specific biochemical analysis was developed to define the amount of digestible protein in deer browse (Roberts et al., 1987; Hanley et al., 1992). Little insight into the nutritional value of forage could be gained by analysis of mineral, carbohydrates, or total protein content (Hanley et al., 1992).
Rose (1990) designed a series of laboratory and field experiments to determine how twig growth and leaf biochemistry of blueberry varied with changes in overstory cover and nitrogen availability. The field experiment was conducted on a recently logged area where large numbers of blueberry plants were growing. To obtain a range in incident radiation, zero to three layers of netting were placed over individual blueberry plants; in addition, a range in nutrient availability was provided through application of nitrogen fertilizer with supplements of other nutrients at the start of the growing season. Measurements of twig growth and foliar analyses were made at the end of the growing season before leaves began to senescence.
The results of the field experiment fully supported theoretical predictions. Annual shoot production by blueberry increased with irradiance as expected (Fig. 6.10a). Excess amounts of nitrogen proved damaging to growth, a not uncommon response by ericaceous plants that have limited ability to reduce nitrate nitrogen (Smirnoff et al., 1984). The concentration of tannins in foliage increased sharply with irradiance as theory also predicted (Fig. 6.10b). As a result of the increase in tannins and decrease in amino acid levels (not shown), the specific assay of digestible N for deer (Roberts et al., 1987) indicated a general reduction in palatability as irradiance increased (Fig. 6.10c). Specific leaf weight, an easily measured structural index, varied inversely with the pattern shown for digestible nitrogen (Fig. 6.10d). In the particular region where the field study was conducted, changes in specific leaf weight were closely correlated with the biochemical analyses. Such correlations between structure and biochemistry have proved helpful to managers interested in assessing forage quality that affect winter carrying capacity and the survival of fawns (Van Horne et al., 1988; Hanley, 1993).
Browsing animals exert a number of controls on the rates of important processes in ecosystems. By selectively browsing young plants growing in partial shade, ungulates have effectively removed many deciduous hardwood species from European forests (Wolfe and Berg, 1988). Exclosure experiments in Isle Royale National Park (Michigan) have shown that such selective and intensive herbivory reduce the quantity and quality of litter returned to the soil, and hence N mineralization and NPP (Pastor et al., 1993). Generally, deciduous hardwoods are first removed, with conifers usually less injured by browsing. When animal populations reach levels where they consume the less palatable species, their impact on forest ecosystem function and structure is generally negative. Where populations of ungulates and other vertebrate herbivores are high, the leaves and twigs of aspen and birch that can be reached by browsing animals are less palatable than in areas with significantly lower densities of the animals (Moore, 1994). It is possible that these differences could result in reduced growth and altered rates of decomposition that would have little correlation with environmental variables that drive most ecosystem process models.
In summary, we note that it is essential to consider plant biochemistry when evaluating the susceptibility of plants to biotic agents of disturbance. The search for general indices of plant susceptibility to biotic disturbance is challenging because only a few biochemical constituents account for defense against particular organisms, and these constituents may differ significantly, depending on the herbivore and species (or race) of plant. Nevertheless, we see value in comparing the ratio of the concentrations of broad groups of defensive compounds (tannins, phenolics, lignin, alkaloids) to readily assimilated compounds (sugars and amino acids) against the amount of plant material consumed by insects, digested by browsing animals, or infected by pathogens. Under some conditions, key biochemical properties may be related to structural indices such as tree growth efficiency and specific leaf weight. When this is the case, changing patterns of host susceptibility or palatability may be readily monitored. The absence of the normal complement of palatable and unpalatable vegetation typically found in different stages of stand development may be indicative of unsustainable management practices that will result in fostering excessive variation in animal populations. Extremely high animal populations are particularly damaging as they often lead to a reduction in biodiversity and long-term site productivity.
In earlier chapters we noted how seasonal variation in climate affected ecosystem function and how species differed in their allocation of carbon to leaves, sapwood, and roots depending on their genetics, the availability of resources, and type of stress. The biotic components of ecosystems, and particularly long-lived trees, must adapt to infrequent events such as fires, floods, and windstorms. Throughout human history, attempts have been made to alter the natural frequency of fires and damage caused by floods and windstorms. We have to accept, however, that relatively catastrophic, natural disturbances will continue to occur. In fact, some practices associated with human activity, such as logging, grazing, and road building, have the potential to destabilize the natural resistance and repair mechanisms that reside in ecosystems. In other cases, human activities provide scarce resources to forest ecosystems by increasing the atmospheric concentrations of CO2 and the deposition of nitrogen and sulfur.
In the following sections we introduce additional functional indices that have diagnostic value and quantify responses to specific kinds of disturbances. In addition, we evaluate how various abiotic disturbances alter the flow of water, carbon, and nutrients through ecosystems. A historical perspective on the frequency of different kinds of abiotic disturbances and their effects on ecosystem processes puts us in a better position to assess the implications of changes imposed by management practices.
Although fires start naturally, human activities have long played a role in their spread and frequency. As Attiwill (1994) points out in a review, the record goes back at least 10,000 years in the Americas, over tens of thousands of years in Australia, and over 1.5 million years in Africa. Fire plays important roles in many forest ecosystems, and there are two main types: (1) ground fires, which consume litter and kill understory trees and shrubs, and (2) crown fires, which usually kill the overstory trees and lead to stand replacement.
The spread and intensity of fire depends on the prevailing climate and on the amount of fuel available. The amount of fuel available varies with stand development as a function of fire frequency and logging practices. In North America the greatest fuel loads accumulate on the forest floor in natural stands of Douglas-fir about 20 years after a crown fire, and 500 years later as western hemlock (Tsuga heterophylla) begins to replace dying old-growth Douglas-fir. Agee and Huff (1987) analyzed the circumstances in which fires occur and concluded from their analyses that to lessen the fire danger to a patch of old-growth forest, surrounding stands should be between 100 and 200 years old rather than made up of younger or older age classes. Their recommendation would also minimize wind damage to old-growth stands by establishing tall, younger forests on all sides.
Fire differs in its effect on soil fertility depending on the region. In the boreal zone, fire occurs frequently in upland spruce forests and muskeg bogs, two areas where organic matter with excessive C : N ratios accumulates and nutrient cycling through decomposition is severely limited. As litter accumulates, it shields the soil from solar radiation, permafrost rises closer to the surface, and the productive capacity of the system decreases. Fire removes much of the surface organic matter, concentrates nutrients in the ash, and allows solar radiation to warm the soil and increase the rooting depth to permafrost. As a result, more productive hardwoods return to occupy upland sites (Van Cleve et al., 1983). In muskeg bogs, willows replace poor quality browse provided by spruce and ericaceous shrubs and support an increase in moose and deer herds, as well as their predators (Viereck et al., 1983).
In temperate forests of the Great Lakes region, fire historically regenerated many of the short-lived tree species at intervals of less than 50 years (Heinselman, 1973). Much of the wildlife is dependent on frequent disturbance to the forest to provide high-quality browse to sustain them through harsh winters (Hansen et al., 1973; Jakubas et al., 1989; Gullion, 1990). With protection from fire and logging, longer lived tree species eventually replace short-lived ones. Shade-tolerant trees are often not well adapted to fire because they lack thick bark, the ability to sprout from roots, or the ability to produce epicormic branches. As organic matter continues to accumulate on the soil surface, a multiaged forest develops, which provides a fuel ladder from the forest floor to the canopy. Under these conditions, the likelihood of stand replacement fires increases (Romme, 1982). An analysis of fire histories from dated fire scars on tree trunks and stumps suggests that frequent, moderate burns were typical in fire-adapted forests and that infrequent fires were generally destructive, particularly to thin-barked trees (Keane et al., 1996b).
Periodic fires at intervals of less than a century have been important in maintaining diversity in other regions. Large areas of the pine forest native to the southeastern United States were maintained by fires; in the absence of fire, hardwood species predominate. Nitrogen-fixing species in many of the drier regions of the Pacific Northwest are dependent on fire for their regeneration and the release of a limiting supply of an essential trace element, molybdenum (Mo), which is sequestered over time in accumulations of organic matter (Silvester, 1989). Douglas-fir forests in many parts of the Pacific Northwest and the Rocky Mountains regenerated following fires at intervals between 125 and 400 years (Romme, 1982). Even the giant sequoia (Sequoiadendron) that may live for more than 2000 years is fire-dependent for seedling establishment in its native habitat (Kilgore and Taylor, 1979).
If we attempt to limit the area burned annually, large fires which are difficult to control will occur at less frequent intervals (Christensen et al., 1989). Individual stands may still be protected if fuel loads are first reduced and periodic ground fires are set and allowed to burn under prescribed conditions. These practices are widely applied to protect and perpetuate many pine plantations throughout the world (Sackett, 1975; Goldammer, 1983; Covington and Moore, 1994). Ground fires have some additional benefits in consuming volatile organic compounds that inhibit decomposition (White, 1986). Volatile compounds are particularly high in plantations composed of pine and eucalyptus trees, which raise the danger of fires to surrounding native vegetation and to human settlements.
The extensive and species-rich eucalyptus forests of Australia have been among the best studied in terms of the effects of fire on ecosystem processes (Raison et al., 1993). Although most species of eucalyptus are well adapted to fire, fire intensity and frequency greatly affect nutrient availability, which limits productivity on the highly weathered soils typical throughout much of Australia (Raison, 1980). Fire changes the availability of nutrients by volatilizing C, N, and S, while concentrations of K, P, and divalent cations increase in ash. Soil heating leads to an immediate accumulation of ammonium nitrogen (NH4+) as a result of chemical oxidation of organic matter. The amount released increases with the degree of soil heating for temperatures up to 400°C and with the content of N in the oxidized organic matter (Walker et al., 1983). Additions of ash to acid soils may lead to a decrease in exchangeable aluminum and an increase in soluble silica; as a consequence, soils are likely to fix less P in an unavailable form.
Fire frequency plays a critical role in the eucalyptus forests of Australia. Too long of an interval between fires results in excessive damage to soils and loss of nutrients, whereas too short of an interval prevents nitrogen-fixing shrubs from restoring N lost in combustion and reduces soil N mineralization rates by 35 to 50% (Raison et al., 1993). In most eucalypt forests the pattern of forest floor fuel accumulation can be described, as shown by Raison et al. (1993), by the sum of two negative exponential relationships:
where Xt is the mass (Mg ha−1) of litter accumulated at time t (years), Xs is the mass of litter accumulated under steady-state conditions, k is a decomposition rate constant (year−1), X0 is the residual litter remaining after the previous fire, and k′ is its decay constant. For Australian eucalypt forests, k varies from about 0.1 to 0.3 year−1 and Xs varies from about 10 to 30 Mg ha−1. The accumulation of elements such as N and P in fuels can also be described by similar exponential equations. Because the overstory trees are not generally killed, litterfall rates are maintained, so there is a rapid accumulation of fuels during the initial 5 years after burning with a plateau approached by 15 years (Fig. 6.11a). Nitrogen in the litter and shrubs follows a similar trend as fuel accumulation because symbiotic N-fixing plants, such as the woody leguminous shrub Daviesia mimosoides or the cycad Macrozamia riedlei, are able to establish themselves after fires and grow rapidly (Raison et al., 1993; Fig. 6.11b). In Australian eucalypt forests, an interval of about 10 years or more allows natural processes time to replace N loss during burning if volatilization of N is limited to approximately 50% of that in fuel (Raison et al., 1993).
In the tropical regions of Africa, Asia, and South America, fire frequency has increased as forest land is converted to farms and pastures (Matson et al., 1987). Depending on the intensity of fire and the kind of fuel present, large amounts of nutrients may be lost through volatilization, wind dispersal of ash, and surface erosion. Kaufmann et al. (1993) reported in studies of Brazilian second-growth tropical dry forest that maximum losses could exceed 500 kg N ha−1 and more than 20 kg P ha−1 following intense fires. In a study across a broader vegetation gradient in the Amazonian Basin of Brazil, Kaufmann et al. (1994) concluded that N and S were the nutrients lost in highest quantities during fire. Losses of P were intermediate, and losses of K and Ca were negligible. The total N in the rapidly decaying fuel was <4% of that in the surface 10 cm of soil and should, under natural conditions, be replaced in 1 to 3 years. Because of the extensiveness of fire in the tropics, the full implications must be assessed at a landscape and regional level with other methods (Chapters 7–9).
In many areas, fire plays a dominant role in recycling nutrients and in creating conditions favorable for establishing certain species. The historical frequency of fire varies with climatic conditions and with the amount and type of fuel accumulated. Through fire scar, charcoal carbon-dating, and stand-age analyses, it is possible to reconstruct fire frequencies and compare them with present-day conditions (Bradshaw and Zackrisson, 1990; Duncan and Stewart, 1991; Abrams et al., 1995). Although many stand development models take into account the differential responses of vegetation to fire, fuel loads, and climatic conditions (Shugart and Noble, 1981; Kercher and Axelrod, 1984), ecosystem models that predict the effect of fire on the pool of available soil nitrogen (Fig. 6.12a), LAI (Fig. 6.12b), and evapotranspiration (Fig. 6.12c) are just beginning to appear in the literature (Sirois et al., 1994; Keane et al., 1996b). These combination models, when expanded across landscapes, better assess the implications of forest practices under stable or changing climatic conditions (Chapters 7 and 8). In addition, they have the potential to include new species introduced into the ecosystem that may alter nutrient capital, fuel loads, and fire hazards.
As noted in Chapter 4, solid, gaseous, and dissolved compounds in the atmosphere are captured by forest canopies to a greater extent than by other types of vegetation because of the larger volume of exposed leaf surfaces (Lovett, 1994). Over the past 50–100 years, the concentration of gases such as ozone (O3), sulfur dioxide (SO2), nitrogen oxides (NOx), ammonia (NH3), and methane (CH4) have increased substantially above previous natural levels. The increase in atmospheric CO2, on the other hand, is still within the range of concentrations that encompass the recent evolutionary development of most species (Beerling and Chaloner, 1993; Van de Water et al., 1994). We have discussed the implications of continued increases in CO2 on photosynthesis, growth allocation, and community structure in previous chapters and will defer further discussion on the global scale significance until Chapter 9. In this section, the implications of increased trace gas concentration are considered as it relates to gradual reduction in stand LAI over time.
Like CO2, ozone and other gaseous pollutants are absorbed at rates proportional to the concentration gradient between the atmosphere and the leaf, constrained by the leaf stomatal conductance. The same models developed to predict stomatal control on transpiration provide a basis for estimating uptake of gaseous pollutants (Matyssek et al., 1995). Humid atmospheric conditions favor maximum opening of stomata and maximum sensitivity to pollutants. Crop plants and deciduous trees are more sensitive to given exposure to ozone or other pollutants than are most evergreen species due to differences in maximum stomatal conductances (Reich, 1987). If ozone damage reduces the photosynthetic capacity of leaves more than stomatal conductance (Matyssek et al., 1992), then the carbon isotopic composition of foliage and wood (adjusted for change in atmospheric composition) should become more depicted in 13C and enriched in 12C. On the other hand, SO2 has been reported to cause stomata to close more quickly, which results in an increase in the 13C/12C ratio in foliage (Martin and Sutherland, 1990). Although evergreen species restrict the diffusion of ozone and other pollutants into their leaves more than deciduous species, the longer life span of evergreen leaves increases their total exposure and often results in premature shedding of foliage (McLaughlin, 1985). Rising concentrations of atmospheric CO2 are predicted to induce some stomatal closure that could ameliorate the detrimental effects of gaseous pollutants (Field et al., 1992).
Trees may also produce a number of volatile organic compounds that create haze above forest and result indirectly in reducing the solar energy load on the canopy. Isoprene, one volatile compound produced by plants, has been shown to protect chloroplasts as temperatures rise to dangerous levels (Sharkey and Singsaas, 1995). Concentrations of isoprene inside leaves increase at temperatures above 35°C; the efflux from leaves continues to increase although stomata may close completely (Monson et al., 1994; Sharkey et al., 1995). Emissions of monoterpenes also increase greatly with temperature and vapor pressure deficits, which has implications for attracting bark beetles (Waring and Pitman, 1985).
More cations (Ca2+, Mg2+, K+) are often derived from atmospheric deposition than from mineral weathering in many forests (Chapters 4 and 5). Since the 1950s, atmospheric inputs of nitrogen and sulfur have become dominant sources of these nutrients in many northern temperate forests (Hedin and Likens, 1996). Airborne deposition of heavy metals (Pb, Cu, Cd, Hg, Ni, Zn) has also increased locally where traffic, mining processing plants, and heavy industrial development are concentrated. Heavy metals are often lethal to mosses, lichen, and other species which collect nutrients exclusively from airborne sources or from stemflow (Tyler, 1972). When heavy metals enter the detritus pool, they may inhibit microbial activity and the release of nutrients to higher plants (O’Neill et al., 1977). Changes in heavy metal deposition rates have been quantified by analyzing the contents of dated wood cores extracted from long-lived trees (Ragsdale and Berish, 1988; Jordon et al., 1990; Eklund, 1995).
Because nitrogen and sulfur are transported further from their source than heavy metals, they have potentially greater impact. Fortunately, deposition rates of N and S have been widely monitored at a number of sites, with one of the best long-term records available at Hubbard Brook Experimental Forest, New Hampshire (Likens and Bormann, 1995). During the International Biological Program in the 1960s, a host of additional sites began to record nutrient inputs in wetfall. In the 1980s, many more sites were established in the United States and Europe, and included elemental deposition in dryfall and cloud/fog precipitation. The more inclusive measurements showed that wetfall often represents less than 50% of total inputs, and less than 25% where cloud/fog precipitation is prevalent (Johnson and Lindberg, 1992). The network of monitoring sites have shown considerable variation in atmospheric deposition over decades which reflect changes in national policies on emissions of pollutants, land use, and weather patterns (Hedin and Likens, 1996).
Where ecosystems have developed on infertile soils, the additions of nitrogen and sulfur, as well as base cations, can be extremely beneficial. Atmospheric deposition or nitrogen may counterbalance any increase in C : N ratios that might be associated with rising CO2 levels, and by doing so increase litter decomposition rates and site fertility. Such responses have been already observed in some ponderosa pine forests distributed across a deposition gradient in California (Fenn, 1991; Trofymow et al., 1991). Where sulfur and nitrogen are deposited in excess of what vegetation requires or the soil can store, the excess moves into the groundwater, carrying with it acids and toxic concentrations of aluminum that affect stream and lake chemistry in detrimental ways.
Enough is known about cation exchange and leaching processes to construct fairly general models to predict the transfer or immobilization of sulfur, hydrogen, and other cations through soil and into streams (Cosby et al., 1985; Gherini et al., 1985; Norton et al., 1992; Wright et al., 1995). Models of nitrogen cycling are available, but they are difficult to test because some of the variables cannot be easily measured and long-term predictions are not based on fully established principles (Aber et al., 1991; Johnson, 1992).
Beyond certain limits, N deposition will exceed the uptake potential of vegetation and soil storage. Such ecosystems become “nitrogen saturated” as excess N is released into the groundwater and vented to the atmosphere through denitrification (Chapter 4). We might conclude that annual additions of nitrogen in the range from 25 to 50 kg ha−1 would be entirely beneficial to some types to forest ecosystems. Certainly such rates are well below those applied commercially to enhance forest growth. However, pulse additions of fertilizer are quickly incorporated into the soil or stored in foliage and other tissue to be later utilized by plants. In contrast, chronic, long-term additions of N, even at rates slightly above 10 kg ha−1 year−1, have resulted in site degradation in some parts of Europe (Dise and Wright, 1995; Wright et al., 1995).
Johnson (1992) reviewed the results of 60 fertilization trials and ecosystem studies conducted in temperate and boreal forests where the amount of nitrogen deposited, leached, and incorporated in aboveground increment was measured, and that retained in the soil calculated by difference (input – increment – leaching). From the analysis, Johnson suggested that leaching of nitrogen from soils could be predicted quantitatively as a function of atmospheric N input minus N increment in wood (Fig. 6.13) At N deposition levels below that required for increment, no leaching was observed; trees were apparently able to obtain nitrogen from the soil in competition with heterotrophic fungi. At higher deposition rates, a fixed proportion of the excess N was stored in the soil, equal to that lost through leaching. There were no obvious relationships between N deposition rates and calculated soil N retention, nor with tree N increment (Johnson, 1992).
To appreciate how the nitrogen cycle may be altered with increasing deposition of N, we adopt a convenient classification scheme developed by Aber et al. (1989). We will identify shifts in the relative importance of various processes at different stages and extract some general indices from the reviews by Abet (1992) and Johnson (1992).
Figure 6.14 depicts four stages that represent a transition from conditions improving forest productivity to ones causing a major decline, associated with a massive reduction in LAI. Stage 0 represents conditions with background levels of N deposition of <2 kg N ha−1 year−1 (Hedin et al., 1995). At these levels, most temperate and boreal forests are limited by the slow nitrogen release rates through decomposition and by the ability of mycorrhizal fungi to take up organic forms of N directly (Chapter 4). The concentration of N in foliage and litter potentially limits both photosynthesis and decomposition. Some of the NH4+ released through microbial activity may be accessed by nitrifiers, but no net nitrification occurs in evergreen-dominated systems (Robertson, 1982). In deciduous forests, some net nitrification could be expected, but NO3− uptake by plants and microbes would prevent loss of N, except in dissolved organic matter (Hedin et al., 1995).
In the beginning of stage 1 (Fig. 6.14), nitrogen deposition increases above background levels to a constant 20 kg ha−1 year−1. Such deposition rates may seem low, compared to single applications of fertilizer, but they appear sufficient to cause N saturation in central Europe (Dise and Wright, 1995). The capacity of soils and vegetation to incorporate large amounts of nitrogen vary, so that stage 1 may continue for only decades or for several centuries (Aber et al., 1991). Foliar N concentrations and foliar biomass peak during the middle of stage 1 as nitrification becomes the dominant process. With a relative excess of nitrogen, total NPP increases, but proportionally less carbon is allocated to roots and mycorrhizae, which reduces the ability of the system to immobilize N in the soil (Stroo et al., 1988) and favors heterotrophic fungi over symbiotic species (Osonubi et al., 1988; Johnson et al., 1991; A. H. Johnson et al., 1994). Less root growth, together with leaching of base cations, is likely to create deficiencies in magnesium and other nutrients (Zinke, 1980; Oren et al., 1988b; Schulze 1989; Cote and Camire, 1995). As a result, growth per unit of leaf area or per unit of N will decrease (Oren et al., 1988a,b).
At the start of stage 2, foliar biomass begins to decrease as continued additions of N have detrimental consequences (Fig. 6.14). Evergreen coniferous species are particularly sensitive because they have limited ability to increase their photosynthetic capacities per unit leaf area (Reich et al., 1995b) or to utilize large amounts of nitrate (Smirnoff et al., 1984). The added NO3−, if not reduced when taken up by plants, can be toxic and may cause chlorosis of foliage, premature foliage drop and mortality (Schulze, 1989).
The start of stage 3 is defined by extensive mortality of overstory trees, which results in reduced foliar biomass (Fig. 6.14). Nitrogen-saturated ecosystems typically lose large amounts of nitrogen through leaching of NO3− and volatilization of nitrous oxide (N2O). The ability of soils to consume methane (CH4) is also reduced because the enzyme system involved in the reaction no longer distinguishes between NH4+ and CH4 (Steudler et al., 1989). Tree species with higher demands for nitrogen and the ability to induce nitrate reductase activity in their foliage can slowly replace less well adapted species. In the interim, both the overstory and total LAI will be substantially reduced (Ulrich, 1983).
Aber (1992) encapsulates the ideas presented above to explain a lag of a decade to a half century or more between the time when N deposition increases (stage 1) and when excessive nitrate losses are observed (stage 2), as the result of competition between plants, heterotrophic microbes, and nitrifiers for the available pool of ammonium. Net nitrification should remain at zero as long as the demands by plants and decomposers for NH4+ are not fully met. In addition, ammonium is held relatively strongly on cation-exchange sites and can also be immobilized directly into soil organic matter (Johnson, 1992; Paul and Clark, 1996). The relative strengths of these competing forces for NH4+, along with the gross N mineralization rate, determine the residual concentration of inorganic N in the soil solution and leachate (Aber, 1992). Table 6.4 summarizes the principal changes expected in the transition from an N-limited to an N-saturated system.
TABLE 6.4
Contrasting Characteristics of N-Limited and N-Saturated Forest Ecosystemsa
Characteristic | N-limited | N-saturated |
Soil properties | ||
Soil C : N ratio | High | Low |
Dissolved organic (labile) C in soil | High | Low |
Ratio of gross NO3− immobilization to gross nitrification | 1.0 | <0.01 |
Ratio of gross NH4+ immobilization to gross mineralization | ~0.95 | <0.5 |
Population of nitrifiers | Low | High |
Plant properties | ||
Form of N taken up by plants | NH4+ | NO3− > NH4+ |
Foliar N concentration | Low | High |
Foliar free amino acid concentration | 0 | High |
Transfer properties | ||
Nitrate losses in leaching | 0 | High |
N2O production | 0 | High |
CH4 absorption by ecosystem | High | Low |
aPrincipally from Trends in Ecology and Evolution, Volume 7, J. D. Abert, “Nitrogen cycling and nitrogen saturation in temperate forest ecosystems,” pp. 220–223, 1992, with kind permission of Elsevier Science–NL, Sara Burgerhartstraat 25, 1055 KV Amsterdam, The Netherlands.
What management options might delay N saturation? Fast-growing trees, harvested in short rotations, can extract N, but the wood will contain considerable amounts of base cations and phosphorus, which are likely in short supply. Site preparations that included slash burning would transfer N back into the atmosphere, while the more limiting nutrients would remain on the site. Increase leaching could result with reduced cover, however, and the total amount of N in biomass is usually less than 20% of that stored in the soil (Johnson, 1992). Grinding up large woody debris and mixing the residue into the soil should immobilize considerable N and slowly release base cations back into the system (Turner, 1977; Waring and Pitman, 1985). We advocate that further experimentation be coupled with selected measurements of N cycling controls and ecosystem response to clarify the relative merits of various practices designed for specific soil and climatic conditions.
Forest harvesting involves mechanical extraction of biomass from an ecosystem. In most instances, the removal of woody biomass results in only a small percentage loss of the total content of nutrient elements in forest ecosystems because the largest pool is in the soil (Chapter 4). Sawlog harvest of a mixed oak forest in Tennessee removed from 0.1 to 7.0% of the nutrient pool of N, P, K, and Ca (Johnson et al., 1982). Firewood harvest of a young rain forest in Costa Rica resulted in removal of 3% Ca to 31% S of the pool of various nutrients in the vegetation and surface soil (Ewel et al., 1981). The foliage and branches that are left behind contain a large portion of the nutrient pool in vegetation, because of the higher nutrient concentrations in these tissues than in wood. Leaf litter and small branches are likely to decompose rapidly, releasing nutrients for regrowth. With conventional harvest, the number of years needed to replace nutrients removed through the accumulation of annual inputs from atmospheric deposition, N fixation, and weathering is usually less than the harvest cycle (Johnson et al., 1982; Van Hook et al., 1982; Likens and Bormann, 1995).
Harvest of whole trees for pulpwood or biomass for energy results in substantially greater nutrient removals, although these may still be replaced within a regrowth cycle in many instances (Silkworth and Grigal, 1982; Van Hoob et al., 1982). Often the available forms of N, P, and other nutrients are derived mainly from the rapid decay of fresh organic matter. In such cases, the analysis of loss during harvest might more reasonably be expressed in reference to the available pool in vegetation and litter rather than that in the total soil. Whole-tree harvesting may remove 30% of the available Ca in a northern hardwoods forest in New Hampshire, but only 2% of the total in the soil (Hornbeck and Kropelin, 1982). The consequence of removing nutrients in harvested biomass on different sites can, in principle, be assessed by measuring the rate at which the available pool is replenished from various sources (including atmospheric deposition and fertilization) and by monitoring nutrient balances and return in litter over critical periods. In reality, atmospheric deposition is highly variable as a result of rapid changes in pollution control policies (Hedin and Likens, 1996). Moreover, the availability of two of the most critical nutrients, nitrogen and phosphorus, is often unrelated to the total content of these elements in the soil.
With the removal of the forest, the soil surface warms, and, with less transpiration, more water is available; these are environmental combinations that generally favor increases in decomposition, mineralization, and nitrification. Normally, herbaceous and shrubby vegetation quickly reoccupies a site following removal of the forest overstory so that little loss of inorganic nutrients occurs (Marks and Bormann, 1972). Attempts to control the growth of herbs and shrubs through application of herbicides, introduction of grazing animals, or by fire may contribute to accelerated losses of nitrate and other nutrients if the capacity of microbial organisms and other components of the soil to immobilize nutrients is exceeded (Vitousek et al., 1982).
The problem of nutrient loss is particularly severe in the wet tropics because, following clearing, a rich supply of carbon and organic nitrogen is initially available (Matson et al., 1987). As a result, large pulses of nitrous oxide, nitric oxide, and methane may be produced while NO3− is leached downward in the profile (Matson et al., 1987; Luizao et al., 1989; Keller and Reiners, 1994). Trace gas emissions are reduced rapidly once either the available carbon or nitrate substrate is exhausted in the surface soils (Matson et al., 1987; Parsons et al., 1993). In most situations, microbial biomass is rapidly reduced following disturbance, so its contribution to immobilizing N is much less than in temperate and boreal forests. On the other hand, tropical soils have a much larger capacity than temperate soils to capture nitrate on anion-exchange sites. Once deep roots are reestablished his nitrate is available for regrowth of the forest (Matson et al., 1987).
Many tropical forests are converted for growing crops or more permanently into pasture. Conventional soil analyses do not provide reliable measures of soil fertility because so much of the nutrients are bound in various fractions of soil organic matter. A better assessment of available nutrients can be obtained with knowledge of the turnover rates or mean residence times (MRT) of various organic fractions. The pulse of radioactive carbon-14 put into the atmosphere during the peak years of atomic bomb testing provides a radioactive label to all detritus produced in the late 1950s and early 1960s which differs from that produced before or since. Tiessen et al. (1994) used carbon-14 analyses to determine the mean residence time of organic matter that holds most of the available nutrients in an extremely poor sandy soil from an Amazon rain forest at San Carlos de Rio Negro in Venezuela and from a sandy soil which supported a semiarid thorn forest in Brazil. With measurements of MRT and nutrient content of the organic matter, Tiessen et al. calculated that agriculture could be sustained following slash burning of the two forests only 3 and 6 years, respectively. In contrast, sandy prairie soils from the Canadian Great Plains were cited as being able to support economic agriculture without fertilization for 65 years. Trumbore et al. (1996) applied the same type of carbon-14 analyses to confirm the importance of soil carbon turnover at depths below 1 m in an evergreen Amazonian forest. They showed that managed pastures fertilized with phosphorus and planted with highly productive grasses could add large amounts of carbon quickly to the upper meter of soil, which partially offset predicted carbon losses due to death and decomposition of fine tree roots below 1 m in the soil. Such isotopic analyses, when applied to specific fractions of soil organic matter, provide a sound basis for calculating organic matter production and turnover, two rates that are critical in assessment of regional and global carbon balances (Chapter 9).
Erosion is one additional route for nutrient loss when trees are harvested from steep slopes. With tree removal, large-diameter roots begin to decay. As decay progresses slope stability is reduced until a new network of interlocking roots is reestablished (Ziemer, 1981). To remove wood products, roads must be constructed, and soil compaction often occurs in the process of harvesting. Both activities may contribute to slope instability and accelerated erosion (Harr et al., 1979; Reid and Dunne, 1984; Jones and Grant, 1996). Some physically based models are available to predict the risks of erosion and runoff associated with logging and road construction (Rice and Lewis, 1991; Luce and Cundy, 1994). Identification of unstable slopes and unreliable road construction practices can greatly reduce erosion hazards because most erosion related to forest management occurs on a small fraction of the total forest area (Rice and Lewis, 1991).
In general, extraction of wood products from forests removes only a modest fraction of the total nutrients present in soil and litter. If vegetation is not rapidly reestablished, however, or if the permeability of surface soil is significantly reduced following logging, the stabilizing network of roots can be destroyed, increasing the potential for soil erosion and nutrient loss from the site. Under exposed conditions, the net release of radiative gases that affect Earth’s energy balance also will increase along with the export of nutrients (and pollutants) into the groundwater and adjacent aquatic systems. By careful location and construction of roads and by adhering to good logging practices, disturbance can be minimized, but the largest benefits may accrue by reducing the fraction of LAI removed in a single harvest, as discussed in Chapter 5.
Mechanical forces include the effects of wind, snow, and mass movement of material. Trees that possess flexible branches and leaves suffer little damage at normal wind speeds. Increased stem taper also provides greater resistance to breakage from wind. Where wind is a potential problem, thinning operations must be limited to provide time for residual trees to add more wood to their lower boles and large-diameter roots (Petty and Worrell, 1981). Genetic variation in tree resistance to wind is well documented. Spruce (Picea), with typically shallow roots, is easily blown down, whereas pine (Pinus) with deep tap roots is a species more resistant to wind. However, variation exists within a genus too; plantations of lodgepole pine in southern Finland are much more susceptible to wind damage than the native Scots pine (Lahde et al., 1982).
The edge of a forest exposed to the prevailing wind is of particular concern. Normally, the edge is shaped so that wind sweeps up and over the canopy and does not penetrate directly through the stand. When roads or logging disturb the edge, wind enters, modifies the microclimate, and may result in extensive blowdown (Chen and Franklin, 1992; Ruel, 1995). By quickly reestablishing the canopy configuration typical of boundaries exposed to wind, the structural integrity and microclimate within a stand can be maintained.
To protect forests, Ruel (1995) concludes that trees should be selected which can adjust to wind by developing more wind-resistant canopies and by growing deeper and larger diameter roots that interlock with one another. Soil and root characteristics determine the sturdiness of the anchoring systems; age, stocking density, and thinning practices modify a stand’s resistance to wind, while the topographic exposure and general climate impose ultimate limits on management options. In western Scotland, the choice of species and rotation age are made from an analysis of topographic exposure. Potential plantation sites are evaluated by installation of standard sized cloth flags, on which ablation is observed after a set time (Lines and Howell, 1963). On the basis of these wind hazard analyses, pine or spruce plantations are established and their “safe” rotation age specified.
Many forests are dependent on wind for maintaining diversity and productivity. High-intensity windstorms associated with hurricanes, typhoons, and tornadoes rarely destroy whole forests. For example, in 1938 a hurricane struck 4.5 million ha of forest in the northeastern United States (Smith, 1946), but only about 5% of the forest experienced heavy damage. Damage was mainly restricted to trees infected by pathogens or to those occupying particularly wind-prone sites (Henry and Swan, 1974; Bormann and Likens, 1979). Likewise, tropical windstorms cause frequent treefall (Lugo et al., 1983), averaging about one tree per hectare per year (Hartshorn, 1978). The regularity of these disturbances limits the maximum age of trees in forests of Costa Rica to a range of 80–140 years and fosters, as a result, relatively stable net primary production over time. Comparable rates of windthrow are found elsewhere in the tropics (Putz and Milton, 1982). On the other hand, hurricanes that topple whole forests create conditions where the original composition is maintained through resprouting and seedling establishment. This sequence is not possible through normal gap formation and may explain the composition and structure of tropical rain forests subjected to frequent hurricanes (Yih et al., 1991; Attiwill, 1994).
Wind can play an important role in maintaining soil fertility. For example, in coastal Alaska where a temperate rain forest of Sitka spruce (Picea sitchensis) and western hemlock (Tsuga heterophylla) dominates, organic matter accumulates rapidly on the forest floor and leads, even on freshly deposited glacial material, to podzolic horizons in the surface soil after only 100–150 years. Wildfires are uncommon so that continued accumulation of organic matter, combined with leaching of Fe and Al, create conditions where the water table rises and roots become largely confined to the upper organic-rich, nutrient-depleted horizons. Historically, periodic windstorms uprooted trees, mixing the soil horizons and thus slowing the degradation process (Bormann et al., 1995). Harvesting practices should encourage the soil horizons to be mixed, and the amount of organic woody debris left on site should be minimized. Without following these practices, many forest ecosystems in northern Europe have been transformed to unproductive bogs (Miles, 1985).
Snow and ice are mechanical forces that may create extremely high stresses on tree branches and stems. Under some conditions, the load is sufficient to cause breakage of stems as well as branches. On steep slopes, snow moves slowly downhill in the spring, causing young trees to be uprooted, broken off, or forced to develop a “pistol-butt” shaped lower bole. In temperate regions where snow accumulates at predictable depths in response to elevational gradients, the distribution of forest types is often correlated with the average depth of snowpack. Models described in Chapter 2 provide estimates of snow water content and canopy interception which can provide a basis for judging the bearing load that snow can impose on vegetation. In the western United States, the upper elevational limits of Douglas-fir, western hemlock, and ponderosa pine, as well as most hardwood species, are clearly limited by their susceptibility to snow breakage. Subalpine fir (Abies lasiocarpa), mountain hemlock (Tsuga mertensiana), and Jeffrey pine (Pinus jeffreyi) replace less adapted species as the snowpack increases (Waring, 1969).
Lemon (1961) suggested that ice storms could speed up the normal successional sequence by selectively damaging faster growing species that are normally established earlier (Chapter 5). DeSteven et al. (1991) confirmed this prediction in a study of a forest dominated by sugar maple (Acer saccharum), beech (Fagus grandifolia), and basswood (Tilia americana) in which an ice storm caused accretions up to 12 cm in diameter. Mortality of beech and other dominant species allowed saplings of shade-tolerant sugar maple to increase their growth. On the other hand, Nicholas and Zedaker (1989) suggest that ice storms may play an important role in maintaining red spruce (Picea rubens) in the higher elevation forest in the southeastern portion of the United States in spite of the presence of more shade-tolerant species.
Mass movement of soil and snow avalanches create conditions that favor regeneration of some species. In the Cascade Range of the Pacific Northwest, two nitrogen-fixing species are well adapted to colonizing cleared areas following soil erosion (Alnus rubra) or snow avalanches (Alnus incana). Similarly, in the Andes mountain range of Chile and Argentina where earthquakes and volcanic eruptions are rather frequent, single-aged stands of Northofagus species are maintained up to 300 years and then replaced following another stand-destroying event which exposes bare soil (Kitzbergen et al., 1995; Veblen and Alaback, 1996). Trees growing on unstable soils can often be recognized because they tend to shift their centers of gravity over time, which leads to an irregular buttress and nonvertical alignment.
Some of the residue from massive slope failures creates benches of deep soil deposits at midslope. When soil displaced by mass movement reaches streams it is deposited as alluvium on floodplains. Silt from a single flood may accumulate in deposits more than 50 cm deep around the base of coast redwood trees (Sequoia sempervirens) without preventing a new root system from quickly reestablishing (Stone and Vasey, 1968). In some areas, the sequence has been repeated for over 1000 years so that the surface on which trees were originally established now lies nearly 10 m below the surface. In addition, the fresh silt provides a seedbed free of fungi which allows redwood seedlings to germinate successfully and to colonize new areas on the floodplain (Stone and Vasey, 1968).
Native forests are well adapted to mechanical forces typical of the environments in which they are found. Silvicultural handbooks provide a general ranking of species in regard to their tolerances to various environmental stresses, including responses to mechanical forces (Fowells, 1965; Burns and Honkala, 1990). Often, widely distributed species are still planted in zones where they are ill-adapted because they initially show rapid growth. If rainfall intensity or the accumulation of snowpack were to change significantly, the boundaries of present forest types could shift abruptly.
In this and previous chapters, a number of structural and chemical properties have been identified that have general or specific diagnostic value (Table 6.5). In general, if high levels of amino acids are observed in foliage or other plant tissues, this is a sign of an unstable forest with high susceptibility to biotic-induced disturbance. Chemical and physical analyses of tree rings are particularly useful to assess historical variation in atmospheric deposition of heavy metals and tree responses to variation in atmospheric pollutants and CO2. The frequency of fire and the intensity of insect outbreaks may also be inferred from scars and historical patterns in growth efficiency. Plant community attributes provide some insight into the populations of browsing animals. Together, biochemical, isotopic, and mineral analyses of foliage, wood, and roots yield clues to the susceptibility or resistance of forests to specific agents of disturbance.
TABLE 6.5
Summary of Structural and Chemical Indices for Diagnostic Analysis of Ecosystems Exposed to Different Kinds of Environmental Stresses
Environmental factor | Diagnostic indices |
General stress symptoms | Increase in free amino acids in leaves, reduced wood growth per unit leaf area, nitrate leached from system |
Herbivory | Loss of most palatable species, evergreens replace deciduous species, low ratio of defense/energy compounds in plant tissue |
Soil flooding | Presence of buried stems and soil litter layers in soil profile |
Slope instability | Asymmetrical and leaning trees, mixed soil horizons with buried large roots |
Soil drought | Low sapwood relative water contents, trees with low leaf area/sapwood area ratios, leaf δ13C not correlated with branch length |
Soil nutrient imbalance | Adverse nutrient ratios in foliage, low wood production per unit of N in foliage |
Excess N deposition | Leaf N in excess to other nutrients, leaching of nitrate, volatilization of nitrous oxide |
Fire-prone environments | Charcoal in soils, fire scars on trees that have thick bark, serotinous cones, or other adaptations to fire |
Wind-prone environments | Stem taper in trees with asymmetrical crowns, historical evidence of uprooted trees, predominant direction of blowdowns on exposed sites |
Ozone and SO2 | Loss of lichens, accelerated loss of older evergreen foliage, evidence of sulfur concentrations in annual rings, historical shifts in δ13C in cellulose unrelated to climatic variation or atmospheric concentrations of CO2 |
Historical analysis of stands suggests that moderate mortality caused by wind, fire, insects, disease, and other types of disturbances creates gaps in the canopy that provide benefits to residual trees. These benefits include reduced competition for light, water, and nutrients and improved resistance against many agents of disturbance. When assessed over decades, frequent, moderate disturbances seldom increase the normal rates of mortality beyond that expected with self-thinning, nor is the general forest structure greatly altered. Frequent disturbances, however, may favor some species over others and alter expected trends in forest composition.
Large-scale natural disturbances associated with hurricanes, fire, landslides, and insect and disease outbreaks greatly increase mortality but, in the long run, rejuvenate the system and help maintain or increase its productive capacity. When the frequency or kind of disturbance is significantly altered, new conditions are created that favor new mixtures of species from those previously dominant. With insight into basic ecosystem processes and the use of various indices of disturbance, the long-term consequences of disturbances may be better recognized so that remedial, preventive, or adaptive actions have a sound basis.