Sarah A. Cooley
Department of Neurology, School of Medicine, Washington University in Saint Louis, St. Louis, MO, USA
The relationship between neuropathology and clinical presentation is imperfect. An often‐citedpostmortem study of older individuals reported substantial Alzheimer's pathology in the brain tissue of individuals who did not have a corresponding degree of cognitive impairment prior to death (Katzman et al., 1988). The concept of cognitive reserve (CR) was introduced as an important explanation for this disconnect in pre‐ and postmortem disease phenotypes. The following chapter reviews the theoretical basis of CR and putative neural mechanisms underlying the construct. Potential pathways to enhance CR throughout the lifespan are discussed.
Katzman et al. (1988) reported greater brain weight and neuronal cell counts in cognitively normal individuals with neuropathological evidence of Alzheimer's disease (AD) compared with cognitively impaired individuals with a similar degree of neuropathology. This observation became the basis for the “passive” model of reserve, otherwise known as brain reserve (Satz, 1993). The brain reserve model highlights the importance of premorbid brain structure integrity, which naturally varies across individuals. Those with higher reserve (more robust neuronal integrity) prior to an injury or onset of disease are less likely to express symptoms compared with individuals with lower reserve. The brain reserve model is often referenced as a quantitative model because biological measures of brain integrity are summed to calculate a quantitative index. For example, neuronal/synapse count (Wilson et al., 2013), head circumference (Perneczky et al., 2010), and brain/intracranial volume (ICV) (Sumowski et al., 2013) have been utilized as markers of brain reserve. Not all studies report significant associations between these markers and clinical outcomes (Jenkins, Fox, Rossor, Harvey, & Rossor, 2000; Tate et al., 2011), raising concern that additional factors contribute to individual outcomes despite similar levels of neuropathology.
CR was proposed as an alternative model to brain reserve. CR is described as an “active” model that emphasizes brain function (versus brain structure) and dynamic versus static mechanisms (Stern, 2002). Similar to brain reserve, a person with high CR would be expected to perform better on testing or have a better clinical outcome compared with an individual with lower CR at any given level of neuropathology. Importantly, when compared with those with low CR, individuals with higher CR experience a much faster disease progression after the expression of initial symptoms. These results suggest that a critical threshold governs disease manifestation, such that clinical evidence of severe cognitive dysfunction or dementia is delayed until the threshold of pathology has been met (Hall et al., 2009).
CR is quantified using proxy measures including education (Stern et al., 1994), occupational attainment (Stern et al., 1994), premorbid IQ (Armstrong et al., 2012), literacy (Manly, Schupf, Tang, & Stern, 2005), leisure activities (Scarmeas, Levy, Tang, & Stern, 2001), social networks (Bennett, Schneider, Tang, Arnold, & Wilson, 2006), and bilingualism (Gollan, Salmon, Montoya, & Galasko, 2011). Nucci, Mapelli, and Mondini (2012) demonstrated additional gain using an aggregate measure of multiple CR proxies, referred to as the Cognitive Reserve Index.
Stern et al. (1994) conducted one of the first studies of CR and risk of AD in a large group of older adults. Results from this study indicated that individuals with low CR (<8 years of education) had a 2.2 times higher risk for developing dementia over a 4‐year period compared with those with high CR (≥8 years of education). Additionally, individuals with lower occupational attainment (e.g., skilled trade, office worker) were more than twice as likely to develop AD over the same four‐year period compared with those with higher occupational attainment (e.g., manager, technical professional). A follow‐up study reported that individuals who participated in more frequent leisure activities (high CR) had a 38% lower risk of developing dementia compared with those engaged in fewer leisure activities (Scarmeas et al., 2001). Perhaps most importantly, late life modifications to CR lower the risk of subsequent dementia, suggesting neuronal plasticity across the older adult age range (Valenzuela & Sachdev, 2006).
Higher CR is associated with better cognitive health in normal aging (Baker et al., 2017), AD (Scarmeas et al., 2003) and numerous neurological conditions including cerebrovascular disease (Brickman et al., 2009; Lane, Paul, Moser, Fletcher, & Cohen, 2011), human immunodeficiency virus (HIV) (Foley et al., 2012), and Parkinson's disease (Hindle, Martyr, & Clare, 2014). Other studies report the beneficial effects of higher CR on cognitive outcomes in traumatic brain injury (Kesler, Adams, Blasey, & Bigler, 2003), multiple sclerosis (Nunnari et al., 2016), Lewy body disease (Perneczky et al., 2008), hepatitis C (Bieliauskas et al., 2007), and frontotemporal lobar degeneration (Placek et al., 2016).
The literature has utilized two broad terms (neural reserve and neural compensation) to describe the underlying neural mechanisms of CR (Stern, 2017). Neural reserve refers to the interindividual variability in brain regions or networks that contribute to cognitive performance and the resistance of these networks to neuropathology. Neural compensation refers to the recruitment of additional/alternative brain regions or networks to perform a cognitive task among individuals with brain degeneration.
Neural reserve encompasses the efficiency of a brain region or network needed to successfully perform a given task. Capacity refers to the maximum degree to which a brain network can be activated to perform a cognitive task, particularly as tasks become increasingly more challenging. Accordingly, a person with higher capacity is more likely to perform a difficult, resource‐demanding cognitive task than someone with lower capacity. Neuroimaging studies of neural reserve, CR, and cognitive performance suggest that these concepts apply most readily to young adults with intact brain networks and limited neuropathology (for reviews see Anthony & Lin, 2018; Steffener & Stern, 2012; Stern, 2017).
Neural compensation is typically studied within the context of age‐related changes in brain networks as well as pathological (e.g., AD) changes in the brain. Under the neural compensation model, once a primary network reaches capacity or loses function, an alternate (or “compensatory”) network is activated to facilitate task completion. Accuracy, however, is not guaranteed. For example, the older adults who engage compensatory networks to perform a task are more likely to exhibit worse overall performance compared with older adults who did not require engagement of brain regions beyond the primary network (Zarahn, Rakitin, Abela, Flynn, & Stern, 2007). Both neural reserve and neural compensation are associated with CR (Habeck et al., 2003).
In terms of anatomical substrates, Stern et al. (2008) identified the superior frontal gyrus as key to CR‐related benefits. Neural correlates of CR also include the medial temporal lobe regions (e.g., medial temporal gyrus, parahippocampal gyrus), which correspond with higher CR in a manner suggesting neural reserve in younger individuals, while the neural correlates of neural compensation in aging (independent of AD) include frontal regions (e.g., superior frontal gyrus, inferior frontal gyrus) (Anthony & Lin, 2018). Additionally, resting‐state functional MRI reveals that activity in the default mode network and dorsal attention network correlates with CR in older adults (Arenaza‐Urquijo et al., 2013; Bastin et al., 2012).
Life experiences have a greater impact on CR compared with brain reserve. For example, Richards and Sacker (2003) reported strong associations between CR and adult occupational attainment. Other studies suggest that participation in mentally stimulating social or leisure activities (e.g., knitting, gardening) relates to lower risks of dementia in older adults (Scarmeas et al., 2001). One study estimated a decline by 0.18 years for every day of self‐reported participation in cognitively stimulating leisure activities in community‐dwelling older adults (Hall et al., 2009). These results suggest that mental stimulation increases CR and provides protection against cognitive decline, even when implemented in late adulthood. Preliminary evidence indicates these effects may even extend to individuals with mild to moderate AD (Mondini et al., 2016).
CR is a protective mechanism against brain injury. Individuals with higher CR are more likely to exhibit better cognitive performance and lower risk of expressing clinically relevant symptoms of brain dysfunction. Additionally, the neuroimaging literature supports a neural basis for CR with the identification of specific regions and networks that associate with proxies of CR, such as education and occupational attainment. Future research is needed to examine the long‐term outcomes of interventions designed to bolster CR throughout the lifespan.
Sarah A. Cooley, PhD, is a research data analyst in the Department of Neurology at Washington University in St. Louis School of Medicine. Her research interests include the neuropsychology of aging in HIV, specifically examining changes in cognitive functioning in older HIV‐positive individuals compared with normal aging, as well as the neuroimaging correlates of these cognitive changes.