© Springer International Publishing Switzerland 2015
David D. Schwartz and Marni E. AxelradHealthcare Partnerships for Pediatric AdherenceSpringerBriefs in Public Health10.1007/978-3-319-13668-4_66. Adherence in Adolescence
(1)
Associate Professor of Pediatrics Department of Pediatrics Section of Psychology, Baylor College of Medicine, Houston, Texas, USA
Abstract
An understanding of adolescence and adolescent development is critical for clinicians who wish to be able to help their patients with adherence and illness control. Many teens seem perfectly capable of managing a chronic illness, yet to the surprise of many clinicians, adherence is often at its worst in adolescence, as is chronic illness control. In this chapter we explore the reasons why adherence in adolescence is so challenging and frustrating for patients, parents, and providers alike. We argue that some degree of nonadherence is actually likely to be normative due to the developmental, neurodevelopmental, and cognitive changes of this period, which are almost antithetical to maintaining consistent adherence behaviors. Continued parent involvement will therefore turn out to be a key component of successful illness management in the teenage years. Of course, this involvement is not without its own challenges and costs. We conclude that encouraging a focus on supporting patient autonomy (i.e., volitional behavior) without pushing youth independence (i.e., acting alone) can foster youth development without necessitating a withdrawal of needed parental assistance with illness management.
Tomorrow’s life is too late: live today.
—Martial, Epigrams, bk. I, epig. 15. (A.D. 85)
Adolescents struggle more with adherence than any other age group—in fact, adherence is often at its worst in adolescence. Youth with chronic illness experience declining illness control, higher incidence of serious consequences of nonadherence such as organ graft failure and diabetic ketoacidosis, increased stress and depression, and decreased quality of life. At the same time, they find themselves more on their own with illness management, with less support and involvement from parents, and in many cases parent-child conflict increases. Moreover, illness-management goals may conflict with or impede attainment of the normal goals of adolescent development such as achieving a sense of individuality (Seiffge-Krenke 1998). For these reasons, many health professionals find “that managing the complexity and range of health concerns in adolescents is more challenging than for other age groups” (Sawyer et al. 2007).
Adolescence is also a time of significantly increased behavioral risk. In fact, morbidity and mortality in teenagers are primarily attributable to risk-taking and health-risk behaviors such as alcohol and drug use and reckless driving (Kann et al. 2013). In this chapter we argue that these phenomena are related—that nonadherence and risk-taking are both largely attributable to the normal neurodevelopmental changes that occur in adolescence, and to the consequent changes that occur in the parent-child relationship. We review the evidence from developmental neurobiology and current theories of risk-taking to develop the case that nonadherence in adolescence can in many circumstances be considered either a direct risk-taking behavior itself, or as the result of developmental factors that contribute to risk-taking.
Adolescence—Definition
Adolescence is a time of dramatic changes—physically, mentally, socially. It is a time of opportunities and new experiences, as teens begin to separate from their parents and spend more time with friends, explore romantic and sexual relationships for the first time, and take more responsibility for themselves and for their lives. Many of the challenges of adolescence provide formative experiences that help prepare the youth for the transition into adulthood.
Adolescence is popularly associated with the teenage years, although most researchers now see the period as lasting longer. The American Academy of Pediatrics defines adolescence as the period from 11 to 21 years of age (https://brightfutures.aap.org/pdfs/Guidelines_PDF/18-Adolescence.pdf). Others have defined the period functionally, as “the period of life that starts with the biological changes of puberty and ends at the time at which the individual attains a stable, independent role in society” (Blakemore and Robbins 2012). Colver and Longwell (2013) suggest that adolescence “should be considered to extend from 11 to 25 years of age” so as to reflect the substantial brain development that occurs during this period. Interestingly, the latter formulation overlaps with what has recently been termed “emerging adulthood.” As noted by Arnett (2004), who coined the term, “For today’s young people, the road to adulthood is a long one. They leave home at age 18 or 19, but most do not marry, become parents, and find a long-term job until at least their late twenties.” Adherence remains quite challenging throughout this period, which encompasses the transition from pediatric to adult care, which is deserving of a volume in its own right; but in this chapter we focus primarily on the period in which most youth are still in their parents’ homes, i.e. from puberty until around age 18 or so.
Adherence in Adolescence
Health management habits that are established in adolescence set the stage for later self-management. Nonadherence tends to start in adolescence (Kovacs et al. 1992) and, once established, can persist into adulthood. As noted by Rapoff (2010) in his seminal book on pediatric adherence, adolescents are more likely than younger children or adults to have poorer adherence to their medical regimen regardless of which chronic illness you consider. Worse adherence in adolescence has been documented for youth with asthma, cancer, cystic fibrosis, diabetes, HIV/AIDS, juvenile rheumatoid arthritis, and organ transplant (Rapoff 2010), and other conditions as well.
One challenge to adherence at this stage of development is that physical changes associated with puberty can make illness control more difficult. Growth spurts and hormonal changes can reduce the effectiveness of medication. Changes in the immune system can place organ transplant patients at greater risk for graft failure. For youth with diabetes, hormonal changes can also cause blood sugars to increase while insulin sensitivity decreases (Amiel et al. 1986; Helgeson et al. 2009).
These physical changes can make good illness control an unattainable goal for many teens, even when they complete all management tasks as prescribed. This is especially true when healthcare providers and clinical guidelines establish tight parameters for “good” control. For example, current guidelines for glycemic control in youth with type 1 diabetes recommend maintaining hemoglobin A1c below 7.5 % (or below 7 % if this can be achieved “without excessive hypoglycemia”; American Diabetes Association 2014), as lower A1c has been associated with reduced risk for complications. The problem is that this is not an achievable goal for many youth with T1D due to factors outside of their control, setting them up for failure and frustration.
These frustrations are compounded when parents and healthcare providers believe that youth are doing less to manage their illness than they actually are. Frustration can lead to burnout, leading many youth to feel “hopeless and helpless” and question whether management is worth all the effort. Some simply give up. Even more concerning, mental illnesses such as depression and anxiety often have their onset in adolescence (Kessler et al. 2005), adding an additional layer of risk for teens with chronic conditions (see Chap. 3) .
Normal neurodevelopmental changes that occur post-puberty also contribute to the decline in adherence. Adolescence is associated with an increase in sensation-seeking and reward-seeking behaviors that underlie much general risk-taking in teens. This increase occurs prior to the maturation of cognitive control networks that underlie adult self-regulation, and that help temper impulses toward more immediate gratification. At the same time, there is an increasing shift toward greater independence with less parent oversight, which reaches its peak by later adolescence when many teens can drive, further limiting parents’ ability to monitor their behavior. Together, these factors create a “perfect storm” of increased risk-taking and increased opportunities for taking risk, with serious implications for adherence.
Nonadherence as Risk-taking Behavior
Heightened risk taking during adolescence is likely to be normative, biologically driven, and, to some extent, inevitable.—Steinberg 2008.
Risk-taking behavior characterizes adolescence. Of course, not all adolescents are risk takers, but the evidence is clear that risk-taking behavior spikes in adolescence, and the ramifications are profound. According to the Youth Risk Behavior Surveillance study (Kann et al. 2013), the leading causes of morbidity and mortality among youth in the United States are related to six health-risk behaviors: (1) behaviors that contribute to unintentional injuries and violence; (2) tobacco use; (3) alcohol and other drug use; (4) risky sexual behaviors; (5) unhealthy diet; and (6) physical inactivity. As the authors note, “these behaviors frequently are interrelated and are established during childhood and adolescence and extend into adulthood.”
In youth with chronic illness , nonadherence to the medical regimen can potentially be added to this list of health-risk behaviors. In fact, in many instances nonadherence can be seen as a risk-taking behavior, as has been acknowledged by a number of authors (Bender 2006; Kondryn et al. 2011; Sawyer et al. 2007; Taddeo et al. 2008). Every time someone skips an insulin dose or an immunosuppressive pill entails some risk. This is not to imply that nonadherence is always or even most often intentional, the result of a reasoned decision-making process (e.g., I’m going to stop taking my Metformin because it isn’t helping me anyway). As noted earlier, nonadherence can often result from a spur of the moment decision not to engage in a specific behavior at a specific time (e.g., If I miss this one dose, it won’t hurt me). Based on clinical experience, we would argue that these spur-of-the-moment risky decisions are very common among teens who struggle with adherence, and that “nonintentional but volitional” risk behavior (Gerrard et al. 2008) may well be characteristic of teens.
Two Paths to Risk-taking
When asked, most adolescents say they have no intention of engaging in behaviors that put their health at risk; and yet, when given the opportunity, many of them do.—Gibbons 2008
For teens, risk-taking is often unplanned and opportunistic, a reaction to social circumstances. Many teens will deny having any intention to engage in a risky behavior (such as getting into a car with a drunk driver) yet will acknowledge that they may be willing to do so if the situation arises (Gibbons et al. 2005). The propensity to take an opportunity for risk when it arises has been termed behavioral willingness (Gibbons et al. 2006). Gibbons, Gerrard, and their colleagues have shown that behavioral willingness is a better predictor of teen health-risk behaviors than behavioral intentions, which are typically arrived at through a deliberative, goal-oriented process (Gibbons et al. 1998; Gibbons et al. 2004). On the other hand, behavioral intentions are very strong predictors of health maintenance behaviors, at least in adults (Gibbons 2008).
Behavioral willingness—and risky decision-making in general—appears to be enhanced in social contexts and emotionally exciting situations (Gerrard et al. 2003), i.e., “in the heat of the moment.” This brings us back to the “hot” and “cool” systems involved in self-regulation discussed earlier (Metcalf and Mischel 1999). In general, adolescents tend to perform like adults on tasks assessing “cool” decision-making in the laboratory, although risk-taking can even be elicited in the lab when social factors come into play.
In an oft-cited study (Gardner and Steinberg 2005), teens, younger adults, and older adults were asked to play a computer driving game of “chicken.” The player accumulated more points the farther the car went, but had to stop at a red light or crash (crashing wiped out all of the points). When a yellow light appeared, “players had to decide how much further to allow the car to move, balancing their desire to accumulate points against the possibility of crashing.” The longer the car moved provided the measure of risk-taking. All three groups performed similarly when playing the game alone, but when subjects played the game with other people in the room (the social condition), dramatic differences emerged. The teens played much more riskily than young adults, who performed much more riskily than older adults; moreover, the older adults did not change their play in the social condition at all.
Not all decisions are made on the spur-of-the-moment, of course. In fact, most of the research on health behavior in adults has operated under the assumption that decision-making reflects a reasoned process of weighing possible outcomes and then deciding to act (the behavioral intention) based on expectations of success and the subjective values assigned to each outcome (Cohen 1996).
Based on these findings, Gibbons et al. (1998) postulated that there are two pathways to risk-taking behavior: a “reasoned” pathway in which people acknowledge and accept the possibility of negative outcomes but engage in the behavior anyway, and a “reactive” pathway in which risk-taking results from unexpected opportunities that occur most commonly in social situations. The reasoned pathway (assessed by measuring behavioral intentions) is a stronger predictor of health behaviors, whereas the reactive pathway (assessed by measuring behavioral willingness) is a better predictor of risk behaviors (Gibbons 2008). Reyna and Farley (2006) offer a similar typology of risk-takers, differentiating between risky deliberators who rationally weigh the costs and benefits of decisions, and risky reactors, who more impulsively take risks. As we will see in the next section, current research in both developmental neurobiology and cognitive psychology support this dual-pathway view.
Clinical Implications of the Dual-pathway Model of Risk
It is very important for clinicians to recognize the distinction between “cool” competence and “hot” reactivity in their patients. Healthcare providers will assess their patients’ knowledge, understanding, and intentions in the exam room, a cool-system setting where youth are likely to appear more competent and capable then they will be in “real life.” A patient may be able to answer all questions about her illness and its management but that does not mean she will be able to complete all management behaviors in the face of competing demands (especially social demands).
On occasion, we have heard clinicians suggest that patients are lying when their intensions don’t match up with their behaviors, but theory and research would suggest a different explanation. Most teens are probably being quite honest when they say that they intend to take all their medicine, or do a better job following dietary restrictions, etc., but their stated intentions may not capture their willingness to deviate from prescribed care if certain opportunities arise. They may also underestimate their willingness to deviate when queried in cool settings. When the hot system is quiescent, the cool system is better able to show what it can do, which can lead adults to overestimate a youth’s reasoning potential in other settings. In fact, there is some evidence that children may set overly high goals for themselves in the presence of adults; this has been found for children with asthma and diabetes (Hilliard et al. 1985) and children with cancer (Elkin et al. 1998).
Neurodevelopmental Changes in the Adolescent Brain
Recent evidence from neurodevelopmental and neurobiological research suggests that adolescent risk-taking might be the expectable result of normal maturational processes. Specifically, it has been posited that the greater vulnerability to risk-taking in adolescence results from a “temporal disjunction” between the maturation of two brain systems: a social-emotional network that underlies reward-seeking behavior, which peaks in mid-adolescence, and a cognitive-control network that develops more slowly and only reaches maturity in early adulthood (Steinberg 2010). Evidence for these brain changes is reviewed below, after which the discussion will turn to the implications of these findings for adherence.
It is now understood that the brain goes through substantial changes in adolescence almost as dramatic as in the first few years of life (Colver and Longwell 2013). It is only a slight exaggeration to say that the adolescent is not the same person as he or she was as a child. First, there are changes in the relative distribution of cerebral gray and white matter (Paus et al. 1999; Lenroot and Giedd 2006). Gray matter peaks at the start of adolescence and then declines thereafter, while white matter throughout the brain increases steadily into adulthood, either as the result of increasing myelination, increasing axonal diameter, or both. It is currently unclear whether gray matter is “pruned” or whether increasing myelination converts gray to white, but the important outcome of this process is that brain connectivity increases and neural networks become more efficient and probably take on new functional roles (Giedd 2008; Power et al. 2010). Developmental changes in three processing networks that likely play a role in adherence are discussed below.
The social-emotional reward system In early adolescence there is a dramatic increase in the brain’s sensitivity to reward and to social-emotional stimuli. There is a surge in activity of the neurotransmitter dopamine in pathways linking subcortical areas involved in emotion processing (limbic system, especially amygdala) and reward sensitivity (ventral striatum, nucleus accumbens) to the frontal lobes, starting around puberty and increasing through mid- and late-adolescence and then declining thereafter (Galván et al. 2006; Steinberg 2010). Dopamine plays an important role in reward-seeking and motivated behavior, and both human and animal studies show that reward-seeking behaviors increase dramatically after puberty (Steinberg 2010). Moreover, the neural reward system sketched above is especially sensitive to immediate reward (McClure et al. 2004), and there is good evidence that a preference for immediate versus delayed reward characterizes many teens (Blakemore and Robbins 2012).
Thus, adolescents appear neurodevelopmentally primed to seek out experiences and engage in behaviors that are immediately rewarding (regardless of whether they may have long-term consequences), and their willingness to do so is highly sensitive to social context (Gerrard et al. 2003). It might even be said that heightened activity in the social-emotional network increases behavioral willingness to take risks (Pomery et al. 2009). It makes evolutionary sense that brain systems that underlie social approach and reward-seeking behavior would spike with the onset of reproductive maturity (Casey et al. 2008).
The cognitive-control system The white matter development that occurs throughout adolescence is especially dramatic in the frontal lobes, which are the last region of the brain to fully mature. The frontal lobes are associated with development of executive functions (e.g., planning, organization, working memory) so necessary for self-regulation of behavior. There is clear evidence that executive functions play a critical role in chronic illness management (e.g., Duke and Harris 2014), and more generally in cognitive control. Converging evidence from studies on adolescent brain development strongly supports the conclusion that this period is characterized by a progressive increase in cognitive control (Yurgelun-Todd 2007).
Frontal lobe development is only part of this story. There are also dramatic changes in the wiring between frontal control areas and many other regions of the brain, including the limbic system, which is integrally involved in emotion, and the subcortical reward system. The result of this increased functional connectivity is a gradual increase in cognitive control over emotional reactivity, increased ability to delay gratification, and (probably) decreased risk-taking behavior (Olson et al. 2008; Steinberg 2010; but see Berns et al. 2009). As Reyna and Rivers (2008) note, one of the most important developments in adolescence “is the coordination (through improved connectivity) between cortical and subcortical limbic regions—the dance between affect and thinking.” However, affect leads this dance into late adolescence, and it is only by the middle of the third decade of life that thinking—cognitive control—takes the lead. This is why car insurance rates are so much higher prior to the age of 25, and why car rental companies often do not let youth younger than 25 rent a car.
Critically, neither social-emotional reactivity or immature frontal lobe functioning by themselves is sufficient to account for increase risk-taking in adolescence; it is the combination of these factors, and the temporal gap between their development, that create such a potent vulnerability to risk (Casey et al. 2010; Steinberg 2010). If adolescent risk-taking were simply a result of immature frontal lobe functioning, children would engage in far more risky behavior than adolescents, and this is simply not the case, as evidenced by the alarming spike in risk-taking with the onset of puberty.
An alternative (though not mutually-exclusive) view is that increased activity in the social-emotional network in adolescence may actually drive subsequent development of frontal control networks (Bernheim et al. 2013)—in other words, that development of cognitive control may be dependent on experiences gained at least in part through normative risk-taking. In this view, risk behaviors may “present adaptive benefits” by allowing adolescents to gain “skills for survival in absence of parental protection.” Of course, the fact that risk-taking may have been evolutionarily adaptive does not necessarily mean that it remains so in the modern world, as the types of risk opportunities have changed (e.g., availability of drugs, guns, and cars) and social constraints have loosened.
Neuroimaging studies of reward processing and decision-making support the developmental lag hypothesis, as they have revealed clear differences in the ways adolescent and adult brains process risk. Adolescents shows different neural activation patterns from adults on executive decision-making tasks (Luna et al. 2010), especially when making decisions about risk (Ernst et al. 2005; Galván et al. 2006). Difference in orbitofrontal cortex activation have especially been noted. Compared to children and adults, adolescents show increased activity in nucleus accumbens relative to orbitofrontal cortex in response to reward (Galván et al. 2006), consistent with the hypothesis that adolescence is characterized by increased reward responsivity with relatively diffuse cognitive control. Consistent with the neurobiological evidence, data from cognitive studies suggests that when risks and rewards are directly compared, rewards win out for teens (but not adults)(Reyna and Farley, 2006). Both cross-sectional and longitudinal neuroimaging studies have demonstrated that adolescents’ neural activation patterns become increasingly adult-like as they are able to exhibit more cognitive control (Galván and Rahdar 2013).
The Default Network A third brain system that appears to “come online” during adolescence is the so-called default mode network or default network, a distributed system that includes the frontal lobes, posterior cingulate cortex, and lateral parietal/occipital cortices (especially cuneus and precuneus)(Buckner et al. 2005). Neuroimaging studies have shown that the default network is only sparsely connected or fragmented in children (Fair et al. 2008) and likely goes through significant developmental change throughout adolescence (Blakemore 2012).
The default network becomes activated during resting but awake states and deactivated during goal-directed activity (Broyd et al. 2009). Although its role in cognition is currently debated, it is believed to be involved in introspective thought of some sort, possibly including mental imagery, creation and review of mental models and alternative possibilities (Buckner et al. and/or in social cognition (Supekar et al. 2010). There is accruing evidence that the default network might be disrupted by poorly controlled type 1 diabetes (Kaufmann et al. 2011; Perantie et al. 2007), and it has been speculated that default network dysfunction might contribute to adherence difficulties by making it more difficult for individuals to think through possible consequences of their actions (e.g., what might happen if a diabetic teen does not take his insulin; Schwartz et al. 2014).
Clinical Implications of the Neurodevelopmental Evidence
The neurodevelopmental data strongly support the hypothesis that adolescents are driven by increased social-emotional reward sensitivity while lacking the control mechanisms to temper reward-seeking impulses (Galván et al. 2006). Moreover, they are likely to show a preference for immediate over delayed reward (Blakemore and Robbins 2012), and these qualities are heightened in the heat of the moment, when social-emotional rewards are high.
These factors would seem to make it more likely that they would skip a medication dose when asked to go out with friends, or forgo dietary restrictions when snacks are available and parents are not around, than to put those immediately rewarding behaviors aside in favor of the long-term health gains that come from good adherence. To borrow from McClure et al. (2004), the neurodevelopmental data suggest that adolescents are more likely to act like the impatient and self-indulgent grasshopper from Aesop’s fable, and less like the patient ant who carefully prepares for the long winter.
Adolescents’ increased vulnerability to risk—and the alarming statistics regarding risk-related morbidity and mortality in youth—has led many researchers and professionals with an interest in public health to examine ways in which to reduce these risks and their negative outcomes. This research is discussed more fully in the next section, but for now we will note that one of the few effective approaches to reducing risk has involved reducing opportunities to engage in risk through parental monitoring and supervision (Reyna and Farley 2006; cf. Gibbons et al. 2003). Steinberg (2008) sums this view up nicely:
Strategies such as raising the price of cigarettes, more vigilantly enforcing laws governing the sale of alcohol, expanding adolescents’ access to mental-health and contraceptive services, and raising the driving age would likely be more effective in limiting adolescent smoking, substance abuse, pregnancy, and automobile fatalities than strategies aimed at making adolescents wiser, less impulsive, or less shortsighted. Some things just take time to develop, and, like it or not, mature judgment is probably one of them.
Applying this logic to adherence would mean maintaining a relatively high level of vigilance over illness-management behaviors and reducing opportunities for nonadherence. However, it is also important to acknowledge here the opposing view that risk-taking is important for development, that risk behaviors allow adolescents to gain “skills for survival in absence of parental protection” (Bernheim et al. 2013). In this view, reducing risk might reduce opportunities for learning. For example, experiencing an episode of DKA might teach a diabetic teen of the dangers of poor adherence to insulin, with hospitalization providing a “wake-up call” that results in better adherence in the future. Arguing against this idea is the evidence showing that nonadherence in adolescence predicts nonadherence in adulthood. Of course, no one would argue that a teen should be allowed to go into DKA, given the health risks involved, but we have certainly heard the perspective that teens need to be given the freedom to “figure things out for themselves,” which, when it comes to illness management, will inevitably involve some risk.
Of course, the fact that risk-taking may have been evolutionarily adaptive does not mean that it remains so in the modern world, as the types of risk opportunities have changed (e.g., drugs, guns, cars) and social constraints have loosened. In the end, it is probably a matter of degree—all parents are faced with the challenge of allowing their children to make mistakes that they can learn from, while still ensuring their health and safety (Sawyer and Aroni 2005). The question becomes how much risk and how many mistakes are allowed before parents step up their level of supervision. We have here arrived back at one of the central concerns of this book, the tension between parental behavioral control and autonomy support . As we discuss later in this chapter, this tension characterizes changes in the parenting role during adolescence .
Cognitive Factors in Adolescent Decision-making
Brain systems implicated in basic cognitive processes reach adult levels of maturity by mid-adolescence, whereas those that are active in self-regulation do not fully mature until late adolescence or even early adulthood. In other words, adolescents mature intellectually before they mature socially or emotionally, a fact that helps explain why teenagers who are so smart in some respects sometimes do surprisingly dumb things. –Steinberg 2013
In accordance with a significant reorganization of brain structure and function, changes also occur in the ways adolescents think. Cognitive changes in adolescence are at least as dramatic as neurodevelopmental ones .
Many healthcare providers believe that adolescents who do not follow their regimen must not understand how to do it correctly, or why it’s important to do so. Yet by mid-to-late adolescence, many youth perform similarly to adults on most reasoning tasks (Steinberg 2010). Unfortunately, we know of no studies directly comparing parent and youth knowledge of illness and illness-management (cf. DeWalt and Hinks 2009), although there is no reason to expect that older youths would be less capable of reasoning about illness management than their parents. In fact, research does suggest that teens are most likely to show adult-like patterns of thinking in areas that are most familiar to them (Carey 1988), which is a good characterization of illness management for teens who have been living with the illness for a while.
Stay updated, free articles. Join our Telegram channel
Full access? Get Clinical Tree