Australia has reported the highest rates of food allergy, using the gold standard, oral food challenge. This phenomenon, which appears linked to the “modern lifestyle” and has coincided with the explosion of the new diseases of affluence in the 21st century, dubbed “affluenza,” has spurred a multitude of theories and academic investigations. This review focuses on potentially modifiable lifestyle factors for the prevention of food allergy and presents the first data to emerge in the Australian context that centers around the dual allergen exposure hypothesis, the vitamin D hypothesis, and the hygiene hypothesis.
Key points
- •
Food allergy is on the rise in developed countries and has been well-described in Australia using challenge-proven outcomes. It is believed to be linked to the modern-day lifestyle.
- •
The 3 key hypotheses for the rise in food allergy in the 21st century are currently (1) the hygiene hypothesis (which includes microbial diversity); (2) the dual allergen exposure (or Lack) hypothesis, and (3) the vitamin D hypothesis. There are as yet few published data with regard to other factors pertaining to food allergy as an outcome, although there are many studies in progress.
- •
High rates of food allergy in infants of Asian migrants provide a unique opportunity to explore possible explanations for this modern day phenomenon.
Introduction
Food allergy appears to have risen in many developed countries around the world but none more so than in Australia. We reported in 2011 that in a population cohort of more than 5000 1-year-old infants, more than 10% had evidence of challenge-proven food allergy. Although there are now many hypotheses as to why food allergy appears to be rising worldwide, until recently there has been little direct evidence formally evaluating risk factors in populations in which the rise has been demonstrated. The current leading hypotheses of postnatal modifiable factors for the rise in food allergy are (1) the “dual allergen exposure” or Lack hypothesis, (2) the vitamin D hypothesis, and (3) the hygiene hypothesis (which includes factors associated with microbial diversity and the modern lifestyle). This review will present insights from the one of the first large-scale studies, the Healthnuts study, to formally assess these hypotheses using challenge-confirmed food allergy undertaken in all food-sensitized infants. By reviewing the literature, particularly in reference to studies that use the gold standard of oral food challenge, this article aims to understand potential lifestyle and environmental factors that might be driving the Australian epidemic and reviews other potential hypotheses that are as yet unstudied but may also contribute to this perplexing phenomenon of the 21st century.
Introduction
Food allergy appears to have risen in many developed countries around the world but none more so than in Australia. We reported in 2011 that in a population cohort of more than 5000 1-year-old infants, more than 10% had evidence of challenge-proven food allergy. Although there are now many hypotheses as to why food allergy appears to be rising worldwide, until recently there has been little direct evidence formally evaluating risk factors in populations in which the rise has been demonstrated. The current leading hypotheses of postnatal modifiable factors for the rise in food allergy are (1) the “dual allergen exposure” or Lack hypothesis, (2) the vitamin D hypothesis, and (3) the hygiene hypothesis (which includes factors associated with microbial diversity and the modern lifestyle). This review will present insights from the one of the first large-scale studies, the Healthnuts study, to formally assess these hypotheses using challenge-confirmed food allergy undertaken in all food-sensitized infants. By reviewing the literature, particularly in reference to studies that use the gold standard of oral food challenge, this article aims to understand potential lifestyle and environmental factors that might be driving the Australian epidemic and reviews other potential hypotheses that are as yet unstudied but may also contribute to this perplexing phenomenon of the 21st century.
How convincing is the evidence for a 10% prevalence of food allergy in Australia?
The Healthnuts study provided evidence of unexpectedly high rates of challenge-proven immunoglobulin (Ig)E-mediated food allergy in infants in Melbourne, Australia, an urban population in Australia’s most southern mainland city. These findings may not be generalizable to other more rural areas of the state of Victoria because of differences in distribution of potentially protective factors, such as microbial exposure linked to contact with livestock or other rural factors. However, the prevalence of peanut allergy in Healthnuts (3%) is similar to the overall Victorian prevalence reported in the Longitudinal Study of Australian Children of 2.9% parent-reported peanut allergy in a cohort of more than 4000 children aged 6 to 7 years. Because peanut allergy is uncommonly outgrown and peanut allergy is invariably IgE-mediated, this similarity between the 2 Victorian-based cohorts is reassuring. Findings from Healthnuts are also not necessarily applicable to other Australian states because there is evidence of a latitude gradient of food allergy prevalence in Australia, as there is for North America and Chile, with those living farthest from the equator in the south of Australia (including Victoria) having higher rates than those living farther north.
Although higher than initially expected when the study was mounted, the high prevalence of food allergy found in Australia is not particularly surprising when viewed in the context of Australian hospital admission figures for food-induced anaphylaxis, which have risen fivefold in young children from the mid-1990s to the mid-2000s, with similar increases in allergy waiting lists, which are now more than 12 months for most specialty clinics around the country. To date, these observations are limited to young children with only modest increases of anaphylaxis admissions for older children and adults and no formal reports of rising rates in the adult population. A high food allergy prevalence in Australian infants is also consistent with the country having one of the highest rates of asthma and eczema in the world, perhaps suggesting a second-wave epidemic of allergic disease.
One last factor to consider is that the Healthnuts study used raw egg for its oral food challenges, which may have overestimated clinically relevant egg allergy, a large determinant of the high prevalence of positive challenges. Countering this, however, is the observation that a history of acute allergic reactions to egg were reported by 6.5% of those exposed to dietary egg by age 1 year (K.J. Allen and J.J. Koplin, personal communication, 2011). As Lack observed, the prevalence of baked egg allergy (the most severe egg allergy phenotype) was 2% in Healthnuts, which is much more in line with prevalence rates published of challenge-proven semicooked egg allergy in the United Kingdom.
Why are the rates of food allergy so high in Australian infants?
There are a multitude of potential explanations for why food allergy may be on the rise. However, it is important to consider them in the context of lifestyle factors that have changed over the past 20 to 30 years, the time period in which the rise in food allergy in developed countries has been noted. At the general public health level, there has been a slow but persistent urbanization of cities with increasing use of asphalt, cleaner water supplies, cleaner food supplies, a more sedentary lifestyle, and increased intake of a Westernized diet. In addition, there has been parallel increase in obesity and wide-scale use of antibiotics in not only the human population, but also in feed lots for livestock at low levels to optimize growth, a significant decline in smoking, and a decreasing prevalence of Helicobacter pylori infection. In infants, there has been an increased uptake of immunization and altered infant feeding patterns. Although these factors all began to change before the epidemic of food allergy, only infant feeding patterns have been significantly temporally linked to the most recent rise in food allergy specifically as opposed to the rise in allergic disease in general. Last, potential factors that appear somewhat unique to Australia are rising rates of migration from Asia and rising rates of vitamin D insufficiency in both mothers and children. There has been a modest public health response to the latter, with proactive identification of maternal vitamin D insufficiency with antenatal supplementation. However, there are currently no general recommendations for vitamin D food chain fortification (other than margarine) or consideration of preventive infant supplementation in the absence of risk factors for vitamin D insufficiency.
The Dual Allergen Exposure Hypothesis (Lack Hypothesis)
This hypothesis proposes that allergic sensitization to foods may occur through exposure to low doses of allergen through the skin due to food allergens in the environment being absorbed through a damaged skin barrier (such as in eczema or presence of filaggrin loss-of-function mutations). Oral exposure to these allergens through consumption of allergenic foods early in infancy, before skin sensitization, leads to lasting oral tolerance and prevents the development of sensitization and allergy even with subsequent skin exposure.
Mechanistic evidence supporting this hypothesis comes from mouse models showing that sensitization can be induced following application of allergen to damaged skin, and that this can be prevented by previous high-dose oral allergen exposure. Recent studies suggest that the activation of innate immune pathways in the skin Through TSLP and Basophil Activation may play a key role in development of food allergy secondary to cutaneous sensitization in animal models. Studies of human populations to date have primarily focused on peanut allergy, demonstrating that peanut allergens can be found in the household environment and that higher exposure to environmental peanut antigens appears to increase the risk of peanut allergy in children with either filaggrin loss-of-function mutations or atopic dermatitis.
This hypothesis is appealing in the Australian context because eczema is extremely common in infants in Australia, with up to 25% of Healthnuts infants having a history of doctor-diagnosed eczema or nurse-observed eczema at age 1 year. As reported previously, eczema frequently coassociates with food allergy, with 50% of those with early-onset moderately severe eczema developing food allergy by age 1 year. This, coupled with distinct changes to infant feeding guidelines in the late 1990s/early 2000s with recommendations to delay allergenic solids such as egg to 10 months and peanut until age 3 years, providing the correct temporal framework for this hypothesis to have had a potential effect on the epidemic.
Adequate early-life skin barrier function
It is important to note that filaggrin loss-of-function mutations appear to be equally common among individuals with asymptomatic food sensitization and those with true food allergy, suggesting that filaggrin confers a risk for food sensitization, the first step to food allergy, but not further for food allergy itself. Previous studies reporting an association with food allergy were not designed to untangle any differential effect between sensitized tolerant and sensitized allergic individuals. Recent data from the Isle of Wight birth cohort used path analysis to demonstrate that the effect of filaggrin loss-of-function mutations on food allergy at age 10 occurred indirectly through an effect on eczema and food sensitization in early childhood. Together these findings suggest that skin barrier function plays a role in sensitization status but not in the second step of food allergy versus tolerance development.
Two exciting new studies were recently published that both undertook randomized controlled trial of daily moisturizing from birth in an attempt to reduce infantile eczema and associated effects. The first demonstrated an impressive 50% reduction in eczema. The second also examined egg sensitization as a secondary outcome. Although effective at preventing atopic dermatitis, there was no evidence of a reduction in sensitization to egg white in this relatively small study of 118 infants. Follow-up results to these trials and larger studies will be intriguing. Currently there is little information about early infant bathing and moisturizing practices in Australia, although avoiding soap in the first few weeks of life is generally recommended. The Barwon Infant Study, a prospective prenatal birth cohort study of 1000 infants undertaken in Geelong, 80 km southwest of Melbourne, has gathered this information, which is likely to shed light on this question in Victoria, Australia.
Timing of introduction of solids and infant feeding
The Lack hypothesis suggests the second factor in the 2 steps to food allergy is delayed oral allergen exposure. This is partially supported by data from the STAR trial, which randomized infants with eczema to egg avoidance or early regular egg consumption from age 4 months, finding a lower prevalence of egg allergy by 12 months in the intervention group (33% vs 51%, P = .11). These findings suggest a potentially protective effect of early allergen introduction that requires investigation in larger studies, and we await further studies. The STAR trial results also indicate that egg sensitization and allergy (at least in infants with eczema) may already be present as early as 4 months; however, introduction of oral solids before 4 months is likely to be extremely controversial. The recently published landmark study by Du Toit and Colleagues is the first RCT to Demonstrate a protective effect of early introduction of peanut with a dramatic reduction in development of peanut allergy if peanut was introduced between 4 and 11 months of age.
Early evidence, which requires further investigation, suggests that if a window of opportunity for promoting tolerance exists, it may be different for each food. For example, the optimal timing of introduction of milk appears to be earlier compared with egg. In one observational study, infants introduced to milk at 4 to 6 months were more likely to be milk allergic compared with those introduced to milk later. Lower rates of cow’s milk allergy among those who were exposed to cow’s milk formula within the first 14 days of life suggest that very early exposure to cow’s milk protein might promote tolerance, although this requires further investigation.
Weaning practices in Australia coupled with high eczema rates may contribute to the high prevalence of food allergy. In Healthnuts, fewer than 5% of infants received solid foods before 3 months of age, compared with 27% in a United Kingdom–based birth cohort study, with delayed introduction of solids likely to have a flow on effect in delaying the timing of oral exposure to potentially allergenic foods. High-quality evidence for an association between timing of introduction of allergenic foods remains sparse, but the studies published to date generally show a reduced risk of food allergy in those introduced to specific allergenic foods earlier in infancy.
More recent attention has turned to diet diversity, which may be one factor change that coincides temporally with the rise in food allergy and could reflect changes that occur following migration to a new country. Several studies in the past year examined the role of diversity of early-life food exposures in the development of food sensitization and food allergy. A prospective birth cohort study of 856 children reported increased diversity of complementary foods introduced in the first year of life was associated with a reduced risk of food allergy. In a prospective longitudinal study, dietary patterns in the first year of life consisting of more fresh fruit and vegetables and home-prepared meals were associated with less challenge-proven food allergy by the age of 2 years.
At the population levels, changes in the timing of food introduction may contribute to but are unlikely to completely explain recent increases in the prevalence of food allergy. Recently we assessed the impact of changing guidelines on infant feeding practices in the general population in Healthnuts. Changing guidelines had some impact on timing of allergenic foods, with these introduced earlier after changes in guidelines, although changes were less pronounced among those with a family history of allergy and in families of lower socioeconomic status. Despite this, there was no decrease in the overall prevalence of food allergy in the second half of the cohort (when timing of allergenic solids was less delayed) compared with the first half of the cohort (K.J. Allen and J.J. Koplin, personal communication, 2015) although analysis is ongoing to assess the impact of differential uptake of guideline changes among those at higher risk of food allergy on this finding.
Vitamin D Hypothesis
Recent hypotheses that low vitamin D may increase the risk of food allergy are supported by 2 lines of ecological enquiry. First, countries farther from the equator (and thus with lower ambient ultraviolet radiation) have recorded more pediatric admissions to hospital for food allergy–related events and more prescriptions of hypoallergenic formulas for the treatment of cow’s milk allergy and adrenaline auto injectors for the treatment of anaphylaxis in children. These findings appear to be independent of longitude, physician density, or socioeconomic status. Second, season of birth may play a role. For example, children attending emergency departments in Boston with a food-related acute allergic reaction were more likely to be born in autumn/winter, when vitamin D levels reach their nadir, than in spring/summer, and similar links of food allergy to birth seasonality were reported in the southern hemisphere. As described previously, children residing in Australia’s southerly state have twice the odds (95% confidence interval [CI] 1.2–5.0) of peanut allergy at age 4 to 5 years and 3 times (95% CI 1.0–9.0) the odds of egg allergy than those in the northern states. Despite a sunny clime, Australia has high rates of vitamin D deficiency, including in infants, with a highly successful “slip, slop, slap, wrap” anti–skin cancer public health campaign (slip on a t-shirt, slop on sunscreen, slap on a hat, and wrap on sunglasses) and absence of fortification and universal supplementation presumably contributing to this.
We recently described that infants with vitamin D insufficiency were 3 times more likely to have either peanut or egg allergy, the odds increasing to fourfold among those with 2 or more food allergies. Furthermore, among food-sensitized infants, those with vitamin D insufficiency were 6 times more likely to be food allergic than tolerant. These effects were observed among infants with Australian-born parents but not those with parents born outside Australia. Investigation of genetic risk factors may help to explain these differences in associations between populations. Genetic polymorphisms contribute to variation in vitamin D–binding protein levels, explaining almost 80% of variation in levels. Binding protein levels in turn alter the biological availability of serum vitamin D, with lower levels increasing the availability of serum vitamin D (25OHD3). We recently found that polymorphisms resulting in lower binding protein levels appeared to compensate for adverse effects of low serum vitamin D on food allergy risk (JJ Koplin and colleagues, manuscript under review), presumably by increasing ability to utilize available vitamin D. As well as supporting a potentially causal link between vitamin D and food allergy, these findings suggest that reference ranges to define low levels of serum vitamin D with a detrimental effect on food allergy risk may need to take into account differences in binding protein level. Randomized controlled trials stratified by genetic, racial, or migratory status are required to determine whether correction of vitamin D status either prevents infantile food allergy or promotes the development of tolerance in food-allergic infants.
Hygiene Hypothesis
There is increasing evidence that the interaction between the host microbiome and the immune system is essential to the development of immune regulation and oral tolerance. The maturation of the mucosal immune system is prompted by exposure to microbes after birth. In searching for explanations for food allergy, attention has been turned to the composition and timing of exposure to gut microflora, and their possible role in disease development or prevention. One hypothesis to explain the increased incidence of sensitization to food allergens is that the reduction in early childhood infections (the hygiene hypothesis) or in exposure to microbial products (eg, endotoxin, microbial exposure) may impede the development of early immunoregulatory responses. This leaves the immune system more susceptible to inappropriate reactivity to innocuous antigens, resulting in an ‘‘allergic’’ reaction.
As described in the landmark paper by David Strachan in 1989, the traditional concept of the “hygiene hypothesis” described a protective effect of an increasing number of siblings in a household on the risk of developing allergic rhinitis. This was thought to potentially relate to the shared exposure to common childhood infections transmitted through direct contact with older siblings or by maternal contact with her older children prenatally. Although a protective sibling effect has been confirmed for challenge-proven food allergy outcomes in our own infant cohort study (Healthnuts) and by others for various food sensitization and allergy outcomes, it is by no means clear as to the underlying mechanism of this phenomenon. Although the concept is interesting and reproducible, changes to postwar houses and sanitation, sizes of families, as well as the emergence of national immunization programs with high uptakes moderates our interpretation of the mechanisms underlying the protective effects of siblings. Further evidence of a protective effect of dog ownership on food allergy risk in Healthnuts may point to sharing of microbes or even parasites, the latter underpinning the “old friends hypothesis,” which is predicated on the teleologic emergence of the IgE antibody immune mechanism as primary protection against parasite infestation. More generally speaking, there is some early evidence to suggest a difference between prevalence of food allergy in rural versus urban environments, which appears to be reflected in rising rates of food allergy described in cities in China (such as Chongquing) undergoing rapid urbanization.
Stay updated, free articles. Join our Telegram channel
Full access? Get Clinical Tree