Advertisement

Superinfection in malaria: Plasmodium shows its iron will

Sílvia Portugal, Hal Drakesmith, Maria M Mota

Author Affiliations

  1. Sílvia Portugal1,
  2. Hal Drakesmith*,2 and
  3. Maria M Mota*,1
  1. 1 Instituto de Medicina Molecular, Faculdade de Medicina da Universidade de Lisboa, Avenue Prof. Egas Moniz, 1649‐028, Lisboa, Portugal
  2. 2 Weatherall Institute of Molecular Medicine, John Radcliffe Hospital, University of Oxford, Oxford, OX3 9DS, UK
  1. *Corresponding authors. Tel: +44 (0)1865 222329; Fax: +44 (0)1865 222406; E-mail: alexander.drakesmith{at}ndm.ox.ac.uk or Tel: +35 12 1799 9509; Fax: +35 12 1799 9504; E-mail: mmota{at}fm.ul.pt

Abstract

After the bite of a malaria‐infected mosquito, the Plasmodium sporozoite infects liver cells and produces thousands of merozoites, which then infect red blood cells, causing malaria. In malaria‐endemic areas, several hundred infected mosquitoes can bite an individual each year, increasing the risk of superinfection. However, in infants that are yet to acquire immunity, superinfections are infrequent. We have recently shown that blood‐stage parasitaemia, above a minimum threshold, impairs the growth of a subsequent sporozoite infection of liver cells. Blood‐stage parasites stimulate the production of the host iron‐regulatory factor hepcidin, which redistributes iron away from hepatocytes, reducing the development of the iron‐dependent liver stage. This could explain why Plasmodium superinfection is not often found in young nonimmune children. Here, we discuss the impact that such protection from superinfection might have in epidemiological settings or in programmes for controlling malaria, as well as how the induction of hepcidin and redistribution of iron might influence anaemia and the outcome of non‐Plasmodium co‐infections.

Malaria today

Malaria infection is still a major scourge in the developing world, although there has been a gradual decrease in the number of deaths in the past ten years, and more than ten countries were able to reduce malaria cases by 50% (WHO, 2010). This is believed to be a response to a global movement, as proposed by Melinda Gates in 2007, towards the ambitious goal of malaria eradication. Nevertheless, malaria elimination will clearly not be reached with the available tools, especially in areas of moderate to high transmission (MalERA, 2011). Thus, it remains of the greatest importance to better understand the biology of the causative agent of the disease—the protozoan parasite Plasmodium—and to delineate its complex relationships with human populations and the many species of mosquito by which it is transmitted. This is critical to establish the determinants of endemicity and transmission dynamics and to envisage new and better ways to fight malaria.

Plasmodium and its hosts

The three main players of a malaria infection—human (or other vertebrate hosts), parasite and mosquito—have been known for more than a century. Nevertheless, the exact details of the Plasmodium life cycle and its needs, as well as the host and vector molecules that contribute to or fight infection are still not well understood. Plasmodium takes many forms during its life cycle, shifting between invasive and replicative stages both in the vertebrate host and in the anopheline vector. Infection starts during a blood meal, when an anopheline mosquito deposits Plasmodium sporozoites into the dermis of the host. Sporozoites use their capacity to traverse host cells to reach the circulation, where they travel to the liver, and to exit the liver sinusoids towards the hepatocyte, where each parasite will develop and replicate into thousands of new merozoites (reviewed in Prudencio et al, 2006). These are then released into the bloodstream to infect red blood cells (RBCs) during several cycles of asexual replication, leading to the onset of disease. Occasionally, some parasites differentiate into sexual erythrocytic stages, originating female or male gametocytes that, after a subsequent blood meal, reach a mosquito's midgut. Here, fertilization of gametes occurs forming ookinetes and, later, sporozoite‐containing oocysts. These invade the mosquito salivary glands and restart the cycle at the next blood meal (Fig 1).

Figure 1.

Plasmodium life cycle. (A) During a blood meal, an anopheline mosquito injects Plasmodium sporozoites into the host dermis. (B) After reaching a blood vessel, sporozoites travel to the liver where—after traversing several hepatocytes—they invade a final one. (C) After asexual replication and development inside the hepatocyte, merozoites are released into the bloodstream. (D) Merozoites infect red blood cells during cycles of asexual replication. (E) Occasionally, replication cycles originate female and male gametocytes. (F) Through another blood meal, a mosquito ingests gametocytes into its midgut. (G) Fertilization of gametes occurs in the mosquito midgut with the formation of ookinetes and later the oocysts. (H) Sporozoites released from the oocyst migrate to the salivary gland of the mosquito and are released during the next blood meal.

Malaria endemicity—the risk of re‐infection

The intensity of malaria transmission can range from unstable epidemic transmission to high perennial transmission, and can be characterized by 100‐fold differences in entomological measures of transmission (or entomological inoculation rates, EIRs). In regions of high malaria transmission, individuals can be exposed to several hundred infected mosquito bites per year (Robert et al, 2003). In such regions, mosquitoes repeatedly transmit Plasmodium sporozoites into individuals who already have Plasmodium parasites from a previous infection, generating a risk of superinfection. This phenomenon occurs when single individuals host more than one Plasmodium species, or different genotypes of the same Plasmodium species, infecting RBCs (Babiker et al, 1991; Bruce et al, 2000; Richie, 1988). Although these mixed infections can be easily explained by consecutive infectious bites, an alternative explanation proposed by Beier in 1988 is that they originate from a single infectious bite from one mosquito infected with more than one Plasmodium species or genotype (Beier et al, 1988). Nevertheless, it is well established that Plasmodium mixed infections more often occur in areas of high transmission (Mayor et al, 2003; Molineaux & Gramiccia, 1980; Owusu‐Agyei et al, 2002; Sama et al, 2006), suggesting that the risk of superinfection generally originates from consecutive infectious bites.

Studies on superinfection have concentrated on the interactions between different genotypes or species of Plasmodium that infect RBCs, and several putative interactions have been described (Richie, 1988). A longitudinal study in Papua New Guinea by Bruce and Day showed oscillatory peaks of infection with no coincidence of the species (Bruce & Day, 2003), whereas a study by Lorenzetti in the Brazilian Amazon proposed that P. vivax reduces the severity of P. falciparum malaria (Lorenzetti et al, 2008. Other studies suggested that acute P. falciparum infection suppresses P. vivax parasitaemia, which increased to detectable levels after treatment (Looareesuwan et al, 1987; Snounou & White, 2004). Along these lines, McKenzie and Bossert made the interesting observation that fewer mixed‐species infections than expected on the basis of the product of individual species prevalences were observed in areas where P. vivax and P. falciparum coexist (McKenzie & Bossert, 1997). Nevertheless, an important but unexplained feature of the epidemiological studies is that concurrent carriage of various parasite genotypes at low asymptomatic parasitaemias is only frequently seen in older, semi‐immune children (Mayor et al, 2003; Molineaux & Gramiccia, 1980; Owusu‐Agyei et al, 2002; Sama et al, 2006). Why, then, is superinfection not frequently observed in younger, less immune children, who ought to be prone to sequential infections?

The sequence of events, from Plasmodium deposition and passage through the skin, followed by growth in hepatocytes and invasion of erythrocytes is probably perturbed in high transmission areas, with new infections occurring in individuals harbouring parasites from previous infections. In these situations, the balance between the roles of host and pathogen—which have evolved towards ensuring the survival of the host and the successful transmission of the pathogen—might be disturbed, raising the possibility of secondary interactions between the parasite and the parasitized host.

Having that in mind, we recently analysed the impact of an ongoing blood‐stage infection on a subsequent liver infection. By using rodent models of malaria infection, we showed that ongoing blood‐stage infections impair the growth of subsequently inoculated Plasmodium sporozoites such that they become growth‐arrested in liver hepatocytes and fail to develop into blood‐stage parasites (Portugal et al, 2011). This impairment, caused by an ongoing blood‐stage infection on the establishment of a Plasmodium liver infection, is independent of the Plasmodium species, transient and only occurs above a certain threshold of blood parasite density. This resembles a quorum‐sensing mechanism described for bacteria, by which the blood stage is able to protect its niche from the threat of superinfection (Portugal et al, 2011).

Protection of young nonimmune children

Epidemiological studies from highly endemic areas consistently show that the concurrent carriage of various parasite genotypes at low asymptomatic parasitaemias is frequently seen in older children but rarely seen in younger ones (Mayor et al, 2003; Molineaux & Gramiccia, 1980; Owusu‐Agyei et al, 2002; Sama et al, 2006). In addition, the incidence of infection initially increases with age in young children before it declines as a result of acquired immunity (Molineaux & Gramiccia, 1980; Sama et al, 2006). Can the observations made in mice—in which ongoing malaria blood‐stage infections impair the establishment of new Plasmodium liver‐stage infections—explain the low frequency of superinfections in young children? By using a simple agent‐based model, we showed that threshold‐density‐dependent inhibition of new liver‐stage infections alone can explain the changes in infection risk and complexity of infections in young individuals observed in the field (Fig 2; Portugal et al, 2011).

Figure 2.

Parasite‐dependent protective effect over age and time. Young children often have high levels of blood‐stage parasitaemia. When these children are re‐infected with sporozoites, little replication or development of parasites happens in the liver, inhibiting superinfection. The blood parasitaemia of an infection decreases with increasing age. Infected individuals become susceptible to further infections only once parasite densities fall below a critical threshold, raising the probability of superinfection.

Notably, modelling showed that this effect is dependent on the transmission intensity and is most prominent under moderate‐to‐high transmission settings, as observed in epidemiological studies. This strongly suggests that the probability of a new Plasmodium infection occurring in humans is also dependent on the level of the peripheral blood parasitaemia.

Prevention of superinfection by ongoing blood‐stage infections might have direct implications in host protection. The first Plasmodium in the blood protects its niche, possibly trying to ensure transmission on a first‐come–first‐served basis. Thus, the host immune system would only have to fight one circulating parasite, raising its chances of eliminating the circulating infected RBCs and clearing infection, thereby increasing the host's chances of survival. This is particularly relevant for nonimmune individuals—the young ones in endemic regions. In older, semi‐immune children, the risk of severe disease decreases and, as circulating parasitaemia is also lower, the complexity of blood‐stage infections increases. Indeed, it has also been reported that superinfections are much more frequently observed among asymptomatic carriers—who have lower peripheral blood parasitaemias and thus are probably below our proposed threshold of protection—than in clinical cases (al‐Yaman et al, 1997). Plasmodium has co‐evolved with mammals for millions of years, and mechanisms such as this one, of protection from superinfection, seem to have been selected to benefit both the host and the incumbent pathogen by maintaining parasite density at levels that are not life threatening before acquired immunity takes up its role.

Superinfection protection and acquired immunity

Naturally acquired immunity to malaria was initially described almost 100 years ago (reviewed in Doolan et al, 2009) and has been greatly debated. It is dependent on constant exposure to the Plasmodium parasite and on the degree of exposure. Nevertheless, re‐infections and the great impact that an ongoing blood‐stage infection has on a subsequent sporozoite infection have never been taken into consideration when studying the establishment of acquired immunity against Plasmodium. Although an early human study looked at how sequential Plasmodium inoculations could lead to multiple infections (Boyd, 1939), the possibility that the liver stage of infection could be modulated by the presence of another infection and contribute to acquisition of immunity was not definitively explored. The inhibition of Plasmodium liver‐stage growth of re‐infected individuals might be acting as a live attenuated vaccine, promoting protection in endemic populations. In addition, the number of liver infections would be higher than the number of episodes of blood‐stage infections, boosting the immune response to the liver stage without exposing the host to an extra blood‐stage infection. This could possibly contribute to the faster acquisition of immunity and protection from severe disease, and both, coincidently, are associated with high Plasmodium transmission rates (Okiro et al, 2009; Snow & Marsh, 2002). It is therefore tempting to speculate that re‐infection might also have a role in naturally acquired immunity.

Liver‐stage‐specific, naturally acquired immunity is not believed to be strong enough or acquired fast enough to provide protection in endemic populations (Marsh & Kinyanjui, 2006). Nevertheless, the infection of both humans and mice with Plasmodium sporozoites followed by chloroquine treatment—which abrogates the establishment of blood stages—protects individuals from subsequent infections (Belnoue et al, 2008; Roestenberg et al, 2009, 2011). Notably, vaccine development efforts have invested significantly in pre‐erythrocytic immunization and proven efficient in rendering protection for variable periods of time. Indeed, the use of either attenuated forms of the parasite (Clyde, 1975; Waters et al, 2005) or live sporozoites (Roestenberg et al, 2009, 2011) can completely protect humans from malaria infections.

On the other hand, blood stages could suppress immune responses against liver‐stage immunity, as blood‐stage infection was shown to reduce the number of CD8+T cells produced after irradiated sporozoite immunization (Ocana‐Morgner et al, 2003). The impairment of Plasmodium liver infection by an ongoing blood‐stage malaria infection occurs in the absence of T cells (Portugal et al, 2011), but it is unclear what would happen after several infections, when acquired immunity would play a stronger role and other interactions could occur between these two stages of Plasmodium infection.

Undoubtedly, population‐based studies of malaria‐endemic areas and highly controlled experimental model systems will be decisive for the full understanding of the relevance of superinfection control in naturally acquired immunity to malaria.

The importance of iron for Plasmodium development

To understand how the inhibition of the liver stage by blood‐stage parasitaemia might have an impact on infection dynamics in malaria‐endemic regions, the molecular basis of the phenomenon must be elucidated. The strain‐independence and species‐independence of the effect, as well as its rapid kinetics, suggest that adaptive immunity is not involved, and this was confirmed experimentally (Portugal et al, 2011). Both the number and size of exoerythrocytic forms are reduced during inhibition of the liver stage, suggesting that nutrient limitation might be involved. Almost all forms of life use iron in processes such as DNA synthesis, oxygen carriage and generation of ATP; iron is needed for growth and proliferation (Schaible & Kaufmann, 2004). Plasmodium is no exception, and the importance of iron for parasite growth has been clear for the past 30 years. A seminal study showed that feeding patients after hospital admission during famine led to an early increase in serum iron and transferrin saturation that often preceded attacks of malaria and parasitaemia (Murray et al, 1975). Similarly, parasitaemia increased in Wister rats after intramuscular injection of iron, which also led to raised transferrin saturation (Murray et al, 1975). The effect of iron supplementation on malaria has been a subject of much debate, especially after the recent Pemba trial, in which prophylactic supplementation of preschool children with iron and folic acid in a population with high rates of malaria resulted in an increased risk of severe disease and death (Prentice et al, 2007; Sazawal et al, 2006).

The need for iron during parasite replication has been explored in several studies, beginning with work by Scheibel and Adler using the Fe3+ chelator desferrioxamine (Scheibel & Adler, 1980, 1981, 1982). Desferrioxamine has also been used to inhibit the growth of P. falciparum in RBCs in vitro; the removal of available iron caused a block in schizogony (Raventos‐Suarez et al, 1982). Subsequent studies found that desferrioxamine inhibited the growth of blood‐stage parasitaemia in mice (Fritsch et al, 1985), primates (Pollack et al, 1987) and humans (Bunnag et al, 1992; Gordeuk et al, 1992a). However, desferrioxamine is not a good potential drug for wide use in the context of malaria owing to its poor membrane permeability and absorption when given orally, a short half‐life in plasma, and a slow‐to‐develop antimalarial effect (Cabantchik et al, 1996). Perhaps for these reasons, desferrioxamine did not reduce mortality in clinical trials (Thuma et al, 1998a), despite some early promise (Gordeuk et al, 1992b), nor did the orally active iron chelator deferiprone (Thuma et al, 1998b). Nevertheless, the particular sensitivity of Plasmodium blood‐stage growth to iron chelators, noted by Glickstein and colleagues among others (Glickstein et al, 1996), means that the iron chelation approach might be beneficial, and further iron‐binding molecules aimed at halting Plasmodium growth are being developed (Pradines et al, 2006; Rotheneder et al, 2002). One additional factor to bear in mind, however, is that the potent antimalarial artemisinin requires iron for its action—so that co‐administration with iron chelators is probably ill‐advised (Meshnick et al, 1993; Weinberg & Moon, 2009).

Plasmodium proliferates prodigiously in hepatocytes (Prudencio et al, 2006). The rate of parasite growth within hepatocytes exceeds that within RBCs, and such rapid cell division requires iron. Accordingly, iron loading of mice promotes the development of Plasmodium yoelii, and iron supplementation of hepatocyte cultures increases parasite numbers in vitro (Goma et al, 1996). In addition, iron deficiency might reduce parasite growth both in animals and in cultured hepatocytes (Goma et al, 1995; Loyevsky et al, 1999). Together, these data indicate that Plasmodium replication in both the liver and the erythrocytes requires iron. The precise source of iron and the molecular mechanisms by which Plasmodium acquires iron from its host cell remain unclear (see Sidebar A).

Sidebar A | In need of answers

  1. Do young, nonimmune children living in endemic areas of malaria have a natural protection system from superinfection? Are hepcidin and iron the critical factors to protect individuals from superinfection? What other factors might be involved?

  2. What are the molecular mechanism and the impact of hepcidin upregulation for blood‐stage infections and anaemia? And for Plasmodium transmission?

  3. How do malaria‐induced changes in hepcidin and iron influence susceptibility to co‐infections?

  4. How is iron acquired by Plasmodium during either liver or blood stages of infection?

  5. Do hepcidin and iron levels influence relapse from P. vivax (hypnozoite activation)?

Hepcidin controls iron availability and distribution

Iron availability in vivo clearly has important effects on Plasmodium growth, as it does in all other forms of life. Iron transport and homeostasis in mammals is under the control of the iron regulatory hormone hepcidin, a circulating 25 amino acid peptide (Ganz & Nemeth, 2011). The levels of iron in humans are normally kept in balance through the regulation of intestinal dietary absorption, through which about 1 mg of iron is acquired per day. About 20–25 mg iron are recycled daily by macrophages, which ingest senescent RBCs, enzymatically release iron from haemoglobin and export iron back into the serum. Both the duodenal enterocytes and the macrophages of the reticuloendothelial system release iron through the only known mammalian iron exporter protein ferroportin (Abboud & Haile, 2000; Donovan et al, 2000, 2005; McKie et al, 2000), a process that is targeted by hepcidin. The accumulation of iron increases hepcidin synthesis by the liver (Pigeon et al, 2001), which is released into the circulation and binds to ferroportin. The hepcidin–ferroportin complex is then internalized and degraded, ceasing iron export (Nemeth et al, 2004a). Hence, increases in hepcidin synthesis lead to reduced iron absorption and inhibited iron recycling, restoring equilibrium. Genetic defects that disrupt this autoregulatory feedback loop lead to clinically important disorders of metabolism. For example, hereditary haemochromatosis is caused by mutations that reduce hepcidin synthesis (Bridle et al, 2003; Ganz & Nemeth, 2011) or that confer hepcidin resistance to variants of ferroportin (Drakesmith et al, 2005). As a result, ferroportin activity is uninhibited and dietary iron absorption is chronically increased. Reduced hepcidin can also be caused by an increased erythropoietic drive; a signal is believed to originate from highly activated bone marrow that acts on the liver to suppress hepcidin levels (Ganz & Nemeth, 2011; Tanno et al, 2007). The iron accumulation that results from low hepcidin can be toxic, as increased serum iron and total body iron levels lead to iron deposition in parenchymal tissues, especially hepatocytes. Conversely, genetic defects that lead to increased hepcidin and block ferroportin can lead to iron‐deficiency anaemia (Finberg et al, 2008) caused by a reduction of dietary iron absorption and sequestration of iron in reticuloendothelial macrophages, including those of the spleen. In this condition, serum iron is very low, and the flow of iron to the bone marrow—where it is needed for incorporation in haemoglobin—is restricted. Therefore, changes in hepcidin synthesis affect both the amount of iron and its tissue distribution within the body. Because different pathogens target particular tissues—or, as in the case of Plasmodium, the same pathogen infects specific cell types as its life‐cycle progresses—fluctuations in hepcidin levels have the potential to change iron availability during infections. As iron is important for the growth of pathogens, this suggests that hepcidin levels could influence the course and outcome of infections. However, hepcidin is not the only factor that influences the availability of iron to pathogens. Within cells, iron is present in different compartments, such as stored in ferritin or loosely bound to chaperones in the cytoplasm—known as the ‘labile iron pool’. The relative amounts of iron in these intracellular locations are also important in determining its availability for specific pathogens, which in turn have evolved the capacity to scavenge iron from different compartments (Byrd & Horwitz, 1991; Nairz et al, 2007). Exactly how hepcidin influences the intracellular distribution of iron in different cell types remains to be determined, but there are studies that suggest that high hepcidin and low ferroportin activity increase iron storage within ferritin in macrophages (Ganz & Nemeth, 2011).

Hepcidin in Plasmodium infections

Increased hepcidin levels have been found in blood‐stage malaria infections of P. falciparum and P. vivax (de Mast et al, 2009a,b, 2010; Howard et al, 2007), and a potential role of hepcidin in the aetiology of malarial anaemia—an important aspect of malarial illness—has been discussed. Haemolysis, sequestering of erythrocytes and increased erythrophagocytosis all contribute to anaemia, but an additional factor to consider is the inability of erythropoiesis to keep pace with RBC loss (Lamikanra et al, 2007; Rencricca et al, 1974). Dyserythropoiesis also occurs in malaria despite generally high erythropoietin levels; the response to erythropoietin is blunted probably due to the action of haemozoin and cytokines (Casals‐Pascual et al, 2006; Lamikanra et al, 2007). This lack of a strong erythropoietic drive might mean that the signals normally produced by an active bone marrow, which suppress hepcidin, are not synthesized during malaria. This failure to suppress hepcidin allows inflammatory cytokines—possibly including interleukin‐6 (de Mast et al, 2009b; Nemeth et al, 2004b)—to further increase hepcidin levels, which exacerbates dyserythropoiesis by sequestering iron in macrophages. Thus, a vicious cycle could arise, in which loss of erythrocytes during malaria leads to an increase in hepcidin, rather than to the decrease that is seen in other situations of blood loss, which in turn means not enough iron is available to support RBC generation (Fig 3). An alternative, or additional, view of the increase in hepcidin is that it is a host response to infection. From this perspective, high hepcidin levels and subsequent lowering of RBC production and iron content could possibly represent an attempt by the host to reduce the number of available targets and the amount of available iron for Plasmodium. In such a scenario, hepcidin would function as an innate immune molecule (Armitage et al, 2011) that might help to control parasitaemia before the development of adaptive immune responses. More work is clearly needed to clarify these issues (see Sidebar A).

Figure 3.

Malaria, hepcidin and anaemia. (A) In a healthy individual, loss of red blood cells (RBCs)—for example, by haemorrhage—leads to a hypoxia‐induced increase in erythropoietin (EPO) production and a higher erythropoietic drive. Through mechanisms that are not fully understood, hepcidin synthesis by the liver is also reduced, increasing iron availability for incorporation into haemoglobin, and enabling recovery from anaemia. (B) Malaria infection also leads to a loss of RBCs, but causes an increase in hepcidin. Iron is sequestered in macrophages and is therefore not available for an efficient bone marrow response. In addition, the haemozoin released by replicating parasites and cytokines produced during the inflammatory response to infection further inhibits erythropoiesis.

Hepcidin and Plasmodium liver‐stage inhibition

What about the role of hepcidin in the inhibition of superinfection? It is well established that low levels of hepcidin lead to iron accumulation in hepatocytes, whereas macrophages are relatively spared, as observed, for example, in the iron overloading disorder hereditary haemochromatosis (Pietrangelo, 2010). By contrast, high hepcidin levels lead to increased reticulendothelial iron deposition, and mice overexpressing hepcidin have relatively reduced levels of liver iron (Gardenghi et al, 2010). We found that blood‐stage Plasmodium infections caused an iron redistribution effect: more non‐haem iron was found in the spleen and less in hepatocytes. This effect would be consistent with an increase in hepcidin mRNA, which was indeed observed in the livers of mice with ongoing blood‐stage parasitaemia. Earlier studies discussed above had hinted that the liver stage of Plasmodium infection is, similarly to the blood stage, iron‐dependent. This is further supported by in vitro and in vivo experiments showing that iron chelation reduces exoerythrocytic forms size and liver parasite load, and iron supplementation increases Plasmodium development in hepatoma cultures and in intact mice (Portugal et al, 2011). Hepcidin inhibits the growth of the liver stage by limiting the access of proliferating parasites to iron, as shown by increasing hepcidin using three experimental methods: injection of hepcidin peptide, infection with hepcidin‐expressing recombinant adenovirus and use of transgenic mice engineered to overexpress hepcidin. In all three conditions, the development of liver‐stage Plasmodium infection was significantly reduced. Could the increased hepcidin caused by blood‐stage infections be responsible for the inhibition of liver‐stage infections? This is probably the case, as clearance of an ongoing blood‐stage infection with chloroquine returned hepcidin mRNA to usual levels and allowed the normal development of a secondary liver infection. In addition, the waxing and waning of P. chabaudi blood‐stage infections—which are naturally cleared in mice—correlated with a rise and then fall to baseline of hepcidin levels; secondary liver‐stage infections could only develop in P. chabaudi‐infected mice once hepcidin expression had returned to its normal levels. Injecting increasing numbers of parasitized RBCs caused a corresponding increase in liver hepcidin mRNA; an increase of RBCs above 0.15% parasitaemia and of hepcidin expression more than 3.3‐fold inhibited the development of a liver‐stage infection initiated by injection of sporozoites. Importantly, this hepcidin upregulation renders levels similar to those found in the transgenic mice in which liver infection was also inhibited (Fig 4). Thus, hepcidin is sufficient to inhibit the development of the liver stage, and its levels correlate with the inhibition of the liver stage mediated by an ongoing blood‐stage infection. Together, these experiments strongly suggest that iron redistribution caused by hepcidin is a key factor that can determine the outcome of liver Plasmodium infection acting independently of immunity. This provides new insights as to how the availability of a nutrient might have an impact on transmission dynamics and disease protection in endemic settings, in which iron deficiency, anaemia and oral iron supplementation are commonplace.

Figure 4.

Rodent models used to study the role of hepcidin and iron in primary Plasmodium infections and superinfection. (A) During blood‐stage infection, hepcidin upregulation inhibits liver infection, suppressing Plasmodium superinfection. (B) Reduction of hepcidin back to levels in non‐infected mice, by using chloroquine to treat blood‐stage parasitaemia, allows secondary sporozoite infection to progress to blood infection. (C) Increased hepcidin levels in the absence of blood‐stage infection—by using transgenic mice that overexpress hepcidin or through the inoculation of hepcidin peptide or hepcidin‐expressing adenovirus into wild‐type mice—impair sporozoite liver infection.

Co‐infections of Plasmodium and other pathogens

The profound effect of hepcidin and iron on Plasmodium liver infection raises the question of whether the incidence and/or severity of infection by other pathogens are altered by the iron redistribution caused by malaria (see Sidebar A). In principle, co‐infection by macrophage‐tropic and iron‐requiring microbes might be favoured by the presence of blood‐stage Plasmodium parasites, as hepcidin induction during blood‐stage infection increases levels of macrophage iron retention. Infection of non‐typhoidal Salmonella (NTS) could fit into this simple scheme in many respects. NTS multiplication within macrophages is iron‐dependent, and changes in the expression of genes that regulate iron alter bacterial growth (Nairz et al, 2007). In particular, the hepcidin–ferroportin axis influences Salmonella replication, as increased ferroportin expression reduces intracellular bacterial growth, an effect that is counteracted by exogenous hepcidin (Chlosta et al, 2006). Accordingly, mice injected with P. yoelii nigeriensis have been shown to be more susceptible to Salmonella infection (Roux et al, 2010). Haemolysis alone does not account for increased bacterial growth, suggesting that other factors might be important. Although not investigated in that study, a candidate for such a factor would be Plasmodium‐induced hepcidin. In human populations, malaria seems to be a risk factor for NTS in children, and co‐infections can be lethal (Mabey et al, 1987). Peaks of malaria and NTS infection coincide in areas where malaria transmission is seasonal, and NTS is particularly linked with severe malarial anaemia (Graham et al, 2000; Mabey et al, 1987), which, in turn, is associated with higher parasitaemia and hepcidin. Finally, a decrease in the incidence of malaria in the Gambia was closely associated with a decline in the incidence of NTS (Mackenzie et al, 2010). Together, these studies suggest that Plasmodium blood‐stage parasitaemia might promote and exacerbate NTS infection through the induction of hepcidin, causing an accumulation of iron within the cell type that Salmonella favours for replication—macrophages. Other aspects of host–pathogen interactions will clearly also be important for the outcome of disease, such as potential suppression of immune responses and the effects of anaemia, but the influence of iron on pathogen growth might be strong. Deleterious outcomes of other important infections including tuberculosis and HIV‐1 are also associated with accumulation of iron in macrophages (Boelaert et al, 2007; de Monye et al, 1999), and therefore might potentially be exacerbated by Plasmodium. Studies on HIV‐1 and malaria co‐infections have often focused on the undoubted importance of virally induced immune deficiency on control of Plasmodium, but the effects of iron redistribution due to malaria might not be without consequence, as HIV‐1 replication is iron‐dependent (Drakesmith & Prentice, 2008).

New findings, new perspectives, more questions

If young, nonimmune individuals have a natural protection system from superinfection—which will be important to analyse in populations where Plasmodium is endemic (see Sidebar A)—the old concern that reducing the risk of Plasmodium infection will necessarily lead to a decrease in the threat of severe malaria throughout life is emphasized once more (Garnham, 1949).

Examples such as that of Sri Lanka after the Global Malaria Eradication Program initiated in 1955 hint that there is a risk associated with reducing transmission without elimination. The resurgence of malaria when the intervention stopped gained epidemic proportions and the levels of mortality and morbidity were higher than those reported before the eradication programme (Wijesundera Mde, 1988). Recent results from intermittent preventive treatment of malaria in children indicate that this with sulphadoxine‐pyrimethamine plus amodiaquine, although not associated with an increase in the prevalence of malaria infection, was associated with a small increase in the incidence of clinical malaria in the subsequent malaria transmission season (Dicko et al, 2011; Konate et al, 2011). Conversely, results from the pre‐erythrocytic vaccine trial RTSS show that reducing Plasmodium liver load in children leads to a significant decrease in the severity of disease (Vekemans et al, 2009).

Further work towards fully understanding host–Plasmodium interactions, and the association between transmission and immunity will be of major importance in defining malaria control interventions and predicting their outcome over time. Our knowledge about single malaria infections has increased greatly in the past decades, much due to the developments in systems biology and large‐scale approaches and to the use of animal models. Infections should be studied—even in experimental models—as they appear in nature, in the presence of co‐infections that might influence the next one directly or through its effects on the host, which could also have some impact on the primary infection. Such an approach has also been proposed by others (Boraschi et al, 2008). A recent report showing that a protozoan parasite can be infected by a virus, which contributes strongly to the harm caused to the host (Ives et al, 2011), highlights the complexity of host–pathogen interactions, and that unravelling such complexity is crucial to understanding infectious diseases. Similarly, the unveiling of an unexpected new relationship between the liver and blood stages of Plasmodium infection has ramifications for strategies to eliminate malaria.

Conflict of Interest

The authors declare that they have no conflict of interest.

Acknowledgements

We thank Lucy Eddowes and Andrew Armitage for helpful comments and discussions. This work was supported by Fundação para a Ciência e a Tecnologia (FCT, Portugal), European Science Foundation (EURYI to M.M.M.), Howard Hughes Medical Institute and the Medical Research Council UK. H.D. is a Beit Memorial Fellow for Medical Research and a Medical Research Council New Investigator. S.P. was supported by FCT (SFRH/BD/31523/2006).

References