Many clinical studies have explored the relationship between iron levels, iron supplementation, and malaria. Taken together, data from these studies have consistently indicated that iron deficiency reduces malaria risk, while iron supplementation increases malaria susceptibility.
Because of these observations, many public health programs in regions where malaria is endemic have suspended efforts to prevent iron deficiency anemia. However, data from a new study provide the first cellular and molecular explanation for why iron supplementation increases malaria susceptibility.
The study, “Host iron status and iron supplementation mediate susceptibility to erythrocytic stage Plasmodium falciparum”, was published online in the journal Nature Communications. Researchers from the University of North Carolina at Chapel Hill’s Gillings School of Global Public Health, UNC School of Medicine, Duke University, and The London School of Hygiene and Tropical Medicine utilized an in vitro culture system with freshly isolated red blood cells from iron-deficient human donors recruited through UNC’s Clinical and Translational Research Clinic. This approach allowed the team to minimize confounding factors that have complicated field studies in malaria endemic areas.
“Our results provide the first experimental validation of field observations reporting protective effects of iron deficiency and harmful effects of iron administration on human malaria susceptibility,” said senior author Dr. Carla Cerami, research assistant professor of epidemiology at the Gillings School.
“We have demonstrated that the malaria parasite is less able to invade and replicate in red blood cells from individuals with iron deficiency,” said lead author, Dr. Martha Clark, recent UNC Department of Microbiology and Immunology alumna, “and these effects were reversed by iron supplementation.”
“When people with iron deficiency are given iron supplementation, the rate at which they produce red blood cells increases, which leads to an increase in the numbers of circulating young red blood cells,” said Ms. Morgan Goheen, an MD/PhD student in microbiology and immunology. “These young red blood cells are most susceptible to the malaria parasite.”
The study led researchers to hypothesize that the risk of malaria is highest when people are in recovery from iron-deficiency anemia. In other words, one is least vulnerable when anemic, more vulnerable when not anemic, but most vulnerable during the period in between.
“Our end goal is to find a way to safely administer iron supplements in malaria endemic regions,” said Dr. Andrew Prentice, professor of international nutrition at the London School of Hygiene & Tropical Medicine, and head of the Nutrition Theme at the MRC Unit in The Gambia.
Given the great burden of anemia, infection, and nutritional disorders in the malaria-endemic world, these findings have broad implications for public health. Throughout the developing world, at least 30 percent of pregnant women and children of preschool age are iron-deficient. The total number of deaths from malaria is declining each year, after reaching a peak of about 1.8 million in 2004; however, the most recent figure—about 600,000 deaths each year—is still high.
The authors note that their study was limited by including red blood cells only from a small number of iron-deficient people living in a non-malaria endemic area. The authors plan to increase the sample size in further studies, which will be conducted in a malaria-endemic area.
Dr. Cerami’s UNC co-authors, in addition to Dr. Clark and Ms. Goheen, are Dr. Nancy Fisher, of the School of Medicine’s Department of Microbiology and Immunology; graduate student Mr. Jaymin Patel of UNC Gillings’ department of epidemiology; Dr. Raj S. Kasturi, of the department of medicine’s division of hematology/oncology; and UNC undergraduate student, Ms. Marwa A. Elnagheeb.
This study was funded by the Eunice Kennedy Shriver National Institute of Child Health and Human Development and the Bill and Melinda Gates Foundation.