Adolescent Female Blood Donors At Risk For Iron Deficiency And Associated Anemia
New public health measures could help protect this vulnerable population, authors say
Female adolescent blood donors are more likely to have low iron stores and iron deficiency anemia than adult female blood donors and nondonors, which could have significant negative consequences on their developing brains, a new study led by Johns Hopkins researchers suggests. Based on these findings, the authors propose a variety of measures that could help this vulnerable population.
Although blood donation is largely a safe procedure, adolescents are at a higher risk for acute, adverse donation-related problems, such as injuries from fainting during donation, explains study leaders Eshan Patel, M.P.H., a biostatistician in the Department of Pathology at the Johns Hopkins University School of Medicine, and Aaron Tobian, M.D., Ph.D., professor of pathology, medicine, oncology and epidemiology at the Johns Hopkins University School of Medicine and director of transfusion medicine at The Johns Hopkins Hospital.
Additionally, they add, blood donation may also increase the risk of iron deficiency, as each whole blood donation removes about 200–250 milligrams of iron from the blood donor. Because adolescents typically have lower blood volumes, when donating the same amount of blood, they have a relatively higher proportional loss of hemoglobin—the iron-containing protein in blood cells that transports oxygen—and consequently more iron during donation than adults. Females are even more at risk of iron deficiency than males due to blood loss during menstruation every month.
Numerous studies have shown that younger age, female sex and increased frequency of blood donation are all associated with lower serum ferritin levels (a surrogate for total body iron levels) in blood donor populations. However, note Patel and Tobian, no study using nationally representative data has compared the prevalence of iron deficiency and associated anemia between blood donor and nondonor populations, specifically adolescents.
Toward this end, the researchers analyzed data from the National Health and Nutrition Examination Survey, a long-running study designed to assess the health and nutritional status of adults and children in the U.S. based on both physical exams and interviews conducted by the Centers for Disease Control and Prevention. From 1999 to 2010, this study included collections of blood samples as well as questions about blood donation history in the past 12 months.
The researchers found 9,647 female participants 16–49 years old who had provided both samples and blood donor history information. There were 2,419 adolescents ages 16–19 in this group.
They report in the journal Transfusion on Feb. 19 that about 10.7 percent of the adolescents had donated blood within the past 12 months, compared with about 6.4 percent of the adults. Mean serum ferritin levels were significantly lower among blood donors than among nondonors in both the adolescent (21.2 vs. 31.4 nanograms per milliliter) and the adult (26.2 vs. 43.7 nanograms per milliliter) populations. The prevalence of iron deficiency anemia was 9.5 percent among adolescent donors and 7.9 percent among adult donors—both low numbers, but still significantly higher than that of nondonors in both age groups, which was 6.1 percent. Besides, 22.6 percent of adolescent donors and 18.3 percent of adult donors had absent iron stores.
Collectively, the authors say, these findings highlight the vulnerability of adolescent blood donors to associated iron deficiency.
“We’re not saying that eligible donors shouldn’t donate. There are already issues with the lack of blood supply,” Tobian says. “However, new regulations or accreditation standards could help make blood donation even safer for young donors.”
Source Newsroom: Johns Hopkins Medicine