More than 100 years ago, Ludwig Hoekten first suggested that blood transfusion might be made safer by crossmatching the newly discovered blood types between donors.
This suggestion led to an avalanche of discoveries related to transfusion medicine that would ultimately result in the establishment of the world's first blood banks.
Then in 1950, Carl Walter and W.P. Murphy, Jr. introduced the plastic bag for blood collection, which was "one of the single most influential technical developments in blood banking."*
Seventy years later, the technologies being developed to aid in transfusion and cellular therapies are leaps and bounds beyond plastic bags and include such innovations as machine learning, microfluidics devices and robotics. AABB News recently spoke with several researchers and scientists about some of these recent advances.
Imagine if a computer could be used to identify the precise amount of time that each individual blood donor should wait before returning to donate again. That is exactly what Alton Russell, a PhD student in the Department of Management Science and Engineering at Stanford University, and colleagues did.
"Internationally, there is a lot of disparity in how frequently countries let blood donors give blood," Russell told AABB News. "The U.S. allows people to give blood every 8 weeks, but that is the shortest inter-donation interval of any country."
In Canada, although men can return every 8 weeks, women are required to wait 12 weeks between donations. In the United Kingdom, men must wait 12 weeks and women, 16 weeks.
"There is mounting evidence from the last 30 years that the interval may be too short for some blood donors, and they don't fully recover their iron stores during that time," Russell said. With an increasing reliance on repeat donors, blood bankers must balance the need for a robust blood supply with the probability of deferrals made for iron-deficient donors.
In their study, Russell and colleagues wanted to see if they could use machine learning to identify personalized inter-donation intervals that would balance risk to donors versus risk to blood supply. They used about 2 years of data from the REDS-II RISE dataset looking at 3,162 donations for 1,025 U.S. donors.
"We took those data and trained a machine learning model using characteristics like ferritin level, hemoglobin level, how frequently they had given blood in the past, and questionnaire responses on things like diet and iron supplementation," Russell said. "We wanted to predict—based on how long they wait to come back—the likelihood of hemoglobin deferral, having a completed donation with low iron, a completed donation with absent iron or no problems."
Using the model, they were able to estimate how the risk of adverse outcomes changed in terms of inter-donation intervals and establish a level of risk they were comfortable with.
"Some donors had very low risk even if they came back only 8 weeks later, and others remained likely to have hemoglobin deferral or absent iron even if they waited a whole year," Russell said. "There was a huge variation in underlying iron status and how long people took to recover."
Using data from REDS-II RISE, the use of personalized inter-donation intervals would have decreased blood collections by 30%, but it would also have reduced hemoglobin deferral by 45%, low-iron donations by 12% and absent iron donations by 73%.
"In principle, a blood center could implement this and, whenever someone comes to give blood, use questionnaire answers and physiologic measurements to personalize inter-donation intervals to each donor," Russell said. "Certain donors, based on their profile, could be told to come back 8 weeks later, and others would be told that they need to wait longer until the risk is low enough."
*Read the full article for references and to learn more, including research on high-quality donors, simplifying blood draws, and scalable platelet supplies from researchers across North America.