From Conjecture to Reality to Controversy: Blood Transfusions and Blood Banking in America

Nate Zaroban is a fourth-year Biological Chemistry major and Neuroscience concentrator at Grinnell College. After graduating from Grinnell, Nate will be attending the University of Nebraska College of Medicine starting in the Fall of 2018. He took the History of American Health and Medicine course to learn more about the profession that he will begin training for in the near future.

Blood has had great symbolic significance throughout history. For example, during the Last Supper, Jesus Christ of Nazareth poured his blood out for his disciples to drink, promising eternal life if they did so [1]. Blood oaths, in which both parties spill their own blood to demonstrate their sincerity and sacrifice, have been widely revered as a definitively binding agreement. Count Dracula, the famous monster from the 1897 novel by Bram Stoker, has terrified readers for well over a century because his desire to bite his victims and consume their blood directly from their bodies [2]. It’s clear that there is, and nearly always has been, a deep-rooted connection between one’s blood and personal identity, soul, and being. So, taking the blood from one person, and putting it into the body of another person is no small matter. I aim to lay out a road-map of how medicine came to rely on blood transfusions and the wide-spread implementation of blood banking systems and discuss the societal issues that had to be confronted to get there, particularly in the United States of America. The scientific advances that slowly brought blood transfusions from mere speculation to mainstream medicine were accelerated by times of desperation and imminent threat to life. And, these advances generated waves of societal discourse and debate that forced America to face new issues surrounding race and blood.

Although it is fruitlessly debated who first conceived the idea of taking the blood of one person and putting it another, it is accepted that the first person to definitively describe blood transfusions was Andreas Libavius in 1615 [3]. Libavius suggested the blood of young, healthy males should be collected and given to weak, elderly people to restore strength and vitality. Historians doubt that Libavius actually performed these transfusions, but his descriptions reinforce the notion that blood is the source of life in humans. The first documented animal-to-animal blood transfusion is credited to Richard Lower of Oxford in 1665, in which he exchanged the blood of two dogs [3].  After knowledge of the possibility of blood transfusions spread, the remainder of the 17th century was comprised of an indecipherable cacophony of unregulated transfusion experiments (including animal-animal, human-human, and animal-human transfusions) performed by many scientists all over Europe. These experiments yielded wildly unsuccessful results, including some particularly gruesome subject deaths. Because of the predominant failure of these transfusion experiments, continued study was virtually abandoned until the very late 1800’s [3].

The discovery of different blood types by Austrian scientist Karl Landesteiner in Vienna in 1900 kickstarted transfusion experimentation again [4]. Other scientist had already begun to test for differences in blood between species that caused adverse clotting reactions when mixed. But, no one had yet thought to look for molecular differences in human blood. Through the mixing of different subjects’ blood samples, and tracking the mixtures agglutination status, Landesteiner identified four different blood ‘groups’ (A, B, AB, and O). By 1910, W.L. Moss, an American physician-scientist, had caught up to the work of Landesteiner, also identifying all four blood types [4]. Following Moss’ discoveries, transfusion science programs began at several prestigious university-hospitals in the Northeast United States [4,5]. During this boom of blood transfusion research, a young medical student named Oswald Hope Robertson graduated from Harvard Medical School in 1915 [5]. Robertson’s mentor’s research interests focused on transfusion medicine, and Robertson followed suit. One of Robertson’s most significant contributions to transfusion medicine knowledge was that the “O” blood type could be considered “universally compatible,” as it never coagulated when mixed with other blood types [5]. Around this same time, it was also discovered that the clotting of blood samples taken from patients could be delayed with the addition citrate [5,6,7]. These discoveries would prove invaluable in the very near future.

When United States entered World War I in April 1917, Robertson was deployed to the Western Front to serve as a trauma doctor [5]. At this point, nearly all transfusions were still performed directly from donor to recipient [4]. But, Robertson quickly learned that transfusing blood directly from healthy soldiers to injured comrades on the frontlines was a wildly inefficient method. So, he put his knowledge of universal blood and the anti-clotting agent citrate to good use. Robertson formulated a system in which he identified O-type soldiers in his platoon, preemptively collected blood samples from them, added citrate to the samples, and stored them on ice until they were needed for immediate transfusion [5,6]. This blood banking system was the first of its kind, and Robertson’s medical team saw increased patient survivability after its introduction. It was so successful, in fact, that Robertson was sent to train other medical teams to establish their own transfusion blood banks [5]. This massive leap in transfusion medicine was forced by the imminent threat of war and death. Robertson had to make it work (and fast) to save the lives of thousands of soldiers.

After World War I ended, Oswald Robertson eventually became a Professor of Medicine at the University of Chicago in 1927. While there, Robertson became an informal advisor to a man named Bernard Fantus [5,6]. In 1933, Fantus took the position of Director of Therapeutics at Cook County Hospital in Chicago, a public charity hospital that served the impoverished [6,8]. Cook County was struggling when Fantus took over. It didn’t have sufficient funds to pay healthy people to donate blood for transfusions, and many patients that came in requiring transfusions couldn’t pay either [8]. Fantus had learned of the significant impact Robertson’s blood banking methods had overseas during the War and inquired about utilizing similar methods “at Cook County Hospital where large numbers of critically injured patients were admitted who had no friends or relatives to donate the blood needed to save their lives [6].” With Robertson’s guidance, Fantus opened the first public “blood bank” in America at Cook County Hospital in 1937, using an interesting model for sustainability [8]. When patients that arrived needing blood transfusions could not pay, they were required to repay their debts by donating blood back to the Hospital bank when they were healthy [8]. This method’s success was three-pronged, as it ensured pre-typed blood samples would always be available, made life-saving transfusion interventions immediately accessible to all who needed them, and alleviated some financial stress off the Hospital. Just like the successes Robertson saw in the war, Fantus too had successful outcomes in his hospital [8]. This success led Fantus and his colleagues to advocate for the implementation of similar blood banking systems in other Chicago-area hospitals and other large U.S. city hospitals [8]. Within the following years, blood banks in major hospitals in cities such as Boston, Detroit, Los Angeles, Washington D.C., New York City, and Memphis were quickly established [8].

As transfusion medicine became more main-stream, the question of whether a transfusion would save or end a patient’s life diminished. Instead, patients began to worry about whose blood they were receiving [7]. Although scientific research suggested that race played no role in the biological differences of blood types, fear of mixing the blood of people of different races lurked in the hospital floors. Because blood banks at this point still operated on an individual hospital-to-hospital basis, each hospital dealt with these issues as they saw fit. As in other aspects of society, the severity of the racism depended on how far south in America the hospital was located. Some northern hospitals didn’t label blood samples with the donor’s race, some did. Several southern hospitals had separate storage areas for blood samples of white and black people [7]. The Second World War would bring these issues located at the intersection of race and blood to the forefront of national attention.

During the years leading up to World War II, the procedures of sterile blood collection, typing, storage, and transfusion continued to make gradual, but steady, advances [8]. But, just as the First World War had propelled transfusion science forward, the looming threat of a Second World War sparked another rapid push toward wide-spread transfusion medicine. In June of 1941, the United States Surgeon General requested that the American Red Cross begin a nation-wide blood procurement effort to stockpile massive quantities of blood [7,8,9]. As Swanson explains: “During the war the domestic collection of blood for the American armed forces was, at the time, the largest organized medical effort ever undertaken in the United States…” [8]. The aim of these efforts was to send blood and plasma to American and English troops [7,8,9]. The Red Cross’ collection program was massively successful in terms of mobilization, promotion, and the quantity of blood collected, and continued all the way through the end of World War II [8,9]. It unified the nation, as American citizens across the country felt it was their duty to donate blood to help the troops [8]. At least this was true for White America.

With the induction of a centralized national blood collection service, the United States government had to address the concerns surrounding race and blood. Though noting the lack of evidence to suggest that mixing blood of different races was dangerous, the U.S. War Department still issued the following statement advocating for the separation of blood: “For reasons which are not biologically convincing but which are commonly recognized as psychologically important in America, it is not deemed advisable to collect and mix Caucasian and Negro blood indiscriminately for later administration to members of the military forces” [7]. And so, the American Red Cross barred the collection of African-American blood [7,9]. The irony of the situation was that that while this made sure white servicemen would receive the blood of white donors, black servicemen had to receive the blood of white donors. After a backlash from the African-American community, and an increasing demand for blood, the U.S. Government allowed the Red Cross to collect blood from black donors, as long as they kept it separated from white blood [9]. But, this response did not alleviate any of the societal distension: “To many African Americans and their allies, segregation proved every bit as intolerable as exclusion, and so blood became one of the wars great civil rights causes and flashpoints for debates about race” [9]. America was simultaneously fighting two wars. It just so happened that the one fought on foreign soil against the Axis powers brought forth the one fought on American soil over the politics of blood.

It’s easy to understand why such great significance has been placed on blood. Lose enough of it and we cease to exist. If given the wrong kind, our bodies reject it and we still suffer a terrible death. Surely every person involved in the drawn-out narrative of brining blood transfusions from imagination to modern practice had to have felt the gravity of the miracle they were striving for: replacing an integral part of one human with that of another. In this narrative, every little contribution was important, but the unprecedented blood-shed of the two World Wars undoubtedly catapulted transfusion science forward. And, in this narrative (as we so often see throughout history), as human-kind discovered more about life and its own existence, more questions and uncertainties arose. Unfortunately, these new-found uncertainties found a home in racism. But, the American societal discourse rooted within the overlap of race and blood proved to be a point of contention through which Americans could struggle with and learn what exactly it means to be human.



[1] The Holy Bible, King James Version (Philadelphia: National Publishing Company, 1997).

[2] Stoker, Bram, Dacula (London: Archibald Constable and Company, 1897).

[3] Greenwalt, T.J., “A Short History of Transfusion Medicine,” Transfusion, 37 (1997): 550-551.

[4] Schneider, William, “Chance and Social Setting in the Application of the Discovery of Blood Groups,” Bulletin of the History of Medicine 57, no. 4 (1983): 545-62.

[5] Hess, J.R. and Schmidt, P.J, “The first blood banker: Oswald Hope Robertson,” Transfusion, 40, (2000): 110–113.

[6] Telischi, M., “Evolution of Cook County Hospital Blood Bank,” Transfusion, 14 (1974): 623–628.

[7] Schneider, William, “Blood Transfusion Between the Wars,” Journal of the History of Medicine and Allied Sciences 58, no. 2 (2003): 187-224.

[8] Swanson, Kara W. “Banks That Take Donations.” In Banking on the Body, 49-83. Harvard University Press, 2014.

[9] Guglielmo, Thomas, “”Red Cross, Double Cross”: Race and America S World War II-Era Blood Donor Service.” The Journal of American History 97, no. 1 (2010): 63-90.

Further Readings

Alter, Harvey and Klein, Harvery, “The hazards of blood transfusion in historical perspective,” Blood, 112, no. 7 (2008): 2617-2626.

Grove, Jairus. “Blood.” In Making Things International 1: Circuits and Motion, 184-200. University of Minnesota Press, 2015.

Kenny, Michael, “A Question of Blood, Race, and Politics,” Journal of the History of Medicine and Allied Sciences, Vol. 61, No. 4 (2006): 456-491.