Part 1
History and Development of Transfusion Medicine
CHAPTER 1
From Blood Transfusion to Transfusion Medicine
Alice Maniatis
Hematology Department, Henry Dunant Hospital, Athens, Greece; Network for Advancement of Transfusion Alternatives (NATA)
Attempts at using blood transfusion for the treatment of bleeding and anemia were made centuries ago mostly with disastrous effects, and although James Blandell (1818) is granted with the first successful transfusion, it was only during the twentieth century that blood transfusion came of age. The first half of the century was the era of pioneers and ingenious, hardworking individuals who made major breakthroughs
The second half saw the organization of large institutionscharged with developing methods of procurementof safe and effective blood products.
During this time, developments occurred in quick succession in a variety of fields like immunology, biochemistry, microbiology, genetics, molecular biology, and biotechnology, all impacting on transfusion and leading to today’s complex therapeutic intervention and the new specialization of transfusion medicine.
“The history of blood transfusion is marked by numerous bright pages but also some dark moments” as pointed out by Douglas Starr.
The surgical phase
Blood transfusion was introduced by surgeons, as theirs was the main need of finding a method of transferring blood from donor to patient. Alexis Carrel became famous for accomplishing a transfusion through suturing of vessels of the donor to those of the patient, in this case, the father to his baby daughter.
The technique underwent numerous modifications with the use of cannulae and tubes but remained difficult and cumbersome, so as to be used only infrequently. By the end of the first decade of the twentieth century, surgeons were performing some 20 transfusions a year at Mount Sinai hospital, New York.
New York had become home to a number of prominent physicians and scientists like Carrel, Landsteiner, Lindemann (the first full time specialist in transfusion, who introduced the multiple syringe method of transfusion), and others.
Direct donor–patient transfusions performed by surgeons continued to be practiced for decades and even as late as the early 1940s, though as described by Douglas Starr, “nobody liked transfusion as it existed, not the patient or the donor not even the doctors, who spent more time performing the transfusion, than the operation they were using it for.” In addition to being cumbersome, transfusions were resulting in severe reactions in more than 30% of instances.
The laboratory phase
By the end of the first decade, Landsteiner’s discovery of ABO blood groups dating to 1900 began to enter the transfusion field through the efforts, to a large extent, of Ottenberg, who was also the first to use compatibility testing before transfusion; he was thus able to reduce the posttransfusion accidents concluding that “accidents can be absolutely excluded by careful preliminary tests.”
The next problem to be addressed was clottingof the blood which necessitated either suturing ofdonor to patient vessels or very rapid removal and reinjection of the blood, which presented technical difficulties.
The syringe technique introduced by Lindemann eliminated the need for suturing blood vessels, as he inserted needles into the veins of donor and patient, and withdrew and reinjected blood with syringes; nevertheless, the method required quick action and clotting was not always prevented. So the next hero proved to be Lewisohn who introduced sodium citrate as the anticoagulant, publishing his method in 1915.
Surgeons, however, were apparently reluctant to accept the simplified procedure offered by anticoagulation (namely the collection of blood in a vessel containing sodium citrate); they wanted to maintain transfusion as “a complicated and lucrative operation.”
Donor recruitment
Despite the use of anticoagulants, blood could notbe stored for any length of time, hence the needfor proximity of donor and patient. Compatible donors had to be recruited by the doctor, either from the patient’s family or the environment, so the donor supply was difficult and unreliable. In London, donor recruitment techniques were developed similar to those used to this day; donors were tested for ABO group and called by telephone when needed. Through the efforts of Percy Oliver, 2500 nonremunerated donors were made available to London hospitals by 1930. Oliver’s example was followed in other countries as well.
Meantime in Russia, Dr Alexander Bogdanov in 1926 established the “Central Institute of Hematology”where research in transfusion was carried out; experimenting with transfusion on himself, he eventually died of a massive intravascular hemolysis. In 1930, cadaver blood for transfusion was used for the first time in Russia by Dr Serge Yudin.
Blood banking
It was in Russia that the idea of storing blood was originated by Dr Yudin, leading to the institution of blood banks. Blood storage facilities spread throughout the country and blood was being stored for weeks resulting in a high percentage of reactions. Blood bank establishments were followed in Europe and the United States. In 1937, Bernard Fantus in Chicago established what was initially called Blood Preservation Laboratory changing the name later to Blood Bank as it operated with deposits and withdrawals of blood! This, in my opinion, unfortunate name, lingers until today throughout the world, giving false messages to potential donors.
Eventually with the improvement of storage vessels, anticoagulants, and preservatives, longer storage periods became possible, and in the 1940s, blood collection and transfusion stopped being a surgical enterprise and came into the hands of blood bankers. The year 1940 also marks the separation of plasma from whole blood. With regard to the volume of blood to be collected, based on experiments carried out in the 1930s and 1940s, it was decided that it should not exceed 13% of the donor’s estimated blood volume; the 70 mL/kg rule may not be very accurate as pointed out by Frank Boulton, but has prevailed ever since as has the addition of 120 mL of anticoagulant to each blood unit.
World wars
The need for blood transfusion skyrocketed during World War II leading to a series of developments; glass bottles for blood collection, acid citrate dextrose developed by Patrick Mollison for blood anticoagulation, and separation and fractionation of plasma by Cohn with the production of albumin. Dried plasma and albumin were used as volume expanders on the battlefields during World War II. “By the end of 1943, the military had received more than two and a half million packages of dried plasma and nearly 125,000 ampoules of albumin” as mentioned by Starr.
In parallel with these developments, blood group serology was progressing thus increasing the safety of transfusion. The Coombs test introduced in 1945 for pretransfusion testing reduced significantly the risk of immune hemolysis of transfused RBCs. New methods of antibody detection led to the recognition of blood group systems, an endeavor that continues until today.
Blood collection—blood centers versus hospital blood banks
Blood collection from volunteers was relatively easy during the war but became increasingly difficult after the end of the war. Some countries like France and England managed to proceed to the development of National Blood Transfusion Centers and adopt the idea that blood should be voluntarily given without payment to donors. They developed networks of smaller and larger blood banks for collection and distribution of blood to hospitals.
In other countries such as Switzerland and Canada, it was the Red Cross that assumed the responsibility to recruit volunteer donors and supply blood products. In the United States, the American Association of Blood Banks formed in 1947 emphasized individual responsibility for blood procurement, asking patients to replace transfused units or else reimburse the blood bank; in contrast, the Red Cross supported community responsibility.
In many countries, a multitude of small blood banks collecting blood from paid blood donors prevailed in the 1950s and 1960s. By the late 1960s, it became apparent that most deaths by transfusion worldwide were because of viruses, bacteria, or parasites in the blood, and that the incidence was higher from paid donor blood leading to pressures to eliminate paid blood donation. In some instances, this led to substitution of paid donors by friends and relatives of patients, the socalled “replacement donors.” Replacement donors are safer than paid but not as safe as truly volunteer donors.
Even until today, very few countries have achieved 100% collections from truly volunteer donors.
Blood components: hemapheresis
Progress in the technology of blood collection allowed the separation of whole blood into cellular components and plasma, making it possible to cover the transfusion needs of more than one patient with one unit of blood. The terms “component therapy” or “blood economy” were coined by Edwin Cohn. In developed countries, whole blood transfusion is a rarity nowadays as each unit is separated into red cells, plasma, and platelets.
Plasmapheresis, a term coined in 1914 by John Jacob Abel, described the removal of plasma while returning the cells to the donor. It was initially conceived as treatment to remove toxic substances from blood but evolved into a component production technique to provide plasma for transfusion and also for fractionation. Initially, it was carried out manually but it expanded, as automation became available in the 1960s. Blood cell separators made the procedure faster, safer, and yielding a better product. The need for albumin, gamma globulins, and coagulation factors encouraged the expansion of the fractionation industry with numerous companies becoming active throughout the world.
Therapeutic plasmapheresis or rather plasma exchange has contributed significantly in the treatment of hematologic, autoimmune, and metabolic diseases by the removal of antibodies of immune complexes, monoclonal proteins, or cholesterol.
Selective removal of cells, platelets, granulocytes, erythrocytes, and hemopoietic progenitor cells with discontinuous or continuous cell separators are carried out today in blood banks around the world. Platelet apheresis available since the 1970s is gaining ground, replacing gradually the recovery of random platelets for transfusion. Peripheral blood stem cell collection is also replacing bone marrow harvesting for bone marrow transplantation. Red cell apheresis is the most recent development with advantages to both donors and patients, but is limited to larger donors.
Blood safety
The 1970s were marked by progress in the safety of blood through the introduction of screening for hepatitis B virus, which reduced the incidence of posttransfusion hepatitis (PTH), followed by documentation of residual PTH, and the identification of hepatitis C, for which testing was developed in the early 1990s. Unfortunately, the 1980s were marked by the AIDS epidemic, which caused a tremendous amount of grief to both patients and blood providers.
Pathogens continued to emerge calling for constant vigilance; West Nile virus and Chikungunya are the most recent invaders of the blood supply, but such epidemics are quickly brought under control nowadays.
Transfusion risks are not limited to infectious agents; alloimmunization and transfusion reactions, platelet refractoriness due to HLA and antiplatelet-specific antibodies, immunosuppression, transfusion-associated graft versus host disease, and TRALI (transfusion-related acute lung injury) have all received attention in the last 20 years, and measures to prevent them are continuously being studied.
Since a number of risks are attributed to the leukocytes in blood units, leukodepletion, or reduction of leukocytes in blood units by filtration, was introduced some 20 years ago and has proven to be effective in reducing febrile reactions, platelet refractoriness, cytomegalovirus transmission, red cell alloimmunization, and transfusion-induced immunosuppression.
The latest weapon in enhancing the safety of blood products is the inactivation of pathogens.
Solvent detergent treatment of plasma disrupts lipid-enveloped viruses and has been used in pooled plasma since the 1990s, whereas methylene blue, a photoactive virucidal agent, can be added to single units as it has proven to be safe especially since it is being removed before transfusion. Inactivation of pathogens in cellular components is proving more difficult although for platelets, psoralen and UVA light activation are proving feasible and effective. Although screening for viruses will continue, treatment of blood components could be added to reduce the risk of pathogens that we cannot test for.
Information technology (IT) is also adding to the safety of blood transfusion; electronic medical records, electronic blood donor records, computer crossmatch, and virtual blood inventories are beginning to change the way transfusion medicine is practiced.
Alternatives to allogeneic transfusion
The realization that blood can never become 100% safe gave impetus to the development of transfusion alternatives.
Autologous transfusion
- Autologous transfusion, initially by predeposit autologous blood collection before surgery took off mainly in the 1980s after the AIDS epidemic; its advantages (safety, economy of allogeneic blood) were soon counteracted by disadvantages, mainly cost, and its practice is now limited to selective indications.
- Intraoperative hemodilution, the removal of two units immediately preoperatively replacing the volume with crystalloid, proved feasible and had the advantage of decreasing the loss of red cells during surgery but concerns over cardiac ischemia have limited its application to experienced centers.
- Intraoperative red cell salvage particularly with automated centrifugation and washing machines introduced in the late 1980s, is gaining ground. The method is safe but is suitable mainly for major procedures with significant predicted blood loss such as cardiovascular, vascular, and orthopedic operations.
- Postoperative red cell salvage, namely blood collected from drains in the first 6 hours following surgery and reinfused without manipulation, is simple and is adopted mainly by orthopedic teams, but concerns regarding reinfusion of activated plasma proteins and wound debris remain.
Pharmacologic alternatives
Hemopoietic growth factors became available in the 1990s as a result of progress in recombinant technology.
Erythropoietin was the first one to be used in renal disease resulting in drastic decrease in transfusions for these patients. The indications for rhEPO have expanded reducing the need for transfusion in hematologic disease and cancer patients as well as in the anemia of chronic disease and of prematurity.
Colony stimulating factors (CSFs), granulocyte G-CSF, and granulocyte-macrophage GM-CSF for chemotherapy-induced neutropenia, chronic, and neonatal neutropenia are widely used and have resulted in decreased mortality from infection.
The use of thrombopoietin for the treatment of thrombocytopenia has been under investigation for the past 10 years but has not yet had an impact in reducing platelet transfusions.
Hemostatic agents
Almost 50% of blood units are transfused during surgical procedures, so, if perioperative blood loss could be reduced, transfusions would also be reduced.
Antifibrinolytic agents like tranexamic acid, epsilon-aminocaproic acid, and aprotinin have all been used in the last 20 years and have resulted in significant decreases in the need for transfusions, mainly in cardiovascular surgery; unfortunately, aprotinin was recently implicated in thrombosis and myocardial infarction and has been removed from circulation.
Fibrin sealants
Topical agents made of fibrinogen and thrombin or platelet gel applied on surgical surfaces to accelerate hemostasis have been developed in the last 10 years and are used mainly in cardiovascular and orthopedic surgery.
Red cell substitutes
The greatest hope for reducing the need for transfusions was the development of red cell substitutes; perfluorocarbons and hemoglobin-based oxygen carriers have been the subject of intense investigation for more than 20 years but safety problems are still limiting them to clinical studies.
Hemovigilance quality systems
Systematic surveillance of adverse transfusion effects begun in the 1990s; France was the first country to implement such a system in 1993, followed by the United Kingdom in 1996. Today, most European countries have a hemovigilance system, although it is not obligatory in all of them. In addition to disease transmission and reactions, these systems document errors occurring in the entire transfusion chain; by far, the most frequent adverse events were those resulting from errors in the transfusion process leading to the transfusion of ABO incompatible blood. Implementation of hemovigilance has led to establishment of new guidelines for a number of procedures.
In the last 15–20 years, emphasis was given to the application of quality systems principles; good manufacturing practices (GMPs) and quality management systems have been implemented in blood centers, leading to better standardization of blood products and reduction of errors and accidents.
Transfusion medicine
Blood transfusion started out as a relatively simple replacement therapy for bleeding or anemic subjects. The last 20 years, however, have seen a tremendous progress in the development of a number of blood products and in their safety; at the same time, emphasis was placed on the proper indications for transfusion and on the choice of available specialized blood products to cover the needs of patients. Hemotherapy acquired a complexity that necessitated specialized knowledge, and studies began to show the deficiencies in such knowledge of clinicians in making transfusiondecisions. The effectiveness of transfusion came under scrutiny, while the risks remained significant. Blood bank personnel used to dealing with normal subjects such as the blood donors, with the emergence of therapeutic apheresis and stem cell collection for transplantation, have to deal now with patients; clinical laboratory training is not sufficient any more. These developments created the need for a new medical discipline, namely transfusion medicine. Transfusion specialists trained in laboratory medicine, pharmaceutical production, clinical medicine, epidemiological aspects, stem cell transplantation, legal, ethical, and administrative aspects could bridge the gap between the blood bank and the clinicians, be it internists, anesthesiologists, or surgeons. Clinician education and audits of transfusion practice are the tools by which transfusion specialists are aiming at improving the use of blood products.
In 1989, Dr Sacket coined the term evidencebased medicine (EBM), defined as the integration of the best research evidence with the best clinical expertise for good clinical decision making.
Transfusion medicine had to follow the principles and research methodologies that support EBM in order to develop transfusion guidelines based on such evidence, by performing Randomized Controlled Trials (RCTs). As per the McCarthy et al.’s study, 1000 RCTs on transfusion and apheresis and 70 meta-analyses were published by 2006.
Borzini et al. in an article published 10 years ago pointed out that “transfusion medicine had become a self-sufficient autonomous discipline.” He went on to say that in order for TM to be “a stand alone discipline,” self-recognition of such autonomy was necessary but not recognition by other disciplines!
I would argue that the latter recognition is important but unfortunately 10 years later the specialty of TM is still not widely recognized. Mueller and Seifried questioned recently why European directives, recognizing professional qualifications of European doctors, do not include TM, blood transfusion, or immunohematology at all, although TM is recognized as a specialty by a number of EU member states.
Efforts to this end should continue in order to attract young doctors to the specialty of TM and secure not only the safety and economy of bl...