This introductory article will provide a historical perspective on the use of antibody-based therapies, followed by a high-level overview of what makes antibodies attractive tools for this purpose. This will include the pros and cons of such therapies compared to the use of antibiotics and the practical and strategic considerations involved in selecting the best format and development path for new antibody-based therapies targeting specific infectious agents. Then, examples of antibody-based therapies in the development of treatments for infectious diseases will be presented, and finally a look into the future will summarize the different aspects that will influence what the future might bring for this type of treatments for infectious diseases.
HISTORICAL PERSPECTIVE
Antibodies and the use of passive antibody therapy in the treatment of infectious diseases is the story of a treatment concept which dates back more than 120 years, to the late 19th century, and which originally, by the use of serum from immunized animals, provided the first effective treatment options against severe bacterial infections (1, 2). By immunizing horses with bacterial toxins from Clostridium tetani and Corynebacterium diphtheriae, Emil A. Von Behring and Shibasaburo Kitasato (3) generated serum containing antibodies capable of neutralizing the effects of the toxins produced by these bacteria and successfully provided treatment for these serious diseases where the pathogenesis is driven by the effects of the bacterial toxins. For his work on providing treatment for diphtheria, Behring received the Nobel Prize in Physiology or Medicine in 1901. These radical treatment results quickly prompted development of multiple additional serum therapies for the treatment of infectious diseases caused by, e.g., Neisseria meningitidis, Haemophilus influenza, and group A Streptococcus. Since serum therapy involved administration of large amounts of crude mixtures of animal proteins including antibodies, they were associated with side effects in the form of hypersensitivity and serum sickness (2).
Due to the crude and unpurified nature of these products, side effects were seen even when administering human serum preparations. Side effects were observed in up to 50% of patients and were considered to be caused by immune complex formations that resulted in symptoms such as rash, itching, joint pain, fever, and in serious cases hypotension and shock. However, due to the lack of alternative options, these treatments were, despite their side effects, widely used. Serum was normally administered by intravenous infusion in patients after a test for hypersensitivity where a small amount of serum was injected subcutaniously (1). As described above, serum therapy applied in these early days (late 19th century and early 20th century) involved preparations of serum from rabbits and horses immunized with the infectious agent or in live and/or neutralized versions or toxins from these (1). The costs of keeping the immunized animals and the production and potency testing of the materials made this a relatively expensive treatment. In 1891, data from Klemperer (4) showed serum therapy to protect rabbits from Streptococcus pneumoniae infection and paved the way for this type of treatment and for development of similar serum-based treatments of streptococcal infections in humans. When treating humans, early administration of serum could reduce mortality significantly down to around 5% compared to when administered 4 to 5 days after onset of symptoms, when serum treatment was largely without effect. This strongly indicated the need for quick diagnosis and quick treatment to control the infection before it got out of control. Consequently, in the absence of a specific diagnosis, mixtures of serum from immunizations with different serotypes were used to circumvent this need for early treatment without having a serotype-specific diagnosis. The understanding that different serotypes existed for pneumococci and that efficient treatment relied on using serotype-specific serum was being built up during the 1920s and 1930s through experience from extensive clinical trials.
By the end of the 1930s serum therapy was the standard of care for treatment of pneumococcal pneumonia. At that time, the efficacy and potency of the derived sera were assessed in mice, in “the mouse protection test” by testing survival after a concomitant intraperitoneal injection of a lethal dose of pneumococci and the serum to be tested. Due to the inherent variation in this test, efficacy and survival in two thirds of the animals was the acceptance criteria, and 10 times the lowest dose providing this was used for defining a unit of the serum. This allowed for large batch-to-batch variation, and the use of different strains of bacteria for immunization probably explains part of the missing responses observed (1). In the early 20th century a pandemic of meningitis in Europe and the United States, with mortality rates up to 80%, spurred the development of serum therapy treatment options. Although in the 1930s this became the recommended treatment in children assumed to be suffering from meningitis, failure to reduce mortality in several meningitis epidemics during that time raised doubts about the general applicability of serum treatment. In those days serum therapy often involved quite extensive procedures and infusion of large volumes of serum. The following example clearly illustrates this. Data from Flexner and Jobling (5) from treating meningitis in monkeys resulted in the development of sera from immunized horses for treatment in humans. The treatment protocol included lumbar puncture and withdrawal of more than 30 ml of spinal fluid representing an amount slightly larger than the expected amount of horse serum to be injected subsequently. This treatment involved such daily slow infusions of up to 30 ml of serum until the patient’s condition improved. This treatment was used in outbreaks in New York in 1905 and 1906 and did markedly decrease mortality.
After the discovery of penicillin by Fleming in 1928, and the subsequent introduction of antibiotics in the 1930s, serum therapy was largely abandoned over a period of 10 years due to the availability of these new, more broadly effective and cheaper treatment options, which also had fewer side effects. Although improvements in the purification of antibodies had resulted in preparations with better safety and side effect profiles, high manufacturing costs and narrow specificity resulted in antibody therapy being mostly restricted to a smaller number of selected treatments for snake venoms, bacterial toxins, and some viral infections (1, 2). Currently, antibody administration is used for treatment and prevention of hepatitis B virus, rabies virus, respiratory syncytial virus (RSV), Clostridium tetani, Clostridium botulinum, vaccinia virus, echovirus, and enterovirus. For the most part, these treatments consist of pooled immunoglobulin, also known as IVIG (intravenous immunoglobulin), from several postexposure donors. This results in both batch-to-batch variation, in the need for relatively large amounts of serum due to low specificity and to restricted supplies due to reliance on exposed donors.
However, several challenges have resulted in the need for new tools in the treatment and prevention of infectious diseases. The broad and general use of antibiotics in human and veterinary medicine for many years has resulted in the development of multi-resistant strains of bacteria with limited to no response to existing treatments such as methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant S. aureus, and others. This has resulted in patients needing screening and treatment with several antibacterial agents and longer treatment time, causing extra strain on patients and health care providers (6; http://www.cdc.gov/drugresistance/threat-report-2013/index.html). According to the WHO and CDC more than 25,000 people in European Union countries and similar numbers in the United States die every year as a result of antibiotic-resistant infections. This together with the emergence of new pathogens (e.g., severe acute respiratory syndrome, Middle East respiratory syndrome), the re-emergence/epidemics of old/known pathogens (e.g., Ebola), and the difficulties in treating infections in immune-deficient patients (e.g., HIV patients) has highlighted the need for new solutions. The 2014 Ebola epidemic in West Africa (Liberia, Sierra Leone, Guinea, Nigeria, and Senegal) has further highlighted this. No treatment or prophylactic vaccine is available to treat or prevent the spread of Ebola infections, which have an average mortality of >50%. Local health authorities in the affected countries are struggling to contain and handle the disease, which is threatening to go out of control and spread more widely. Various products, mainly antibody cocktails from previously recovered patients, are being used despite a lack of clinical data on their safety and efficacy, and those are the only sporadically available treatment options and only in small amounts and for a few patients.
Ebola is an example of a disease which normally affects only a small number of individuals and which normally burns out when disease outbreaks are contained. Therefore, given the small number of potential patients affected by previous Ebola infections to date, there was no incentive for big pharma companies to do research and development of drugs for Ebola. With the increasing number of infected (13,567) and a death toll of 4,951 (7) and the lack of the ability to contain the epidemic, it will be interesting to follow the aftermath of this outbreak and see whether there will be requests for new ways to ensure that vaccines and treatment options are available for Ebola and similar high-mortality and potential bio-warfare infections that have no available treatments or procedures to for mass-production upon the first signs/reports of active infections. Although both the CDC and the WHO have special programs focusing on these types of infections, the Ebola outbreak in 2014 clearly shows that more financial support for research and development of new diagnostics and treatments is needed. This is one example where antibody-based treatments would have the potential to play a major role. To put this in the right perspective, one should, however, not forget that other infectious diseases such as tuberculosis, influenza, and malaria kill hundreds of thousands each year. There is therefore plenty...