Feeds:
Posts
Comments

Posts Tagged ‘Cachexia’

The History of Infectious Diseases and Epidemiology in the late 19th and 20th Century

Curator: Larry H Bernstein, MD, FCAP

 

Infectious diseases are a part of the history of English, French, and Spanish Colonization of the Americas, and of the Slave Trade.  The many plagues in the new and old world that have effected the course of history from ancient to modern times were known to the Egyptians, Greeks, Chinese, crusaders, explorers, Napoleon, and had familiar ties of war, pestilence, and epidemic. Our coverage is mainly concerned with the scientific and public health consequences of these events that preceded WWI and extended to the Vietnam War, and is highlighted by the invention of a public health system world wide.

The Armed Forces Institute of Pathology (AFIP) closed its’ doors on September 15, 2011. It was founded as the Army Medical Museum on May 21, 1862, to collect pathological specimens along with their case histories.

The information from the case files of the pathological specimens from the Civil War was compared with Army pensions records and compiled into the six-volume Medical and Surgical History of the War of the Rebellion, an early study of wartime medicine.

In 1900, museum curator Walter Reed led the commission which proved that a mosquito was the vector for Yellow Fever, beginning the mosquito eradication campaigns throughout most of the twentieth century.

WalterReed

WalterReed

Another museum curator, Frederick Russell, conducted clinical trials on the typhoid vaccine in 1907, resulting in the U.S. Army to be the first Army vaccinated against typhoid.

Increased emphasis on pathology during the twentieth century turned the museum, renamed the Armed Forces Institute of Pathology in 1949, into an international resource for pathology and the study of disease. AFIP’s pathological collections have been used, for example, in the characterization of the 1918-influenza virus in 1997.

Prior to moving to the Walter Reed Army Medical Center, the AFIP was located at the Army Medical Museum and Library on the Mall (1887-1969), and earlier as Army Medical Museum in Ford’s Theatre (1867-1886).

Army Medical Museum and Library on the Mall

Army Medical Museum and Library on the Mall

This institution, originally the Library of the Surgeon General’s Office (U.S. Army), gained its present name and was transferred from the Army to the Public Health Service in 1956. In 1962, it moved to its own Bethesda site after sharing space for nearly 100 years with other Army units, first at the former Ford’s Theatre building and then at the Army Medical Museum and Library on the Mall. Rare books and other holdings that had been sent to Cleveland for safekeeping during World War II were also reunited with the main collection at that time.

The National Museum of Health and Medicine, established in 1862, inspires interest in and promotes the understanding of medicine — past, present, and future — with a special emphasis on tri-service American military medicine. As a National Historic Landmark recognized for its ongoing value to the health of the military and to the nation, the Museum identifies, collects, and preserves important and unique resources to support a broad agenda of innovative exhibits, educational programs, and scientific, historical, and medical research. NMHM is a headquarters element of the U.S. Army Medical Research and Materiel Command. NMHM’s newest exhibit installations showcase the institution’s 25-million object collection, focusing on topics as diverse as innovations in military medicine, traumatic brain injury, anatomy and pathology, military medicine during the Civil War, the assassination of Abraham Lincoln (including the bullet that killed him), human identification and a special exhibition on the Museum’s own major milestone—the 150th anniversary of the founding of the Army Medical Museum. Objects on display will include familiar artifacts and specimens: the bullet that killed Lincoln and a leg showing the effects of elephantiasis, as well as recent finds in the collection—all designed to astound visitors to the new Museum.

Today, the National Library of Medicine houses the largest collection of print and non-print materials in the history of the health sciences in the United States, and maintains an active program of exhibits and public lectures. Most of the archival and manuscript material dates from the 17th century; however, the Library owns about 200 pre-1601 Western and Islamic manuscripts. Holdings include pre-1914 books, pre-1871 journals, archives and modern manuscripts, medieval and Islamic manuscripts, a collection of printed books, manuscripts, and visual material in Japanese, Chinese, and Korean; historical prints, photographs, films, and videos; pamphlets, dissertations, theses, college catalogs, and government documents.

The oldest item in the Library is an Arabic manuscript on gastrointestinal diseases from al-Razi’s The Comprehensive Book on Medicine (Kitab al-Hawi fi al-tibb) dated 1094. Significant modern collections include the papers of U.S. Surgeons General, including C. Everett Koop, and the papers of Nobel Prize-winning scientists, particularly those connected with NIH.

As part of its Profiles in Science project, the National Library of Medicine has collaborated with the Churchill Archives Centre to digitize and make available over the World Wide Web a selection of the Rosalind Franklin Papers for use by educators and researchers. This site provides access to the portions of the Rosalind Franklin Papers, which range from 1920 to 1975. The collection contains photographs, correspondence, diaries, published articles, lectures, laboratory notebooks, and research notes.

Rosalind Franklin

Rosalind Franklin

Rosalind Franklin

Rosalind Franklin

Rosalind Franklin

“Science and everyday life cannot and should not be separated. Science, for me, gives a partial explanation of life. In so far as it goes, it is based on fact, experience, and experiment. . . . I agree that faith is essential to success in life, but I do not accept your definition of faith, i.e., belief in life after death. In my view, all that is necessary for faith is the belief that by doing our best we shall come nearer to success and that success in our aims (the improvement of the lot of mankind, present and future) is worth attaining.”

–Rosalind Franklin in a letter to Ellis Franklin, ca. summer 1940

Smallpox

Although some disliked mandatory smallpox vaccination measures, coordinated efforts against smallpox went on in the United States after 1867, and the disease continued to diminish in the wealthy countries. By 1897, smallpox had largely been eliminated from the United States. In Northern Europe a number of countries had eliminated smallpox by 1900, and by 1914, the incidence in most industrialized countries had decreased to comparatively low levels. Vaccination continued in industrialized countries, until the mid to late 1970s as protection against reintroduction. Australia and New Zealand are two notable exceptions; neither experienced endemic smallpox and never vaccinated widely, relying instead on protection by distance and strict quarantines.

In 1966 an international team, the Smallpox Eradication Unit, was formed under the leadership of an American, Donald Henderson. In 1967, the World Health Organization intensified the global smallpox eradication by contributing $2.4 million annually to the effort, and adopted the new disease surveillance method promoted by Czech epidemiologist Karel Raška. Two-year old Rahima Banu of Bangladesh (pictured) was the last person infected with naturally occurring Variola major, in 1975

The global eradication of smallpox was certified, based on intense verification activities in countries, by a commission of eminent scientists on 9 December 1979 and subsequently endorsed by the World Health Assembly on 8 May 1980. The first two sentences of the resolution read:

Having considered the development and results of the global program on smallpox eradication initiated by WHO in 1958 and intensified since 1967 … Declares solemnly that the world and its peoples have won freedom from smallpox, which was a most devastating disease sweeping in epidemic form through many countries since earliest time, leaving death, blindness and disfigurement in its wake and which only a decade ago was rampant in Africa, Asia and South America.

—World Health Organization, Resolution WHA33.3

Anthrax

Anthrax is an acute disease caused by the bacterium Bacillus anthracis. Most forms of the disease are lethal, and it affects both humans and other animals. Effective vaccines against anthrax are now available, and some forms of the disease respond well to antibiotic treatment.

Like many other members of the genus Bacillus, B. anthracis can form dormant endospores (often referred to as “spores” for short, but not to be confused with fungal spores) that are able to survive in harsh conditions for decades or even centuries. Such spores can be found on all continents, even Antarctica. When spores are inhaled, ingested, or come into contact with a skin lesion on a host, they may become reactivated and multiply rapidly.

Anthrax commonly infects wild and domesticated herbivorous mammals that ingest or inhale the spores while grazing. Ingestion is thought to be the most common route by which herbivores contract anthrax. Carnivores living in the same environment may become infected by consuming infected animals. Diseased animals can spread anthrax to humans, either by direct contact (e.g., inoculation of infected blood to broken skin) or by consumption of a diseased animal’s flesh.

Anthrax does not spread directly from one infected animal or person to another; it is spread by spores. These spores can be transported by clothing or shoes. The body of an animal that had active anthrax at the time of death can also be a source of anthrax spores. Owing to the hardiness of anthrax spores, and their ease of production in vitro, they are extraordinarily well suited to use (in powdered and aerosol form) as biological weapons.

Bacillus anthracis is a rod-shaped, Gram-positive, aerobic bacterium about 1 by 9 μm in size. It was shown to cause disease by Robert Koch in 1876 when he took a blood sample from an infected cow, isolated the bacteria and put them into a mouse. The bacterium normally rests in endospore form in the soil, and can survive for decades in this state. Once ingested or placed in an open wound, the bacterium begins multiplying inside the animal or human and typically kills the host within a few days or weeks. The endospores germinate at the site of entry into the tissues and then spread by the circulation to the lymphatics, where the bacteria multiply.

Robert Koch

Robert Koch

Veterinarians can often tell a possible anthrax-induced death by its sudden occurrence, and by the dark, nonclotting blood that oozes from the body orifices. Bacteria that escape the body via oozing blood or through the opening of the carcass may form hardy spores. One spore forms per one vegetative bacterium. Once formed, these spores are very hard to eradicate.

The lethality of the anthrax disease is due to the bacterium’s two principal virulence factors: the poly-D-glutamic acid capsule, which protects the bacterium from phagocytosis by host neutrophils, and the tripartite protein toxin, called anthrax toxin. Anthrax toxin is a mixture of three protein components: protective antigen (PA), edema factor (EF), and lethal factor (LF). PA plus LF produces lethal toxin, and PA plus EF produces edema toxin. These toxins cause death and tissue swelling (edema), respectively.

To enter the cells, the edema and lethal factors use another protein produced by B. anthracis called protective antigen, which binds to two surface receptors on the host cell. A cell protease then cleaves PA into two fragments: PA20 and PA63. PA20 dissociates into the extracellular medium, playing no further role in the toxic cycle. PA63 then oligomerizes with six other PA63 fragments forming a heptameric ring-shaped structure named a prepore.

Once in this shape, the complex can competitively bind up to three EFs or LFs, forming a resistant complex. Receptor-mediated endocytosis occurs next, providing the newly formed toxic complex access to the interior of the host cell. The acidified environment within the endosome triggers the heptamer to release the LF and/or EF into the cytosol.

Edema factor is a calmodulin-dependent adenylate cyclase. Adenylate cyclase catalyzes the conversion of ATP into cyclic AMP (cAMP) and pyrophosphate. The complexation of adenylate cyclase with calmodulin removes calmodulin from stimulating calcium-triggered signaling. LF inactivates neutrophils so they cannot phagocytose bacteria. Anthrax causes vascular leakage of fluid and cells, and ultimately hypovolemic shock and septic shock.

Occupational exposure to infected animals or their products (such as skin, wool, and meat) is the usual pathway of exposure for humans. Workers who are exposed to dead animals and animal products are at the highest risk, especially in countries where anthrax is more common. Anthrax in livestock grazing on open range where they mix with wild animals still occasionally occurs in the United States and elsewhere. Many workers who deal with wool and animal hides are routinely exposed to low levels of anthrax spores, but most exposure levels are not sufficient to develop anthrax infections. The body’s natural defenses presumably can destroy low levels of exposure. These people usually contract cutaneous anthrax if they catch anything.

Throughout history, the most dangerous form of inhalational anthrax was called woolsorters’ disease because it was an occupational hazard for people who sorted wool. Today, this form of infection is extremely rare, as almost no infected animals remain. The last fatal case of natural inhalational anthrax in the United States occurred in California in 1976, when a home weaver died after working with infected wool imported from Pakistan. Gastrointestinal anthrax is exceedingly rare in the United States, with only one case on record, reported in 1942, according to the Centers for Disease Control and Prevention.

Various techniques are used for the direct identification of B. anthracis in clinical material. Firstly, specimens may be Gram stained. Bacillus spp. are quite large in size (3 to 4 μm long), they grow in long chains, and they stain Gram-positive. To confirm the organism is B. anthracis, rapid diagnostic techniques such as polymerase chain reaction-based assays and immunofluorescence microscopy may be used.

All Bacillus species grow well on 5% sheep blood agar and other routine culture media. Polymyxin-lysozyme-EDTA-thallous acetate can be used to isolate B. anthracis from contaminated specimens, and bicarbonate agar is used as an identification method to induce capsule formation. Bacillus spp. usually grow within 24 hours of incubation at 35 °C, in ambient air (room temperature) or in 5% CO2. If bicarbonate agar is used for identification, then the medium must be incubated in 5% CO2.

  1. anthracis colonies are medium-large, gray, flat, and irregular with swirling projections, often referred to as having a “medusa head” appearance, and are not hemolytic on 5% sheep blood agar. The bacteria are not motile, susceptible to penicillin, and produce a wide zone of lecithinase on egg yolk agar. Confirmatory testing to identify B. anthracis includes gamma bacteriophage testing, indirect hemagglutination, and enzyme linked immunosorbent assay to detect antibodies. The best confirmatory precipitation test for anthrax is the Ascoli test.

Vaccines against anthrax for use in livestock and humans have had a prominent place in the history of medicine, from Pasteur’s pioneering 19th-century work with cattle (the second effective vaccine ever) to the controversial 20th century use of a modern product (BioThrax) to protect American troops against the use of anthrax in biological warfare. Human anthrax vaccines were developed by the Soviet Union in the late 1930s and in the US and UK in the 1950s. The current FDA-approved US vaccine was formulated in the 1960s.

If a person is suspected as having died from anthrax, every precaution should be taken to avoid skin contact with the potentially contaminated body and fluids exuded through natural body openings. The body should be put in strict quarantine and then incinerated. A blood sample should then be collected and sealed in a container and analyzed in an approved laboratory to ascertain if anthrax is the cause of death. Microscopic visualization of the encapsulated bacilli, usually in very large numbers, in a blood smear stained with polychrome methylene blue (McFadyean stain) is fully diagnostic, though culture of the organism is still the gold standard for diagnosis.

Full isolation of the body is important to prevent possible contamination of others. Protective, impermeable clothing and equipment such as rubber gloves, rubber apron, and rubber boots with no perforations should be used when handling the body. Disposable personal protective equipment and filters should be autoclaved, and/or burned and buried.

Anyone working with anthrax in a suspected or confirmed victim should wear respiratory equipment capable of filtering this size of particle or smaller. The US National Institute for Occupational Safety and Health – and Mine Safety and Health Administration-approved high-efficiency respirator, such as a half-face disposable respirator with a high-efficiency particulate air filter, is recommended.

All possibly contaminated bedding or clothing should be isolated in double plastic bags and treated as possible biohazard waste. The victim should be sealed in an airtight body bag. Dead victims who are opened and not burned provide an ideal source of anthrax spores. Cremating victims is the preferred way of handling body disposal.

Until the 20th century, anthrax infections killed hundreds of thousands of animals and people worldwide each year. French scientist Louis Pasteur developed the first effective vaccine for anthrax in 1881.

louis-pasteur

louis-pasteur

As a result of over a century of animal vaccination programs, sterilization of raw animal waste materials, and anthrax eradication programs in United States, Canada, Russia, Eastern Europe, Oceania, and parts of Africa and Asia, anthrax infection is now relatively rare in domestic animals. Anthrax is especially rare in dogs and cats, as is evidenced by a single reported case in the United States in 2001.

Anthrax outbreaks occur in some wild animal populations with some regularity. The disease is more common in countries without widespread veterinary or human public health programs. In the 21st century, anthrax is still a problem in less developed countries.

  1. anthracis bacterial spores are soil-borne. Because of their long lifespan, spores are present globally and remain at the burial sites of animals killed by anthrax for many decades. Disturbed grave sites of infected animals have caused reinfection over 70 years after the animal’s interment.

Cholera

This is an acute diarrheal infection that can kill within a matter of hours if untreated. Oral rehydration therapy — drinking water mixed with salts and sugar. But researchers at EPFL — the Swiss Federal Institute of Technology in Lausanne — say using rice starch instead of sugar with the rehydration salts could reduce bacterial toxicity by almost 75 percent. That would make the microbe less likely to infect a patient’s family and friends if they are exposed to any body fluids.

The World Health Organization says cholera, a water-borne bacterium, infects three to five million people every year, and the severe dehydration it causes leads to as many as 120,000 deaths.

Cholera is an acute diarrheal disease caused by the water borne bacteria Vibrio cholerae O1 or O139 (V. cholerae). Infection is mainly through ingestion of contaminated water or food. The V cholerae passes through the stomach, colonizes the upper part of the small intestine, penetrates the mucus layer, and secretes cholera toxin which affects the small intestine.

Clinically, the majority of cholera episodes are characterized by a sudden onset of massive diarrhea and vomiting accompanied by the loss of profuse amounts of protein-free fluid with electrolytes. The resulting dehydration produces tachycardia, hypotension, and vascular collapse, which can lead to sudden death. The diagnosis of cholera is commonly established by isolating the causative organism from the stools of infected individuals

There are an estimated 3–5 million cholera cases and 100 000–120 000 deaths due to cholera every year.

Up to 80% of cases can be successfully treated with oral rehydration salts.

Effective control measures rely on prevention, preparedness and response.

Provision of safe water and sanitation is critical in reducing the impact of cholera and other waterborne diseases.

Oral cholera vaccines are considered an additional means to control cholera, but should not replace conventional control measures.

During the 19th century, cholera spread across the world from its original reservoir in the Ganges delta in India. Six subsequent pandemics killed millions of people across all continents. The current (seventh) pandemic started in South Asia in 1961, and reached Africa in 1971 and the Americas in 1991. Cholera is now endemic in many countries.

INDIA-ENVIRONMENT-POLUTION

INDIA-ENVIRONMENT-POLUTION

In its extreme manifestation, cholera is one of the most rapidly fatal infectious illnesses known. Within 3–4 hours of onset of symptoms, a previously healthy person may become severely dehydrated and if not treated may die within 24 hours (WHO, 2010). The disease is one of the most researched in the world today; nevertheless, it is still an important public health problem despite more than a century of study, especially in developing tropical countries. Cholera is currently listed as one of three internationally quarantinable diseases by the World Health Organization (WHO), along with plague and yellow fever (WHO, 2000a).

Two serogroups of V. cholerae – O1 and O139 – cause outbreaks. V. cholerae O1 causes the majority of outbreaks, while O139 – first identified in Bangladesh in 1992 – is confined to South-East Asia.

Non-O1 and non-O139 V. cholerae can cause mild diarrhoea but do not generate epidemics.

The main reservoirs of V. cholerae are people and aquatic sources such as brackish water and estuaries, often associated with algal blooms. Recent studies indicate that global warming creates a favorable environment for the bacteria.

Socioeconomic and demographic factors enhance the vulnerability of a population to infection and contribute to epidemic spread. Such factors also mandate the extent to which the disease will reach epidemic proportions and also modulate the size of the epidemic.Known population level (local-level) risk factors of cholera include poverty, lack of development, high population density, low education, and lack of previous exposure. Cholera diffuses rapidly in environments that lack basic infrastructure with regard to access to safe water and proper sanitation. The cholera vibrios can survive and multiply outside the human body and can spread rapidly in environments where living conditions are overcrowded and where there is no safe disposal of solid waste, liquid waste, and human feces.

Mapping the locations of cholera victims, John Snow was able to trace the cause of the disease to a contaminated water source. Surprisingly, this was done 20 years before Koch and Pasteur established the beginnings of microbiology (Koch, 1884).

John Snow's  map

John Snow’s map

Yellow Fever

Yellow fever virus was probably introduced into the New World via ships carrying slaves from West Africa. Throughout the 18th and 19th centuries, regular and devastating epidemics of yellow fever occurred across the Caribbean, Central and South America, the southern United States and Europe. The Yellow Fever Commission, founded as a consequence of excessive disease mortality during the Spanish– American War (1898), concluded that the best way to control the disease was to control the mosquito. William Gorgas successfully eradicated yellow fever from Havana by destroying larval breeding sites and this strategy of source reduction was then successfully used to reduce disease problems and thus finally permit the construction of the Panama Canal in 1904. Success was due largely to a top-down, military approach involving strict supervision and discipline (Gorgas, 1915). In 1946, an intensive Aedes aegypti eradication campaign was initiated in the Americas, which succeeded in reducing vector populations to undetectable levels throughout most of its range.

The production of an effective vaccine in the 1930s led to a change of emphasis from vector control to vaccination for the control of yellow fever. Vaccination campaigns almost eliminated urban yellow fever but incomplete coverage, as with incomplete anti-vectorial measures previously, meant the disease persisted, and outbreaks occurred in remote forest areas.

It was acknowledged by the Health Organization of the League of Nations (the forerunner to the World Health Organization (WHO)) that yellow fever was a severe burden on endemic countries. The work of Soper and the Brazilian Cooperative Yellow Fever Service (Soper, 1934, 1935a, b) began to determine the geographical extent of the disease, specifically in Brazil. Regional maps of disease outbreaks were published by Sawyer (1934), but it was not until after the formation of the WHO that a global map of yellow fever endemicity was first constructed (van Rooyen and Rhodes, 1948). This map was based on expert opinion (United Nations Relief and Rehabilitation Administration/Expert Commission on Quarantine) and serological surveys. The present-day distribution map for yellow fever is still essentially a modified version of this map.

global yellow fever risk map

global yellow fever risk map

Yellow fever is conspicuously absent from Asia. Although there is some evidence that other flaviviruses may offer cross-protection against yellow fever (Gordon-Smith et al., 1962), why yellow fever does not occur in Asia is still unexplained.

It has been estimated that the currently circulating strains of YFV arose in Africa within the last 1,500 years and emerged in the Americas following the slave trade approximately 300–400 years ago. These viruses then spread westwards across the continent and persist there to this day in the jungles of South America.

The 17D live-attenuated vaccine still in use today was developed in 1936, and a single dose confers immunity for at least ten years in 95% of the cases. In a bid to contain the spread of the disease, travellers to countries within endemic areas or those thought to be ‘at risk’ require a certificate of vaccination. The yellow fever certificate is the only internationally regulated certification supported by the WHO. The effectiveness of the vaccine reduces the need for anti-vectorial campaigns directed specifically against yellow fever. As the same major vector is involved, control of Aedes aegypti for dengue reduction will also reduce yellow fever transmission where both diseases co-occur, especially within urban settings.

Dengue

Probable epidemics of dengue fever have been recorded from Africa, Asia, Europe and the Americas since the early 19th century (Armstrong, 1923). Although it is rarely fatal, up to 90% of the

population of an infected area can be incapacitated during the course of an epidemic (Armstrong, 1923; Siler et al., 1926). Widespread movements of troops and refugees during and after World War II introduced vectors and viruses into many new areas. Dengue fever has unsurprisingly been mistaken for yellow fever as well as other diseases including influenza, measles, typhoid and malaria. It is rarely fatal and survivors appear to have lifelong immunity to the homologous serotype.

Far more serious is dengue haemorrhagic fever (DHF), where additional symptoms develop, including haemorrhaging and shock. The mortality from DHF can exceed 30% if appropriate care is unavailable. The most significant risk factor for DHF is when secondary infection with a different serotype occurs in people who have already had, and recovered from, a primary dengue infection.

Dengue has adapted to changes in human demography very effectively. The main vector of dengue is the anthropophilic Aedes aegypti, which is found in close association with human settlements throughout the tropics, breeding mainly in containers in and around, and feeding almost exclusively on humans. As a result, dengue is essentially a disease of tropical urban areas. Before 1970, only nine countries had experienced DHF epidemics, but by 1995 this number had increased fourfold (WHO, 2001). Dengue case numbers have increased considerably since the 1960s; by the end of the 20th century an estimated 50 million cases of dengue fever and 500 000 cases of DHF were occurring every year (WHO, 2001).

The appearance of DHF stimulated large amounts of dengue research, which established the existence of the four serotypes and the range of competent vectors, and led to the adoption of Aedes aegypti control programs in some areas (particularly South-East Asia) (Kilpatrick et al., 1970).

There have been several attempts to estimate the economic impact of dengue: the 1977 epidemic in Puerto Rico was thought to have cost between $6.1 and $15.6 million ($26–$31 per clinical case) (Von Allmen et al., 1979), while the 1981 Cuban epidemic (with a total of 344 203 reported cases) cost about $103 million (around $299 per case) (Kouri et al., 1989).

There is no cure for dengue fever or for DHF. Currently, the only treatment is symptomatic, but this can reduce mortality from DHF to less than 1% (WHO, 2002). Unfortunately, the extent of dengue epidemics means that local public health services are often overwhelmed by the demands for treatment.

Malaria

Malaria is a serious and sometimes fatal disease caused by a parasite that infects a mosquito. People who get malaria are typically very sick with high fevers, shaking chills, and flu-like illness. About 1,500 cases of malaria are diagnosed in the United States each year. The vast majority of cases in the United States are in travelers and immigrants returning from countries where malaria transmission occurs, many from sub-Saharan Africa and South Asia. Malaria has been noted for more than 4,000 years. It became widely recognized in Greece by the 4th century BCE, and it was responsible for the decline of many of the city-state populations. Hippocrates noted the principal symptoms. In the Susruta, a Sanskrit medical treatise, the symptoms of malarial fever were described and attributed to the bites of certain insects. A number of Roman writers attributed malarial diseases to the swamps.

Following their arrival in the New World, Spanish Jesuit missionaries learned from indigenous Indian tribes of a medicinal bark used for the treatment of fevers. With this bark, the Countess of Chinchón, the wife of the Viceroy of Peru, was cured of her fever. The bark from the tree was then called Peruvian bark and the tree was named Cinchona after the countess. The medicine from the bark is now known as the antimalarial, quinine. Along with artemisinins, quinine is one of the most effective antimalarial drugs available today.

quinquin acalisaya

quinquin acalisaya

Cinchona officinalis is a medicinal plant, one of several Cinchona species used for the production of quinine, which is an anti-fever agent. It is especially useful in the prevention and treatment of malaria. Cinchona calisaya is the tree most cultivated for quinine production.

There are a number of other alkaloids that are extracted from this tree. They include cinchonine, cinchonidine and quinidine  (Wikipedia)

Charles Louis Alphonse Laveran, a French army surgeon stationed in Constantine, Algeria, was the first to notice parasites in the blood of a patient suffering from malaria in 1880. Laveran was awarded the Nobel Prize in 1907.

Alphonse Laveran

Alphonse Laveran

Camillo Golgi, an Italian neurophysiologist, established that there were at least two forms of the disease, one with tertian periodicity (fever every other day) and one with quartan periodicity (fever every third day). He also observed that the forms produced differing numbers of merozoites (new parasites) upon maturity and that fever coincided with the rupture and release of merozoites into the blood stream. He was awarded a Nobel Prize in Medicine for his discoveries in neurophysiology in 1906.

malaria_lifecycle.

malaria_lifecycle.

Ookinete,_sporozoite,_merozoite

Ookinete,_sporozoite,_merozoite

The Italian investigators Giovanni Batista Grassi and Raimondo Filetti first introduced the names Plasmodium vivax and P. malariae for two of the malaria parasites that affect humans in 1890. Laveran had believed that there was only one species, Oscillaria malariae. William H. Welch, reviewed the subject and, in 1897, he named the malignant tertian malaria parasite P. falciparum. In 1922, John William Watson Stephens described the fourth human malaria parasite, P. ovale. P. knowlesi was first described by Robert Knowles and Biraj Mohan Das Gupta in 1931 in a long-tailed macaque, but the first documented human infection with P. knowlesi was in 1965.

Anopheles mosquito

Anopheles mosquito

Ronald Ross, a British officer in the Indian Medical Service, was the first to demonstrate that malaria parasites could be transmitted from infected patients to mosquitoes in 1997. In further work with bird malaria, Ross showed that mosquitoes could transmit malaria parasites from bird to bird. This necessitated a sporogonic cycle (the time interval during which the parasite developed in the mosquito). Ross was awarded the Nobel Prize in 1902.

Ronald Ross_1899

Ronald Ross_1899

A team of Italian investigators led by Giovanni Batista Grassi, collected Anopheles claviger mosquitoes and fed them on malarial patients. The complete sporogonic cycle of Plasmodium falciparum, P. vivax, and P. malariae were demonstrated. Mosquitoes infected by feeding on a patient in Rome were sent to London in 1999, where they fed on two volunteers, both of whom developed malaria.

The construction of the Panama Canal was made possible only after yellow fever and malaria were controlled in the area. These two diseases were a major cause of death and disease among workers in the area. In 1906, there were over 26,000 employees working on the Canal. Of these, over 21,000 were hospitalized for malaria at some time during their work. By 1912, there were over 50,000 employees, and the number of hospitalized workers had decreased to approximately 5,600. Through the leadership and efforts of William Crawford Gorgas, Joseph Augustin LePrince, and Samuel Taylor Darling, yellow fever was eliminated and malaria incidence markedly reduced through an integrated program of insect and malaria control.

Gorgas-William-Crawford, MD

Gorgas-William-Crawford, MD

During the U.S. military occupation of Cuba and the construction of the Panama Canal at the turn of the 20th century, U.S. officials made great strides in the control of malaria and yellow fever. In 1914 Henry Rose Carter and Rudolph H. von Ezdorf of the USPHS requested and received funds from the U.S. Congress to control malaria in the United States. Various activities to investigate and combat malaria in the United States followed from this initial request and reduced the number of malaria cases in the United States. USPHS established malaria control activities around military bases in the malarious regions of the southern United States to allow soldiers to train year round.

U.S. President Franklin D. Roosevelt signed a bill that created the Tennessee Valley Authority (TVA) on May 18, 1933. The law gave the federal government a centralized body to control the Tennessee River’s potential for hydroelectric power and improve the land and waterways for development of the region. An organized and effective malaria control program stemmed from this new authority in the Tennessee River valley. Malaria affected 30 percent of the population in the region when the TVA was incorporated in 1933. The Public Health Service played a vital role in the research and control operations and by 1947, the disease was essentially eliminated. Mosquito breeding sites were reduced by controlling water levels and insecticide applications.

Chloroquine was discovered by a German, Hans Andersag, in 1934 at Bayer I.G. Farbenindustrie A.G. laboratories in Eberfeld, Germany. He named his compound resochin. Through a series of lapses and confusion brought about during the war, chloroquine was finally recognized and established as an effective and safe antimalarial in 1946 by British and U.S. scientists.

Felix Hoffmann, Gerhard Domagk, Hermann Schnell_BAYER

Felix Hoffmann, Gerhard Domagk, Hermann Schnell_BAYER

A German chemistry student, Othmer Zeidler, synthesized DDT in 1874, for his thesis. The insecticidal property of DDT was not discovered until 1939 by Paul Müller in Switzerland. Various militaries in WWII utilized the new insecticide initially for control of louse-borne typhus. DDT was used for malaria control at the end of WWII after it had proven effective against malaria-carrying mosquitoes by British, Italian, and American scientists. Müller won the Nobel Prize for Medicine in 1948.

Paul Muller

Paul Muller

Malaria Control in War Areas (MCWA) was established to control malaria around military training bases in the southern United States and its territories, where malaria was still problematic. Many of the bases were established in areas where mosquitoes were abundant. MCWA aimed to prevent reintroduction of malaria into the civilian population by mosquitoes that would have fed on malaria-infected soldiers, in training or returning from endemic areas. During these activities, MCWA also trained state and local health department officials in malaria control techniques and strategies.

The National Malaria Eradication Program, a cooperative undertaking by state and local health agencies of 13 Southeastern states and the CDC, originally proposed by Louis Laval Williams, commenced operations on July 1, 1947. By the end of 1949, over 4,650,000 housespray applications had been made. In 1947, 15,000 malaria cases were reported. By 1950, only 2,000 cases were reported. By 1951, malaria was considered eliminated from the United States.

With the success of DDT, the advent of less toxic, more effective synthetic antimalarials, and the enthusiastic and urgent belief that time and money were of the essence, the World Health Organization (WHO) submitted at the World Health Assembly in 1955 an ambitious proposal for the eradication of malaria worldwide. Eradication efforts began and focused on house spraying with residual insecticides, antimalarial drug treatment, and surveillance, and would be carried out in 4 successive steps: preparation, attack, consolidation, and maintenance. Successes included elimination in nations with temperate climates and seasonal malaria transmission.

Some countries such as India and Sri Lanka had sharp reductions in the number of cases, followed by increases to substantial levels after efforts ceased, while other nations had negligible progress (such as Indonesia, Afghanistan, Haiti, and Nicaragua), and still others were excluded completely from the eradication campaign(sub-Saharan Africa). The emergence of drug resistance, widespread resistance to available insecticides, wars and massive population movements, difficulties in obtaining sustained funding from donor countries, and lack of community participation made the long-term maintenance of the effort untenable.

The goal of most current National Malaria Prevention and Control Programs and most malaria activities conducted in endemic countries is to reduce the number of malaria-related cases and deaths. To reduce malaria transmission to a level where it is no longer a public health problem is the goal of what is called malaria “control.”

The natural ecology of malaria involves malaria parasites infecting successively two types of hosts: humans and female Anopheles mosquitoes. In humans, the parasites grow and multiply first in the liver cells and then in the red cells of the blood. In the blood, successive broods of parasites grow inside the red cells and destroy them, releasing daughter parasites (“merozoites”) that continue the cycle by invading other red cells.

Anopheles mosquito

Anopheles mosquito

The blood stage parasites are those that cause the symptoms of malaria. When certain forms of blood stage parasites (“gametocytes”) are picked up by a female Anopheles mosquito during a blood meal, they start another, different cycle of growth and multiplication in the mosquito.

After 10-18 days, the parasites are found (as “sporozoites”) in the mosquito’s salivary glands. When the Anopheles mosquito takes a blood meal on another human, the sporozoites are injected with the mosquito’s saliva and start another human infection when they parasitize the liver cells.

Malaria. Wikipedia

Malaria. Wikipedia

A Plasmodium from the saliva of a female mosquito moving across a mosquito cell

Thus the mosquito carries the disease from one human to another (acting as a “vector”). Differently from the human host, the mosquito vector does not suffer from the presence of the parasites.

All the clinical symptoms associated with malaria are caused by the asexual erythrocytic or blood stage parasites. When the parasite develops in the erythrocyte, numerous known and unknown waste substances such as hemozoin pigment and other toxic factors accumulate in the infected red blood cell. These are dumped into the bloodstream when the infected cells lyse and release invasive merozoites. The hemozoin and other toxic factors such as glucose phosphate isomerase (GPI) stimulate macrophages and other cells to produce cytokines and other soluble factors which act to produce fever and rigors associated with malaria.

Ookinete,_sporozoite,_merozoite

Ookinete,_sporozoite,_merozoite

Plasmodium falciparum-infected erythrocytes, particularly those with mature trophozoites, adhere to the vascular endothelium of venular blood vessel walls and when they become sequestered in the vessels of the brain it is a factor in causing the severe disease syndrome known as cerebral malaria, which is associated with high mortality.

Following the infective bite by the Anopheles mosquito, a period of time (the “incubation period”) goes by before the first symptoms appear. The incubation period in most cases varies from 7 to 30 days. The shorter periods are observed most frequently with P. falciparum and the longer ones with P. malariae.

malaria_lifecycle.

malaria_lifecycle.

Antimalarial drugs taken for prophylaxis by travelers can delay the appearance of malaria symptoms by weeks or months, long after the traveler has left the malaria-endemic area. (This can happen particularly with P. vivax and P. ovale, both of which can produce dormant liver stage parasites; the liver stages may reactivate and cause disease months after the infective mosquito bite.)

The Influenza Pandemic of 1918

The Nation’s Health

If you had lived in the early twentieth century, your life expectancy would
have been much shorter than it is today. Today, life expectancy for men is 75 years;
for women, it is 80 years. In 1918, life expectancy for men was only 53 years.

Women’s life expectancy at 54 was only marginally better.

Why was life expectancy so much shorter?

During the early twentieth century, communicable diseases—that is diseases
which can spread from person to person—were widespread. Influenza and
pneumonia along with tuberculosis and gastrointestinal infections such
as diarrhea killed Americans at an alarming rate but
non-communicable diseases such as cancer and heart disease also
exacted a heavy toll. Accidents, especially in the nation’s unregulated factories
and workshops, were also responsible for maiming and killing many workers.

High infant mortality further shortened life expectancy. In 1918, one in
five American children did not live beyond their fifth birthday. In some
cities, the situation was even worse, with thirty percent of all infants dying
before their first birthday. Childhood diseases such as diphtheria, measles,
scarlet fever and whooping cough contributed significantly to these high
death rates.

osler_at_a_bedside

osler_at_a_bedside

By 1900, an increasing number of physicians were receiving clinical
training. This training provided doctors with new insights into disease
and specific types of diseases. [Credit: National Library of Medicine]

scarlet_fever

scarlet_fever

Quarantine signs such as this one warned visitors away from homes
with scarlet fever and other infectious diseases. [Credit: National
Library of Medicine]

Rat Proofing

Cities often sponsored Clean-Up Days. Here, Public Health Service
employees clean up San Francisco’s streets in a campaign to
eradicate bubonic plague. [Credit: Office of the Public Health
Service Historian]

cleanup days

cleanup days

A young woman is seated with a baby on her lap in the center
of the photo.  On the right are two young children.  One child is
standing.  The other is seated in a crib.  A woman in a long
white apron stands by the stove on the left side of the photo.
She is pulling a bottle out of a pan on the stove.

nurse_helps_with_baby_formula

nurse_helps_with_baby_formula

A public health nurse teaches a young mother how to sterilize
a bottle. [Credit: National Library of Medicine]

Seeking Medical Care

Feeling Sick in 1918?

If you became sick in nineteenth-century America, you might consult
a doctor, a druggist, a midwife, a folk healer, a nurse or even
your neighbor. Most of these practitioners would visit you in your home.

By 1918, these attitudes toward health care were beginning to
change. Some physicians had begun to set up offices where patients
could receive medical care and hospitals, which emphasized sterilization
and isolation, were also becoming popular.

However, these changes were not yet universal and many Americans
still lived their entire lives without visiting a doctor.

How Did Ordinary People View Disease?

Folk Medicine:

In 1918, folk healers could be found all over America. Some of these
healers believed that diseases had a physical cause such as cold
weather but others believed it had a supernatural cause such as a curse.

Treatments advocated by these healers ran the gamut. Herbal remedies
were especially popular. Other popular remedies included cupping,
which entailed attaching a heated cup to the surface of the skin,
and acupuncture. Many people also wore magical objects which they
believed protected the wearer from illness.

During the influenza pandemic of 1918 when scientific medicine
failed to provide Americans with a cure or preventative, many people
turned to folk remedies and treatments.

Scientific Medicine

In the 1880s, building on developments which had been in the
making since the 1830s, a growing number of scientists and
physicians came to believe that disease was spread by
minute pathogenic organisms or germs.

Often called the bacteriological revolution, this new theory
radically transformed the practice of medicine. But while this was a
major step forward in understanding disease, doctors and scientists
continued to have only a rudimentary understanding of the differences
between different types of microbes. Many practicing physicians
did not understand the differences between bacteria and viruses
and this sharply limited their ability to understand disease
causation and disease prevention.

Drugs and Druggists:

Although the early twentieth century witnessed growing attempts
to regulate the practice of medicine, many druggists assumed
duties we associate today with physicians. Some druggists, for
example, diagnosed and prescribed treatments which they
then sold to the patient. Some of these treatments included opiates;
few actually cured diseases.

Desperate times called for desperate remedies and during the
influenza pandemic, many patients turned to these and other drugs
in the hopes that they would provide a cure.

Nurses:

Between 1890 and 1920, nursing schools multiplied and trained
nurses began to replace practical nurses. Isolation practices
sterility, and strict routines, practices associated with professionally
trained nurses, increasingly became standard during this period. In 1918, nurses served as the physician’s hand, assisting doctors as
they made the rounds. During the pandemic, many nurses acted
independently of doctors, treating and prescribing for patients.

Physicians:

Throughout the eighteenth and much of the nineteenth centuries,
pretty much anyone had the right to call oneself a physician. By the
late nineteenth century, growing calls for reform had begun to
transform the profession.

In 1900, every state in the Union had some type of medical registration
law with about half of all states requiring physicians to possess a
medical diploma and pass an exam before they received a license
to practice. However, grandfather clauses which exempted many older
physicians meant that many physicians who practiced in 1918
had been poorly trained.

quack_doctor

quack_doctor

Poor training and loose regulations meant that some doctors were
little more than quacks. [Credit: National Library of Medicine]

drug_ad

drug_ad

Drug advertisers routinely promised quick and painless cures.
[Credit: National Library of Medicine]

While access to the profession was tightening, women and minorities,
including African-Americans, entered the profession in growing
numbers during the early twentieth century.

What Did Doctors Really Know?

Growing understanding of bacteriology enabled early twentieth-
century physicians to diagnose diseases more effectively than their
predecessors but diagnosis continued to be difficult. Influenza was
especially tricky to diagnose and many physicians may have incorrectly
diagnosed their patients, especially in the early stages of the pandemic.

Bacteriology did not revolutionize the treatment of disease. In the
pre-antibiotic era of 1918, physicians continued to rely heavily
on traditional therapeutics. During the pandemic, many physicians
used traditional treatments such as sweating which had their
roots in humoral medicine.

Reflecting the uneven structure of medical education, the level and
quality of care which physicians provided varied wildly.

The Public Health Service

Founded in 1798, the Marine Hospital Service originally provided
health care for sick and disabled seaman. By the late nineteenth
century, the growth of trade, travel and immigration networks
had led the Service to expand its mission to include protecting
the health of all Americans.

In a nation where federal and state authorities had consistently
battled for supremacy, the powers of the Public Health Service
were limited. Viewed with suspicion by many state and local
authorities, PHS officers often found themselves fighting state
and local authorities as well as epidemics—even when they had
been called in by these authorities.

chelsea marine hospital in 1918

chelsea marine hospital in 1918

A network of hospitals in the nation’s ports provided seamen with
access to healthcare. [Credit: Office of the Public Health Service Historian]

In 1918, there were fewer than 700 commissioned officers in the PHS.
Charged with the daunting task of protecting the health of some
106 million Americans, PHS officers were stationed in not only
the United States but also abroad.

Because few diseases could be cured, the prevention of disease
was central to the PHS mission. Under the leadership of Surgeon
General Rupert Blue, the PHS advocated the use of scientific
research, domestic and foreign quarantine, marine hospitals
and statistics to accomplish this mission. hen an epidemic emerged,
the Public Health Service’s epidemiologists tracked the disease,
house by house. The 1918 influenza pandemic occurred too
rapidly for the PHS to develop a detailed study of the pandemic.

typhoid_map

typhoid_map

This map was used to trace a smaller typhoid epidemic which erupted in
Washington, DC in 1906. [Credit: Office of the Public Health Service Historian]

The spread of disease within the US was a serious concern. However,
PHS officers were most concerned about the importation of disease into
the United States. To prevent this, ships could be, and often were,
quarantined by the PHS.

fever-quaranteen-station-1880

fever-quaranteen-station-1880

Travelers and immigrants to the United States were also required
to undergo a medical exam when entering the country. In 1918 alone,
700,000 immigrants underwent a medical exam at the hands of PHS
officers. Within the United States, PHS officers worked directly with
state and local departments of health to track, prevent and arrest
epidemics as they emerged. During 1918, PHS officers found themselves
battling not only influenza but also polio, typhus, typhoid, smallpox
and a range of other diseases. In 1918, the PHS operated research
laboratories stretching from Hamilton, Montana to Washington DC.
Scientific researchers at these laboratories ultimately discovered
both the causes and cures of diseases ranging from Rocky Mountain
Spotted Fever to pellagra.

Sewers and Sanitation:

In the nineteenth century, most physicians and public health experts
believed that disease was caused not by microorganisms but rather by dirt itself.

Sanitarians, as these people were called, argued that cleaning dirt-
infested cities and building better sewage systems would both prevent
and end many epidemics. At their urging, cities and towns across the United
States built better sewage systems and provided citizens with access to
clean water. By 1918, these improved water and sewage systems had greatly
contributed to a decline in gastrointestinal infections and a significant
reduction in mortality rates among infants, children and young adults.

But because diseases are caused by microorganisms, not dirt, these
tactics were not completely effective in ending all epidemics.

Sanitation: Controlling problems at source

Box 1: Sharing toilets in Uganda

A recent survey by the Ministry of Health in Uganda suggested that there is only one toilet for every 700 Ugandan pupils, compared to one for every 328 pupils in 1995. Out of 8000 schools surveyed, only 33% of the 8000 schools sampled have separate latrines for girls. The deterioration in sanitary conditions was attributed to increased enrolment in schools. UNICEF surveyed 90 primary schools in crisis-affected districts of north and west Uganda: only 2% had adequate latrine facilities (IRIN, 1999).

Box 2: Sanitation and diarrhoeal disease

Gwatkin and Guillot (1999) have claimed that diarrhoea accounts for 11% of all deaths in the poorest 20% of all countries. This toll could be reduced by key measures: better sanitation to reduce the cause of water linked diarrhoea; and more widespread use of oral rehydration therapy (ORT) to treat its effects. Improving water supplies, sanitation facilities and hygiene practices reduces diarrhoea incidence by 26%. Even more impressive, deaths due to diarrhoea are reduced by 65% with these same improvements (Esrey et al., 1991). Of the 2.2 million people that die from diarrhoea each year, many of those deaths are caused by one bacteria – Shigella. Simple hand washing with soap and water reduces Shigella and other diarrhoea transmission by 35% (Kotloff et al., 1999; Khan, 1982). ORT is effective in reducing deaths due to diarrhoea but does not prevent it.

http://www.who.int/water_sanitation_health/sanitproblems/en/index1.html

Garbage-A-polluted-creek

Garbage-A-polluted-creek

Influenza Strikes

Throughout history, influenza viruses have mutated and caused
pandemics or global epidemics. In 1890, an especially virulent influenza
pandemic struck, killing many Americans. Those who survived that
pandemic and lived to experience the 1918 pandemic tended to be
less susceptible to the disease.

From Kansas to Europe and back again, wave after wave, the
unfolding of the pandemic, mobilizing to fight influenza, the
pandemic hits, protecting yourself, communication, fading of
the pandemic.

Influenza ward

Influenza ward

When it came to treating influenza patients, doctors, nurses and
druggists were at a loss. [Credit: Office of the Public Health Service Historian]

The influenza pandemic of 1918-1919 killed more people than the
Great War, known today as World War I (WWI), at somewhere
between 20 and 40 million people. It has been cited as the most
devastating epidemic in recorded world history. More people died of
influenza in a single year than in four-years of the Black Death Bubonic
Plague from 1347 to 1351. Known as “Spanish Flu” or “La Grippe”
the influenza of 1918-1919 was a global disaster.

Grim Reaper

Grim Reaper

The Grim Reaper by Louis Raemaekers

In the fall of 1918 the Great War in Europe was winding down and
peace was on the horizon. The Americans had joined in the fight,
bringing the Allies closer to victory against the Germans. Deep within
the trenches these men lived through some of the most brutal conditions
of life, which it seemed could not be any worse. Then, in pockets
across the globe, something erupted that seemed as benign as the
common cold. The influenza of that season, however, was far more
than a cold. In the two years that this scourge ravaged the earth,
a fifth of the world’s population was infected. The flu was most deadly
for people ages 20 to 40. This pattern of morbidity was unusual for
influenza which is usually a killer of the elderly and young children.
It infected 28% of all Americans (Tice). An estimated 675,000
Americans died of influenza during the pandemic, ten times as
many as in the world war. Of the U.S. soldiers who died in Europe,
half of them fell to the influenza virus and not to the enemy (Deseret
News). An estimated 43,000 servicemen mobilized for WWI died
of influenza (Crosby). 1918 would go down as unforgettable year
of suffering and death and yet of peace. As noted in the Journal
of the American Medical Association final edition of 1918:   “The 1918
has gone: a year momentous as the termination of the most cruel war
in the annals of the human race; a year which marked, the end at
least for a time, of man’s destruction of man; unfortunately a year in
which developed a most fatal infectious disease causing the death
of hundreds of thousands of human beings. Medical science for
four and one-half years devoted itself to putting men on the firing
line and keeping them there. Now it must turn with its whole might to
combating the greatest enemy of all–infectious disease,” (12/28/1918).

From Kansas to Europe and Back Again:

scourge ravaged the earth

scourge ravaged the earth

Where did the 1918 influenza come from? And why was it so lethal?

In 1918, the Public Health Service had just begun to require state
and local health departments to provide them with reports about
diseases in their communities. The problem? Influenza wasn’t
a reportable disease.

But in early March of 1918, officials in Haskell County in Kansas
sent a worrisome report to the Public Health Service.Although
these officials knew that influenza was not a reportable disease,
they wanted the federal government to know that “18 cases
of influenza of a severe type” had been reported there.

By May, reports of severe influenza trickled in from Europe. Young
soldiers, men in the prime of life, were becoming ill in large
numbers. Most of these men recovered quickly but some developed
a secondary pneumonia of “a most virulent and deadly type.”

Within two months, influenza had spread from the military to the
civilian population in Europe. From there, the disease spread outward—to Asia, Africa, South America and, back again, to North America.

Wave After Wave:

In late August, the influenza virus probably mutated again and
epidemics now erupted in three port cities: Freetown, Sierra
Leone; Brest, France, and Boston, Massachusetts. In Boston,
dockworkers at Commonwealth Pier reported sick in massive
numbers during the last week in August. Suffering from fevers
as high as 105 degrees, these workers had severe muscle and
joint pains. For most of these men, recovery quickly followed. But
5 to 10% of these patients developed severe and massive
pneumonia. Death often followed.

Public health experts had little time to register their shock at the
severity of this outbreak. Within days, the disease had spread
outward to the city of Boston itself. By mid-September, the epidemic
had spread even further with states as far away as California, North
Dakota, Florida and Texas reporting severe epidemics.

The Unfolding of the Pandemic:

The pandemic of 1918-1919 occurred in three waves. The first
wave had occurred when mild influenza erupted in the late
spring and summer of 1918. The second wave occurred with an
outbreak of severe influenza in the fall of 1918 and the final wave
occurred in the spring of 1919.

In its wake, the pandemic would leave about twenty million dead
across the world. In America alone, about 675,000 people in
a population of 105 million would die from the disease.

Find out what happened in your state during the Pandemic

Mobilizing to Fight Influenza:

Although taken unaware by the pandemic, federal, state and local
authorities quickly mobilized to fight the disease.

On September 27th, influenza became a reportable disease. However,
influenza had become so widespread by that time that most states
were unable to keep accurate records. Many simply failed to
report to the Public Health Service during the pandemic, leaving
epidemiologists to guess at the impact the disease may have
had in different areas.

World War I had left many communities with a shortage of trained
medical personnel. As influenza spread, local officials urgently
requested the Public Health Service to send nurses and doctors.
With less than 700 officers on duty, the Public Health Service was
unable to meet most of these requests. On the rare occasions when
the PHS was able to send physicians and nurses, they often became
ill en route. Those who did reach their destination safely often found
themselves both unprepared and unable to provide real assistance.

In October, Congress appropriated a million dollars for the Public
Health Service. The money enabled the PHS to recruit and pay
for additional doctors and nurses. The existing shortage of doctors
and nurses, caused by the war, made it difficult for the PHS to locate and hire qualified practitioners. The virulence of the disease also meant that many nurses and doctors contracted influenza
within days of being hired.

Confronted with a shortage of hospital beds, many local officials
ordered that community centers and local schools be transformed
into emergency hospitals. In some areas, the lack of doctors meant
that nursing and medical students were drafted to staff these
makeshift hospitals.

The Pandemic Hits:

Entire families became ill. In Philadelphia, a city especially hard hit,
so many children were orphaned that the Bureau of Child Hygiene
found itself overwhelmed and unable to care for them.

As the disease spread, schools and businesses emptied. Telegraph
and telephone services collapsed as operators took to their
beds. Garbage went uncollected as garbage men reported sick.
The mail piled up as postal carriers failed to come to work.

State and local departments of health also suffered from high
absentee rates. No one was left to record the pandemic’s spread
and the Public Health Service’s requests for information went
unanswered.

As the bodies accumulated, funeral parlors ran out of caskets
and bodies went uncollected in morgues.

Protecting Yourself From Influenza:

In the absence of a sure cure, fighting influenza seemed an
impossible task.

In many communities, quarantines were imposed to prevent
the spread of the disease.Schools, theaters, saloons, pool
halls and even churches were all closed. As the bodies
mounted, even funerals were held out doors to protect mourners
against the spread of the disease.

Emergency Hospital for Influenza Patients

An Emergency Hospital for Influenza Patients

The effect of the influenza epidemic was so severe that the
average life span in the US was depressed by 10 years.
The influenza virus had a profound virulence, with a mortality
rate at 2.5% compared to the previous influenza epidemics, which
were less than 0.1%. The death rate for 15 to 34-year-olds of
influenza and pneumonia were 20 times higher in 1918 than in
previous years (Taubenberger). People were struck
with illness on the street and died rapid deaths.

One anecdote shared of 1918 was of four women playing bridge
together late into the night. Overnight, three of the women died
from influenza (Hoagg). Others told stories of people on their way
to work suddenly developing the flu and dying within hours
(Henig). One physician writes that patients with seemingly
ordinary influenza would rapidly “develop the most viscous
type of pneumonia that has ever been seen” and later when
cyanosis appeared in the patients, “it is simply a struggle for air
until they suffocate,” (Grist, 1979). Another physician recalls
that the influenza patients “died struggling to clear their airways
of a blood-tinged froth that sometimes gushed from their nose
and mouth,” (Starr, 1976). The physicians of the time were
helpless against this powerful agent of influenza. In 1918 children
would skip rope to the rhyme (Crawford):

I had a little bird,

Its name was Enza.

I opened the window,

And in-flu-enza.

schools inspected -

schools inspected –

The influenza pandemic circled the globe. Most of humanity felt the
effects of this strain of the influenza virus. It spread following
the path of its human carriers, along trade routes and shipping lines.
Outbreaks swept through North America, Europe, Asia, Africa, Brazil
and the South Pacific (Taubenberger). In India the mortality rate was
extremely high at around 50 deaths from influenza per 1,000
people (Brown). The Great War, with its mass movements of men
in armies and aboard ships, probably aided in its rapid diffusion
and attack. The origins of the deadly flu disease were unknown but
widely speculated upon. Some of the allies thought of the epidemic as a
biological warfare tool of the Germans. Many thought it was a result of
the trench warfare, the use of mustard gases and the generated “smoke
and fumes” of the war. A national campaign began using the ready
rhetoric of war to fight the new enemy of microscopic proportions. A
study attempted to reason why the disease had been so devastating
in certain localized regions, looking at the climate, the weather and
the racial composition of cities. They found humidity to be linked with
more severe epidemics as it “fosters the dissemination of the bacteria,”
(Committee on Atmosphere and Man, 1923). Meanwhile the new
sciences of the infectious agents and immunology were
racing to come up with a vaccine or therapy to stop the epidemics.

The experiences of people in military camps encountering the
influenza pandemic: An excerpt for the memoirs of a survivor at
Camp Funston of the pandemic Survivor A letter to a fellow physician
describing conditions during the influenza epidemic at Camp Devens.

A collection of letters of a soldier stationed in Camp Funston Soldier

The origins of this influenza variant is not precisely known. It is thought
to have originated in China in a rare genetic shift of the influenza virus.
The recombination of its surface proteins created a virus novel to
almost everyone and a loss of herd immunity. Recently the virus
has been reconstructed from the tissue of a dead soldier and is
now being genetically characterized.

The name of Spanish Flu came from the early affliction and large
mortalities in Spain (BMJ,10/19/1918) where it allegedly killed 8
million in May (BMJ, 7/13/1918). However, a first wave of influenza
appeared early in the spring of 1918 in Kansas and in military
camps throughout the US. Few noticed the epidemic in the midst of
the war. Wilson had just given his 14 point address. There was
virtually no response or acknowledgment to the epidemics in March
and April in the military camps. It was unfortunate that no steps were
taken to prepare for the usual recrudescence of the virulent influenza
strain in the winter. The lack of action was later criticized when the
epidemic could not be ignored in the winter of 1918 (BMJ, 1918).
These first epidemics at training camps were a sign of what was
coming in greater magnitude in the fall and winter of 1918 to the
entire world.

The war brought the virus back into the US for the second wave
of the epidemic. It first arrived in Boston in September of 1918
through the port busy with war shipments of machinery and supplies.
The war also enabled the virus to spread and diffuse. Men across
the nation were mobilizing to join the military and the cause. As they
came together, they brought the virus with them and to those they
contacted. The virus  killed almost 200,00 in October of 1918
alone. In November 11 of 1918 the end of the war enabled a resurgence.
As people celebrated Armistice Day with parades and large parties, a
complete disaster from the public health standpoint, a rebirth of
the epidemic occurred in some cities. The flu that winter was beyond
imagination as millions were infected and thousands died. Just as
the war had effected the course of influenza, influenza affected
the war. Entire fleets were ill with the disease and men on the front
were too sick to fight. The flu was devastating to both sides, killing
more men than their own weapons could.

With the military patients coming home from the war with battle wounds
and mustard gas burns, hospital facilities and staff were taxed
to the limit. This created a shortage of physicians, especially in the
civilian sector as many had been lost for service with the military.
Since the medical practitioners were away with the troops, only
the medical students were left to care for the sick. Third and forth
year classes were closed and the students assigned jobs as
interns or nurses (Starr,1976). One article noted that “depletion has
been carried to such an extent that the practitioners are brought
very near the breaking point,” (BMJ, 11/2/1918). The shortage was
further confounded by the added loss of physicians to the epidemic.
In the U.S., the Red Cross had to recruit more volunteers to contribute
to the new cause at home of fighting the influenza epidemic. To respond
with the fullest utilization of nurses, volunteers and medical supplies, the
Red Cross created a National Committee on Influenza. It was involved
in both military and civilian sectors to mobilize all forces to fight Spanish
influenza (Crosby, 1989). In some areas of the US, the nursing shortage
was so acute that the Red Cross had to ask local businesses to
allow workers to have the day off if they volunteer in the hospitals
at night (Deseret News). Emergency hospitals were created to
take in the patients from the US and those arriving sick from overseas.

chelsea marine hospital in 1918

chelsea marine hospital in 1918

red_cross_public_health_nurse

red_cross_public_health_nurse

The pandemic affected everyone. With one-quarter of the US and
one-fifth of the world infected with the influenza, it was  impossible
to escape from the illness. Even President Woodrow Wilson suffered
from the flu in early 1919 while negotiating the crucial treaty of
Versailles to end the World War (Tice). Those who were
lucky enough to avoid infection had to deal with the public health
ordinances to restrain the spread of the disease.

The public health departments distributed gauze masks to be worn
in public. Stores could not hold sales, funerals were limited
to 15 minutes. Some towns required a signed certificate to
enter and railroads would not accept passengers without
them. Those who ignored the flu ordinances had to pay steep
fines enforced by extra officers (Deseret News). Bodies pilled up
as the massive deaths of the epidemic ensued. Besides the
lack of health care workers and medical supplies, there was a shortage
of coffins, morticians and gravediggers (Knox). The conditions in 1918
were not so far removed from the Black Death in the era of the
bubonic plague of the Middle Ages.

iowa_flu

iowa_flu

In 1918-19 this deadly influenza pandemic erupted during the final
stages of World War I. Nations were already attempting to deal with
the  effects and costs of the war. Propaganda campaigns and war
restrictions and rations had been implemented by governments.
Nationalism pervaded as people accepted government authority.
This allowed the public health departments to easily step in and
implement their restrictive measures. The war also gave science
greater importance as governments relied on scientists, now armed
with the new germ theory and the development of antiseptic surgery,
to design vaccines and reduce mortalities of disease and battle
wounds. Their new technologies could preserve the men on
the front and ultimately save the world. These conditions
created by World War I, together with the current social attitudes
and ideas, led to the relatively calm response of the public and
application of scientific ideas. People allowed for strict measures
and loss of freedom during the war as they submitted to the
needs of the nation ahead of their personal needs. They had
accepted the limitations placed with rationing and drafting.
The responses of the public health officials reflected the new
allegiance to science and the wartime society. The medical
and scientific communities had developed new theories and
applied them to prevention, diagnostics and treatment of the
influenza patients.

The Medical and Scientific Conceptions of Influenza

Scientific ideas about influenza, the disease and its origins,
shaped the public health and medical responses. In 1918
infectious diseases were beginning to be unraveled. Pasteur
and Koch had solidified the germ theory of disease through
clear experiments clever science. The bacillus responsible
for many infections such as tuberculosis and anthrax  had
been visualized, isolated and identified. Koch’s postulates
had been developed to clearly link a disease to a specific
microbial agent.

Robert Koch

Robert Koch

The petri dish was widely used to grow sterile cultures of bacteria
and investigate bacterial flora. Vaccines had been created for
bacterial infections and even the unseen rabies virus by
serial passage techniques. The immune system was explained by
Paul Erhlich and his side-chain theory. Tests of antibodies such as
Wasserman and coagulation experiments were becoming commonplace.
Science and medicine were on their way to their complete entanglement
and fusion as scientific principles and methodologies made their way
into clinical practice, diagnostics and therapy.

The Clinical Descriptions of Influenza

Patients with the influenza disease of the epidemic were generally
characterized by common complaints associated with the flu. They had
body aches, muscle and joint pain, headache, a sore throat and a
unproductive cough with occasional harsh breathing (JAMA, 1/25/1919).

The most common sign of infection was the fever, which ranged from
100 to 104 F and lasted for a few days. The onset of the epidemic influenza
was peculiarly sudden, as people were struck down with dizziness, weakness
and pain while on duty or in the street (BMJ, 7/13/1918). After  the
disease was established the mucous membranes became reddened
with sneezing. In some cases there was a hemorrhage of the
mucous membranes of the nose and bloody noses were commonly
seen. Vomiting occurred on occasion, and also sometimes diarrhea
but more commonly there was constipation (JAMA, 10/3/1918).

The danger of an influenza infection was its tendency to progress into
the often fatal secondary bacterial infection of pneumonia. In the
patients that did not rapidly recover after three or four days of fever, there
is an “irregular pyrexia” due to bronchitis or broncopneumonia (BMJ,
7/13/1918). The pneumonia would often appear after a period of
normal temperature with a sharp spike and expectorant of bright
red blood. The lobes of the lung became speckled with “pneumonic
consolidations.” The fatal cases developed toxemia and vasomotor
depression (JAMA, 10/3/1918). It was this tendency for secondary
complications that made this influenza infection so deadly.

pneumonia

pneumonia

hospital ward in 1918

hospital ward in 1918

A military hospital ward in 1918

In the medical literature characterizing the influenza disease, new
diagnostic techniques are frequently used to describe the clinical
appearance. The most basic clinical guideline was the temperature,
a record of which was kept in a table over time. Also closely
monitored was the pulse rate. One clinical account said that
“the pulse was remarkably slow,” (JAMA, 4/12/1919) while others
noted that the pulse rate did not increase as expected. With the
pulse, the respiration rate was measured and reported to provide
clues of the clinical progression.
Patients were also occasionally “roentgenographed” or chest x-rayed,
(JAMA, 1/25/1919). The discussion of clinical influenza also often
included analysis of the blood. The number of white blood cells were
counted for many patients. Leukopenia was commonly associated
with influenza. The albumin was also measured, since it was noted that
transient albuminuria was frequent in influenza patients. This was
done by urine analysis. The Wassermann reaction was another
added new test of the blood for antibodies (JAMA, 10/3/1918).
These new measurements enabled to physicians to have an
image of action and knowledge using scientific instruments. They
could record precisely the progress of the influenza infection and perhaps
were able to forecast its outcome.

The most novel of these tests were the blood and sputum cultures.
Building on the germ theory of disease, the physicians and their
associated research scientists attempted to find the culprit for this
deadly infection. Physicians would commonly order both blood and sputum
cultures of their influenza and pneumonia patients mostly for research
and investigative purposes. At the military training camp
Camp Lewis during a influenza epidemic, “in all cases of pneumonia.
a sputum study, white blood and differential count, blood culture
and urine examinations were made as routine,” (JAMA, 1/25/1919).

The bacterial flora of the nasopharynx of some patients was also cultured
since droplet infection was where the disease disseminated. The
collected swabs and specimens were inoculated onto blood agar of
petri dishes. The grown up bacterial colonies were closely studied to
find the causal organism. Commonly found were pneumococcus,
streptococcus, staphylococcus and Bacillus influenzae (JAMA, 4/12/1919).

pneumonia

pneumonia

These new laboratory tests used in the clinical setting brought in a solid
scientific, biological link to the practice of medicine. Medicine had
become fully scientific and technologic in its understanding and
characterization of the influenza epidemic.

Treatment and Therapy

The therapeutic remedies for influenza patients varied from the
newly developed drugs to oils and herbs. The therapy was much less
scientific than the diagnostics, as the drugs had no clear explanatory
theory of action. The treatment was largely symptomatic, aiming to
reduce fever or pain. Aspirin, or acetylsalicylic acid was a common remedy.
For secondary pneumonia doses of epinephrin were given. To
combat the cyanosis physicians gave oxygen by mask or some
injected it under the skin (JAMA, 10/3/1918). Others used salicin which
reduced pain, discomfort and fever and claimed to reduce the infectivity
of the patient. Another popular remedy was cinnamon in powder or oil form
with milk to reduce temperature (BMJ, 10/19/1918). Finally, salt of quinine
was suggested as a treatment. Most physicians agreed that the patient should
be  kept in bed (BMJ, 7/13/1918). With that was the advice of plenty of
fluids and nourishment. The application of cold to the head, with
warm packs or warm drinks was also advised. Warm baths were used
as a hydrotherapeutic method in hospitals but were discarded for
lack of success (JAMA, 10/3/1918). These treatments, like the
suggested prophylactic measures of the public health officials, seemed to
originate in the common social practices and not in the growing field of
scientific medicine. It seems that as science was entering the medical
field, it served only for explanatory, diagnostic and preventative
measures such as vaccines and technical tests. This science had
little use once a person was ill.

However, a few proposed treatment did incorporate scientific ideas
of germ theory and the immune system. O’Malley and Hartman
suggested to treat influenza patients with the serum of convalescent
patients. They utilize the theorized antibodies to boost the immune
system of sick patients. Other treatments were “digitalis,” the
administration of isotonic glucose and sodium bicarbonate intravenously
which was done in military camps (JAMA, 1/4/1919). Ross and
Hund too utilized ideas about the immune system and properties of the
blood to neutralize toxins and circulate white blood cells. They believed
that the best treatment for influenza should aim to: “…neutralize or render
the intoxicant inert…and prevent the blood destruction with its destructive
leukopenia and lessened coagulability,” (JAMA, 3/1/1919). They tried
to create a therapeutic immune serum to fight infection. These therapies
built on current scientific ideas and represented the highest
biomedical, technological treatment like the antitoxin to diphtheria.

influenza

influenza

In July, an American soldier said that while influenza caused a heavy
fever, it “usually only confines the patient to bed for a few days.” The
mutation of the virus changed all that. [Credit: National Library of Medicine]

recovering_from_influenza

recovering_from_influenza

An old cliché maintained that influenza was a wonderful disease as
it killed no one but provided doctors with lots of patients. The 1918
pandemic turned this saying on its head. [Credit: The Etiology of
Influenza in 1918]

During the 1890 influenza epidemic, Pfeiffer found what he
determined to be the microbial agent to cause influenza.
In the sputum and respiratory tract of influenza patients in 1892,
he isolated the bacteria Bacillus influenzae , which was
accepted as the true “virus” though it was not found in localized
outbreaks (BMJ, 11/2/1918). However, in studies of the 1907-8
epidemic in the US, Lord had found the bacillus in only 3 of 20 cases.
He also found the bacillus in 30% of cultures of sputum from TB patients.
Rosenthal further refuted the finding when he found the bacillus in 1 of 6
healthy people in 1900 (JAMA, 1/18/1919). The bacillus was also
found to be present in all cases of whooping cough and many cases
of measles, chronic bronchitis and scarlet fever (JAMA, 10/5/1918).
The influenza pandemic provided scientists the opportunity to confirm
or refute this contested microbe as the cause of influenza. The sputum
studies from the Camp Lewis epidemic found only a few influenza cases
harvesting the influenza bacilli and mostly type IV pneumococcus . They
concluded that “the recent epidemic at Camp Lewis was an acute
respiratory infection and not an epidemic due to Bacillus influenzae ,”
(JAMA, 1/25/1919). This finding along with others suggested to most
scientists that the Pfeiffer’s Bacillus was not the cause of influenza.

In the 1918-19 influenza pandemic, there was a great drive to find the
etiological agent responsible for the deadly scourge. Scientists in their
labs were working hard, using the cultures obtained from physician clinics,
to isolate the etiological agent for influenza. As a report early in the
epidemic said, “the ‘influence’ of influenza is still veiled in mystery, ”
(JAMA, 10/5/1918). The nominated bacillus influenzae bacteria
seemed to be incorrect and scientists scrambled to isolate the true cause.
In the journals, many authors speculated on the type of agent- was
it a new microbe, was it a bacteria, was it a virus? One journal offered
that “the severity of the present pandemic, the suddenness of onset…
led to the suggestion that the disease cannot be influenza but some other
and more lethal infection,” (BMJ, 11/2/1918). However, most accepted that
the epidemic disease was influenza based on the familiar symptoms
and known pattern of disease. The respiratory disease of influenza was
understood to give warning in the late spring of its potential effects
upon its recrudescence once the weather turned cold in the winter
(BMJ, 10/19/1918).One article with foresight stated that ” there can
be no question that the virus of influenza is a living organism…

flu virus EM

flu virus EM

it is possibly beyond the range of microscopic vision,” (BMJ, 11/16/1918). Another
article confirmed the idea of an “undiscovered virus” and noted that pneumococci
and streptococci were responsible for “the gravity of the secondary pulmonary
complications,” (BMJ, 11/2/1918). The article went on to offer the idea of a
symbiosis of virus and secondary bacterial infection combining to make it
such a severe disease.

The investigators as they attempted to find the responsible agent for the influenza
pandemic were developing ideas of infectious microbes and the concept of the
virus. The idea of the virus as an infectious agent had been around for years.
The articles of the period refer to the “virus” in their discussion but do not
consistently use it to be an infectious microbe, distinctive from bacteria. The
term virus has the same usage and application as bacillus. In 1918, a virus
was defined scientifically to be a submicroscopic infectious entity which could
be filtered but not grown in vitro . In the 1880s Pasteur developed an attenuated
vaccine for the rabies virus by serial passage way ahead of his time. Ivanoski’s
work on the tobacco mosaic virus in 1890s lead to the discovery of the virus.
He found an infectious agent that acted as a micro-organism as it multiplied
yet which passed through the sterilizing filter as a nonmicrobe. By the 1910s
several viruses, defined as filterable infectious microbes, had been identified
as causing infectious disease (Hughes). However, the scientists were still
conceptually behind in defining a virus; they distinguished it only by size
from a bacteria and not as an obligate parasite with a distinct life cycle
dependent on infecting a host cell.

The influenza epidemic afforded the opportunity to research the etiological
agent and develop the idea of the virus. Experiments by Nicolle and Le Bailly in
Paris were the earliest suggestions that influenza was caused by a “filter-passing
virus,” (BMJ, 11/2/1918). They filtered out the bacteria from bronchial expectoration
of an influenza patient and injected the filtrate into the eyes and nose of two monkeys.
The monkeys developed a fever and a marked depression. The filtration was later
administered to a volunteer subcutaneously who developed typical signs of influenza.
They reasoned that the inoculated person developed influenza from the filtrate since
no one else in their quarters developed influenza (JAMA, 12/28/1918). These scientists
followed Koch’s postulates as they isolated the causal agent from patients with the
illness and used it to reproduce the same illness in animals. Through these studies,
the scientists proved that influenza was due to a submicroscopic infectious agent
and not a bacteria, refuting the claims of Pfeiffer and advancing virology. They were
on their way to discerning the virus and characterizing the orthomyxo viruses that
lead to the disease of influenza.

These scientific experiments which unravel the cause of influenza, had immediate
preventative applications. They would assist in the effort to create a effective
vaccine to prevent influenza. This was the ultimate goal of most studies, since
vaccines were thought to be the best preventative solution in the early 20th century.
Several experiments attempted to produce vaccines, each with a different
understanding of the etiology of fatal influenza infection. A Dr. Rosenow invented
a vaccine to target the multiple bacterial agents involved from the serum of patients.
He aimed to raise the immunity to against the bacteria, the “common causes of death,
“and not the cause of the initial symptoms by inoculating with the proportions found
in the lungs and sputum (JAMA, 1/4/1919). The vaccines made for the British forces
took a similar approach and were “mixed vaccines” of pneumococcus and
lethal streptococcus. The vaccine development therefore focused on the culture
results of what could be isolated from the sickest patients and lagged behind the
scientific progress.

Fading of the Pandemic:

In November, two months after the pandemic had erupted, the Public Health Service
began reporting that influenza cases were declining.

Communities slowly lifted their quarantines. Masks were discarded. Schools were
re-opened and citizens flocked to celebrate the end of World War I.

Communities and the disease continued to be a threat throughout the spring of 1919.

By the time the pandemic had ended, in the summer of 1919, nearly 675,000
Americans were dead from influenza. Hundred of thousands more were orphaned
and widowed.

The Legacy of the Pandemic

No one knows exactly how many people died during the 1918-1919 influenza
pandemic. During the 1920s, researchers estimated that 21.5 million people died
as a result of the 1918-1919 pandemic. More recent estimates have estimated
global mortality from the 1918-1919 pandemic at anywhere between 30 and 50
million. An estimated 675,000 Americans were among the dead.

Twentieth-Century Influenza Pandemics or Global Epidemics:

The pandemic which occurred in 1918-1919 was not the only influenza pandemic
of the twentieth century. Influenza returned in a pandemic form in 1957-1958
and, again, in 1968-1969. These two later pandemics were much less severe than the 1918-1919 pandemic.
Estimated deaths within the United States for these two later pandemics were 70,000 excess deaths (1957-1958) and 33,000 excess deaths (1968-1967).

Research, forgetting the pandemic of 1918-1919, scientific milestones, 20th century influenza or global pandemics.
The Influenza Pandemic occurred in three waves in the United States throughout
1918 and 1919.

More Americans died from influenza than died in World War I. [Credit: National Library of Medicine]

All of these deaths caused a severe disruption in the economy. Claims against life
insurance policies skyrocketed, with one insurance company reporting a 745 percent
rise in the number of claims made. Small businesses, many of which had been unable to operate during the pandemic, went bankrupt.

Joseph goldberger

Joseph goldberger

Joseph Goldberger, one of the leading researchers in the PHS, studied influenza
during the pandemic. But Goldberger had multiple interests and influenza research
became less important to him in the years following 1918. [Credit: Office of the Public
Health Service Historian]

In the summer and fall of 1919, Americans called for the government to research
both the causes and impact of the pandemic. In response, both the federal government
and private companies, such as Metropolitan Life Insurance, dedicated money
specifically for flu research.

In an attempt to determine the effect influenza had different communities, the Public
Health Service conducted several small epidemiological studies. These studies,
however, were conducted after the pandemic and most PHS officers
admitted that the data which was collected was probably inaccurate.

PHS scientists continued to search for the causative agent of influenza in their
laboratories as did their fellow scientists in and outside the United States.

But while there was a burst of enthusiasm for funding flu research in
1918- 1919, the funds allocated for this research were actually fairly meager.
As time passed, Americans became less interested in the pandemic and its
causes. And even when funding for medical research dramatically increased
after World War II, funding for research on the 1918-1919 pandemic remained
limited.

Forgetting the 1918-1919 Pandemic:

In the years following 1919, Americans seemed eager to forget the pandemic.
Given the devastating impact of the pandemic, the reasons for this forgetfulness
are puzzling.

It is possible, however, that the pandemic’s close association with World War I
may have caused this amnesia. While more people died from the pandemic than
from World War I, the war had lasted longer than the pandemic and caused
greater and more immediate changes in American society.

Influenza also hit communities quickly. Often it disappeared within a few weeks of
its arrival. As one historian put it, “the disease moved too fast, arrived, flourished
and was gone before…many people had time to fully realize just how great
was the danger.” Small wonder, then, that many Americans forgot about the
pandemic in the years which followed.

Scientific Milestones in Understanding and Preventing Influenza:

In the early stages of the pandemic, many scientists believed that the agent
responsible for influenza was Pfeiffer’s bacillus. Autopsies and research conducted
during the pandemic ultimately led many scientists to discard this theory.

In late October of 1918, some researchers began to argue that influenza was
caused by a virus. Although scientists had understood that viruses could cause
diseases for more than two decades, virology was still very much in its infancy at
this time.

It was not until 1933 that the influenza A virus, which causes almost every type
of endemic and pandemic influenza, was isolated. Seven years later, in 1940,
the influenza B virus was isolated. The influenza C virus was finally isolated in 1950.

Influenza vaccine was first introduced as a licensed product in the United States in
1944. Because of the rapid rate of mutation of the influenza virus, the
effectiveness of a given vaccine usually lasts for only a year or two.

By the 1950s, vaccine makers were able to prepare and routinely release vaccines
which could be used in the prevention or control of future pandemics. During the
1960s, increased understanding of the virus enabled scientists to develop both
more potent and purer vaccines.

Mass production of influenza vaccines continued, however, to require several
months lead time.

Twentieth-Century Influenza Pandemics or Global Epidemics:

The pandemic which occurred in 1918-1919 was not the only influenza pandemic
of the twentieth century. Influenza returned in a pandemic form in 1957-1958
and, again, in 1968-1969.

These two later pandemics were much less severe than the 1918-1919 pandemic.
Estimated deaths within the United States for these two later pandemics
were 70,000 excess deaths (1957-1958) and 33,000 excess deaths (1968-1967).

Tuberculosis

Mycobacterium tuberculosis was first discovered in 1882 by Robert Koch and is one of almost 200 mycobacterial species which have been detected by molecular techniques. The genus Actinobacteria (given its own family, the Mycobacteriaceae) includes pathogens known to cause serious diseases in mammals, including tuberculosis (MTBC) and leprosy (M. leprae). Mycobacteria are grouped neither as Gram-positive nor Gram-negative bacteria. MTBC consists of M. tuberculosis, M. bovis, M. bovis BCG (bacillus Calmette-Guérin), M. africanum M. caprae, M. microti, M. canettii and M. pinnipedii, all of which share genetic homology, with no significant variation between sequences (∼0.01 to 0.03%), although differences in phenotypes are present. Cells in the genus have a typical rod, or slightly curved-shape, with dimensions of 0.2 to 0.6 μm by 1 to 10 μm.

Mycobacterium tuberculosis has a waxy mycolic acid lipid complex coating on its cell surface. The cells are impervious to Gram staining, so a common staining procedure used is Ziehl-Neelsen (ZN) staining. The outer compartment of the cell wall contains lipid-linked polysaccharides, is water-soluble, and interacts with the immune system. The inner wall is impermeable. Mycobacteria have some unique qualities that are divergent from members of the Gram-positive group, such as the presence of mycolic acids in the cell wall.

MTBC and M. leprae replication occurs in the tissues of warm-blooded human hosts. This air-borne pathogen is transmitted from an active pulmonary tuberculosis patient by coughing. Droplet nuclei, approximately 1 to 5 μm in size “meander” in the air and are transmitted to susceptible individuals by inhalation. Mycobacteria are incapable of replicating in or on inanimate objects. The risk of infection is dependent to the load of the bacillus that has been inhaled, level of infectiousness, contact perimeter and the immune competency of potential hosts. Due to the size of the droplets inhaled into the lungs, the infection penetrates the defense systems of the bronchi and enters the terminal alveoli. Invading bacteria are then engulfed by alveolar macrophage and dendritic cells.

The cell-mediated immune response alleviates the multiplication of M. tuberculosis and halts infection. Infected individuals with strong immune systems are generally able to combat the infection within 2 to 8 weeks post-infection, when the active cell-mediated immune response stops further multiplication of M. tuberculosis. Tuberculosis infection shows several significant clinical manifestations in pulmonary and extra-pulmonary sites. Prolonged coughing, severe weigh-lost, night sweats, low-grade fever, dyspnoea and chest pain are clinical symptoms indicated from pulmonary infections.

Fort Bayard, N.M., T.B. service assignment

Fort Gayard, NM

Fort Gayard, NM

Fort Bayard, NM Post Hospital circa 1890

U.S. Army, General Hospital, Fort Bayard, New Mexico, General View,

U.S. Army, General Hospital, Fort Bayard, New Mexico, General View,

Tuberculosis, (Pvt.) Richard Johnson said, was “regarded as a much dreaded disease that was easily contracted by association.” In fact, so many hospital corpsmen requested transfers out that the Surgeon General established a policy that no such requests would be considered until after two years of service. Consequently, Johnson noted, “During my time there we had a high percentage of desertions.” For example, all four of the men who arrived with Johnson, within a year—“two of them,” he dryly observed, “owing me money.”

Four years later another young man arrived at Fort Bayard. He, too, remarked on the long journey by rail through the “desert waste of New Mexico,” and then the wagon ride over “dry desolate foothills,” to the post. But his reaction was different from Johnson’s. Capt. Earl Bruns moved from being a patient to a physician at the hospital. For Bruns Fort Bayard was “a veritable oasis in the desert, studded with shade trees, green lawns, shrubbery, and flowers.” He credited the hospital commander, Colonel (Col.) George E. Bushnell, writing that, “[i]n this one spot one man had made the desert bloom like a rose.”

Johnson’s and Bruns’ different views from 1904 and 1908, respectively, may reflect the fact that Johnson was healthy and assigned grudgingly to work at the tuberculosis hospital, whereas Bruns had few other options and came in hopes of regaining his health—or it may reflect the improvements Bushnell made during his first years in command. But every week for the more than twenty years that Fort Bayard was an Army tuberculosis hospital, workers and patients arrived with dread and foreboding, or joy and relief—or a mix of them all.

The approach Fort Bayard and George Bushnell took to tuberculosis was similar to how physicians manage the disease today in that it involved isolating the patient, treating the disease, and educating the patient and his family on how to maintain their health. The hospital offered patients sanctuary from the demands, fears, and prejudices regarding tuberculosis in the outside world. Fort Bayard treated tuberculosis patients with prolonged bed rest, fresh air, and a healthy diet, but undertaking this “rest treatment”—confining oneself to bed for months—proved difficult if not impossible for many patients. Fort Bayard involved patients’ adaptation to new lifestyles as people with tuberculosis. Finally, Fort Bayard managed patients’ transition back to the outside world.

One of the most striking aspects of Fort Bayard was that many of the medical staff had tuberculosis themselves, including George Bushnell. Tuberculosis weakened Bushnell’s lungs and shaped his life in numerous ways. He tired easily, had to carefully monitor his health, and as Earl Bruns observed, “was never a well man.” Bushnell had active tuberculosis five times in his life: the fourth time in 1919 with a breakdown from the strain of wartime work; and the fifth and the final illness in 1924 that lead to his death at age 70. In 1911 he advised his superiors that, “I did not consider myself strong enough to carry on the work of commanding this Hospital and keeping myself in condition for active duty.” The War Department generally required officers in poor physical condition to retire, but the Surgeon General secured a waiver for Bushnell, because “the interests of the service would suffer by his retirement.” After a leave of absence in 1909–10, Bushnell’s annual reports on the competency of his officers included his own name on the list of those competent for hospital duty, but “unfit for active field service.”

“What would our sanatorium movement and our anti-tuberculosis crusade amount to,” wrote tuberculosis expert Adolphus Knopf, “were it not for the labors of tuberculous physicians, or one-time tuberculous physicians, who, because of their infirmity, had become interested in tuberculosis?” Well-known leaders in the antituberculosis movement such as Edward Trudeau and Lawrence Flick established their sanatoriums after they recovered from tuberculosis in order to offer others the treatment. Twenty-one of the first thirty recipients of the Trudeau Medal, established in 1926 for outstanding work in tuberculosis, had the disease. James Waring, a tuberculosis physician who arrived at a Colorado Springs sanatorium on a stretcher in 1908, later wrote, “It has been my good fortune to serve three separate and extended ‘hitches’ as a ‘bed patient,’ the time so spent numbering in all about nine years.” He, like many physicians, saw his personal experience as an asset in his practice. The three key figures in the Army tuberculosis program during World War I were Bushnell, Bruns, and Gerald Webb of Colorado Springs who started a tuberculosis sanatorium after his wife died of the disease.

Bushnell turned tuberculosis into an asset for the Army Medical Department, making Fort Bayard a center of national expertise on the disease. His personal experience with chronic pulmonary tuberculosis gave him good rapport and credibility with many of his patients. Medical officer Earl Bruns wrote that, “[H]e went among the patients and talked to them individually” and thereby provided “a living example of a cure due to rational treatment.” Bruns described how Bushnell spent his days attending to patients, carrying out administrative duties, and devoted hours to supervising the work in the gardens and grounds of Fort Bayard.

(Who’s Who in America, 1924-25. E. H. Bruns in American Review of Tuberculosis, June 1925. 0. B. Webb in Outdoor Life, Sept. 1924. Lancet. Lond., 1924. Jour. Am. Med. Ass’n., 1924, p. 374.)

General George M. Sternberg

In addition to being an Army surgeon, Sternberg was also a noted bacteriologist who, in 1880, had translated Antoine Magnin’s The Bacteria, which presented the latest research in germ theory. Sternberg’s work contributed to preparing American understanding of Robert Koch’s pronouncement in 1882 of the existence of the tubercle bacillus (Ott 1996:55). Over the next two decades Koch’s analysis gained converts, leading to the universally accepted belief that tuberculosis represented a bacterium infection that could be diagnosed and then monitored by microscopic inspection of patient’s sputum.

Sternberg was no doubt aware of the efforts of Edward Livingston Trudeau. Beginning in the 1870s, when he undertook his own recovery from consumption by withdrawing to the Adirondack Mountains, Trudeau had become an advocate of extended bed rest in remote, healthful environments. Quickly accepting Koch’s research, Trudeau argued that those afflicted by the tubercle bacillus could best be healed when removed from cities and placed under the care of physicians who carefully monitored their weight and sputum and who prescribed constant bed rest with exposure to fresh air. Preferring the term “sanatorium,” derived from the Latin word “to heal,” to “sanitarium,” derived from the Latin term for health, Trudeau founded his Adirondack Cottage Sanatorium at Saranac, New York, in 1885. This spawned the opening of hundreds of similar institutions throughout the country (Caldwell 1988:70).

In 1899, Fort Bayard remained within the Army under the auspices of the Army Medical Department. The Army’s decision to retain the fort, even after it had outlived its military usefulness, grew from the strong interest that General George M. Sternberg, Surgeon General of the Army, had in pulmonary tuberculosis and its treatment.
Sternberg was also aware of the relatively good health that the Army’s soldiers had enjoyed serving in the higher elevations of the American West. Members of Zebulon Pike’s expedition of 1810 and of Fremont’s exploratory parties of the 1840s had witness their health improve while in the Rocky Mountains.

………………………………………………………………………………………………………………………………………………..

Upon assuming command in 1904, Bushnell, who had studied botany for years, immediately began to plant flowers, shrubs, and trees. When President Theodore Roosevelt created the Gila Forest Reserve in 1905, Bushnell ensured that Fort Bayard, which adjoined the Reserve, was part of a government reforestation project. The first year alone the Forest Service gave the hospital 250 seedlings of Himalayan cedar and yellow pine. Bushnell also got approval to fence in land for pasturing dairy cattle and arranged to recultivate long-neglected garden plots. The first year he predicted that the garden would generate “about 1300 dollars worth of produce.” After the quartermaster located an underground water source, Bushnell redoubled his cultivation efforts, planting trees, flowers, and grass to mitigate the wind and dust, and “to beautify the Post.” In later years Bushnell successfully grew beans from ancient cave dwellers (Anasazi beans), and made a less successful effort to grow Giant Sequoia from California.28 By 1910 Fort Bayard had four acres of vegetable gardens, a greenhouse, an orchard of 200 fruit trees, and alfalfa fields and hay fields for the dairy herd of 115 Holsteins, which the Silver City Enterprise proclaimed “one of the finest in the west.” The hospital also raised all of the beef consumed at the hospital (thereby avoiding Daniel Appel’s purchasing problems) and consumed pork at small expense by feeding the pigs the waste food. The hospital laboratory raised its own Belgian hares and guinea pigs for experiments.

Bushnell oversaw years of construction at Fort Bayard. In the wake of Florence Nightingale’s writings, nineteenth-century sanitation practices stressed cleanliness and ventilation, giving rise to pavilion style hospitals, narrow one- or two-story buildings lined with windows to provide patients with ample ventilation. In March 1904, Bushnell sent the Surgeon General plans for an “open court building” in modified pavilion style (Figure 2-1).

Plan for tuberculosis patient ward, as designed by George E. Bushnell, providing fresh air porches for each patient, United States Army Tuberculosis Hospital in New Mexico, .

Plan for tuberculosis patient ward, as designed by George E. Bushnell, providing fresh air porches for each patient, United States Army Tuberculosis Hospital in New Mexico, .

The building consisted of a quadrangle of long, narrow dressing rooms around an open court with porches along both the exterior and interior of the building. The rooms could be used for sleeping in inclement weather and the porches allowed patients to seek sun or shade as they wished. Wide doors enabled the easy movement of beds between the rooms and the porches. “The object of this style of building is to facilitate sleeping out of doors, which is now considered so important in modern sanatoria for the treatment of tuberculosis,” Bushnell explained.

The United States escaped the cauldron of WWI until April 1917. But after years of trying to maintain neutrality, President Woodrow Wilson’s administration mobilized the nation to fight in the most deadly enterprise the world had ever seen. Modern industrialized warfare would kill millions of soldiers, sailors, and civilians and unleash disease and famine across the globe. Typhus flourished in Eastern Europe and a lethal strain of influenza exploded out of the Western Front in 1918, producing one of the worst pandemics in history. Although eclipsed by such fierce epidemics, tuberculosis also fed on the war.

He was ordered to the office of The Surgeon General on June 2, 1917, and placed in charge of the Division of Internal Medicine and on June 13 there appeared S. G. O. Circular No. 20, Examinations for pulmonary tuberculosis in the military service, establishing a standard method of examination of the lungs for tuberculosis. Through his efforts a reexamination of all personnel already in the service was made by tuberculosis examiners and about 24,000 were rejected on that score. He had charge of the location, construction, and administration of all army tuberculosis hospitals, of which eight were built with a capacity of 8,000 patients.

With his relief from service in 1919 he took up his residence on a small farm at Bedford, Mass., where he prepared his Study of the Epidemiology of Tuberculosis (1920) and later Diseases of the Chest (1925) in collaboration with Dr. Joseph H. Pratt of Boston. As chief delegate of the National Tuberculosis Association he attended the first meeting of the International Union Against Tuberculosis in London in 1921. During the winter of 1922-23 he delivered a series of lectures on military medicine at Harvard University. In the summer of 1923 he moved to California and took up his residence at Pasadena.

………………………………………………………………………………………………………………………………………………..

In eighteen months the Selective Service registered twenty-five million men for the draft, examined ten million for military service, and enlisted more than four million soldiers, sailors, and Marines. To the dismay of many people, medical screening boards across the nation soon discovered that American men were not as strong and healthy as they had assumed. Of those eligible for military service, 30 percent were physically unfit; a number of them deemed ineligible to serve had tuberculosis. Therefore, in 1917 Surgeon General William Gorgas called George Bushnell to Washington, DC, to establish the Office of Tuberculosis in the Division of Internal Medicine, leaving Bushnell’s protégé, Earl Bruns, in charge of Fort Bayard. Given the Medical Department’s mission to maintain a strong and healthy fighting force, Bushnell’s new job was to minimize the incidence of tuberculosis among active-duty soldiers and avoid the high cost of disability pensions for men who incurred the disease during military service. It was a tall order.

Wartime tuberculosis had already received attention in 1916, when reports circulated that the French army had sent home 86,000 men with the disease, raising the specter that life in the trenches would generate hundreds of thousands of cases. One investigator found that tuberculosis rates in the British army were double those in peacetime, reversing the prewar downward trend. The head of the New York City Public Health Department, Hermann Biggs, declared that “tuberculosis
offers a problem of stupendous magnitude in France.” Subsequent studies revealed that only 20 percent or less of the French soldiers sent home with tuberculosis actually had the disease; others were either misdiagnosed or had had tuberculosis prior to entering the military and therefore had not contracted it in the trenches. The reports nevertheless galvanized public health officials to address the tuberculosis problem. The Rockefeller Foundation, for example, in cooperation with the American Red Cross, established a Commission for the Prevention of Tuberculosis in France to help the French and protect any Americans from contracting tuberculosis “over there.”

Bushnell established four “tuberculosis screens” by (1) examining all volunteers and draftees before enlistment, (2) checking recruits again in the training camps, (3) examining soldiers already in the Army for tuberculosis, and (4) screening military personnel at discharge to ensure they returned to civil life in sound condition. To implement these activities, Bushnell developed a protocol under which physicians could quickly examine men for tuberculosis as part of the larger physical examination process. He standardized the procedures for examinations throughout the Army, and crafted a narrow definition of what constituted a tuberculosis diagnosis to enable the Army to enlist as many young men as possible. Despite these efforts, soldiers developed active cases of tuberculosis throughout the war. Bushnell’s office also created eight more tuberculosis hospitals in the United States and designated three hospitals with the American Expeditionary Forces (AEF) in France to care for soldiers who developed active tuberculosis in the camps and trenches. Short of resources and knowledge, however, the Army Medical Department at times struggled just to provide beds for tuberculosis patients, let alone deliver the individual care Bushnell and his staff had provided at Fort Bayard before the war.

Overburdened medical personnel worked long hours, in often poor conditions. Thousands of tuberculosis patients resented the diagnosis and protested the conditions in which at times they were virtually warehoused. The draft, which brought millions of young men into government control and responsibility, also exposed the Army Medical Department to public scrutiny. Congress launched an investigation in 1919. World War I, which so dramatically changed the world, profoundly altered the Army’s tuberculosis program as well. It also challenged George Bushnell’s expertise. The Army’s tuberculosis expert had founded his policies on assumptions that, although widely held at the time, proved to be inaccurate and costly in lives and treasure. Wartime tuberculosis, therefore, shows the power of disease to overwhelm both knowledge and institutions.

Bushnell and his contemporaries were familiar with the concept of immunity and the power of vaccination, and the Army Medical Department vaccinated soldiers for smallpox and typhoid. Extending this concept of immunity to tuberculosis, medical officers differentiated between primary infection in childhood and secondary infection later in life. Observing that tuberculosis was often fatal for infants and young children, they reasoned that for survivors, an early infection of tuberculosis bacilli immunized a person against the disease later in life.
A “primary infection,” wrote Bushnell, gave a person some immunity, which “while not sufficient in many cases to prevent extension of disease [within the body]…is sufficient to counteract new infections from without.”8 In an article on “The Tuberculous Soldier,” the revered physician William Osler agreed. For years autopsies had uncovered healed tuberculosis lesions in people who had died in accidents or of other diseases. Although it was not known how many men between the ages of eighteen and forty harbored the tubercle bacillus, Osler wrote, “We do know that it is exceptional not to find a few [lesions] in the bodies of men between these ages dead of other diseases.” Thus, he argued, “In a majority of cases the germ enlists with the soldier. A few, very few, catch the disease in infected billets or barracks.”9 Bushnell reasoned if adults developed tuberculosis, “they do it on account of failure of their resistance.”

At one point Bushnell told the chief surgeon of the AEF, “Personally I have no fear of the contagion of tuberculosis between adults and see no reason why patients of this kind should not be treated in the ordinary hospital.” He asserted that the “really cruel persecution of the consumptive…through the fear that he will infect others, is based on what I must characterize as highly exaggerated notions of the danger of such infection.” This, too, was the prevailing view. Boston bacteriologist Edward O. Otis, who served as a medical officer during the war, wrote that “Undue fear of the communicability of pulmonary tuberculosis from one adult to another is unwarranted in the present state of our knowledge.”
Bushnell reasoned that if men infected with tuberculosis could indeed easily spread it to others, there would be much more tuberculosis in the Army than there was. British physician Leslie Murry, reasoned that although the crowded and damp conditions of trench warfare would have unfavorable effects on soldiers’ health, living outside with plenty of fresh air and good food and hygienic practices would improve their resistance to tuberculosis. Public health specialist George Thomas Palmer countered that although reactivation may not be higher in the military than in civil life, the United States had enough men without tuberculosis to bar anyone suspected of it from the military and thereby avoid an “added financial burden to the nation.” The challenge was to keep tuberculosis out of the Army and tuberculars off the disability rolls, but not to exclude so many men as to impair the nation’s ability to amass an army.

Bushnell’s views of tuberculosis immunity, contagion, interaction with military life, and the risk of overdiagnosis shaped the Army Medical Department programs for screening recruits. He knew he could not guarantee that all tuberculosis could be eliminated from the Army, but asserted that, “a sufficiently rigid selection of promising material in itself practically excludes tuberculosis.” In addition to enlisting the strongest men, Bushnell believed that a massive screening program would pay for itself by eliminating those who would later cost the government in medical services and disability benefits.

But the nation at war did not have the time or resources for the meticulous one-hour examination practiced at Fort Bayard, so Bushnell developed a protocol for civilian and military physicians to examine volunteers, draftees, trainees, and soldiers for tuberculosis in a matter of minutes. Circular No. 20 detailed how physicians should examine recruits, and became the single most important Army tuberculosis document during the war. The circular explained that the apices, or the tops of lungs, were the most common location for tuberculosis lesions, and that “the only trustworthy sign of activity in apical tuberculosis is the presence of persistent moist rales.” Circular No. 20 directed that “the presence of tubercle bacilli in the sputum is a cause for rejection,” and that “no examination for tuberculosis is complete without auscultation following a cough.” It recommended that a sputum sample “be coughed up in [the examiner’s] presence,” to ensure that it was actually from the examinee.

The last one-third of the document detailed X-ray examinations, summarizing eight different kinds of conditions that may appear and that would be grounds for rejection, and which conditions would not. By 1915, a Fort Bayard medical officer stated that X-ray technology “has become one of the most valued procedures in the diagnosis of pulmonary tuberculosis.” Medical officers F. E. Diemer and R. D. MacRae at Camp Lewis, Washington, argued in the pages of the JAMA that X-rays should be the primary diagnostic tool, not an “adjunct.” World War I ultimately, however, did encourage X-ray technology by revealing its power to thousands of physicians, stimulating the search for technical advances, and demonstrating the importance of specialization in reading X-rays. By the end of the war, the Army Medical Department had shipped to France hundreds of X-ray machines for use in Army hospitals and at the bedside, and developed various modes of X-ray equipment, including X-ray ambulances

Calculating that it would require 600 examiners for the screening process, the Medical Department turned to training general practitioners from civil life who knew little about tuberculosis. Bushnell’s office established a six-week tuberculosis course to prepare physicians. The first course at the Army Medical School in Washington, DC, was so popular that instructors offered it at several other training camps in the country. General Hospital No. 16, operating in conjunction with Yale Medical School, also offered a course on hospital administration to train medical officers to run tuberculosis hospitals.

Public health officials and the National Tuberculosis Association asked to be informed of any tuberculous individuals being sent to their communities, including the name and address of the “party assuming responsibility for such continued treatment and care.” The journal American Medicine published an article by British tuberculosis specialist Halliday Sutherland, who expressed concern that if men declined treatment and returned home they could spread tuberculosis to their families. He suggested that the U.S. Army retain men diagnosed with tuberculosis so that the government could provide treatment and discipline them if they resisted. Members of Congress opposed simply discharging men with tuberculosis. Representative Carl Hayden of Arizona argued that such men had given up their civilian lives upon induction into the Army, only to discover “that they were afflicted with a dread disease which prevents them from earning a livelihood.” He suggested that “some provision should be made for the care of such men until they are able to provide for themselves.”

While Bushnell’s policies succeeded in suppressing tuberculosis rates in the Army, the narrow definition of a tuberculosis diagnosis explicitly allowed men with healed lesions in their lungs to serve, and the rapid screening system caused some examiners to miss cases of active disease. Bushnell recognized that “a standard, though imperfect, is believed to be an indispensable adjunct in Army tuberculosis work not only to support the examiner but also to secure the necessary uniformity of practice in the matter of discharge for tuberculosis.” Nationwide, local draft boards and training camps rejected more than 88,000 men for tuberculosis, about 2.3 percent of the 3.8 million men examined. Postwar assessments calculated that of the more than two million soldiers who went to France to serve in the AEF, only 8,717 were evacuated with a diagnosis of tuberculosis, an incidence of only 0.4 percent.

In early 1918 a strep infection in the training camps in the United States caused medical officers to send hundreds of trainees to Army hospitals misdiagnosed with tuberculosis, crowding hospitals and generating paperwork and confusion. For a time, therefore, the Office of The Surgeon General ordered that no one should be discharged for tuberculosis from the training camps unless he had bacilli in his sputum—meaning the very severe cases. More than 50 percent of the patients being sent back to the United States from France with a diagnosis of tuberculosis did not actually have the disease. Bushnell viewed such overdiagnoses as “evil,” because it took men out of the AEF and overburdened tuberculosis hospitals and naval transports, which had to segregate suspected tuberculosis cases in isolation rooms or on open decks.

Faced with what he called “leaking” of soldiers from the AEF due to erroneous tuberculosis diagnoses, Bushnell turned to a specialist for assistance, Gerald B. Webb (Figure 4-3), from Colorado Springs.61 An Englishman by birth, Webb had married an American, and when she developed tuberculosis the couple traveled to Colorado Springs, Colorado, for treatment. His wife struggled with the disease for ten years until her death in 1903, and afterward Webb stayed on in Colorado Springs, remarrying and building a medical practice specializing in tuberculosis. In addition to his medical practice, Webb pioneered research into the body’s immune function, searched for a tuberculosis vaccine, and was a founder of the American Association of Immunologists (1913). Still somewhat bored in Colorado Springs, Webb volunteered for the Medical Corps soon after the United States declared war and helped organize and run tuberculosis screening boards at Camp Russell, Wyoming, and Camp Bowie, Texas. Bushnell
appointed him senior tuberculosis consultant for the AEF. After meeting with Bushnell in Washington and attending the Army War Course for senior officers at Columbia University, Webb sailed to France in March 1918.

Gerald B. Webb, World War I, Gerald B. Webb Papers.

Gerald B. Webb, World War I,

Gerald B. Webb, World War I,

Photograph courtesy of Special Collections, Tutt Library, Colorado College, Colorado Springs, Colorado.

Immunity in tuberculosis: Further experiments Unknown Binding – 1914

Webb instituted a screening process similar to that in the United States, distributing Circular No. 20, and preparing an illustrated version for medical officers in the field. He established a policy directing that only patients with sputum positive for tuberculosis should be sent back to the United States. Others would be tagged “tuberculosis observation” and sent to one of three hospitals designated as tuberculosis observation centers. There, specialists—Bushnell’s “good tuberculosis men”—would distinguish tuberculosis signs from other lung problems such as bronchitis and pneumonia, determining that he was free of disease, and send only patients who were indeed positive for tuberculosis back to the homeland.

Webb traveled to field and base hospitals throughout France. He would typically spend three days at a hospital, examining patients, leading conferences, giving lectures, and, according to his biographer, Helen Clapesattle, “preaching his gospel of fresh air and absolute rest.” He recruited a radiologist to teach the proper reading of X-ray plates, and advocated the early detection of tuberculosis, explaining, “Just as the wounded do better if they are got to the surgeons quickly, so the tuberculosis-wounded are more likely to recover if they are spotted and sent to the doctors early.”

In the 1930s, as Webb had concluded in 1919, scientists came to recognize that early tuberculosis infections did not provide protection and that adults could be reinfected with tuberculosis and develop active disease. In the meantime, with his AEF work done, in January 1919 Webb returned to his family and medical practice in Colorado Springs. The National Tuberculosis Association recognized Webb’s war work by electing him president in 1920, and Webb set the Association on a course of tuberculosis research on the immunity question and the standardization of X-ray diagnostics. He did not return to military service, but was a mentor for young physicians Esmond Long and James Waring, who would be leaders in the Army Medical Department’s tuberculosis program during the next war.

May 1941, as the United States stood on the brink of another world war, Benjamin Goldberg, president of the American College of Chest Physicians, recited some stunning figures at the association’s annual meeting in Cleveland, Ohio. He calculated that from 1919 to 1940 the Veterans Administration had admitted 293,761 tuberculosis patients to its hospitals. These patients had received government care and benefits for a total of 1,085,245 patient-years, at a cost of $1,185,914,489.56. Goldberg’s remarks reveal that although tuberculosis rates in the United States were declining 3 to 4 percent annually during the interwar years, the government’s burden to care for tuberculosis patients remained heavy. The Army was only three-quarters the size it was before World War I (131,000 versus 175,000 strength) and experienced no major epidemics, so that suicide and automobile accidents became the leading causes of death in the peacetime Army. Although hospital admissions of active duty personnel for tuberculosis declined during the decade, tuberculosis admissions at Fitzsimons Hospital in Denver remained constant due to a steady stream of patients who were veterans of the war. Tuberculosis, in fact, became a leading cause of disability discharges from the Army and, with nervous and mental disorders, generated the greatest amount of veterans’ benefits between the wars,

The story of tuberculosis in the Army after World War I, then, is one of increasing demand and decreasing resources, a dynamic that left Fitzsimons financially strapped even before the country entered the Great Depression. An examination of Fitzsimons’ postwar environment—the modern hospital and technology, the ever-changing landscape of veterans’ benefits, and new, invasive treatments for tuberculosis—illuminates these stresses.

President Franklin Delano Roosevelt proclaimed a “limited national emergency” on 8 September 1939, a week after Germany invaded Poland. But due to underfunding during the interwar period, one observer wrote that, “to prepare for war the Medical Department had to start almost from scratch.”1 Given the lean years of the 1920s and 1930s and the Army Medical Department’s policy of discharging officers with tuberculosis from duty, Surgeon General James C. Magee had to turn to the civilian sector for a tuberculosis expert. He recruited Esmond R. Long, M.D., Ph.D., director of the Henry Phipps Institute for the Study, Prevention and Treatment of Tuberculosis in Philadelphia. He could not have made a better choice. Long was also professor of pathology at the University of Pennsylvania, director of medical research for the National Tuberculosis Association, and the youngest person to be awarded the Trudeau Medal at age forty-two years (in 1932) for his tuberculosis research.2 He would now become the Army’s point man on the disease and stand at the front lines of the Medical Department’s struggle with tuberculosis beginning before Pearl Harbor to well after V-J (Victory-Japan) Day.

His mission to reduce the effect of tuberculosis on the Army differed from that of Colonel (Col.) George Bushnell in the previous war because disease was less of a threat. In fact, World War II would be the first war in which more American personnel died of battle wounds than of disease. Of 405,399 recorded fatalities, battle deaths outnumbered those from disease and nonbattle injuries more than two to one: 291,557 to 113,842.3 Malaria, sexually transmitted diseases, and respiratory infections did sicken millions of soldiers, sailors, Marines, and airmen, but most survived. Thanks in part to sulfa drugs and, beginning in 1943, penicillin to treat bacterial infections, the Army Medical Department had only 14,904 deaths of 14,998,369 disease admissions worldwide, a 0.1 percent death rate.4 Tuberculosis declined, too, representing only 1 percent of Army hospital admissions for diseases—1.2 per 1,000 cases per year—a rate much lower than the 12 per 1,000 cases per year during World War I. The Medical Department concluded that “tuberculosis was not a major cause of non-effectiveness during the war.”

But Sir Arthur S. McNalty, chief medical officer of the British Ministry of Health (1935–40), called tuberculosis “one of the camp followers of war.” War abetted tuberculosis, he explained, because of the “lowering of bodily resistance and increased physical or mental strain or both.”6 It also found fertile ground in crowded barracks and camps, and ran rampant in the World War II prison camps and Nazi concentration camps. And just one active case of tuberculosis per thousand in the Army meant thousands of tuberculosis sufferers among the 11 million Americans in uniform, each of whom consumed Medical Department resources: the average hospital stay per case during the war was 113 days.7

But if tuberculosis was a camp follower, Esmond Long (Figure 8-1) was a tuberculosis follower.8 He tracked it down, studied it, and tried to prevent its spread at every stage of American involvement in the war. With war looming in 1940, the National Research Council asked Long to chair the Division of Medical Sciences, Subcommittee for Tuberculosis, to advise the government on preventing and controlling tuberculosis in both civilian and military populations during war mobilization. Once the United States entered the war, Long received a commission as a colonel in the Medical Corps and moved his family from Philadelphia to Washington, DC. Working out of the Office of The Surgeon General, Long set up a screening process with the Selective Service to keep tuberculosis out of the Army and then traveled to more than ninety induction camps to ensure adherence to the procedures. He also oversaw the expansion of tuberculosis treatment facilities in the United States, inspected Fitzsimons and other Army tuberculosis hospitals, advised medical officers on treating patients, kept abreast of research developments in the labs, monitored outbreaks of tuberculosis in the theaters of war, and wrote articles for medical and lay periodicals to publicize the Army’s antituberculosis program.

In 1945 Long traveled to the European theater to inspect hospitals caring for tubercular refugees and liberated prisoners of war (POWs). There he saw the horrors of the concentration camps at Buchenwald and Dachau where Army medical personnel cared for thousands of former prisoners sick and dying of typhus, starvation, and tuberculosis. After the war Long organized the tuberculosis control program for the Allied occupation of Germany, and returned annually in the 1950s to assess its progress. He split his time between the Army Medical Department and the Veterans Administration (VA) to supervise the transition of the federal tuberculosis treatment program from the War Department to the VA. He also helped organize and evaluate the antibiotic trials, which ultimately led to an effective cure for tuberculosis. After returning to civilian life Long continued to study tuberculosis in the Army, and he wrote the key tuberculosis chapters for the Army Medical Department’s official history of the war.

With Long as a guide, this chapter shows how war once again served as handmaiden to disease around the globe. This time the Army Medical Department assumed not only national but international responsibilities for the control of tuberculosis in military and civilian populations, among friend and foe. Long and the Army Medical Department did succeed in demoting tuberculosis from the leading cause of disability discharge for American World War I personnel (13.5 percent of discharges), to thirteenth position during the years 1942–45 (1.9 percent of all discharges), behind conditions such as psychoneuroses, ulcers, respiratory diseases, arthritis, and other diseases.9 But this achievement required continued vigilance, an Army-wide surveillance program, and dedicated personnel and resources. The first step was to keep tuberculosis out of the Army.

After war broke out in Europe, Congress passed the National Defense Act of 1940, which established the first peacetime military draft in U.S. history, increasing Army strength eightfold from 210,000 in September 1939 to almost 1.7 million (1,686,403) by December 1941. This resulted in a 75 percent rise in the number of patients in military hospitals, straining the Medical Department, which had only seven general hospitals and 119 station hospitals in 1939.

Esmond R. Long, who directed the Army tuberculosis program during World War II.

Esmond R. Long, who directed the Army tuberculosis program during World War II.

Figure.. Esmond R. Long, who directed the Army tuberculosis program during World War II. Photograph courtesy of the National Library of Medicine, Image #B017302.

“Good Tuberculosis Men”

Soon appropriating freely, pledging “all of the resources of the country“ to meet the crisis, the War Department was constantly readjusting to meet the escalating emergency.

The National Research Council Committee on Medicine, Subcommittee on Tuberculosis, chaired by Long, met for the first time on 24 July 1940 and prioritized its responsibilities: first, develop recommendations on how to screen draft registrants for tuberculosis; second, screen civilians in federal service and wartime industries; third, figure out how to care for people rejected by the draft for the disease; and finally, help civilian and military agencies prepare for tuberculosis in war refugee populations. In its first nine-hour meeting, the subcommittee decided on centralized tuberculosis screening centers at 200 recruiting stations and generated a list of tuberculosis specialists nationwide to evaluate recruits and interpret X-rays at those centers. Subcommittee members stressed the importance of maintaining good records for processing any subsequent benefits claims and, most importantly, called for X-ray screening of all inductees—not just those who looked like they might have tuberculosis.

The War Department leadership initially rejected such comprehensive screening of inductees as expensive and time-consuming. The fact that tuberculosis death rates in the country had fallen two-thirds from 140 per 100,000 people in 1917 to 45 per 100,000 people in 1941, and in the Army from 4.6 per 1,000 in 1922 to 1.4 per 1,000 in 1940, may have led to complacency. But Long, his colleagues, and the national tuberculosis community, mindful of the cost to the nation in sickness, death, and disability benefits in the previous war, persisted. The American College of Chest Surgeons asked in July 1940, “Shall We Spread or Eliminate Tuberculosis in the Army?” and their president, Benjamin Goldberg, reported that the VA had spent almost $1.2 billion on tuberculosis patients through 1940. One medical officer calculated that 31 percent of all veterans who died as a result of World War I service and whose dependents received benefits, had died of tuberculosis. Even the lay press chimed in with a TIME magazine article, “TB Warning,” that stressed the importance of chest X-rays.16 Advocates pointed out that X-ray technology was more available and less expensive than in the previous war, and radiologists were more plentiful and skillful. They were also confident that new technology, such as the development of a lens that allowed the direct and rapid photography of a fluoroscopic image and new 4 x 5 inch films, which made storage and transport easier than that of the 11 x 14 inch films, rendered screening more practical than in 1917–18.

The Army Medical Department agreed with the National Research Council subcommittee. Since 1934 it had required X-rays for all Army personnel assigned overseas, but it had not yet convinced the War Department on universal screening. In June 1941, Brigadier General (Brig. Gen.) Charles Hillman, Chief, Office of The Surgeon General Professional Service Division, told the National Tuberculosis chairman, C. M. Hendricks, that “the desirability of routine X-rays had long been recognized by the Surgeon General’s Office,” but “considerations other than medical entered the picture and the character of induction

Camp Follower: Tuberculosis in World War II 277

Examinations had to be adapted to the limitations of time, place, and available equipment.” When Fitzsimons informed Hillman later that new recruits were arriving at the hospital with tuberculosis, he responded almost plaintively. “I am working with the Adjutant General to devise some method by which every volunteer for enlistment in the Regular Army will have a chest X-ray and serological test before acceptance.” He asked for all available evidence of sick recruits, explaining that “data on Regular Army men of short service now in Fitzsimons with tuberculosis will help me get the thing across.” As the data and advice accumulated, in January 1942, the Adjutant General required that all voluntary applicants and reenlisting men be given chest X-rays. Finally, on 15 March 1942, mobilization regulations made chest X-rays mandatory in all induction physicals.

With universal screening in place, Long, as chief of the tuberculosis branch in the Office of The Surgeon General, oversaw the screening process and faced a task similar to that of George Bushnell in 1917–18, finding that fine line between excluding as much tuberculosis as possible from the Army without rejecting too few or too many men. Conscious of his predecessor’s miscalculations, Long was careful not to criticize Bushnell’s tuberculosis program, at one point noting that World War I medical officers were “not to be reproached for not having knowledge that came into existence only later, any more than the chief of the Army air service in 1917 is to be reproached because more efficient airplanes are available now than then.”

The wartime emergency produced a public health campaign regarding tuberculosis and other disease threats. A War Department pamphlet, What Every Citizen Should Know about Wartime Medicine, presented the issue as one of maintaining troop health and limiting public costs. “The strenuous activity of soldiering is likely to cause extension of an incipient (early) tuberculous invasion of the lungs, or to precipitate the breakdown and reactivation of arrested cases,” it explained. Such illness could result in disability “and the necessity of providing long care of these patients in military hospitals where they must remain isolated from nontuberculous patients.” The Public Health Service also created a tuberculosis office to handle the expected increase in tuberculosis, and, as the National Research Council Subcommittee recommended, gave war industry workers chest examinations.

As military and civilian screening boards found thousands of people with active tuberculosis and sent many of them to tuberculosis sanatoriums and hospitals, they generated what a public health nurse referred to as “potentially the greatest case finding program that workers in tuberculosis control have ever known.” At the same time, however, war mobilization drew civilian medical personnel into the military, reducing staffing in home front institutions. Army medical personnel ultimately numbered more than 688,000, including 48,000 physicians in the Medical Corps, 14,000 dentists in the Dental Corps, and 56,000 nurses in the Army Nurse Corps—a large portion of the nation’s medical professionals.27 To maintain his nursing staff, VA Director Frank Hynes even asked the Army Nurse Corps in May 1942 not to hire VA nurses away from his hospitals.

Army tuberculosis rates during World War II, while lower than during World War I, did show a similar “U” curve with high rates at the beginning of the war as the Selective Service built up the military forces and cases that had eluded screening became active during training or combat (Figure 8-2). Tuberculosis rates fell as radiologists became more proficient at identifying tuberculosis infections, and then another sharp, higher increase in cases at the end of the war as discharge examinations found people who had developed active tuberculosis during their service. Postwar studies also revealed a seemingly paradoxical phenomenon that during the war military personnel serving overseas had lower tuberculosis rates than those serving in the United States, yet higher rates when they returned home.

chart-comparing-incidence-rates-of-tb-in-wwi-wwii

chart-comparing-incidence-rates-of-tb-in-wwi-wwii

Chart comparing the incidence curves of tuberculosis in the Army during World War I and World War II. From Esmond R. Long, “Tuberculosis,” in John Boyd Coates, Robert S. Anderson, and W. Paul Havens, eds., Internal Medicine in World War II, Medical Department, U.S. Army in World War II, vol. 2, Infectious Diseases (Washington, DC: Office of The Surgeon General, Department of the Army, 1961), chart 17, p. 335. Available at http://history.amedd.army.mil/booksdocs/wwii/infectiousdisvolii/chapter11chart17.pdf.

The Medical Department of the United States Army in the World War. Communicable and Other Diseases. Washington: U. S. Government Printing Office, 1928, vol. IX, pp. 171-202.
Letter, The Adjutant General, to Commanding Generals of all Corps Areas and Departments, 25 Oct. 1940, subject: Chest X-rays on Induction Examinations.
M. R. No. 1-9, Standards of Physical Examination During Mobilization, 31 Aug. 1940 and 15 Mar. 1942
Long, E. R.: Exclusion of Tuberculosis. Physical Standards for Induction and Appointment.[Official record.]

Long, E. R., and Stearns, W. II.: Physical Examination at Induction; Standards With Respect to Tuberculosis Induction and Their Application as Illustrated by a Review of 53,400 X-ray Films of Men in the Army of the United States. Radiology 41: 144-150, August 1943.
Long, Esmond R., and Jablon, Seymour: Tuberculosis in the Army of the United States in World War II. An Epidemiological Study with an Evaluation of X-ray Screening. Washington: U. S. Government Printing Office, 1955.

It is estimated that, before roentgen examination became mandatory (MR No. 1-9, 15 March 1942), one. million men had been accepted without this form of examination. Where roentgen examination was practiced, it resulted in a rejection rate of about 1 percent for tuberculosis. Applying this figure, it can be estimated that some 10,000 men were accepted who would have been rejected if they had been subjected to chest roentgen-ray study. Various studies have shown that approximately one-half of these would have been cases of active

http://history.amedd.army.mil/booksdocs/wwii/PM4/CH14.Tuberculosis.htm

Troops who developed tuberculosis were not discovered until their separation examinations, conducted when they were once again in the United States.

In the end, the screening process rejected 171,300 men for tuberculosis as the primary cause (thousands more had tuberculosis in addition to the disqualifying condition), and Long calculated that this saved the government millions of dollars in hospitalization costs. After the war, however, Long identified two factors that allowed tuberculous men into the Army: the failure to screen all inductees until March 1942, and the 4 x 5 inch stereoscopic (fluorographic) films, which were used in the interest of speed but which Long believed caused examiners to miss about 10 percent of minimal tuberculosis lesions in recruits. To better understand the latter problem he had two radiologists read the same X-rays and found substantial disagreement between their findings. Long therefore concluded that “if the induction films had each been read by two different radiologists, undoubtedly many more of the men who had tuberculosis at entry could have been excluded from service.” The Army ultimately discharged 15,387 enlisted men for tuberculosis during the war, which earned it thirteenth position as a cause of disability discharge.

American military forces fought in nine theaters of war—five in the Pacific and Asia, the other four in North Africa, the Mediterranean, Europe, and the Middle East. The Allies gave priority to defeating Germany and Italy in Europe beginning with operations in North Africa and the Mediterranean. After fighting in Tunisia in 1942–43, the Allies invaded Sicily on 10 July 1943, and moved up the Italian peninsula. By April 1944—in preparation for the D-Day invasion on 6 June 1944—the United States had more than 3 million soldiers in Europe, supported by 258,000 medical personnel managing a total of 318 hospitals with 252,050 beds. The war against Japan got off to a slower start as U.S. military forces developed the means to execute an island war across vast expanses of ocean. After fighting began in the Southwest Pacific, military forces grew from 62,500 troops in March 1942 to 670,000 in the summer of 1944 with 60,140 medical personnel. Even though military personnel developed tuberculosis in all of the nine theaters, the numbers were not high and tuberculosis was not a major military problem. In the Southwest Pacific theater, for example, only sixty-four of more than 40,000 hospital admissions were for the disease.

Tuberculosis was of the greatest consequence in the North Africa and Mediterranean theaters, in part due to poor screening early in the war, but also because, according to historian Charles Wiltse, it was the theater “in which the lessons of ground combat were learned by the Medical Department as much as by the line troops.” In general, medical personnel learned the importance of treating battle casualties as promptly as possible and keeping hospitals and clearing stations mobile and far forward to shorten evacuation and turnaround times. With regard to tuberculosis, the Medical Department had to relearn the World War I lesson of the importance of having skilled practitioners—or “good tuberculosis men”—in theater. They also ascertained which treatments were appropriate close to the battle lines and which were not, and when and how best to evacuate tubercular patients to the United States.

When soldiers with tuberculosis began to appear at Army medical stations in North Africa in late 1942, Major General (Maj. Gen.) Paul R. Hawley, chief of medical services for the European theater of operations, called for a tuberculosis specialist. On Long’s recommendation, Hawley appointed Col. Theodore Badger (Figure 8-3) as a senior consultant in tuberculosis on 2 January 1943. A professor of medicine at the Harvard School of Medicine, Badger had served in the Navy during World War I, and then attended Yale and Harvard where he earned his medical degree. Chief of medical service of the 5th General Hospital (GH), organized out of Harvard, Badger would play a role similar to that played by Gerald Webb during World War I—medical specialist, teacher, and troubleshooter.

Assessing the tuberculosis situation in the Mediterranean theater, Badger identified five hazards: (1) the development of active disease in American troops who had not been X-rayed upon induction; (2) association with British troops and civilians who had not been screened for tuberculosis; (3) drinking of nonpasteurized and possibly infected milk that could transmit tuberculosis; (4) battlefield conditions that could activate soldiers’ latent infections; and (5) the undetermined effects of other respiratory infections.41 Badger soon got the Army to use pasteurized milk and to establish X-ray centers with the proper equipment and trained staff, but he was not able to examine the thousands of American soldiers in the war zone. To gauge the extent of the tuberculosis problem he therefore arranged for a mobile X-ray unit to conduct spot surveys of troops in the field. Three examinations of some 3,000 troops each found only about 1 percent with signs of tuberculosis. To avoid losing manpower, Badger reported in mid-1943 that “up to the present time no individual has been removed from duty because of X-ray findings, and follow-up study has, so far, not indicated the necessity for it.” Badger planned to recheck those with suspicious films every few months to see if the signs had advanced. Badger recommended that patients with pleural effusion, the accumulation of fluid between the layers of the membranes that line the lungs and chest cavity that often indicates tuberculosis, be evacuated back to the United States. He also ended the practice of transporting some tuberculosis patients sitting up

. As the first true air war, World War II saw the introduction of air evacuation when Army aeromedical squadrons deployed in early 1943. After successful trials in the Pacific and North Africa, air evacuation increased so that during the Battle of the Bulge (1944–45), some patients arrived in U.S. hospitals within three days of being wounded. Some medical officers were concerned about the effects of transporting tuberculosis patients by air where they would be exposed to high speeds, jolting, and reduced air pressure. Tuberculosis specialists in New Mexico and Colorado therefore studied 143 white, male military patients, twenty-two-years old to twenty-eight-years old, with active tuberculosis flown to Army hospitals in nonpressurized air ambulances for any signs of trouble. Fearing the worst, they instead found that “severe discomfort, pulmonary hemorrhage, and spontaneous pneumothorax did not occur in the series either during or following the flight,” and concluded that air transport up to 10,000 feet was safe and preferable to time-consuming travel by water. By the end of the war the consensus was that rapid air evacuation to the United States also reduced the need to give a tuberculosis patient a pneumothorax in the field.

From the roof of Fitzsimons’ new building in April 1943, Rocky Mountain News reporter John Stephenson could see the Rocky Mountain Arsenal, the Denver Ordnance Plant, and Lowry Field, “places where the Army studies how to kill people.” But, he wrote, “The Army is merciful. It lets the right-hand of justice know what the left hand of mercy is doing at Fitzsimons General Hospital.” The largest Army hospital in the world, Fitzsimons had 322 buildings on 600 acres, paved streets with traffic lights, a post office, barbershop, pharmacy school, dental school, print shop, bakery, fire department, and chapel. It was, wrote Stephenson, “a city of 10,000.”61 No longer a liability, Fitzsimons was the pride of the Army Medical Department. One Army inspector reported that “it is apparent that no expense has been spared in this extraordinary building or in the general equipment and maintenance of the whole hospital plant.”62 As Congressman Lawrence Lewis had hoped, Fitzsimons’ mission now extended beyond caring for tuberculosis patients to meeting the general medical and surgical needs of the wider military community in the Denver region.

During the war the hospital maintained about 3,500 beds, reaching its highest daily patient population after the war—3,719 on 3 February 1946. The annual occupancy rate, calculated in patient days, increased from 603,683 in 1942 to a high of 1,097,760 for 1945, about 85 percent capacity.

With the reduction of tuberculosis in the Army over the years, the percentage of tuberculosis patients among all those at Fitzsimons had declined from 80 percent to 90 percent in the 1920s to 40 percent to 50 percent in the late 1930s. As the Army grew it now rose again. During the war Fitzsimons admitted more than 8,100 patients with tuberculosis. In fact, in 1943, only eighteen patients had battle injuries; the rest were in the hospital for illness and noncombat injuries. Unlike during the previous war, however, this Medical Department had a network of more than fifty veterans’ hospitals to which it could transfer patients too disabled by tuberculosis or other disease or injury to return to duty. Now, instead of allowing patients to stay in the service and receive the benefit of hospitalization with the hopes that they would recover and return to duty, the Medical Department discharged patients to VA hospitals as soon as they were determined to be unfit for military service, thereby reserving capacity for active-duty personnel. Maj. D. P. Greenlee had returned from a training course in penicillin therapy at Bushnell General Hospital in Utah to supervise the administration of the new drug on a variety of infections. He soon reported a cure rate of 93 percent. There were fewer victories in tuberculosis treatment.

During the war about one-quarter of all tuberculosis patients were treated with pneumothorax. During the war Fitzsimons surgeon Col. John B. Grow and other surgeons tried lung resection to treat tuberculosis, with few patient deaths. In 1946, however, when Grow’s staff contacted thirty patients who had had such surgery, they found that half of them were doing well, but three others had died, seven were seriously ill, and the rest were still under treatment. “It was felt that pulmonary resection in the presence of positive sputum was extremely hazardous and the indications were consequently narrowed down.”

Outside the operating rooms, the “City of 10,000” had a rich social life with people arriving at the post from all corners of the country. With Congressman Lewis’s acquisition of the School for Medical Technicians, Fitzsimons assumed the role of medical trainer, offering six- to twelve-week courses in technical training for dental, laboratory, X-ray, surgical, clinical, and pharmacy assistants. By 1946 the School had graduated more than 28,000 such technicians to serve around the world. The Women’s Army Corps arrived at Fitzsimons in February 1944 when 165 women attended the medical technicians school as part of the first coeducational class.74 Members of the Women’s Army Corps, rehabilitation aides, Education Department staff, dietitians, as well as nurses increased the female presence at Fitzsimons, as did activities of welfare organizations such as the Gold Star Mothers, the Red Cross, and the Junior League. Fitzsimons’ patients and staff also enjoyed visits from celebrities, including Jack Benny, Miss America, Gary Cooper, Dorothy Lamour, and other entertainers such as the big band leader Fred Waring and his Pennsylvanians, the Denver Symphony Orchestra, and an African American Methodist Church children’s choir from Denver. Like communities across the country, the hospital participated in war bond campaigns and had a huge war garden that produced thousands of ears of sweet corn and bushels of other vegetables.

Despite national mobilization and generous congressional funding, the Army could not escape the strain on its hospitals. By July 1944, Fitzsimons had reached capacity so the Medical Department designated two more hospitals as specialty centers for tuberculosis. Earl Bruns’ widow Caroline, who lived in Denver at the time, was no doubt pleased when the department named Bruns General Hospital in Santa Fe, New Mexico, in honor of her husband. Bruns along with Moore General Hospital in Swannanoa, North Carolina, cared for enlisted patients with minimal or suspected tuberculosis.

As Allied troops liberated France in 1944 and crossed into Germany they encountered thousands of refugees or “displaced persons”—escaped prisoners from Nazi concentration camps, exhausted and terrified Jews, slave laborers, political prisoners, Allied POWs, and other victims. The Nazi camps that held these people served as incubators for diseases such as tuberculosis and typhus, and the frightened, sick, and starved refugees inundated Army hospitals in late 1944 and early 1945. Theodore Badger reported one of the first waves that arrived on 18 December 1944 when 304 men, most of them Russians, came to the 50th GH in Commercy, France. They had been in the Nazi labor camps for the mines and heavy industries, where thousands died and survivors were malnourished and sick. All of the 304 had tuberculosis, 90 percent with moderate or advanced disease. Four were dead on arrival, eight more died in the first week, and one-third of the patients would die by May.96 Alarmed, Gen. Hawley, Chief Surgeon of the European Theater of Operations, ordered that all displaced civilians and recovered military personnel be examined for signs of tuberculosis “to establish the gravity of the situation.” The situation was dire. At one time the 46th GH had more than 1,000 tuberculosis patients, all recovered Allied POWs, causing Esmond Long to remark that the hospital “had the largest number of tuberculosis patients of any Army hospital in the world.”

The 46th GH from Portland, Oregon, which had cared for tuberculosis patients in the Mediterranean theater, also stood on the front lines of the tuberculosis problem in Europe. Serving at Besancon, France, the hospital would receive the Meritorious Service Unit Plaque and Col. J. G. Strohm, the commanding officer, the Bronze Star Medal for service during the liberation of France. During the spring of 1945, the 46th GH admitted 2,472 Russians, forty-one Poles, and 128 Yugoslav POWs and former slave laborers freed by American forces. The influx began on 12 March and within four days the 46th GH had admitted 1,200 such patients.

“The hospital staff was agast [sic] at the terrible physical condition of these people,” reported the hospital commander.99 When Badger visited the 46th GH in March 1945 he said the patients “constitute one of the most seriously affected groups with tuberculosis and malnutrition that I have ever seen,” explaining that most of them suffered “acute fulminating, rapidly fatal disease, mixed with chronic, slowly progressive, fibrotic tuberculosis. ”Medical personnel (Figure 8-4) cared for these patients as best they could, comforting many of them as they died. They began the rest treatment with some men but, as Badger reported, convincing Allied POWs to submit to absolute bed rest after months of confinement was “practically impossible.” Badger was able to report that after a month “those men who did not die of acute tuberculosis showed marked improvement.”

46th General Hospital nurses who cared for former prisoners of war.

46th General Hospital nurses who cared for former prisoners of war.

Figure 8-4. 46th General Hospital nurses who cared for former prisoners of war. Photograph courtesy of Oregon Health Sciences University, Historical Collections and Archives, Portland, Oregon.

26th Gen Hospital WWII, North Africa

26th Gen Hospital WWII, North Africa

In late 1944 Hawley requested 100,000 additional hospital beds for the displaced persons and POWs he expected to encounter after the German surrender, but Gen. George Marshall and Secretary of War Henry L. Stimson denied the request, believing they could not spare resources of that magnitude. The European Theater, they decided, must use German medical personnel and hospitals to care for the prisoners. Only after the war did American hospital units transfer their equipment and supplies to German civilians and Allies for their use.

The liberation of Europe also freed American POWs, who, not surprisingly, had higher rates of tuberculosis than other American military personnel. Captured British medical officer Capt. A. L. Cochrane cared for some of them in the prison where he was confined and noted sardonically that imprisonment was “an excellent place to study tuberculosis; [and] to learn the vast importance of food in human health and happiness.” German prison guards gave POWs only 1,000 to 1,500 calories per day, so Red Cross food parcels, which provided an additional 1,500 daily calories per person, were critical to preventing malnutrition and physical breakdown. Cochrane observed that the American and British POWs received the most parcels and had the lowest tuberculosis rates in the camp, while the Russians received nothing at all and had the highest rates. During the eighteen months that French POWs received the Red Cross parcels, he noted, just two men of 1,200 developed tuberculosis but when parcels for the French ceased to arrive in 1945, their tuberculosis rate rose to equal that of the Russians. The situation, he concluded, showed the “vast importance of nutrition in the incidence of tuberculosis.” Not all Americans got their parcels, though. William H. Balzer, with an American artillery unit, was captured in February 1943, and remembered how German guards stole the Americans’ packages.
Balzer survived imprisonment but never recovered from the ordeal. Severely disabled (70 percent), he died in 1960 on his forty-sixth birthday.

Exact tuberculosis rates among American POWs are not known because the rush of events surrounding the liberation of prisoners from German and Japanese control prevented a systematic X-ray survey. Rates did appear to be higher, though, for prisoners of the Japanese than for prisoners of the Germans. Long reported that about 0.6 percent of recovered troops from European POW camps had tuberculosis, whereas data from the Pacific theater suggested that 1 percent of recovered prisoners had tuberculosis. Moreover, an analysis of the chest X-rays done at West Coast debarkation hospitals revealed that 101 (or 2.7 percent) of 3,742 former POWs of the Japanese showed evidence of active tuberculosis. John R. Bumgarner was a tuberculosis ward officer at Sternberg General Hospital in Manila, the Philippines, before the war. A POW for forty-two months after the Japanese invasion, he described his experience in Parade of the Dead. Bumgarner did what he could to care for many of the 13,000 prisoners in the camp, but knew that “my patients were poorly diagnosed and poorly treated.” The narrow cots were so close together, he wrote, “the crowding and the breathing of air loaded with this bacilliary miasma from coughing ensured that those mistakenly segregated would be infected.”

Bumgarner was able to stay relatively healthy throughout his imprisonment. His luck ended, however, because “on my way home across the Pacific I had the first symptoms of tuberculosis.” Severe chest pain and subsequent X-rays at Letterman Hospital in San Francisco revealed active disease. “I had gone through more than four years of hell—now this!” Discharged on disability for tuberculosis in September 1946 he began to work at the Medical College of Virginia but soon had a lung hemorrhage. This time it took eight years of rest, with surgery and new antibiotic treatment for him to recover. By 1956, however, Bumgarner had married his sweetheart, Evelyn, and begun a medical career in cardiology that lasted for thirty years.

Tuberculosis continued to take its toll on POWs for years after the war. The VA followed POWs as a special group because, explained Long, of “the hardships that many of these men endured, and the notorious tendency for tuberculosis to make its appearance years after the acquisition of infection.” A follow-up study published in 1954 reported that for American POWs during the six years after liberation tuberculosis was the second highest cause of death, after accidents.

If the challenges Army medical personnel faced in caring for sick and starving POWs and refugees were unprecedented, the scale of disease and suffering they encountered in the Nazi concentration camps was almost unimaginable. Allied troops had heard about secret and deadly camps but were not prepared for what they found. As the Allies converged on Berlin from the East and the West, the Nazis evacuated thousands of prisoners—most of them Jews seized from across Europe, as well as POWs—to interior camps to hide their crimes and prevent the inmates from falling into Allied hands. These evacuations became death marches as SS (abbreviation of Schutzstaffel, which stood for “defense squadron”) guards beat and murdered people, and failed to feed them for days on end. Survivors were crowded into camps such as Buchenwald and Dachau making them even more chaotic and deadly. Americans, therefore, liberated camps that were riven with disease, especially typhus, tuberculosis, and malnutrition.

The Allies liberated Buchenwald on 11 April 1945. The following day the world learned that Franklin Roosevelt had died. Americans then liberated Dachau on 29 April, the day Italian partisans executed Mussolini in Milan, and the next day Hitler killed himself in his bunker. Dachau (Figure 8-5) had been the first of hundreds of concentration camps in the German Reich to which the Nazis sent political enemies, the disabled, people accused of socially deviant behavior, and, increasingly after the Kristallnacht pogroms of 1938, Jewish men, women, and children. In January1945 Dachau held 67,000 prisoners, but with troops of the Seventh U.S. Army approaching the SS began evacuating and killing prisoners. Capt. Marcus J. Smith, a medical officer in his thirties, arrived at Dachau on 30 April 1945, the day after liberation, part of a small team trained to treat persons displaced by the war. Horror greeted him outside the camp in a train of forty boxcars loaded with more than two thousand corpses. Smith called the frost that had formed on the bodies in the intense cold, “Nature’s shroud.” Inside Dachau he encountered more grotesque piles of naked, skeletal bodies of prisoners and scattered, mutilated bodies of German guards.

Dachau survivors gather by the moat to greet American liberators, 29 April 1945

Dachau survivors gather by the moat to greet American liberators, 29 April 1945

Figure 8-5. Dachau survivors gather by the moat to greet American liberators, 29 April 1945. Photograph courtesy of the United States Holocaust Memorial Museum, Washington, DC.
Smith found more than 30,000 prisoners, mostly Jews of forty nationalities, and all men except for about 300 women the SS had kept in a brothel. They were in desperate condition. Typhus and dysentery raged, at least half of the prisoners were starving, and hundreds had advanced tuberculosis. “The well, the sick, the dying, and the dead lie next to each other in these poorly ventilated, unheated, dark, stinking buildings,” Smith told his wife. The men were “malnourished and emaciated, their diseases in all stages of development: early, late, and terminal.” He wondered, “What am I going to write in my notebook?” and then started a list of needed supplies: clothes, shoes, socks, towels, bedding, beds, soap, toilet paper, more latrines, and new quarters. He almost despaired. “What are we going to do with the starving patients? How will we care for them without sterile bandages, gloves, bedpans, urinals, thermometers, and all the basic material? How do we manage without an organization? No interns, no nursing staff, no ambulances, no bathtubs, no laboratories, no charts, and no orderlies, no administrator, and no doctors.… I feel helpless and empty. I cannot think of anything like this in modern medical history.”

American efforts did prevent a deadly typhus epidemic from sweeping postwar Europe and helped contain tuberculosis rates in Germany, but the Nazis had created a human catastrophe so immense that even the most dedicated efforts would at times fall short.

Faced with horror on such a scale, Smith and other Army Medical Department personnel assigned to the concentration camps threw themselves into the work of cleansing, comforting, treating, and nurturing their patients. American commanders called in at least six Army evacuation hospitals (EH) to care for the sick and dying in the liberated camps. EH No. 116 and EH No. 127 began arriving at Dachau on 2 May with some forty medical officers, forty nurses, and 220 enlisted men. Consulting with Smith and his team, the units set up in the former SS guard barracks. They tore out partitions to create larger wards, scrubbed the walls and floors with Cresol solution, sprayed them with dichloro-diphenyl-trichloroethane (DDT), and then set up cots to create two hospitals of 1,200 beds each. Medical staff also discovered physician-prisoners who had cared for the sick and injured as well as they could, and could now advise and assist, and in some cases translate for the medical staff. In two days the hospitals were ready to admit patients by triage, segregating them by disease and prognosis. Laurence Ball, the EH No. 116 commander, noted that more than 900 patients had “two or more diseases, such as malnutrition, typhus, diarrhea, and tuberculosis.” Staff bathed and deloused them, gave them clean pajamas, and put them to bed.

Death by overeating was but one of the dangers that the prisoners faced. During May 1945, American hospitals at Dachau had more than 4,000 typhus patients and lost 2,226 to typhus and other diseases. Typhus, a rickettsial disease transmitted by body lice, had a mortality rate as high as 40 percent. With no medical cure, treatment consisted of supportive care—keeping patients clean and nourished—to mitigate effects of prolonged fever, such as the breakdown of tissue into gangrene. The Americans knew that typhus had taken three million lives in Eastern Europe after World War I, but now they had a means of prevention and better weapons—a typhus vaccine and DDT. On 2 May, the day the evacuation hospitals arrived, the commander of the Seventh Army imposed quarantines for typhus and tuberculosis, and summoned the U.S. Typhus Commission, which had controlled a typhus outbreak in Naples, Italy. A typhus team arrived the next day and began to immunize American personnel and dust them with DDT. On 7 May staff began to vaccinate inmates but kept typhus patients isolated for at least twenty-one days from the onset of illness to prevent transmission to others. This meant that the Americans did not immediately enter the inner camp barracks—the worst, most typhus-infested part of the camp—nor did they quickly relieve crowding there for fear of spreading typhus-bearing lice. It took over a week for personnel to prepare more spacious and clean quarters.

Smith wrote his lists, reported to his wife, and kept track of the daily death toll, finding comfort as the number of people who died daily fell from 200 during the first week to twenty by the end of May. Another medical officer performed autopsies. He chose ten of the dead bodies, five from the death train and five from the camp yard, to see what had caused their deaths. All had typhus and extreme malnutrition, eight had advanced tuberculosis, and some bodies had signs of fractures and head injuries.

Survivors in Dachau, 1 May 1945

Survivors in Dachau, 1 May 1945

By the end of May, conditions at Dachau had improved. Typhus was abating and American officials began to release groups of inmates by nationality. Beyond Dachau, the U.S. Typhus Commission tracked down new cases of typhus in civilian and military populations, deloused one million people, sprayed fifteen tons of DDT, and created a cordon sanitaire on the Rhine requiring all who crossed from Germany to be vaccinated and dusted to prevent the spread of disease. Thus the Army averted a broader typhus epidemic.138 The tuberculosis situation was more complicated and presented the Americans with a conundrum. What to do with thousands of people suffering from a long-term, infectious, and deadly disease?

As with the American POWs, tuberculosis continued to follow Dachau survivors into their new lives. Thousands of Jewish survivors emigrated to what would become the state of Israel. Fifteen years after liberation, the Israeli Minister of Health reported that although concentration camp survivors comprised only 25 percent of the population, they accounted for 65 percent of the tuberculosis cases in the country. Tuberculosis continued to thrive in Europe as well.

Historian Albert Cowdrey has credited the American actions with preventing a number of postwar scourges: “No one can prove that a great typhus epidemic, mass deaths of prisoners of war, or widespread outbreaks of disease among the German population would have taken place without the efforts of Army doctors of the field forces and the military government.” But, he continued, “conditions were ripe for such tragedies to occur, and Army medics brought both professional knowledge and military discipline to forestalling what might have been the last calamities of the war in Europe.” Thus, as usual, in public health the good news is no news at all.

Thousands of men survived the Vietnam War because of the quality of their hospital care. US hospitals in Vietnam were the best that could be deployed, incorporating several improvements from previous field hospitals. Army doctors were better trained, and they had good facilities at the semi-permanent base camps. As a result, more advanced surgical procedures were possible: more laparotomies, thoracotomies, vascular repairs (including even aortic and carotid repairs), advanced neurosurgery for head wounds, and other medical procedures. Blood transfusions were performed, with massive quantities of blood available for seriously wounded patients; some patients received as many as 50 units of blood. Advances in equipment resulted in the development of intensive care units with mechanical ventilators. There were far more medications available for particular diseases than in earlier conflicts.

With about 30 physicians assigned, the 12th could keep four or five operating tables going all day, and two or three all night. A common practice was delayed primary closure for wounds with a high likelihood of infection. Instead of stitching the wound closed immediately, dirt and contaminants were flushed out, bleeding was controlled, dead flesh was removed (debrided), the wound was packed with sterile gauze, and antibiotics were administered. For a few days the patient healed, while nurses changed the bandages and made sure the wound did
not get worse. Then doctors removed any remaining contaminants or dead flesh and stitched up the wound. This procedure reduced the incidence of infection compared to immediate wound closure at a risk of a larger scar.

In any given year in Vietnam, about one soldier in three was hospitalized for disease. The main causes for hospitalization were malaria, psychiatric problems, and ordinary fevers. Although many men fell sick, competent care was available and most recovered quickly and returned to duty.

The war spurred advances in surgery and medical trauma research. New surgical techniques allowed limbs that previously would have been amputated to remain functional. Nurse anesthetist Rosemary Sue Smith recalled the development of new blood-handling procedures:

We started separating blood into its components, because we were getting a lot of aggregates that were causing a lot of disseminated intravascular coagulopathy in patients, and causing a lot of blood clots, and pulmonary thrombosis, and a lot of ARDS, Adult Respiratory Distress Syndrome, which started in Da Nang and was called Da Nang Lung initially. It has developed into today being called Adult Respiratory Distress Syndrome, and they did a lot of research on this, and they were having us separate our blood into its components, into fresh frozen plasma and into platelets, and then we started doing blood tests to see which the patients would need. If their platelets were low, or if their blood clotting factors were low, we would just give them the particular products. We actually started breaking these products down and administering them in the Vietnam War, and it’s carried over into civilian life now. They’re used today in acute trauma to prevent disseminated intravascular coagulopathy and prevent Adult Respiratory Distress Syndrome on massive traumas that have to be naturally resuscitated with blood and blood products.

In the 1960s, intensive care was still quite new and the 12th had only one (later two) intensive care wards fully equipped and staffed. A key piece of equipment was the ventilator, then called “respirator.” Ventilators worked on pure oxygen until 1969, when research revealed physiological problems from prolonged breathing of pure oxygen. Early ventilators required considerable maintenance; valves needed frequent cleaning or the machines broke down.

Antibiotics were important because of the wide variety of bacteria and large number of penetrating wounds; in the face of a possible systemic infection (the development of sepsis), antibiotics were delivered through an IV. Nurse Rosie Parmeter recalled having to prepare antibiotics to be delivered through an IV several times a day for each patient, a necessary but time-consuming task.

About two-thirds of patients cared for by the 12th were US military; the other third were mainly Vietnamese but also included nonmilitary Americans and Free World Military Assistance Forces personnel. Staff regularly dealt with the Vietnamese, both military and civilian, enemy and friendly. There were wards set aside for enemy prisoners (who were stabilized, then transferred to hospitals at POW camps) and civilians. Wounded South Vietnamese Army soldiers were also stabilized and transferred to hospitals run by the Army of the Republic of Vietnam (ARVN). Civilian patients often stayed longer because the war swamped the available hospitals for Vietnamese civilians.

Through the years of the Vietnam War, US forces sustained 313,616 wounded in action; at peak strength, there were 26 American hospitals. The 12th Evacuation Hospital was at Cu Chi for 4 years and treated just over 37,000 patients. Records for the 12th are incomplete, but the average died-of-wounds rate in Vietnam was about 2.8% of patients who reached a hospital alive. Applied to the 12th, that rate amounted to about 1,036 patients, including prisoners and Vietnamese as well as Americans. But over 36,000 people survived and could return home because of the treatment they received at the 12th Evac.

Sources:

Fort Bayard,  by David Kammer, Establishment of Fort Bayard Army Post
http://newmexicohistory.org/places/fort-bayard
George Ensign Bushnell, Colonel, Medical Corps, U. S. Army
THE ARMY MEDICAL BULLETIN, NUMBER 50 (OCTOBER 1939)
http://history.amedd.army.mil/biographies/bushnell
Chapter One, The Early Years: Fort Bayard, New Mexico
http://www.cs.amedd.army.mil/borden/FileDownloadpublic.aspx?docid=332041d7-dbd2-4edf-823f-29a66c0b65ef
Dachau concentration camp (Wikipedia)
http://en.wikipedia.org/wiki/Dachau_concentration_camp
Office of Medical History – United States Army
Esmond R. Long, M. D., TUBERCULOSIS IN WORLD WAR I
Chapter 14 – Tuberculosis
http://history.amedd.army.mil/booksdocs/wwii/PM4/CH14.Tuberculosis.htm

Chapter Four, Tuberculosis in World War I
Chapter Five, “A Gigantic Task”: Treating and Paying for Tuberculosis in the Interwar Period
Chapter Six, “Good Tuberculosis Women”: Tuberculosis Nursing during the Interwar Period
Chapter Seven, Surviving the Great Depression: Fitzsimons and the New Deal
Chapter Eight, Camp Follower: Tuberculosis in World War II
http://www.cs.amedd.army.mil/FileDownload aspx?
Good Tuberculosis Men”: The Army Medical Department’s Struggle with Tuberculosis Carol R. Byerly
http://www.cs.amedd.army.mil/borden/FileDownloadpublic.aspx?docid=986faf8a-b833-46a8-a251-00f72c91da2f

The Global Distribution of Yellow Fever and Dengue
D.J. Rogers1, A.J. Wilson1, S.I. Hay1,2, and A.J. Graham1
Adv Parasitol. 2006 ; 62: 181–220. http://dx.doi.org:/10.1016/S0065-308X(05)62006-4.

http://www.historyofvaccines.org/content/timelines/yellow-fever

History of yellow fever
http://en.wikipedia.org/wiki/History_of_yellow_fever

Additional Reading:

Open Wound: The Tragic Obsession of William Beaumont.
Jason Karlawish

The Great Influenzs. John M. Barry.
Penguin. 2004.
Univ Mich Press. 2011.

Flu. The story of the great influenza pandemic of 1918 and
the search for the virus that caused it.
Gina Kolata.
Touchstone. 1999

Pestilence. A Medieval Tale of Plague.
Jeani Rector
The HorrorZime. 2012

Knife Man: The extraordinary life of John Hunter, Father of Modern Surgery
Wendy Moore.
Broadway Books. 2005

Hospital.
Julie Salamon.
Penguin Press. 2008.

Overdosed America.

John Abramson.
Harper. 2004.

Sick.
Jonathen Cohn.
Harper Collins. 2007.
.

Read Full Post »

Metabolomics, Metabonomics and Functional Nutrition: the next step in nutritional metabolism and biotherapeutics

Metabolomics, Metabonomics and Functional Nutrition: the next step in nutritional metabolism and biotherapeutics

Reviewer and Curator: Larry H. Bernstein, MD, FCAP 

 

The human genome is estimated to encode over 30,000 genes, and to be responsible for generating more than 100,000 functionally distinct proteins. Understanding the interrelationships among

  1. genes,
  2. gene products, and
  3. dietary habits

is fundamental to identifying those who will benefit most from or be placed at risk by intervention strategies.

Unraveling the multitude of

  • nutrigenomic,
  • proteomic, and
  • metabolomic patterns

that arise from the ingestion of foods or their

  • bioactive food components

will not be simple but is likely to provide insights into a tailored approach to diet and health. The use of new and innovative technologies, such as

  • microarrays,
  • RNA interference, and
  • nanotechnologies,

will provide needed insights into molecular targets for specific bioactive food components and

  • how they harmonize to influence individual phenotypes(1).

Nutrigenetics asks the question how individual genetic disposition, manifesting as

  • single nucleotide polymorphisms,
  • copy-number polymorphisms and
  • epigenetic phenomena,

affects susceptibility to diet.

Nutrigenomics addresses the inverse relationship, that is how diet influences

  • gene transcription,
  • protein expression and
  • metabolism.

A major methodological challenge and first pre-requisite of nutrigenomics is integrating

  • genomics (gene analysis),
  • transcriptomics (gene expression analysis),
  • proteomics (protein expression analysis) and
  • metabonomics (metabolite profiling)

to define a “healthy” phenotype. The long-term deliverable of nutrigenomics is personalised nutrition (2).

Science is beginning to understand how genetic variation and epigenetic events

  • alter requirements for, and responses to, nutrients (nutrigenomics).

At the same time, methods for profiling almost all of the products of metabolism in a single sample of blood or urine are being developed (metabolomics). Relations between

  • diet and nutrigenomic and metabolomic profiles and
  • between those profiles and health

have become important components of research that could change clinical practice in nutrition.

Most nutrition studies assume that all persons have average dietary requirements, and the studies often

  • do not plan for a large subset of subjects who differ in requirements for a nutrient.

Large variances in responses that occur when such a population exists

  • can result in statistical analyses that argue for a null effect.

If nutrition studies could better identify responders and differentiate them from nonresponders on the basis of nutrigenomic or metabolomic profiles,

  • the sensitivity to detect differences between groups could be greatly increased, and
  • the resulting dietary recommendations could be appropriately targeted (3).

In recent years, nutrition research has moved from classical epidemiology and physiology to molecular biology and genetics. Following this trend,

  • Nutrigenomics has emerged as a novel and multidisciplinary research field in nutritional science that
  • aims to elucidate how diet can influence human health.

It is already well known that bioactive food compounds can interact with genes affecting

  • transcription factors,
  • protein expression and
  • metabolite production.

The study of these complex interactions requires the development of

  • advanced analytical approaches combined with bioinformatics.

Thus, to carry out these studies

  • Transcriptomics,
  • Proteomics and
  • Metabolomics

approaches are employed together with an adequate integration of the information that they provide(4).

Metabonomics is a diagnostic tool for metabolic classification of individuals with the asset of quantitative, non-invasive analysis of easily accessible human body fluids such as urine, blood and saliva. This feature also applies to some extent to Proteomics, with the constraint that

  • the latter discipline is more complex in terms of composition and dynamic range of the sample.

Apart from addressing the most complex “Ome”, Proteomics represents

  • the only platform that delivers not only markers for disposition and efficacy
  • but also targets of intervention.

Application of integrated Omic technologies will drive the understanding of

  • interrelated pathways in healthy and pathological conditions and
  • will help to define molecular ‘switchboards’,
  • necessary to develop disease related biomarkers.

This will contribute to the development of new preventive and therapeutic strategies for both pharmacological and nutritional interventions (5).

Human health is affected by many factors. Diet and inherited genes play an important role. Food constituents,

  • including secondary metabolites of fruits and vegetables, may
  • interact directly with DNA via methylation and changes in expression profiles (mRNA, proteins)
  • which results in metabolite content changes.

Many studies have shown that

  • food constituents may affect human health and
  • the exact knowledge of genotypes and food constituent interactions with
  • both genes and proteins may delay or prevent the onset of diseases.

Many high throughput methods have been employed to get some insight into the whole process and several examples of successful research, namely in the field of genomics and transcriptomics, exist. Studies on epigenetics and RNome significance have been launched. Proteomics and metabolomics need to encompass large numbers of experiments and linked data. Due to the nature of the proteins, as well as due to the properties of various metabolites, experimental approaches require the use of

  • comprehensive high throughput methods and a sufficiency of analysed tissue or body fluids (6).

New experimental tools that investigate gene function at the subcellular, cellular, organ, organismal, and ecosystem level need to be developed. New bioinformatics tools to analyze and extract meaning

  • from increasingly systems-based datasets will need to be developed.

These will require, in part, creation of entirely new tools. An important and revolutionary aspect of “The 2010 Project”  is that it implicitly endorses

  • the allocation of resources to attempts to assign function to genes that have no known function.

This represents a significant departure from the common practice of defining and justifying a scientific goal based on the biological phenomena. The rationale for endorsing this radical change is that

  • for the first time it is feasible to envision a whole-systems approach to gene and protein function.

This whole-systems approach promises to be orders of magnitude more efficient than the conventional approach (7).

The Institute of Medicine recently convened a workshop to review the state of the various domains of nutritional genomics research and policy and to provide guidance for further development and translation of this knowledge into nutrition practice and policy (8). Nutritional genomics holds the promise to revolutionize both clinical and public health nutrition practice and facilitate the establishment of

(a) genome-informed nutrient and food-based dietary guidelines for disease prevention and healthful aging,

(b) individualized medical nutrition therapy for disease management, and

(c) better targeted public health nutrition interventions (including micronutrient fortification and supplementation) that

  • maximize benefit and minimize adverse outcomes within genetically diverse human populations.

As the field of nutritional genomics matures, which will include filling fundamental gaps in

  • knowledge of nutrient-genome interactions in health and disease and
  • demonstrating the potential benefits of customizing nutrition prescriptions based on genetics,
  • registered dietitians will be faced with the opportunity of making genetically driven dietary recommendations aimed at improving human health.

The new era of nutrition research translates empirical knowledge to evidence-based molecular science (9). Modern nutrition research focuses on

  • promoting health,
  • preventing or delaying the onset of disease,
  • optimizing performance, and
  • assessing risk.

Personalized nutrition is a conceptual analogue to personalized medicine and means adapting food to individual needs. Nutrigenomics and nutrigenetics

  • build the science foundation for understanding human variability in
  • preferences, requirements, and responses to diet and
  • may become the future tools for consumer assessment

motivated by personalized nutritional counseling for health maintenance and disease prevention.

The primary aim of ―omic‖ technologies is

  • the non-targeted identification of all gene products (transcripts, proteins, and metabolites) present in a specific biological sample.

By their nature, these technologies reveal unexpected properties of biological systems.

A second and more challenging aspect of ―omic‖ technologies is

  • the refined analysis of quantitative dynamics in biological systems (10).

For metabolomics, gas and liquid chromatography coupled to mass spectrometry are well suited for coping with

  • high sample numbers in reliable measurement times with respect to
  • both technical accuracy and the identification and quantitation of small-molecular-weight metabolites.

This potential is a prerequisite for the analysis of dynamic systems. Thus, metabolomics is a key technology for systems biology.

In modern nutrition research, mass spectrometry has developed into a tool

  • to assess health, sensory as well as quality and safety aspects of food.

In this review, we focus on health-related benefits of food components and, accordingly,

  • on biomarkers of exposure (bioavailability) and bioefficacy.

Current nutrition research focuses on unraveling the link between

  • dietary patterns,
  • individual foods or
  • food constituents and

the physiological effects at cellular, tissue and whole body level

  • after acute and chronic uptake.

The bioavailability of bioactive food constituents as well as dose-effect correlations are key information to understand

  • the impact of food on defined health outcomes.

Both strongly depend on appropriate analytical tools

  • to identify and quantify minute amounts of individual compounds in highly complex matrices–food or biological fluids–and
  • to monitor molecular changes in the body in a highly specific and sensitive manner.

Based on these requirements,

  • mass spectrometry has become the analytical method of choice
  • with broad applications throughout all areas of nutrition research (11).

Recent advances in high data-density analytical techniques offer unrivaled promise for improved medical diagnostics in the coming decade. Genomics, proteomics and metabonomics (as well as a whole slew of less well known ―omics‖ technologies) provide a detailed descriptor of each individual. Relating the large quantity of data on many different individuals to their current (and possibly even future) phenotype is a task not well suited to classical multivariate statistics. The datasets generated by ―omics‖ techniques very often violate the requirements for multiple regression. However, another statistical approach exists, which is already well established in areas such as medicinal chemistry and process control, but which is new to medical diagnostics, that can overcome these problems. This approach, called megavariate analysis (MVA),

  • has the potential to revolutionise medical diagnostics in a broad range of diseases.

It opens up the possibility of expert systems that can diagnose the presence of many different diseases simultaneously, and

  • even make exacting predictions about the future diseases an individual is likely to suffer from (12).

Cardiovascular diseases

Cardiovascular diseases are the leading cause of morbidity and mortality in Western countries. Although coronary thrombosis is the final event in acute coronary syndromes,

  • there is increasing evidence that inflammation also plays a role in development of atherosclerosis and its clinical manifestations, such as
  • myocardial infarction, stroke, and peripheral vascular disease.

The beneficial cardiovascular health effects of

  • diets rich in fruits and vegetables are in part mediated by their flavanol content.

This concept is supported by findings from small-scale intervention studies with surrogate endpoints including

  1. endothelium-dependent vasodilation,
  2. blood pressure,
  3. platelet function, and
  4. glucose tolerance.

Mechanistically, short term effects on endothelium-dependent vasodilation

  • following the consumption of flavanol-rich foods, as well as purified flavanols,
  • have been linked to an increased nitric oxide bioactivity.

The critical biological target(s) for flavanols have yet to be identified (13), but we are beginning to see over the horizon.

Nutritional sciences

Nutrition sciences apply

  1. transcriptomics,
  2. proteomics and
  3. metabolomics

to molecularly assess nutritional adaptations.

Transcriptomics can generate a

  • holistic overview on molecular changes to dietary interventions.

Proteomics is most challenging because of the higher complexity of proteomes as compared to transcriptomes and metabolomes. However, it delivers

  • not only markers but also
  • targets of intervention, such as
  • enzymes or transporters, and
  • it is the platform of choice for discovering bioactive food proteins and peptides.

Metabolomics is a tool for metabolic characterization of individuals and

  • can deliver metabolic endpoints possibly related to health or disease.

Omics in nutrition should be deployed in an integrated fashion to elucidate biomarkers

  • for defining an individual’s susceptibility to diet in nutritional interventions and
  • for assessing food ingredient efficacy (14).

The more elaborate tools offered by metabolomics opened the door to exploring an active role played by adipose tissue that is affected by diet, race, sex, and probably age and activity. When the multifactorial is brought into play, and the effect of changes in diet and activities studied we leave the study of metabolomics and enter the world of ―metabonomics‖. Adiponectin and adipokines arrive (15-22). We shall discuss ―adiposity‖ later.

Potential Applications of Metabolomics

Either individually or grouped as a profile, metabolites are detected by either

  • nuclear magnetic resonance spectroscopy or mass spectrometry.

There is potential for a multitude of uses of metabolome research, including

  1. the early detection and diagnosis of cancer and as
  2. both a predictive and pharmacodynamic marker of drug effect.

However, the knowledge regarding metabolomics, its technical challenges, and clinical applications is unappreciated

  • even though when used as a translational research tool,
  • it can provide a link between the laboratory and clinic.

Precise numbers of human metabolites is unknown, with estimates ranging from the thousands to tens of thousands. Metabolomics is a term that encompasses several types of analyses, including

(a) metabolic fingerprinting, which measures a subset of the whole profile with little differentiation or quantitation of metabolites;

(b) metabolic profiling, the quantitative study of a group of metabolites, known or unknown, within or associated with a particular metabolic pathway; and

(c) target isotope-based analysis, which focuses on a particular segment of the metabolome by analyzing

  • only a few selected metabolites that comprise a specific biochemical pathway.

 

Dynamic Construct of the –Omics

Dynamic Construct of the –Omics

 

Dynamic Construct of the –Omics

 

 

Iron metabolism – Anemia

Hepcidin is a key hormone governing mammalian iron homeostasis and may be directly or indirectly involved in the development of most iron deficiency/overload and inflammation-induced anemia. The anemia of chronic disease (ACD) is characterized by macrophage iron retention induced by cytokines and hepcidin regulation. Hepcidin controls cellular iron efflux on binding to the iron export protein ferroportin. While patients present with both ACD and iron deficiency anemia (ACD/IDA), the latter results from chronic blood loss. Iron retention during inflammation occurs in macrophages and the spleen, but not in the liver. In ACD, serum hepcidin concentrations are elevated, which is related to reduced duodenal and macrophage expression of ferroportin. Individuals with ACD/IDA have significantly lower hepcidin levels than ACD subjects. ACD/IDA patients, in contrast to ACD subjects, were able to absorb dietary iron from the gut and to mobilize iron from macrophages. Hepcidin elevation may affect iron transport in ACD and ACD/IDA and it is more responsive to iron demand with IDA than to inflammation. Hepcidin determination may aid in selecting appropriate therapy for these patients (23).

There is correlation between serum hepcidin, iron and inflammatory indicators associated with anemia of chronic disease (ACD), ACD, ACD concomitant iron-deficiency anemia (ACD/IDA), pure IDA and acute inflammation (AcI) patients. Hepcidin levels in anemia types were statistically different, from high to low: ACD, AcI > ACD/IDA > the control > IDA. Serum ferritin levels were significantly increased in ACD and AcI patients but were decreased significantly in ACD/IDA and IDA. Elevated serum EPO concentrations were found in ACD, ACD/IDA and IDA patients but not in AcI patients and the controls. A positive correlation exists between hepcidin and IL-6 levels only in ACD/IDA, AcI and the control groups. A positive correlation between hepcidin and ferritin was marked in the control group, while a negative correlation between hepcidin and ferritin was noted in IDA. The significant negative correlation between hepcidin expression and reticulocyte count was marked in both ACD/IDA and IDA groups. If the hepcidin role in pathogenesis of ACD, ACD/IDA and IDA, it could be a potential marker for detection and differentiation of these anemias (24).

Cancer

Because cancer cells are known to possess a highly unique metabolic phenotype, development of specific biomarkers in oncology is possible and might be used in identifying fingerprints, profiles, or signatures to detect the presence of cancer, determine prognosis, and/or assess the pharmacodynamic effects of therapy (25).

HDM2, a negative regulator of the tumor suppressor p53, is over-expressed in many cancers that retain wild-type p53. Consequently, the effectiveness of chemotherapies that induce p53 might be limited, and inhibitors of the HDM2–p53 interaction are being sought as tumor-selective drugs. A binding site within HDM2 has been dentified which can be blocked with peptides inducing p53 transcriptional activity. A recent report demonstrates the principle using drug-like small molecules that target HDM2 (26).

Obesity, CRP, interleukins, and chronic inflammatory disease

Elevated CRP levels and clinically raised CRP levels were present in 27.6% and 6.7% of the population, respectively. Both overweight (body mass index [BMI], 25-29.9 kg/m2) and obese (BMI, 30 kg/m2) persons were more likely to have elevated CRP levels than their normal-weight counterparts (BMI, <25 kg/m2). After adjusting for potential confounders, the odds ratio (OR) for elevated CRP was 2.13 for obese men and 6.21 for obese women. In addition, BMI was associated with clinically raised CRP levels in women, with an OR of 4.76 (95% CI, 3.42-6.61) for obese women. Waist-to-hip ratio was positively associated with both elevated and clinically raised CRP levels, independent of BMI. Restricting the analyses to young adults (aged 17-39 years) and excluding smokers, persons with inflammatory disease, cardiovascular disease, or diabetes mellitus and estrogen users did not change the main findings (27).

A study of C-reactive protein and interleukin-6 with measures of obesity and of chronic infection as their putative determinants related levels of C-reactive protein and interleukin-6 to markers of the insulin resistance syndrome and of endothelial dysfunction. Levels of C-reactive protein were significantly related to those of interleukin-6 (r=0.37, P<0.0005) and tumor necrosis factor-a (r=0.46, P<0.0001), and concentrations of C-reactive protein were related to insulin resistance as calculated from the homoeostasis model and to markers of endothelial dysfunction (plasma levels of von Willebrand factor, tissue plasminogen activator, and cellular fibronectin). A mean standard deviation score of levels of acute phase markers correlated closely with a similar score of insulin resistance syndrome variables (r=0.59, P<0.00005) and the data suggested that adipose tissue is an important determinant of a low level, chronic inflammatory state as reflected by levels of interleukin-6, tumor necrosis factor-a, and C-reactive protein (28).

A number of other studies have indicated the inflammatory ties of visceral obesity to adipose tissue metabolic profiles, suggesting a role in ―metabolic syndrome‖. There is now a concept of altered liver metabolism in ―non-alcoholic‖ fatty liver disease (NAFLD) progressing from steatosis to steatohepatitis (NASH) (31,32).

These unifying concepts were incomprehensible 50 years ago. It was only known that insulin is anabolic and that insulin deficiency (or resistance) would have consequences in the point of entry into the citric acid cycle, which generates 16 ATPs. In fat catabolism, triglycerides are hydrolyzed to break them into fatty acids and glycerol. In the liver the glycerol can be converted into glucose via dihydroxyacetone phosphate and glyceraldehyde-3-phosphate by way of gluconeogenesis. In the case of this cycle there is a tie in with both catabolism and anabolism.

 

TCA_reactions

TCA_reactions

 http://www.newworldencyclopedia.org/entry/Image:TCA_reactions.gif

 

For bypass of the Pyruvate Kinase reaction of Glycolysis, cleavage of 2 ~P bonds is required. The free energy change associated with cleavage of one ~P bond of ATP is insufficient to drive synthesis of phosphoenolpyruvate (PEP), since PEP has a higher negative G of phosphate hydrolysis than ATP.

The two enzymes that catalyze the reactions for bypass of the Pyruvate Kinase reaction are the following:

(a) Pyruvate Carboxylase (Gluconeogenesis) catalyzes:

pyruvate + HCO3 + ATP — oxaloacetate + ADP + Pi

(b) PEP Carboxykinase (Gluconeogenesis) catalyzes:

oxaloacetate + GTP — phosphoenolpyruvate + GDP + CO2

The concept of anomalies in the pathways with respect to diabetes was sketchy then, and there was much to be filled in. This has been substantially done, and is by no means complete. However, one can see how this comes into play with diabetic ketoacidosis accompanied by gluconeogenesis and in severe injury or sepsis with peripheral proteolysis to provide gluconeogenic precursors. The reprioritization of liver synthetic processes is also brought into play with the conundrum of protein-energy malnutrition.

The picture began to be filled in with the improvements in technology that emerged at the end of the 1980s with the ability to profile tissue and body fluids by NMR and by MS. There was already a good inkling of a relationship of type 2 diabetes to major indicators of CVD (29,30). And a long suspected relationship between obesity and type 2 diabetes was evident. But how did it tie together?

End Stage Renal Disease and Cardiovascular Risk

Mortality is markedly elevated in patients with end-stage renal disease. The leading cause of death is cardiovascular disease.

As renal function declines,

  • the prevalence of both malnutrition and cardiovascular disease increase.

Malnutrition and vascular disease correlate with the levels of

  • markers of inflammation in patients treated with dialysis and in those not yet on dialysis.

The causes of inflammation are likely to be multifactorial. CRP levels are associated with cardio-vascular risk in the general population.

The changes in endothelial cell function,

  • in plasma proteins, and
  • in lpiids in inflammation

are likely to be atherogenic.

That cardiovascular risk is inversely correlated with serum cholesterol in dialysis patients, suggests that

  • hyperlipidemia plays a minor role in the incidence of cardiovascular disease.

Hypoalbuminemia, ascribed to malnutrition, has been one of the most powerful risk factors that predict all-cause and cardiovascular mortality in dialysis patients. The presence of inflammation, as evidenced by increased levels of specific cytokines (interleukin-6 and tumor necrosis factor a) or acute-phase proteins (C-reactive protein and serum amyloid A), however, has been found to be associated with vascular disease in the general population as well as in dialysis patients. Patients have

  • loss of muscle mass and changes in plasma composition—decreases in serum albumin, prealbumin, and transferrin levels, also associated with malnutrition.

Inflammation alters

  • lipoprotein structure and function as well as
  • endothelial structure and function

to favor atherogenesis and increases

  • the concentration of atherogenic proteins in serum.

In addition, proinflammatory compounds, such as

  • advanced glycation end products, accumulate in renal failure, and
  • defense mechanisms against oxidative injury are reduced,

contributing to inflammation and to its effect on the vascular endothelium (33,34).

Endogenous copper can play an important role in postischemic reperfusion injury, a condition associated with endothelial cell activation and increased interleukin 8 (IL-8) production. Excessive endothelial IL-8 secreted during trauma, major surgery, and sepsis may contribute to the development of systemic inflammatory response syndrome (SIRS), adult respiratory distress syndrome (ARDS), and multiple organ failure (MOF). No previous reports have indicated that copper has a direct role in stimulating human endothelial IL-8 secretion. Copper did not stimulate secretion of other cytokines. Cu(II) appeared to be the primary copper ion responsible for the observed increase in IL-8 because a specific high-affinity Cu(II)-binding peptide, d-Asp-d-Ala-d-Hisd-Lys (d-DAHK), completely abolished this effect in a dose-dependent manner. These results suggest that Cu(II) may induce endothelial IL-8 by a mechanism independent of known Cu(I) generation of reactive oxygen species (35).

Blood coagulation plays a key role among numerous mediating systems that are activated in inflammation. Receptors of the PAR family serve as sensors of serine proteinases of the blood clotting system in the target cells involved in inflammation. Activation of PAR_1 by thrombin and of PAR_2 by factor Xa leads to a rapid expression and exposure on the membrane of endothelial cells of both adhesive proteins that mediate an acute inflammatory reaction and of the tissue factor that initiates the blood coagulation cascade. Other receptors that can modulate responses of the cells activated by proteinases through PAR receptors are also involved in the association of coagulation and inflammation together with the receptors of the PAR family. The presence of PAR receptors on mast cells is responsible for their reactivity to thrombin and factor Xa , essential to the inflammation and blood clotting processes (36).

The understanding of regulation of the inflammatory process in chronic inflammatory diseases is advancing.

Evidence consistently indicates that T-cells play a key role in initiating and perpetuating inflammation, not only via the production of soluble mediators but also via cell/cell contact interactions with a variety of cell types through membrane receptors and their ligands. Signalling through CD40 and CD40 ligand is a versatile pathway that is potently involved in all these processes. Many inflammatory genes relevant to atherosclerosis are influenced by the transcriptional regulator nuclear factor κ B (NFκB). In these events T-cells become activated by dendritic cells or inflammatory cytokines, and these T-cells activate, in turn, monocytes / macrophages, endothelial cells, smooth muscle cells and fibroblasts to produce pro-inflammatory cytokines, chemokines, the coagulation cascade in vivo, and finally matrix metalloproteinases, responsible for tissue destruction. Moreover, CD40 ligand at inflammatory sites stimulates fibroblasts and tissue monocyte/macrophage production of VEGF, leading to angiogenesis, which promotes and maintains the chronic inflammatory process.

NFκB plays a pivotal role in co-ordinating the expression of genes involved in the immune and inflammatory response, evoking tumor necrosis factor α (TNFα), chemokines such as monocyte chemoattractant protein-1 (MCP-1) and interleukin (IL)-8, matrix metalloproteinase enzymes (MMP), and genes involved in cell survival. A complex array of mechanisms, including T cell activation, leukocyte extravasation, tissue factor expression, MMP expression and activation, as well induction of cytokines and chemokines, implicated in atherosclerosis, are regulated by NFκB.

Expression of NFκB in the atherosclerotic milieu may have a number of potentially harmful consequences. IL-1 activates NFκB upregulating expression of MMP-1, -3, and -9. Oxidized LDL increases macrophage MMP-9, associated with increased nuclear binding of NFκB and AP-1. Expression of tissue factor, initiating the coagulation cascade, is regulated by NFκB. In atherosclerotic plaque cells, tissue factor antigen and activity were inhibited following over-expression of IκBα and dominant-negative IKK-2, but not by dominant negative IKK-1 or NIK. Tis supports the concept that activation of the ―canonical‖ pathway upregulates pro-thrombotic mediators involved in disease. Many of the cytokines and chemokines which have been detected in human atherosclerotic plaques are also regulated by NFκB. Over-expression of IκBα inhibits release of TNFα, IL-1, IL-6, and IL-8 in macrophages stimulated with LPS and CD40 ligand (CD40L). This report describes how NFκB activation upregulates major pro-inflammatory and pro-thrombotic mediators of atherosclerosis (37-41).

This review is both focused and comprehensive. The details of evolving methods are avoided in order to build the argument that a very rapid expansion of discovery has been evolving depicting disease, disease mechanisms, disease associations, metabolic biomarkers, study of effects of diet and diet modification, and opportunities for targeted drug development. The extent of future success will depend on the duration and strength of the developed interventions, and possibly the avoidance of dead end interventions that are unexpectedly bypassed. I anticipate the prospects for the interplay between genomics, metabolomics, metabonomics, and personalized medicine may be realized for several of the most common conditions worldwide within a few decades (42-44).

References

  1. Trujillo E, Davis C, Milner J. Nutrigenomics, proteomics, metabolomics, and the practice of dietetics. J Am Diet Assoc. 2006;106(3):403-13.
  2. Kussmann M, Raymond F, Affolter M. OMICS-driven biomarker discovery in nutrition and health. J Biotechnol. 2006;124(4):758-87.
  3. (Zeisel SH. Nutrigenomics and metabolomics will change clinical nutrition and public health practice: insights from studies on dietary requirements for choline. Am J Clin Nutr. 2007;86(3):542-8.
  4. García-Cañas V, Simó C, León C, Cifuentes A. Advances in Nutrigenomics research: novel and future analytical approaches to investigate the biological activity of natural compounds and food functions. J Pharm Biomed Anal. 2010;51(2):290-304.
  5. Kussmann M, Blum S. OMICS-derived targets for inflammatory gut disorders: opportunities for the development of nutrition related biomarkers. Endocr Metab Immune Disord Drug Targets. 2007;7(4):271-87.
  6. Ovesná J, Slabý O, Toussaint O, Kodícek M, et al. High throughput ‘omics’ approaches to assess the effects of phytochemicals in human health studies. Br J Nutr. 2008;99 E Suppl 1:ES127-34.
  7. Workshop Report: ―The 2010 Project‖. Chory J, Ecker JR, Briggs S, et al. A Blueprint for Understanding How Plants Are Built and How to Improve Them. Plant Physiology 2000;123:423–425, http://www.plantphysiol.org.
  8. Stover PJ, Caudill MA. Genetic and epigenetic contributions to human nutrition and health: managing genome-diet interactions. J Am Diet Assoc. 2008 Sep;108(9):1480-7.
  9. Kussmann M, Panchaud A, Affolter M.. Proteomics in nutrition: status quo and outlook for biomarkers and bioactives. J Proteome Res. 201;9(10):4876-87.
  10. Wolfram Weckwerth. Metabolomics in Systems Biology. Annual Review of Plant Biology 2003; 54: 669-689.
  11. Kussmann M, Affolter M, Nagy K, Holst B, Fay LB. Mass spectrometry in nutrition: understanding dietary health effects at the molecular level. Mass Spectrom Rev. 2007;26(6):727-50.
  12. Grainger DJ. Megavariate Statistics meets High Data-density Analytical Methods: The Future of Medical Diagnostics? IRTL Reviews 2003;1:1-6.
  13. Heiss; C, Keen CL, Kelm M. Flavanols and Cardiovascular Disease Prevention. European Heart Journal 2010;31(21):2583-2592.
  14. Kussmann M, Rezzi S, Daniel H. Profiling techniques in nutrition and health research. Curr Opin Biotechnol. 2008;19 (2):83-99.
  15. Ohashi N, Ito C, Fujikawa R, Yamamoto H, et al. The impact of visceral adipose tissue and high-molecular weight adiponectin on cardia-ankle vascular index in asymptomatic Japanese subjects. Metabolism 2009; 58:1023-9. [CAVI and VAT and HMW adiponectin levels];
  1. Zha JM, Di WJ, Zhu T, Xie T, et al. Comparison of gene transcription between subcutaneous and visceral adipose tissue in chinese adults. Endocr J 2009;56:934-44. [TLR4 signaling, 11 beta-HSD1 and GR levels in VAT];
  2. Albert L, Girola A, Gilardini L, Conti A, et al. Type 2 diabetes and metabolic syndrome are associated with increased expression of 11 beta-hydroxysteroid dehydrogenase 1 in obese subjects. Int J Obesity (Lond) 2007;31:1826-31;
  3. Fabbrini E, Markos F, Mohammed BS, Pietka T, et al. Intrahepatic fat, not visceral fat, is linked with metabolic complications of obesity. PNAS 2009;106:15430-5;
  4. Tong J, Boyko EJ, Utzschneider KM, McNeely MJ, et al. Intraabdominal fat accumulation predicts the development of the metabolic syndrome in non-diabetic Japanese-Americans. Diabetologia 2007;50:1156-60;
  5. Kim K, Valentine RJ, Shin Y, Gong K. Association of visceral adiposity and exercise participation with C- reactive protein, insulin resistance, and endothelial dysfunction in Korean healthy adults. Metabolism 2008;57:1181-9. [(VAT-EC exhibits a marked angiogenic and proinflammatory state];
  6. Villaret A, Galitzky J, Decaunes P, Exteve D, et al. Adipose tisue endothelial cells from obese human subjects: differences among depots in angiogenic, metabolic, and inflammatory gene expression and cellular senescence. Diabetes 2010;59:2755-63;
  7. van Dijk -, Feskens EJ, Bos MB, Hoelen DW, et al. A saturated fatty acid-rich diet induces an obesity-linked proinflammatory gene expression profile in adipose tissue of subjects at risk of metabolic syndrome. Am J Clin Nutr 2009;90:1656-64.[MUFA in LDL lowering].
  8. Theurl I, Aigner E, Theurl M, Nairz M, et al. Regulation of iron homeostasis in anemia of chronic disease and iron deficiency anemia: diagnostic and therapeutic implications. Blood. 2009;113(21):5277-86
  9. Cheng PP, Jiao XY, Wang XH, Lin JH, Cai YM. Hepcidin expression in anemia of chronic disease and concomitant iron-deficiency anemia. Clin Exp Med. 2010 May 25. [Epub ahead of print].
  10. Spratlin JL, Serkova NJ, and Eckhardt SG. Clinical Applications of Metabolomics in Oncology: A Review. Clin Cancer Res. 2009 ;15; 15(2): 431–440.
  11. Fischer PM, Lane DP. Small molecule inhibitors of thep53 suppressor HDM2: have protein-protein interactions come of age as drug targets? Trends in Pharm Sci 2004;25(7):343-346.
  12. Visser M, Bouter LM, McQuillan GM, Wener HM. Elevated C-Reactive Protein Levels in Overweight and Obese Adults. JAMA. 1999;282:2131-2135.
  13. Yudkin JS, Stehouwer CDA, Emeis JJ, Coppack SW. C-Reactive Protein in Healthy Subjects: Associations With Obesity, Insulin Resistance, and Endothelial Dysfunction : A Potential Role for Cytokines Originating From Adipose Tissue? Arterioscler. Thromb. Vasc. Biol. 1999; 19:972-978.
  14. Visvikis-Siest S, Siest G. The STANISLAS cohort: a 10-year followup of supposed healthy families. Gene-environment interactions, reference values and evaluation of biomarkers in prevention of cardiovascular diseases. Clin Chem Lab Med 2008;46:733-47.
  15. Schmidt MI, Duncan BB. Diabesity: an inflammatory metabolic condition. Clin Chem Lab Med 2003;41:1120-1130.
  16. Fenkci S, Rota S, Sabir N, Akdag B. Ultrasonographic and biochemical evaluation of visceral obesity in obese women with non-alcoholic fatty liver disease. Eur J Med Res 2007;12:68-73. (VAT, HOMA)
  17. Lee JW, Lee HR, Shim JY, Im JA, et al. Viscerally obese women with normal body weight have greater brachial-ankle pulse wave velocity than non viscerally obese women with excessive body weight. Clin Endocrinol (Oxf) 2007;66:572-8. [visceral obesity – high trigly, high baPWV, greater SFA and thigh SFA].
  18. Kaysen GE. The Microinflammatory State in Uremia: Causes and Potential Consequences. J Am Soc Nephrol 2001;12:1549–1557.
  19. Kaysen GE. Role of Inflammation and Its Treatment in ESRD Patients. Blood Purif 2002;20:70–80.
  20. Bar-Or D, Thomas GW, Yukl RL, Rael LT, et al. Copper stimulates the synthesis and release of interleukin-8 in human endothelial cells: a possible early role in systemic inflammatory responses. Shock 2003;20(2):154–158.
  21. Dugina TN, Kiseleva EV, Chistov IV, Umarova BA, and Strukova SM. Receptors of the PAR Family as a Link between Blood Coagulation and Inflammation. Biochemistry (Moscow), 2002; 67(1):65-74. [Translated from Biokhimiya 2002;67(1):77-87].
  22. Monaco C, Andreakos E, Kiriakidis S, Feldmann M, and and Ewa Paleolog. T-Cell-Mediated Signalling in Immune, Inflammatory and Angiogenic Processes: The Cascade of Events Leading to Inflammatory Diseases. Current Drug Targets – Inflammation & Allergy, 2004, 3, 35-42.
  23. Monaco C, Grosjean J, and Paleolog E. The role of the NFκB pathway in atherosclerosis. [E-mail: e.paleolog@imperial.ac.uk]
  24. Libby P, Ridker PM, and Maseri A. Inflammation and atherosclerosis. Circulation 2002;105:1135-43.
  25. Karin M, Yamamoto Y, Wang QM. The IKK NF-kappa B system: a treasure trove for drug development. Nat Rev Drug Discov 2004;3:17-26.
  26. Karin M, Ben-Neriah Y. Phosphorylation meets ubiquitination: the control of NF-[kappa]B activity. Annu Rev Immunol 2000;18:621-63.
  27. Lee DY, Bowen BP, and Northen TR. Mass spectrometry–based metabolomics, analysis of metabolite-protein interactions, and imaging. BioTechniques 2010;49:557-565.
  28. Faca V, Krasnoselsky A, and Hanash S. Innovative proteomic approaches for cancer biomarker discove.
  29. Sharp, P, and MIT faculty. ‘Convergence’ offers potential for revolutionary advance in biomedicine. The Third Revolution: Convergence of the Life Sciences, Physical Sciences and Engineering. White paper. Reported in Biotechnology Jan 5, 2011. [Convergence is a new paradigm that can yield critical advances in a broad array of sectors]

 

Read Full Post »