Plagues and the Paradox of Progress
Thomas J. Bollyky
Rating: 8.2
“Plagues and the Paradox of Progress provides a smart warning about our future without discounting the successes of our past.”
-Health Affairs
Contents
Plagues and parasites shaped migration patterns and the evolution of cities and states.
Viruses, bacteria, fungi, protozoa, and other microorganisms cause infectious diseases. These contagions influenced human history and shaped the development of cities, trade routes and wars.
Ancient hunter-gatherer societies circulated in smaller bands of 30 to 40 individuals. These foragers moved frequently and their tribes were scattered over wide areas, providing fewer opportunities for germs to spread. The establishment of villages and towns put more people in close contact, which made contagion more likely. The rates of infectious disease increased as groups of people shifted to growing crops and building settlements.
“No other development over the last 100 years – not world wars, not the internet, not the spread of democracy – has had as widespread and transformative an impact on the human experience as the decline in infectious diseases.”
The development of agriculture added to infectious disease hazards, but it also promoted better health. Farming led to larger quantities of available food. More food enabled more people to stay in one area for longer periods of time. People had time to develop immunity to local microbes. Staying in one place allowed them to care for their sick, store food, and develop and use heavy ceramic pots, which improved the safety of cooking.
However, as people domesticated animals, that also created opportunities for infectious diseases to develop. Herding placed large numbers of animals in close contact with humans. Researchers believe cattle, goats and sheep passed measles, diphtheria and rotavirus to humans. Pigs and poultry may have spread trichinosis and influenza; camels may be to blame for the common cold.
Farming also resulted in larger communities, which produced still more food and, thus, more waste. Garbage attracted rodents and insects, which spread yellow fever, toxoplasmosis and rabies. Herding produced ditches that held pools of rainwater – the perfect breeding ground for mosquitoes that can carry malaria and other diseases. Food storage provides more available nutrition, but improperly stored food can cause salmonella, botulism and more. As towns and cities developed, they brought more people into close contact, so contagious diseases spread more easily from person to person.
Contagions pose the greatest risk to children and unexposed populations.
When infectious diseases attack the immune system, those who lack previous exposure to a specific virus or bacterium suffer more severe reactions. When a disease doesn’t kill a person, he or she can become immune to it.
When Christopher Columbus arrived in the New World in 1492, its native population had never been exposed to smallpox, measles, cholera, influenza, or many other diseases. Infectious disease – mostly smallpox and measles – killed 75% of the native population within 200 years.
“Understanding the role of plagues and parasites in world affairs still provides key insights into the evolution of the state, the growth and geography of cities, the disparate fortunes of national economies, and the reasons why people migrate.”
Throughout history, isolated communities faced germs brought to them via exploration, trade and conquest. Children, in particular, have no prior immunity to disease agents, so such illnesses affect them strongly. At times in history when parents faced a high risk of losing children to rampant contagions, mothers gave birth to more children to compensate. Such factors shaped women’s societal roles and education rates.
Today, controlling infection depends on government action. Prevention is possible only when neighbors, communities and nations and their governments work together. Unified action matters because epidemics affect all facets of a community beyond its health, including its economic life and future growth.
Infectious diseases spread fastest in dense populations, and they have compelled governments to take action.
As towns formed, more people lived indoors and clustered closer together without direct sunlight or air circulation. Such damp and dark conditions can support tuberculosis, leprosy and influenza. More clusters of people led to more human waste, which can seep into the water supply or fester on the ground – leading to hookworm, cholera and schistosomiasis.
The growth of communities and increased trade among them led to epidemics of diseases in Europe, beginning in the 1330s. A plague known as the Black Death came out of Central Asia and spread along the Silk Road and sea routes. By 1353, a third of Europeans fell victim to the plague. It resurfaced several times over the next several centuries, presumably due to the density of fleas and rats that spread the disease.
In the 19th century, textile mill and factory jobs led American and European laborers to relocate to urban areas. Many workers lived in horrendous conditions in hastily constructed buildings with poor ventilation. The smoke and soot from the factories kept people inside, where they grew ill due to poor-quality air, water and sanitation.
“The role that malaria has played in history has been as more of a disease of colonies – a hindrance to exploration and settlement – than a help to military conquests.”
Public flush toilets first appeared in US and European cities in 1810. In 1856, New York City had one toilet for every 63 people. Wastewater overflowed into the streets along with trash, sewage and pollutants from tanneries and butcheries. Thousands of pigs, goats and dogs roamed the streets, feeding on garbage and filth. New York City’s 150,000 horses left 30 pounds (14 kg) of manure a day per horse on the streets during the 19th century.
Infectious diseases killed many migrants seeking work in European and American cities. Respiratory and waterborne infectious rates multiplied. Tuberculosis – also known as consumption – was the most frequent cause of death in industrialized countries during the 1800s. The discovery of pasteurization in 1865 helped reduce the bovine form of tuberculosis that spread among milk drinkers.
Cholera killed far fewer people than tuberculosis, but it attacked healthy people suddenly and dramatically. Public fear of cholera led people to organize local and national associations to promote sanitary practices, and many local governments authorized urban sewer systems. Improved infrastructure meant fewer people died.
Underdeveloped countries have only recently enjoyed the decrease in infectious disease and the increase in child survival rates that occurred in developed countries in the 1800s and early 1900s. By the middle 1900s, the urban development grew in poorer nations. Growth places people and resources close together, which facilitates the exchange of ideas, entrepreneurship and – unfortunately – the spread of disease.
“The public health bodies required to establish and maintain effective sanitation systems have continued to drive reductions in infectious disease and change the role of government in the daily lives of people.”
Cities enable people to assemble, share grievances and mobilize supporters. The fast growth rate, larger size and sprawling geography of poor cities increases the incidence of violence, instability and uprisings. In the underdeveloped world, poor cities are more likely emerge in countries with autocratic governments, but impoverished cities also pose a danger to dictators. The social disruption and violence in these cities “may not necessarily be a bad thing in the long run” since the social movements can eventually force democratic reform, improved government institutions and, subsequently, more healthful conditions.
Throughout history, contagious illnesses did tremendous damage with significant societal impact.
Malaria had a major impact on conquest, warfare and its outcome. It prevented the Han Dynasty (202 BC–221 AD) from expanding into China’s Yellow River flood plain and the Yangtze Valley. During the American Civil War, more than a million soldiers suffered from the disease. During World War I, malaria affected troop movements in Macedonia, East Africa, Mesopotamia and the Jordan Valley.
By the 1880s, better drainage systems reduced the number of mosquitoes in malarial breeding grounds. The US Public Health Service studied malaria control methods to determine the most cost-effective strategy. It also sought public support. Beginning in 1912, research showed the benefits – including increased worker productivity – of draining mosquito habitats.
Federal regulations and lawsuits required the US hydropower industry to change the design of dams to eliminate standing water. By the time the US Malaria Control in War Areas program – which became the US Centers for Disease Control and Prevention – implemented newly developed pesticides, malaria rates had dropped significantly due to drainage efforts. During the Great Depression of the 1930s, the United States counted more than a million malaria cases, but the nation eliminated the disease by 1952.
Vaccination campaigns have shown great success.
Shortly after English physician Edward Jenner’s discovery of a smallpox vaccine in 1796, the Spanish and British Empires began vaccination campaigns in poorer countries. British officials selected orphans as carriers to relay the smallpox vaccine to Bombay in their bodies. Children received the live vaccine by having it rubbed into a series of small cuts.
After an incubation period of nine to 10 days – during which they were transported to India – the children’s cuts erupted into lesions filled with fluid containing the live virus. Health care workers used the fluid to vaccinate other people. Subsequent vaccination campaigns used calves to transport the vaccine until the advent of heat-stable, freeze-dried versions decades later. The vaccination programs, which may have protected half a million people, generated goodwill, improved the productivity of the colonies’ workforce, and protected expat Spanish and English officials.
Measles are so contagious that you can catch it by breathing near someone who has it. Complications such as encephalopathy can turn measles deadly. Prior to the invention and deployment of the measles vaccine, thousands of American children died or were left mentally handicapped each year. The vaccine saves 1.5 million lives a year. The advent of vaccines for measles, mumps and rubella in the 1960s improved children’s lives in the Western hemisphere. Some opponents of the vaccines claim a connection between vaccination and autism, but that claim has been completely discredited. Some parents still forgo vaccinations, a degree of abstinence which caused a resurgence of measles in the United States, with the number of cases jumping from just 63 in 2010 to 667 in 2014.
As of 1974, fewer than 5% of children in lower-income nations had access to these vaccines. Measles and other preventable diseases were killing more than five million children per year. In 1982, the World Health Organization (WHO) and the United Nations International Children’s Emergency Fund (UNICEF) began the Child Survival Revolution program. The goal was to vaccinate 70% of the world’s children against the six major childhood diseases within an eight-year span, aiming to halve the number of children who died of them.
The vaccine program had to convince despots such as Jean-Claude “Baby Doc” Duvalier of Haiti and Mengistu Haile Mariam of Ethiopia that its campaign would make them popular. In conjunction with additional UNICEF programs promoting clean water and good nutrition, the campaign met its goals, saving the lives of an estimated 25 million children.
Meningitis is an acute brain and spinal cord inflammation that thrives almost exclusively in some climates where the world’s poorest people live – especially in West Africa. The cold winter winds cause people to stay indoors, crowded into shelters. The dry climate parches throats, and the type A Neisseria meningitidis bacterium spreads easily.
“All this progress that has been achieved against plagues and parasites is one of the great achievements of humankind, but it is also spurring new and daunting challenges.”
In the late 2000s, the World Health Organization, the US government, and the Bill and Melinda Gates Foundation supported efforts to conquer meningitis and other “neglected diseases” that ravage the world’s poorest people. In 2010, PATH – an NGO – developed a type A Neisseria meningitidis vaccine and, by 2014, health care workers had immunized some 215 million people in West and Central Africa. Since then, no new cases of type A meningitis have been reported among those vaccinated.
In the 20th century, HIV/AIDS became a foreign policy priority.
From the beginning of the 1980s until the end of 2016, the HIV epidemic led to 35 million deaths worldwide. The majority of cases occurred in Africa. In 1998, controversy erupted as the public became aware that although the world’s wealthier countries had access to AIDS treatment, many people in sub-Saharan Africa didn’t.
“The risk of dangerous disease events is ever-present with the continued emergence of new plagues such as HIV/AIDS and the evolution of existing microbes, especially influenza.”
Public investment in the health of people in developing countries increased. Charities and governments raised billions of dollars to research, develop, and distribute medicines to the world’s poorest people. Competition, public protests against the high cost of treatment, and drug companies’ voluntary price cuts drastically reduced the price of the antiretroviral medications now helping bring AIDS under control.