6. How Does Technology Change Us?

We are mirrored by our machines…
we mirror our machines.
The question is not whether
we should let them change us,
but whether we are to be lifted up
or dragged down in the process.

John Lienhard

The pilot of the small propeller plane empties his bottle of Coca Cola and drops it out the window. It whistles briskly down to the barren and unpopulated Kalahari Desert in Africa…unpopulated except for the Bushmen.

These pygmies, with their distinctive language of clicks and whistles nearly irreproducible by Western mouths, find the glass bottle intact and presume it a gift from the gods. The bottle is unlike anything they have ever seen. Clear like water, yet harder than any of their sticks or bones (their territory has no stones), the bottle becomes immediately useful in grinding seeds and grains for food. It also whistles when blown across, helps to stretch hides, and imprints perfectly round circles of colored pigment as decoration on hides. The new technology changes how they prepare food, play, and create art.

But while they have multiple bows, arrows, hides, and other possessions, they have only the one bottle. In a tribe that shares everything, this scarcity inspires competition, jealousy, and violence previously unknown. Convinced that a gift that causes discord must be a mistake, the tribe’s leader throws it back to the gods. The gods apparently refuse it, allowing it to fall back to earth, accidentally striking a child on the head. Since the gods won’t accept it back, the tribe’s leader buries it. Unfortunately, the scents they have left on it attract hyenas that dig it up, leaving it in plain view.

This story of how technology changes us comes from the movie The Gods Must Be Crazy. But technology need not be as novel as a Coke bottle in the Kalahari to have profound effect on us. Throughout history, it has changed our lifespan, work, thought process, and the very nature of our species:

  • Improving nutrition, enhancing sanitation, and fighting disease, technology has extended human lifespan from 20 to 30 years in prehistory to about 78 in the modern developed world. It has also left us more vulnerable to disease by concentrating us into cities, providing us more calories than we need, and allowing us to become sedentary. Still, the overall trend has been toward longer lives, and that has changed how we lead them.
  • The planting stick and the plow changed how we worked to feed ourselves. The surplus of agriculture allowed specialization, leading to occupations beyond hunting, gathering, or farming. Technology has made some jobs obsolete even as it has created new ones. Cars and trucks evaporated demand for blacksmiths and a host of horse-related positions, while employing mechanics, engineers, oil riggers, and car washers.
  • Centuries ago, what people considered “fact” was only what they or close friends had witnessed. That changed with the printing of books and newspapers. Now our perception of reality is also influenced by 24-hour television. Through technology we also find out how we are doing and if we should be worried. Drugs such as methamphetamine that simulate the sense of accomplishment can further modify our perception of reality. Future technology may incorporate that mechanism into a dangerously seductive virtual reality.
  • In a world governed by “survival of the fittest,” technology has changed what “fittest” means and, so, it has affected the evolution of our species. Those who threw spears accurately or worked within the growing social network tended to survive and pass their genes along. Eyeglasses and medical treatments made many physical conditions irrelevant in determining who could propagate their genes. With genetic engineering, changes to our species may no longer be slow. Physically incorporating technology into our bodies may have even greater impact.

Technology is part of our environment, and as our environment changes, so do we. In the previous chapter, we looked at how technology changes as if we were able to stand outside its field of influence. But the observer and the observed are inseparable, and part of our understanding of technology must include how it changes us. We start with investigating lifespan and what technology has done to change it.

Ever since we crawled out of the ocean

and stood upright on the land

There are some things that we just don’t understand:

Relieve all pain and suffering

and lift us out of the dark

Turn us all into Methuselah—

But where we gonna park?

Don Henley

Methuselah’s Burden

The patriarch Methuselah was reputed to have lived 969 years. If true, that would have been much longer than the 20 to 30 year average in human prehistory. Since then, average lifespan has greatly increased, with technology both helping and hurting its progress. If we continue this trend, we may create a world where Methuselah would be average. Some of the technological effects on human lifespan:

  • Agricultural technology (harvesting sickles, scythes, plows, irrigation, crop rotation, chemical fertilizers, selectively-bred plants, pesticides) created denser populations.
  • Transportation technology (trucks, trains, ships, airplanes, telecommunication) connected populations, creating a ripe environment for infectious and parasitic diseases.
  • We combated those diseases with advances in sanitation and medicine. Water treatment, vaccines, and a host of related technologies greatly increased average lifespan, subjecting us to diseases of old age.
  • New technology addresses the degenerative afflictions of the old: circulatory diseases, cancer, diabetes, and the general breakdown and mutation of our cells.
  • Technology still in the conceptual stage could augment our natural immune and regeneration systems to the point of giving us near immortality.

Start with agriculture. About 12,000 years ago, humans used sickles made with flint blades and stone handles to harvest naturally occurring grains. Since then, we have made our environment increasingly specialized, changing an ecosystem that happened to support humans into one designed exclusively for feeding them.

The yield of food from a given area shows this progress. Slash and burn agriculture with primitive implements, such as the scythe, produces 225 to 450 kilograms of cereal grain per hectare of land. This approach was typical of Northern Europe during the Bronze and early Iron Age. Add animals for plowing and slaves for weeding and the harvest increases to 500 to 750. Add the heavy plow, which dug deeper than earlier plows (turning the soil over instead of just parting it), to produce 600 to 900.

And when you have soil so light that even a curved stick will turn it (such as irrigated desert agriculture), that produces 1200 or more kilograms of cereal grain per hectare. Sometimes much more: the Nile River basin produced as much as 2500 kilograms of cereal grain per hectare of land during Roman control. The key there was fertilizer: the Nile brought potash from the Abyssinian Plateau in Ethiopia and decomposing vegetation from Lake Victoria all the way down and through Lake Nasser.

Where soil had no such natural mechanism for fertilization, people developed techniques such as crop rotation. Planting nitrogen-fixing legumes after a grain harvest restored the soil so that it could support another planting of grain. In England this produced 1500 kilograms of cereal grain per hectare of land.

To do better than that required nitrate fertilizers, which allow a hectare to produce 2000 to 3000. Breeding of new varieties of cereal crops that can absorb more fertilizer (without becoming over stimulated) can produce 3000 to 6000 and more.

Specialization of ecosystems in the east paralleled the west. In Japan, rice production advanced from less than 1300 kilograms per hectare under primitive farming to well over 5000 using irrigation and other technical innovations of the present day. And now the world is one, with pesticides and genetically modified crops available—if not necessarily affordable—to all. Boats, airplanes, trucks, and trains move these agricultural technologies as well as the foods they help produce around the globe. Between hothouses and the global transport network, fresh fruits and vegetables are available year round.

Increasing food production from several hundred kilograms per hectare to more than 30 times that has led to population increases of even greater proportions. Population not only increased, but also became much denser because farming surplus allowed people to congregate in cities, specializing in non-farming activity.

Thousands of years ago, when all this was first happening, we could not have known that urban concentrations would be attractive environments for infectious disease and, in particular, epidemics. Now we know three reasons they are: sanitation, density, and size.

1. For most of the time we have had cities, they have been unsanitary, with garbage and sewage spilling out onto streets and waterways. Although domestic bathroom plumbing first appeared 4000 years ago (on the island of Crete) and the modern flush toilet was invented in 1884, major cities today continue to dump untreated sewage. For instance, Karachi, a Pakistani city of 15 million, releases sewage directly into the Lyari River and most of its residents lack access to clean water. Cholera and other diseases spread through fecal water contamination.

2. Disease loves cities because new hosts are packed close together. Whether transmitted through direct contact or intermediaries like rats, fleas, or contaminated water, it is easier to find someone new to infect than it would be in sparsely populated rural areas.

3. Epidemic diseases such as smallpox, measles, cholera, and influenza leave their hosts either immune or dead. These diseases are on “sinking ships” and must find fresh hosts to survive. Fortunately for them—and unfortunately for us—cities support large populations with enough newborns and immigrants to keep all but the most deadly epidemics circulating.

A devastating example of epidemic diseases requiring a minimum population comes from the Faeroes, a small group of islands between Scotland and Iceland. The islands suffered a measles epidemic in 1781, but, with fewer than 10,000 inhabitants, they could not support that disease. With many dead and all the survivors immune, measles died out, disappearing for 65 years until a visitor from mainland Europe reintroduced it.

Some diseases are unable to survive in even larger populations. A direct overland trade route between China and Europe brought, among many other things, bubonic plague or “Black Death.” Killing one in four, it burned out even among a population of millions. And, by the end of World War I, the world was so well connected that an influenza pandemic killed 21 million before it ran out of victims. In the 21st century, transportation technology connects almost seven billion people into a community of hosts large enough for epidemics even more deadly than that. People now circle the world in much less time than it takes for most diseases to kill their host…or even display symptoms.

In the chapter on where technology comes from, we noted the great benefit of concentrated (and, by the same argument, connected) populations in allowing one innovation to trigger another. Like disease, ideas thrive in environments where many hosts can share them.

The people suffering in filthy, disease-ridden cities shared ideas about improving the situation. Two of the most significant areas of improvement were in sanitation and medicine. Water wells, chlorination, and waste treatment have enhanced sanitation, protecting us from water-borne diseases like cholera; sanitation in food-handling practices protects us from food-borne bacteria like salmonella. Antiseptic techniques, quarantine, antibiotics, vaccines, public education, and the sharing of medical knowledge have also extended our average lifespan.

And human lifespan has extended remarkably. As we noted earlier, thousands of years ago, human lifespan was between 20 and 30 years. Between 1800 and 1900 in England, it climbed from the mid-30s to about 50, and then beyond 78 by 2004. A major factor has been childhood mortality—dying before the age of five. In prehistory it was as high as 50%. Even if the surviving 50% had all lived to 50, which they did not, average lifespan would have been little more than 25. Worldwide childhood mortality dropped to 19.7% by 1960, 8.3% by 2000, and 8.1% by 2002. Angola and England continue to bracket this, holding steady over recent years at 26.0% and 0.7% respectively.

Child mortality does not affect everyone equally, but tends to weed out those who are weakest. When it is high, mostly the strongest survive to old age. When technological advances cause childhood mortality to fall, we might expect the swelling ranks of older people to include many that are so weak that they succumb to other ailments. And, yet, life expectancy at older ages has increased where child mortality has dropped. In the U.S. a 65-year-old living in 1960 could have expected another 14.3 years, and one living in 2000, another 17.9 years (this is different from life expectancy at birth, which is now 77 in the U.S.). So, while the largest improvement to average lifespan has come from reduced child mortality, older people are living longer, too.

There is much room for progress in improving sanitation and medicine around the world. In 2002, 1,798,000 people died from diarrhea, 1,566,000 from tuberculosis, and 1,272,000 from malaria. Where modern sanitation or medicine is absent, lifespan is short (e.g. less than 37 years in Angola and 36 years in Zambia), largely due to absence of these technologies and presence of others. Weapons technology has made it easier to wage civil wars, which shorten lifespan both directly and by preventing development of infrastructure to deliver clean water, safe food, administration of vaccines, and basic medicine. Conflict also displaces people, and refugees can neither farm nor readily conduct commerce.

In the developed world, free of civil war and generally replete with food, medicine, and basic sanitation, lifespan faces new challenges. More food does not necessarily lead to longer life. Not only can we consume calories with little nutrition, we can consume so many calories that we become vulnerable to diseases of overnutrition. Obesity contributes to both circulatory disease and diabetes. So while our technology can provide almost any foods, many of those in a position to choose, opt for fried animal products high in saturated fat and processed foods high in refined sugar. Such lifestyle choices reduce average lifespan, though a variety of drugs and interventions can keep us alive, if neither healthy nor independent. Perhaps surprisingly, some of the longest living individuals subsist on low-calorie diets.

For those who make the healthiest lifestyle choices there are still limits to lifespan. Genetic imperfections can lead to a variety of ailments, including heart disease. And even without genetic problems from birth, each year of life exposes us to genetic mutations. These come from ultraviolet sunlight (melanoma), radiation and chemicals (leukemia), and time (various cancers and old age). The human body has 50 to 75 trillion cells, each reproducing as often as hours (white blood cells live 10 hours) or days (skin cells live 19-34 days). There is always the chance of DNA copying errors when cells divide, so the longer we live, the more mistakes we contain. Usually, the body detects these mutations and destroys them, but there is always the chance that some will be missed and become cancerous.

Time may also play a more basic role. Some human cells divide only a certain number of times, with a telomere sequence at the end of a chromosome counting each divide. On average, the skin cells of younger people have longer telomere sequences than those of older. This mechanism may help prevent the runaway divisions found in cancer, whose cells divide without limit, but it may also impose a limit on human lifespan. Whatever further research concludes about the role of telomeres, the human body may have various forms of built-in counters, limiting the lifespan of even a perfectly nourished, disease-free human.

This means that technological progress in nutrition, sanitation, and disease can help to spread our “best practices” to the billions still wanting, but may do little for those already benefiting from them. However, advances in emerging areas, such as genetic engineering and nanotechnology, may break this lifespan ceiling.

If nanotechnology develops to the point where robots just hundreds of nanometers on a side (nanobots) could patrol our bloodstreams, then that might lead to the next leap forward in lifespan. Imagine millions of submarines, each smaller than a blood cell, programmed to identify unhealthy cells and destroy them. Our natural immune system is remarkably resilient, developing new antibodies when exposed to new threats, but the environment in which it evolved has changed and continues to change at an accelerating rate. Nanobots could help us keep up, downloading distinguishing characteristics of the latest viruses or forms of cancer and protecting us from afflictions even before exposure.

Our individual immune systems would, in effect, be networked globally. The World Health Organization, the U.S. Center for Disease Control (CDC), or some similar entity, could distribute “fingerprints” of viruses, cancers, or new bioweapons as soon as they have been identified. Though to further extend lifespan, it may be insufficient to eradicate the dangerous elements from our bodies. We may have to develop ways to promote healthy development of replacement cells.

Chess-Playing Computers and Bacteria

A computer programmed to play chess illustrates the power of being able to share what we learn. Although it may take a long time to program a computer to play grandmaster-level chess (just as it takes human grandmasters a long time to learn), that program can then be copied to millions of other computers. Once a process is defined, refined, and automated, the algorithm to perform it is replicable and shareable. It is just information.

By contrast our biological immune systems learn, but cannot readily share that learning with the immune systems of other people (the exceptional case: a mother does confer temporary immunity on her infant). Imagine if whatever one person’s immune system learned could be shared with millions of others. As fantastic as it sounds, the sharing of biological immunity happens today, though only between bacteria.

Because they do not shield their genes within a double-walled nucleus, bacteria can share strings of their genome with each other. Once a bacterium develops resistance to an antibiotic (perhaps by random mutation), it may share the genetic plan for that resistance with its neighbors, and they with their neighbors. Bacteria that incorporate something as useful as resistance to penicillin, are more likely to live, multiply, and share that genetic information. Similarly, those incorporating genetic information harmful to themselves are more likely to die and be unable to share it with others.

Plants and animals, including humans, cannot do this naturally because our cells, called eukaryotes, do shield their genes within a double-walled nucleus. Eukaryotes evolved from prokaryotes (e.g. bacteria cells) about 1.5 billion years ago. Technology may bring back to us an ability to share immunity that we lost long ago—this time operating not just locally, but globally, and at the speed of light.

Rather than begin a detailed technical analysis of how nanobots could function or worry about how that technology could go horribly amiss, let’s consider the implications if nanobots or some other technology could greatly extend lifespan. Over a long period of time, lifespan has trended ever higher, and there is little reason to believe that would suddenly stop.

So, suppose some humans started living far longer than any do now. Human culture and mythology are aligned with limited life. Although lifespan has more than doubled from less than 30 years in prehistory, this has been a very slow process and we have had time to become accustomed to patterns of childhood, adolescence, discovery, settling down, reevaluating life, creating a legacy, and reconciling with death.

We have already noticed some changes in less than the past century. People are living well beyond the usual age for retirement from work. When lifespan averaged 60, many died shortly after they left their jobs. Now, many are reinventing themselves, going back to school or starting entirely new careers or new families. If some technology makes human life indefinite, how will people cope? Will they exploit the ever-growing percentage of life that can be directed by experience and wisdom—avoiding the joke “I just figured it out and now I’m too old to do anything about it”? Or will indefinite life sap the urgency to seize the day?

The more important question to some: will we be healthy to enjoy those extra years? The “opportunity” to live an extra few hundred years connected to life support machinery is appealing only to those in abject terror of death. Independence and vitality until we choose to conclude our lives may be the only conditions under which many would opt for far longer lives.

The longer people live, the longer they tend to wait before having children and the fewer children they have. The highest birthrates are in the poorest countries, which also have the highest infant and overall mortality rates. Birthrates in some developed countries are insufficient to balance death rates. Italy averaged 8.9 births and 10.1 deaths per 1000 people in 2002. Were it not for immigration, Italy’s population would shrink. Would near-immortality cause population to explode? Or would decreased death rates further depress birthrates, keeping a balance?

How would limitless lifespan affect education and work? How would it affect our systems of government? Or family relationships, which could grow far beyond three or four contemporary generations? How would 20-year-olds and 700-year-olds interact when they appear the same age? Would society segregate by age so that experienced and cunning 500-year-olds do not prey on the young and innocent? Would we find that nearly everyone who has lived for centuries loses interest in those who have not? After all, they may have little in common. Or, would wildly unpredictable reactions come of such diversity?

Although homicides and suicides are committed for a variety of reasons, rates of incidence have followed long term trends along with health and longevity. In the year 1300, England had nearly 20 homicides but less than one suicide for every 100,000 inhabitants. Homicide rates dropped and suicide rates increased so that around 1700 they crossed at about five per year. By the year 2000 in the United Kingdom, homicides fell to 1.3 and suicides climbed to 8.7. Would an immortal be much more likely to commit suicide than kill someone?

Longevity affects us in many ways. Is it possible to live 1000 years without perceiving the Universe differently? Until recently, such thought games have been of use only in intellectual circles to probe abstract ethical and moral issues. But with some technologies changing at accelerating rates, as we noted in the previous chapter, we may live to see our questions answered.

 

This webpage is adapted from the book
Technology Challenged: Understanding Our Creations & Choosing Our Future
available at Amazon