5. How does technology change?

 

To fathom the changes
our marvelous tools will undergo,
we must draw on insights from
those who study evolution
of life and society.

 

 

If technology did not change, this book might not be very important. Or interesting. Early in human history, glaciers melted faster than technology changed—or at least both processes were comparably imperceptible to us. The first known plow dates back prior to 6500 years ago, after which more than 1000 years passed before the first known wheel…plenty of time for people to forget that there had once been a time before the plow. And then the wheel probably crept onto the scene slowly and inconspicuously.

Even if there had been newspapers, no headline would have read, “First Plow, Now Wheel:  Change Rocks Civilization.”  No, this book and its search for patterns underlying the vast parade of technology would not have found an audience. Technology was then a static feature of the environment and patterns in its nature, change, and evaluation must have seemed irrelevant. But times change and technology accelerates, so we find ourselves facing…

  • New weapons and new protections
  • New poisons and new medicines
  • Newly-obsolete industries and new careers
  • New assaults on our privacy and new forms of creative expression.

 

We need not remember 1000 years back because less than a century ago the world had no SCUD missiles, early-warning radar, weaponized anthrax, gene therapy, computer industry, TV commercials, or jazz quartets that burn and sell their own CDs. The faster that technology changes, the more likely we are to ask questions about it.

How do we explain, for instance, how technology becomes so inconspicuous?  Cellular telephones, once as magic as Dick Tracy’s wrist radio, are now completely unremarkable…unless one rings in a quiet theater. Some technology seems to be a solution in search of a problem. Isn’t necessity supposed to be the mother of invention?  In cases where the problem is clear, what causes some solutions to be adopted and others not?

How do we explain the blistering pace of improvements in electronics?  Today we can buy a computer that is twice as powerful as one costing the same just a couple years back. Is this an anomaly of our time?  Have any other technologies followed this pattern?  What can we learn from science and the study of biological change—evolution—that will help us understand technological change?  In this chapter we answer these questions by exploring five patterns:

  1. Disappearing technology: Technology can become so common that we stop noticing it (e.g. light bulbs). It can also be built into other technologies so we can’t see it (microprocessors).
  2. Necessity’s mother & daughter: An old saying goes, “Necessity is the mother of invention,” but the gasoline engine and the microprocessor show us that invention can come before we even figure out the need.
  3. Advantage, prestige, compatibility, and visibility: What influences whether a new technology is adopted or not?  We find out why printing with moveable type succeeded with Gutenberg’s press in the 15th century, but failed several millennia earlier.
  4. Autocatalysis: Technology acts on itself to change itself, accelerating even to an exponential degree. This applies not only to computers, but also mechanical clocksand biology.
  5. Evolution & memes: Natural selection guides both biological change, based on genes, and technological change, based on memes.

 

And while the stories are interesting, the patterns will help us understand and evaluate today’s technology, and—when it comes—tomorrow’s, too. That last part is an important goal of this book:  learning enduring patterns so that we can more consciously create our future. This is different from predicting the future—something far more difficult.

For instance, Microsoft failed to predict the importance of the Internet, forcing its legion of brilliant people to play catch-up with Netscape’s web browser. In 1948 Thomas Watson, then head of IBM, predicted “a world market for about five computers.”  In 1977 computer pioneer Ken Olsen claimed, “There is no need for any individual to have a computer in his home.”  Predicting specific technologies and applications is a gamble. Influencing our future need not be.

We can make informed, thoughtful, critical, evaluative decisions throughout our lives—and, collectively, that is how we create our future. Context is the foundation for those decisions. Whether something new matches or differs from one of these patterns, the context we gain by recognizing those patterns makes it easier to identify, categorize, and understand the new. Let’s start with something familiar, something that you or I would not think twice about, but that stunned the Emperor of Brazil.

 

Disappearing Technology

In 1876, Dom Pedro, Emperor of Brazil, presided over the Centennial Exposition in Philadelphia. His long trip was made worthwhile when he saw Alexander Graham Bell’s demonstration of the first telephone. He was stunned, but so was everyone else. Squeezing someone’s voice into tiny metal wires, only to have it pop out on the other side of the building, was magic. But this was not a trick; it was the foundation for an industry.

A few years later, on New Years Day, 1880, crowds traveled to Edison’s laboratory in Menlo Park, New Jersey, just to see his new incandescent bulbs light up. Today, telephones and light bulbs blend into our landscape. Who even notices them?

In the 19th century an airplane passing overhead (years before the first one was invented) would have caused excitement and panic. Today one may notice the contrail from a jet, but even wide-eyed children soon learn to ignore the regular passing of passenger, cargo, and military aircraft.

Three centuries ago, England’s Parliament offered a prize equivalent to several million of today’s dollars for the invention of a timepiece that would work on a sailing ship. Both highly accurate time and star charts were necessary for locating oneself at sea, as missing your destination, ending up lost at sea or on a strange and uninhabited shoreline, could mean financial disaster or even death.

John Harrison, a clockmaker, invented the first clock impervious to the rocking and rolling of a ship. This was a tremendous accomplishment in the 18th century, but today, how many of us notice that we have a descendant of these miraculous devices strapped to our wrists?

 

How Quickly Technology Becomes Popular

Familiarity breeds a form of blindness, and technology is becoming familiar ever more quickly. The table below shows how many years before a new technology was adopted by 25% of the U.S. population. Late 19th century technologies such as electricity (which we date from when it was first available for purchase), the telephone, and the automobile took decades to reach the masses:  46, 35, and 55 years, respectively. But late 20th century technology such as the personal computer, cellular phone, and the World Wide Web took about a decade:  16, 13, and 7 years, respectively. When every fourth person owns something, it is not surprising that it tends to blend into the background.

                                                         Year                        Years to

Product                                         Invented                     Spread      

Electricity                                         1873                               46

Telephone                                        1876                               35

Automobile                                       1886                               55

Airplane                                            1903                               64

Radio                                               1906                               22

Television                                         1926                               26

VCR                                                 1952                               34

Microwave oven                                1953                               30

Personal computer                            1975                               16

Cellular phone                                   1983                               13

World Wide Web                               1991                                 7

 Take a look around you. How far back would you have to travel in a time machine with some of these familiar objects in order to be called a magician or witch?  How long will it take for today’s most advanced technology to become commonplace?  Of course this pattern of “familiarity leading to a form of blindness” is not specific to technology. The new catches our attention because, as animals, we are wired to notice rapid changes in our surroundings (e.g. hungry leopards). The new could be a threat or an opportunity, and it takes attention to evaluate it.

 

However, there is something different about how technology disappears. A bizarre new hairstyle or a practice like body piercing may become so common that it blends into the background, but technology is actually burrowing out of sight. Physical connections between telephones (i.e. cords) are disappearing with the increasing use of cellular phones and wireless computers. Microprocessors are disappearing into the engines of our cars (adjusting fuel and air mixtures, operating the antilock brakes, etc.) and the doorknobs of our hotels (for which you use a magnetic or punched card instead of a conventional key). Why is this?

One reason is that technology is assembled from parts and those parts are often other technologies. The microprocessor becoming a building block for cars fits this pattern. But there is more to this “burrowing out of sight” than that. In the chapter on why we use technology, we saw that our reasons for using it last much longer than the specific technologies themselves. Beyond the novelty value—which includes the temporary status that flaunting some new device may confer—most of us just don’t care about the technology. It might as well disappear and get out of our way.

Future bathrooms may recognize who enters, check their vital signs (e.g. blood pressure), remind them about taking medicine, and even contact health care providers. It may also monitor health by analyzing what we leave in the toilet. Accenture (a technology and services consulting firm) developed a prototype bathroom of the future that incorporated many of these features.

With its ubiquitous computing strategic initiative, IBM seeks to make computer technology hide in anything from milk cartons to bottles of prescription drugs, from sprinklers to door locks, from clothing to the toilet. A company called emWare produces $1 computers just intelligent enough to communicate over the Internet.

Connected computers like these, hidden in everyday objects, are now joining billions of embedded microcontrollers that have long established themselves inside factory machines, elevators, lighting/heating/cooling systems, and other places, and we have already stopped noticing them.

If future bathrooms analyze our toilet deposits to predict health problems, it will not be entirely new. Long ago, the Chinese Emperor had his stools checked by royal doctors to make sure he was in good health, but in our time there has been no visible groundswell for this luxury. If consumers are not demanding it, then why is someone inventing it?  Isn’t necessity the mother of invention?  Interestingly, it is sometimes the daughter.

 

 

It was by chance that the recorded sound
found a lasting commercial use.
With the business market for the phonograph
faltering [wax cylinder recordings did not replace
business letters], manufacturers scrambled
to come up with other applications. In 1889,
the first coin-operated phonograph was placed
in an arcade. For a nickel, listeners could hear
a two-minute recording.

— Mark Robinson

 

Necessity’s Mother & Daughter

The incandescent light bulb was invented before Thomas Edison was born. But it did not work well, which is why Edison, using improvements in vacuum technology and a few other tricks to make it practical, is credited with the first bulb. Widespread demand for safe and economical lighting made the light bulb an example of “necessity being the mother of invention,” but history shows that the relationship is often reversed.

Edison invented the phonograph to record people’s dying words or to replace business letters (which the earlier invention of the typewriter had started to transform). There was little demand for deathbed speeches or wax cylinder business correspondence (spoken letters). Instead, Edison lived to see entertainment, which he declared as frivolous, take over his invention.

Ordinary people dropping coins into phonograph machines to hear a brief song was hardly what Edison had in mind, yet that led to huge industries based on records, tapes, and, eventually, compact discs. Listening to recorded audio (as well as the very similar technology of recorded video) has become so popular and pervasive that it has become nearly a necessity…and the phonograph was mother to that.

The modern computer industry has its own example of invention being the mother of necessity. The microprocessor was invented in 1971 to power a calculator. When the price of calculators plummeted from $250 in 1972 to just $10 a year later, the microprocessor was an invention in search of a need. This orphan has since found its way into hundreds of millions of homes, hiding in microwave ovens, TVs, video games, thermostats, and stereos, as well as the obvious personal computers. Prior to the microprocessor, these technologies were doing quite well without it (e.g. thermostats that turned on when it got cold, not an hour before our alarm clocks ring) or simply did not exist (e.g. personal computers). Since then, consumers have come to appreciate the new features that microprocessors allow, and manufacturers have come to appreciate the cost savings of controlling their product with a programmable device.

A story with twists and turns between necessity and invention concerns the steam engine, driver of the Industrial Revolution. Steam engines require lubrication or else they overheat and seize. Necessity. Mineral oil was distilled from petroleum to be that lubricant. Invention. Here, necessity was the mother of invention. However, mineral oil also leads indirectly to the demise of the steam engine in a case of invention being the mother of necessity.

For years when distilling mineral oil from petroleum left a waste product:  oil too light to be a good lubricant. This dangerously flammable liquid was dumped into rivers or anywhere convenient. In the 19th century, inventors were hard at work improving both steam and natural gas engines, so eventually someone thought of using this light oil—let’s call it gasoline—for fueling an engine. Shooting drops of gasoline into an engine cylinder exposed only the surface of each drop to oxygen, something crucial to burning. Once the surface burns off, oxygen can reach a layer deeper, but that means a slow explosion going layer by layer, like peeling an onion. As it turns out, the solution to this problem was invented to solve a different problem.

Scientific experimentation with liquids led to the “spray atomizer,” which was quickly adapted to create a fine mist of perfume. Bathing was not common before central plumbing and hot water heaters, making perfume understandably popular. The medical belief that disease came from bad smells made development of the atomizer urgent. But atomizing a liquid into many very small drops is also exactly what gasoline needs to get maximum exposure to oxygen, and therefore the quickest and most energetic explosion.

Once the atomizer was recognized as an important part of the solution, it was adapted and refined into a new invention, the carburetor (a form of which we still have in our cars). As gasoline engine technology improved, the steam engine lost popularity (there were other factors contributing to its demise, but we need to return to our point). Necessity was the mother of the carburetor; by the time it came about there was a clear need for improved efficiency in the very-promising new engine. Still, the two grandmothers of the carburetor were not necessities, but, rather, other inventions: a distilling process that just happened to produce an explosive waste product and a type of engine that could use it.

Which came first, necessity or invention?  For technology in general, the answer is the same as for the riddle, “Which came first, the chicken or the egg?”  The chicken came before the egg, which came before the chicken. Technology satisfies needs that develop in response to capability, which develop in response to earlier needs. Our Cro Magnon ancestors of tens of thousands of years ago had no need for telephones, resealable sandwich bags, or accounting software. They had their own needs of communication, food, and organization, for which they invented solutions, which revealed new needs, and so on all the way to the present day.

By recognizing the natural swing between necessity and invention, we are better equipped to understand current and future changes. The next time we see a frivolous, but remarkable, technology, we may start looking around for sprouting necessities.

 

 

 

The four stages of response to
any new and revolutionary development:
(1) It’s crazy!

(2) It may be possible—so what?

(3) I said it was a good idea all along.

(4) I thought of it first.

– Arthur C. Clarke

Advantage, Compatibility, Risk, Visibility

If an invention can create its own necessity, why do some inventions become no more than an idea or prototype?  One reason is lack of infrastructure. Leonardo da Vinci envisioned the helicopter, but the strong, lightweight materials and power source were centuries away. The helicopter reached no farther than his notebook. In the chapter on how technology works we saw that the stealth bomber’s shape and dimension appeared decades earlier as the flying wing, but that without small, fast computers, the design was unstable in flight. The prototypes were not adopted by an air force. Is lack of supporting technology the only reason that inventions fail?  Once an invention clears this rather obvious technical hurdle, it faces the social hurdles.

Advantage, compatibility, risk, and visibility—technical and social factors—determine whether a viable technology spreads or withers. The success of the transistor over the vacuum tube illustrates these four factors:

  1. Transistorshave the advantage of being smaller, requiring less energy, producing less heat, and lasting longer than vacuum tubes. A transistor is a solid sandwich of silicon or other semiconductor. A vacuum tube resembles its parent, the incandescent light
  2. Transistorsamplify or switch on and off electrical signals, which is just what vacuum tubes do, which makes them sound compatible with that existing technology. Actually, they were not because manufacturing vacuum tubes and designing them into radios and televisions were both very different for transistors.
  3. That the transistorperforms the same function as a vacuum tube does mean that it is simple to imagine replacing one with the other. This lack of complexity means that this new technology has a low risk of failure.
  4. The transistor’s advantages over the vacuum tubewere easily visible. The pocketsize transistor radio introduced by Sony in 1952 ran on batteries and became very popular. The much larger vacuum tube radio used so much energy that it had to be plugged into a wall outlet.

 

Transistors succeeded as a technology because they had advantages over vacuum tubes, those advantages were easily visible, and the risk of failure was low. These factors overwhelmed the lack of compatibility that transistors had with the vested interests of manufacturers and designers. So a technology need not triumph in all four categories in order to succeed.

In the case of the transistor, further advances showed how overwhelming its advantages truly were. Integrated circuits, invented a decade after the transistor, miniaturized entire electronic circuits of multiple transistors onto a single chip of semiconductor. Tens of millions fit into postage stamp sized areas and run for years without a failure.

By contrast, vacuum tubes cannot be miniaturized, in part because they use lots of energy, which produces lots of heat, which requires space for cooling. Their reliability has improved since the middle of the 20th century:  on average, one out of the 19,000 vacuum tubes in the Eniac computer had to be replaced every seven minutes. But it has not improved enough to make practical a computer with tens of millions of tubes. Vacuum tubes do live on in giant form as cathode ray tubes (CRTs) in televisions and computer monitors. Ranging from 19 to more than 30 inches across, these displays were not threatened half a century ago by transistors, but are now by liquid crystal displays (LCDs), which have been used in wristwatches for decades.

These four factors are also a useful tool in understanding why the invention of printing with moveable type, thousands of years before Pi Cheng (11th century China) and Johannes Gutenberg (15th century Germany), failed. Unearthed on the island of Crete and dating from 1700 BC, the Phaistos Disk is a clay disk imprinted repeatedly with 45 different symbols. Each symbol appears identically every time, indicating that it was probably carved into wood or cast into metal. Then, each time that symbol was needed, the carving or casting was pressed onto the disk.

Why didn’t printing with moveable type “take” way back then instead of waiting more than two millennia to encroach on writing by hand?  Because the Phaistos Disk failed all four tests:

  1. It lacked advantage because, with only a handful of people (scribes) who could read in each kingdom, there was little writing of any kind to automate and no market for mass-produced documents.
  2. Controlling writing was in the interest of the scribes. Why would they want to make it something that just anyone could do by pressing a symbol into moist clay? This made printing incompatible with vested interests.
  3. Carving or casting 45 different symbols into various materials to test the endurance of each was time consuming. Testing various types of clay and processes for firing it would have been complex. This increased the risk of failure and likely discouraged many people.
  4. Advantages were not visible because the process of carving or casting symbolsand then pressing them into the moist clay was slower than writing on it with a stylus. Visibility would ultimately come with the demand for and capability of high volume printing.

 

Many things changed by the 15th century. One of the most important was the availability of paper. Invented in China, the process of making paper traveled to the west, where it replaced vellum, the skin of lambs. A copy of the Gutenberg Bible would have consumed 150 lambs. Still, this “vellum” had been an improvement over hauling around heavy and fragile clay tablets. In Mel Brooks’ irreverent movie History of the World, Moses comes down from the mountain with 11 stone tablets. Just as he is about to announce the “11 Commandments,” one tablet falls and shatters. If only rag paper had been available!  Paper was one of many reasons that Gutenberg’s printing press succeeded:

  1. Advantage: With thousands of literate monks spending their lives hand-copying manuscripts and many merchants in need of documents to facilitate long-distance trading, mass-production of written documents had great economic advantage. Just the century before, the Black Death (bubonic plague) had killed off cheap labor. Further, Europe was fascinated with machinery:  clockworks, locks, and water gardens with fountains shooting up to do “tricks.”  A machine to automate printing had prestige.
  2. Compatibility: The printing press was compatible with the values of the Church (printing Bibles, prayer books, and papal indulgences), the Protestant movement (spreading word of an alternative to the Church), and merchants (selling books of all sorts).
  3. Risk: Developing an infrastructure for mass printing would be time-consuming, complex, and, therefore, risky if done from scratch. But, by the 15th century, papermaking had spread from China, olive presses could press inked letters into paper, and metallurgy developed for clocks and door locks could also cast perfectly standard letters and symbols. Risk was low.
  4. Visibility: People could easily observe that printing copies of the Bible was far better than hand copying (each of which could have taken a monk his entire life). Further, the fixed costs of printing could be quickly recovered. Setting up to print a book cost roughly three times as much as having a scribe copy it, so saving would be realized once just a few copies were printed.

 

But we are getting ahead of the story. Why did printing with movable type not flourish in 11th century China?  Lack of advantage and compatibility. The Chinese alphabet has thousands of symbols. Casting them into stamps, storing several of each type (since a symbol can appear multiple times on each page), and setting up the press would have been an immense effort. An advantage of movable type is in replicating fewer symbols more times (the power of replication we noted in the previous chapter). Also, printing with movable type was not compatible with Confucian values, which prohibited commercialization of printing. The government in China could distribute documents, but merchants could not sell them.

Modern printing is done with computers, and we can look to a computer keyboard to see most of the symbols available. Where did  the keyboard’s layout come from?  Why are letters, numbers, and other symbols placed where they are?  The adoption of the present day computer keyboard illustrates a battle in which advantage and visibility lost out to compatibility and risk.

The keyboard on nearly every computer is similar to the one that was on nearly every typewriter. Called the QWERTY layout because of the letters across the top left, it was intentionally inefficient. It originated because early mechanical typewriters would jam if one typed too quickly. Inefficient placement of letters slowed typists so that they would not jam the machines, which used many metal arms, each with a symbol (or two, when used in conjunction with the shift key) in relief, arranged in a semicircle. Pressing a key caused an arm to swing up and hammer its imprinted letter against an ink ribbon and the paper.

But by the time the technology was no longer subject to jamming (with electric typewriters and the no-moving-parts word-processing computers we use now), many people had learned the QWERTY layout. So changing to a more efficient layout, such as Dvorak, would require learning a whole new layout. QWERTY and Dvorak split our four factors:

  1. Advantage: Although QWERTY minimized key jamming on manual typewriters, Dvorak allowed much faster typing on electric typewriters and computers. Dvorak wins.
  2. Compatibility: Few typists could touch-type on Dvorak keyboards, but a huge labor pool with QWERTY skill was ready for hiring by the same companies that had to choose which keyboards to buy. QWERTY wins.
  3. Risk: Trying a new system such as Dvorak is risky because it requires changing lots of equipment and retraining lots of people. It is much safer to just stick with what has long worked, however inefficient it might be. QWERTY wins.
  4. Visibility: The U.S. Navy reported that Dvorak was so much more efficient that retraining would pay for itself within 10 days, an easily visible advantage…if a business goes to the risk of trying it for itself. Dvorak wins.

 

Just because each keyboard layout “won” two factors does not make the battle a tie. In this case, compatibility was so important that the technically superior Dvorak keyboard was marooned as little more than a prototype. Few use it and 21st century computer keyboard manufacturers continue to use the 19th century keyboard layout.

image022

 

In cases such as keyboard layout, compatibility outweighs all other factors. Economists call this “lock-in” or “network effect,” which is what happens when a technology establishes itself so strongly that consumers are reluctant to switch, even to an apparently superior technology.

Lock-in is one factor in the success of the Microsoft Windows™ operating systems and of personal computers in general:  there are so many of them in the market that any new product—hardware or software—is bound to support them. Buying a niche operating system or machine exposes you to the risk that future add-on products will not be made available for you…and that the market might abandon your system entirely. Economies of scale also favor the entrenched party with falling prices, but even being entrenched is no guarantee when the economic advantages of a new technology are overwhelming.

Eventually, a technology with an advantage can threaten one with compatibility, as the inexpensive Linux operating system has come to slowly take some market share away from the omnipresent Windows. While it is impossible to predict the outcome of Linux vs. Windows, we do know how the 19th century “war of the currents” turned out.

Edison built the first electrical power plants to produce direct current (DC). This means that electricity always flows in just one direction, electrons coming out of one slot on your wall plug, going through your appliance to drive it, and back in the other slot. Nikola Tesla, an odd genius who went from rags to riches and back again, advocated alternating current (AC), which reverses the direction of electrons many times each second (60 times in the U.S.). The war between DC and AC had high stakes. Electricity would clearly revolutionize the world; selling it and all the devices that used it would be very profitable.

With Edison established as virtually a “god of technology” in America, he invented the many technologies necessary to efficiently generate electricity, distribute it, measure how much a customer used, and charge for it. Making matters worse for AC, electric motors that ran in factories needed DC. Everything was compatible with DC and very little with AC. What could change the status quo?

To answer that question, we need to explain a bit about electrical voltage, current, and energy. Voltage measures the force moving the electrons and current measures how many electrons are moving. The speed of a river is analogous to voltage and the quantity of water to current. The amount of electrical energy is the voltage multiplied by the current, so we can chose to have high voltage and low current or low voltage and high current without affecting the amount of energy. The practical application of this theory comes into play when sending electricity over a long distance. It turns out that the amount of energy lost to heat is proportional to the amount of current, so the farther a utility sends their energy, the higher voltage and lower current they prefer. On the other hand, high voltage can be dangerous since it can arc right through the insulation on wires, causing fires or painful shocks, so consumers want low voltage and high current.

Whatever the proportion of voltage and current, the amount of energy is the same, requiring the same amount of coal, oil, uranium, or flowing water to generate. But here’s the hitch: transforming one voltage to another is much easier with AC than DC. Because he needed to transmit his DC at the same voltage as consumers used it, Edison located his generators within a few miles of them. AC provided the best of both worlds, allowing transmission from faraway sources at efficient high voltage (hundreds of thousands of volts) and household or business delivery at safe low voltage (120 and 220 volts in the U.S.). As AC technology developed, this presented a critical advantage that DC could not match.

Niagara Falls, the first big hydroelectric generator, transmitted high voltage AC to cities many miles away, making visible the advantage of AC. DC still thrives in short distance applications, such as inside electronics, which run on the DC from batteries or convert AC “from the wall” into DC. Just as QWERTY compatibility overwhelmed Dvorak advantage, AC advantage overwhelmed DC compatibility.

 

Technology is autocatalytic
in that it acts on itself
to change itself.

Autocatalysis

When something acts on itself to change itself, as some technologies appear to, it is called autocatalytic. There is a web of interrelationships and dependencies that suggest technology works this way. Tools create better tools, which create still better tools. This happened with primitive stone, bone, and wood tools being used to make improved versions. The light bulb needed a good vacuum pump in order to avoid burning out. Integrated circuits needed photo resistive chemicals to etch complex patterns on silicon (those chemicals were discovered just a few years before the integrated circuit was invented).

Try going back in time to invent something and you’ll probably be stumped by absence of support technologies. We saw some of this in the dead-end Phaistos Disk, the early attempt at printing with moveable type on clay tablets, but it is easy to imagine more difficult situations.

If Mark Twain’s Connecticut Yankee in King Arthur’s Court had tried to invent the electronic computer, he would have had to invent much of the electrical and electronics industries. So, too, would Daniel Defoe’s Robinson Crusoe, stranded on a desert island without even the metallurgy and labor that the Connecticut Yankee might have found.

Although interrelationships of technology may prevent an invention from coming before its time, they also propel a variety of technologies faster and faster. Viewed mathematically, autocatalysis can cause exponential change. Exponential change became familiar to many of us toward the end of the 20th century in the form of personal computers doubling in power every few years, but costing the same or less. This repeated doubling can be traced back to the beginning of the 20th century, when computers were built from electromagnetic relays and vacuum tubes.
The graph below shows computing power per constant (1998) dollar from 1900 to 2001. In 1908, one calculation every 50 seconds cost $154,000 with Hollerith Tabulator, the computer used for the U.S. Census. In 2001, one billion calculations every second cost about one thousand dollars in a common PC. In 2004, manufacturers shifted their focus from ever-faster clock speeds to squeezing multiple processors into one package, continuing the trend.

Suppose that growth from the Hollerith Tabulator and onward progressed linearly instead of exponentially. In 1909, a computer costing the same $154,000 would have done two calculations in 50 seconds. In 1910, three, and in 1911, four. By 2005, it would perform 98 calculations in 50 seconds, or less than two each second for $154,000. What a difference exponential growth makes!

image026Digital storage technology—computer memory and disks—has increased exponentially. Communications technology has, too, dropping the cost to send a given amount of information. Robotics and improved algorithms sped the sequencing of the human genome—early predictions had it taking thousands of years. We find similar exponential improvement in mechanical clocks, which were preceded by water clocks. One type of water clock, called a “clepsydra,” consisted of a large bowl filled with water and a smaller bowl with a tiny hole in the bottom. The smaller bowl floated on the water in the large bowl until enough water leaked up to sink it. The final rush of water into the sinking bowl made a “ploonk” sound, indicating that, whatever you were doing, your time was up.

Water clocks had the unfortunate characteristic of clogging with sediment from the never-quite-pure water, which affected their accuracy. The first mechanical clocks—certified to be sediment free—lost about 15 minutes each day, the same as the best water clocks. But then they followed a curious trend. Every 30 years, or so, mechanical clocks doubled in accuracy, right up to the 20th century. Fashion in watch technology followed a similar trend with the thickness of British and Swiss pocket watches shrinking:

  • 41 millimeters in 1700
  • 25 millimeters in 1812
  • 12 millimeters circa 1815-1825
  • 11 millimeters in 1846
  • 6 millimeters in 1850

 

But exponential change in clock accuracy and watch thickness did not continue, nor did faster air speeds. Over nearly a century, the maximum speed achieved in air doubled every nine years. When commercial supersonic transport (SST) went into service in the 1970s, many imagined SSTs to be the future of air travel. Three decades of technological advance later, the successful airplanes are more economical, but not as fast. In April 2003, Britain and France announced that the only remaining SST, the Concorde, would no longer fly. So much for exponentially increasing speeds of air transport.

Given the time that all air passengers must spend just getting to and through airports, the shortened flying time was rarely worth the price. Unlike color television and the videocassette recorder, which also debuted as expensive toys for the rich, the cost of flying an SST did not plummet.

While some technologies are not autocatalytic, those that are—computation, storage, communication—will change the world. And even those that change more slowly are subject to selective pressures from their environment, and that may help explain why some thrive while others become extinct.

 

 

Technology picks right up
with the exponentially quickening
pace of evolution.

— Ray Kurzweil

Evolution & Memes

It took billions of years to evolve from the first single-cell life form to multicellular plants and animals, but only half a billion to get from there to mammals. Then, less than a sixth of a billion from there to monkeys and apes, and 1/36 of a billion from there to simple technology. The calendar opening “Chapter 3: Where does Technology Come from?” showed, on the timescale of millions of years, just how recently important technology has been invented. For example, electricity in any form more practical than novelty is barely two centuries old. That is just one five-millionth of a billion years. It is as if an accelerating arms race among predators and prey spilled over from biology to technology:

accelerating change table

We turn to evolution because that may influence both biology and technology. If we want to understand what is accelerating the change of technology, we need to examine how evolution works, and in particular natural selection or “survival of the fittest.”  While this has been associated with biology and genes, it is more universal than that, applying quite well to technology and memes.

Could it be that the same forces of evolution that have governed biological change on Earth also apply to technological change?  It might be easy to jump to the conclusion that, lacking genes, technology cannot evolve. Our Stone Age ancestors might have concluded that, lacking stone, one could not have a knife. More recently, we might have concluded that a knife must have a sharp edge…until we saw how effectively a laser can cut through eye corneas or steel.

So let’s step back from the biological implementation of evolution to understand its process. What is necessary for evolution by natural selection?  The answer is just three things, none of which require genes:  variation, selection, and retention. Applying these to biology, which does use genes:

  1. Variation – Sexual combination, random mutations, symbiogenesis, and viral penetration of the nucleus can cause variation in genes. A simple, if unrealistic, example: a giraffe with a medium-length neck has short-, medium-, and long-neck offspring.
  2. Selection – Survival of the fittest selects some organisms to procreate and pass on their genes before dying and others not. The short-neck giraffe dies before procreating.
  3. Retention – Offspring retain part or all of the genetic code of those successful enough to procreate. The medium-neck giraffe tends to have medium-neck offspring and the long-neck giraffe tends to have long-neck offspring.

image027

Symbiogenesis is the emerging theory that variation is largely the product of combinations of life forms. An example is lichen (as seen affixed to trees and stone), which is the symbiotic relationship of fungi and bacteria. The fungi store water and provide protection. The bacteria photosynthesize nutrients. If the branches of the evolutionary tree rejoin to create new species then, according to scientist Lynn Margulis, “Animal evolution resembles the evolution of machines, where typewriters and television like screens integrate to form laptops, and internal combustion engines and carriages merge to form automobiles. The principle stays the same: Well-honed parts integrate into startling new wholes…”  So we apply variation, selection, and retention to the steam engine, which does not use genes:

  1. Variation – James Watt created a variation on Thomas Newcomen’s primitive steam pump.
  2. Selection – Industry selected Watt’s steam engine over competing technologies, such as waterwheels.
  3. Retention – New steam engines retained the basic design while introducing variations, such as Richard Trevithick’s more efficient high-pressure approach, bringing us full circle.

 

The difference:  biology stores the information necessary for evolution in genes and technology stores it in human brains. Scientist Richard Dawkins coined the term “memes” for a “unit of cultural transmission.”  Memes could include ideas, designs, practices, or even musical melodies. They spread by word of mouth, through books, in classrooms, and on television. They spread imperfectly, with some individuals perceiving a variation of the original meme. Survival of the fittest selects which memes will be repeated to friends and which will sell on the media. The meme we get retains part or all of the information from the original meme.

The specific implementation of an evolutionary system—whether genes or memes or something else—does not matter, so long as the system incorporates variation, selection, and retention. Another way to put it:  this system is substrate independent (a term that has come up a few times in this book).

 

 

Imagine a world full of hosts for memes (e.g., brains)
and far more memes than can possibly find homes.
Now, ask which memes are more likely to find
a safe home and get passed on again?

– Susan Blackmore

 

 

This is not a radically different way to view technological change. It is conventional wisdom that (1) people innovate, (2) some innovations thrive while others are abandoned, and (3) others copy the innovations that work. What is different is that we are suggesting that some of the analytic tools developed for biological evolution may be turned on technology.

Suddenly it becomes reasonable to ask if there are technological counterparts to weeds, parasites, and viruses. What about inbreeding and the weakness that results from absence of diversity?  For instance, is the Internet more vulnerable to computer worms and viruses because 90% of personal computers run the Windows operating system?  Is this analogous to the danger that bananas, now virtually all the same genetically, could be wiped out worldwide by a single pest?

Naturalists have learned much about ecosystems, and now that may shed light on the increasingly complex and diverse world of technology. But we have new tools to explore the world of ecosystems, tools that let us simulate environments and squeeze years into minutes.

Computer scientists have created primitive virtual environments inside computers, which introduce variation, select the fittest, and retain many of their characteristics. These simulated worlds of ones and zeros instead of molecules and energy are still very simple, but they make it possible to see what emerges after thousands of generations subject to the rules of natural selection.

 

Two hundred years after [Benjamin] Franklin, artificially generated

lightning—tamed, measured, and piped through wires into buildings

and tools—is the primary organizing force in our society,

particularly our digital society. Two hundred years from now,

artificial adaptation—tamed, measured, and piped into every type

of mechanical apparatus we have—will become the central

organizing force in our society.

– Kevin Kelly

 

 

A breeding experiment by scientist Danny Hillis provides an illustration. Now Danny is not a virologist, breeding bugs in test tubes, but a computer scientist breeding programs in a computer. He let his evolution simulation run for 10,000 generations. His digital “creatures” competed for resources and the most successful mated, passing their genetic code on to digital children.

Occasionally, that code mutated, usually creating monsters that could not survive. Once in a while, though, the mutation was beneficial and the mutant thrived, producing long lines of children. Simulating 10,000 generations is quick on a computer with 64,000 interconnected processors running simultaneously, yet it still evolved a program so “fit” that it was only one step longer than the leanest, most efficient one that computer scientists had spent decades devising and improving.

That program sorted numbers into order, but as simulated environments become richer, with further programming and increasingly fast computers to run them, all sorts of technology may be evolved rather than designed. Why would we do this?  Hillis answers, “There are only two ways we know of to make extremely complicated things. One is by engineering, and the other is evolution. And, of the two, evolution will make the more complex.”

Evolution is an extraordinarily powerful force that created dinosaurs and humans out of energy and stardust. What will we create in the miniature universes boxed under our desks?  How will we decide what to release into the outside world?

 

 

 

_________________________

 

 

 

Two billion years ago our ancestors were microbes;
a half-billion years ago, fish;
a hundred million years ago, something like mice;
ten million years ago, arboreal apes;
and a million years ago, proto-humans
puzzling out the taming of fire.
Our evolutionary lineage is
marked by mastery of change.
In our time, the pace is quickening.

— Carl Sagan

 

 

When Darwin’s theory of evolution by natural selection was young, it had a competitor in Lamarckian evolution. Jean Lamarck’s theory held that whatever traits an animal develops before it has offspring, it can pass on to those offspring. For instance, a giraffe may stretch its neck every day to reach higher leaves on trees, just as an athlete gets stronger by training. When the giraffe has children, that longer neck trait would be passed along to them. Interesting, but wrong.

Stretching its neck, the giraffe does not alter the DNA code in its sperm or egg (i.e. germ) cells. Since its offspring are products of that code, they will not be affected by how much the parent stretches its neck. So Lamarck, speculating long before Watson and Crick discovered DNA and the underlying mechanisms of evolution, was wide of the mark. What we believe does happen is that random mutation in its genetic code can create a longer neck and that affords the giraffe access to leaves out of reach of its competitors. Better fed and stronger, that giraffe can have more offspring, which share the code for a longer neck. So, the percentage of giraffes in the total population with longer necks increases because that trait has positive selection pressure.

Why dredge up an old evolutionary theory that was wrong?  Because it may apply to technology. Retention in technology does not depend on germ cells. Information is retained around the technology as designs, so a design improved over the life of a technology can be incorporated into its successors. That may allow technology to evolve much faster than biological systems, but it raises a question. Why does life not use Lamarckian evolution?

Perhaps it was just chance that biological systems never developed a mechanism to exploit that process. It may require complex systems that are unlikely to arise through random mutation. Or, perhaps, there are disadvantages. If evolution is too fast, can that make for weakness?  Is it less risky to allow only slower changes and test them over many generations for fitness before allowing them to crowd out the old, but proven designs?  We may find out.

In the meantime, we already know much about how technology changes. While it may follow Lamarckian evolution more than Darwinian, technology does exhibit the three key elements of any kind of evolution:  variation, selection, and retention. Memes rather than genes carry the information in this process. Although artificial evolution—the invisible battles of information ecosystems residing in computer memory—may appear but a novelty now, it does employ the powerful forces that created dinosaurs, humans, and AIDS. And it accelerates those processes to evolve thousands, if not millions, of generations each second. The results may not stay hidden inside computers.

Using computers to simulate evolution and develop new technology that we might never think to design is a perfect example of autocatalysis. Computers act on themselves to change themselves, evolving software and hardware so complex that no human team could design it. Then that software and hardware could be used to evolve systems even more complex, and so on. Autocatalysis started simply enough with crude tools allowing humans to fashion finer tools. Since technology has come to extend not only our muscles, but also our brains, the autocatalytic process is freeing itself of human dependence. The programmer may not even understand how the products of simulated evolution work because they are so complex. When humans are not needed even to setup the environment and define the goals, then simulated evolution can accelerate even more.

Can humans keep up with this accelerating rate of change?  In the next chapter, we will look at how we have tried, but for now, let’s consider a theory that claims we cannot. The many processes that become exponentially faster suggest to some that we are approaching a point at which change will be wholly incomprehensible to humans. They call that point the “Singularity.”

Although technological generations lasting just a few years (e.g. computers and cellular phones), have been challenging to many of us, generations lasting hours or even minutes would be too quick for anyone to understand and evaluate. No sooner would we start, but a hundred successive generations would evolve, leaving the one under our consideration extinct or at least obsolete. And exponentially accelerating change would continue past that point, creating new generations of technology we cannot even imagine within fractions of seconds. And then faster.

If the Singularity occurs, we will not be able to understand the details of change, but we will still recognize the forces of change. Technology will still act on itself to change itself. Advantage, compatibility, risk, and visibility will still be selection factors for what survives, though these four factors may be seen through eyes that are more technology than human. Necessity will still create new inventions, and inventions will still create new necessities. Technology will still disappear by being incorporated into other technology and by becoming so common it is no longer noticed (It is not just human psychology that focuses on the novel. Understanding the Universe in complete detail is impossible, so any perceiving organism or system must filter for those details that matter most).

But back to the present. Deciding if we want the Singularity to happen, or if we even have control over the outcome, rests on understanding the forces of change. Unfortunately, few people examine technological change, and when they do, they tend to focus on a particular technology: The computer purchased last year is already slow by current standards; the cellular phone purchased last year is already large by current standards. But technology is like a forest and those particular examples of it, but trees. To see the grand trends, we must uncover the sorts of patterns we did in this chapter. And there is urgency to this. As Carl Sagan pointed out, the pace is quickening.

 

This webpage is adapted from the book
Technology Challenged: Understanding Our Creations & Choosing Our Future
available at Amazon