Six innovations that transformed the world and paved the way for a host of other ubiquitous technologies today
Central ideas:
1 – Angelo Barovier of Murano, Venice, used seaweed, rich in potassium and manganese oxide, burned it to create ash, and then added it to molten glass. The new product was called cristallo. The modern glass was born.
2 – Ellis Carrier, an engineer, was hired by a printing company to devise a scheme to help keep ink smear-free in the humid summer months. Carrier’s invention not only removed the humidity from the printing room, it cooled the air.
3 – Perhaps the most important legacy of the telephone was Bell Labs, an organization that was born alongside AT&T. From Bell Labs came fundamental tools of modern life: radios, laser beams, vacuum tubes, transistors, televisions, solar cells, microprocessors, computers, and fiber optics.
4 – The confirmation that diseases such as cholera and typhoid are caused not by smell, but by invisible organisms in contaminated water had the collaboration of new microscopes equipped with Zeiss lenses. Robert Koch used them to identify the cholera bacteria and others.
5 – At the NIF (National Ignition Facility) in Northern California, they are closing a full circle of light, using lasers to produce a new energy source based on nuclear fusion, recreating the process that occurs naturally in the core of the Sun, the original source of natural light.
About the author:
Steven Johnson is one of today’s great non-fiction authors, always taking us on a journey through the history of technology. With more than ten books published and translated into several languages, he is the creator and host of the series How We Got to Now, which gave rise to this book.
By the way, there are no intelligent robots in this book. The innovations presented here belong to everyday life, not to science fiction: light bulbs, sound recorders, air-conditioning, a glass of drinking water, a wristwatch, and a glass lens. But I have tried to tell the story of innovations in a way similar to the perspective of De Landa’s robot historian. If a light bulb could write the history of the last three hundred years, that would also be very different. We would have to see how much of our past was committed to the pursuit of artificial light, how much ingenuity and effort was spent in the battle against darkness, and how the inventions we engineered triggered changes that, at first glance, seemed to have nothing to do with light bulbs.
This is an approach that I have called, in another instance, “long zoom” history: the attempt to explain historical changes by examining multiple scales of experience at the same time – from the vibrations of sound waves on the eardrum membrane to mass political movements. It might be more intuitive to limit the historical narrative to the scale of individuals or nations, but at a fundamental level, these limits constrain the accuracy of the analysis. History happens on the plane of atoms, on the plane of planetary climate change, and on every plane in between. If we want to get history right, we need an interpretive approach that does justice to all these different levels.
There is something undeniably compelling about the story of a great inventor or scientist – Galileo and his telescope, for example – paving the way toward a transformative idea. But there is another, deeper story that can also be told: how the ability to make lenses depended on a quantum mechanical property of silicon dioxide and the fall of Constantinople. Telling the story from this long-zoom perspective does not diminish the traditional importance focused on Galileo’s genius, it only adds to it.
Chapter 1. Glass
Glass began to make its transition from ornament to advanced technology during the height of the Roman Empire, when glassmakers discovered ways to make the material stronger and less cloudy than naturally forged glass, such as that on Pharaoh Tutankhamen’s scarab. During this period, glass windows were built for the first time, laying the foundation for the glittering glass towers that now populate the skyline of cities around the world. The visual aesthetic of wine tasting emerged as people began to drink wine in semi-transparent glass containers and store it in bottles of the same material.
But the fall of Constantinople also triggered a seemingly minor event, lost amidst this vast reordering of geopolitical and religious domination and ignored by most historians of the time. A small community of glassmakers from Turkey sailed west from the Mediterranean and settled in Venice, where they began to practice their trade in the thriving new city, expanding from the marshes on the shores of the Adriatic Sea.
In 1291, in a dual effort to hone the glassmakers’ skills and protect public safety, the city administration sent the glassmakers into exile again, only this time the journey was a short one-and-a-half-mile one-way trip across the Venetian lagoon to the island of Murano. Unknowingly, the Venetian doges had created a center of innovation by concentrating the glassmakers on a single island the size of a small neighborhood in the city, where they unleashed a wave of creativity, giving birth to an environment that had what economists call “information spillover”.
There, Angelo Barovier stood out. He used seaweed, rich in potassium oxide and manganese, burned it to create ash, and then added these ingredients to molten glass. When the mixture cooled, he had created a remarkably clear type of glass. Baffled by its resemblance to the most translucent of quartz crystal stones, Barovier called it cristallo. The modern glass was born.
In the monasteries of the 12th and 13th centuries, monks bent over religious manuscripts in candle-lit rooms and used pieces of glass to aid reading. They used bulky magnifying glasses over the page to magnify the Latin inscriptions. No one is sure exactly when or where this happened, but at that time, somewhere in northern Italy, glassmakers arrived with an innovation that would change the way we see the world, or at least make everything clearer: they molded the glass into small disks with a curvature in the center, put each disk in a frame, and joined the frames together at the top. They had created the world’s first glasses.
What followed Gutenberg’s creation of the printing press was one of the most extraordinary cases of the hummingbird effect in modern history. Gutenberg made it possible to print relatively inexpensive and portable books, which caused an advance in literacy, exposed a flaw in the visual acuity of a considerable part of the population, and created a new market for eyeglass manufacturing.
In 1590 in the small town of Middelburg in the Netherlands, Hans and Zacharias Janssen, father and son, spectacle makers, experimented with aligning two lenses, instead of placing them side by side as spectacles. They observed that objects appeared enlarged, and thus invented the microscope. Twenty years after the invention of the microscope, a group of Dutch lens makers, including Zacharias Janssen, invented, at about the same time, the telescope. Lippershey was the first to apply for a patent. Within a year, Galileo found the word for the new device and modified Lippershey’s design, achieving magnification corresponding to ten times normal vision. In January 1610, just two years after Lippershey filed his patent, Galileo used the telescope to observe moons orbiting Jupiter.
The Keck telescopes on the peak of Mauna Kea in Hawaii look like direct descendants of Hans Lippershey’s creation, except they don’t rely on lenses to work their magic. In order to capture light from the far corners of the universe, you would need lenses the size of a pickup truck. At that size, physically, the glass would not hold and would cause distortions to the image. So the scientists and engineers working on Keck employed another technique to capture extremely faint traces of light: the mirror. Each telescope has 36 hexagonal mirrors that together form a six-meter reflective screen. This light is reflected in a second mirror and sent to a set of instruments, where the images can be processed and viewed on a computer screen. In addition to all this paraphernalia, a system called “adaptive optics” is used. Lasers are launched at night into the sky above Keck, effectively creating an artificial star in the firmament as a kind of reference point.
Chapter 2. Cold
Frederic Tudor from Massachusetts knew from personal experience that a block of ice could last a long time in the maximum summer temperatures if it was shielded from the sun – or at least until late spring in New England. This knowledge would plant the seed of an idea in his head, an idea that would eventually cost him his sanity, his fortune, and his freedom – before making him an immensely wealthy man.
Part of the beauty of ice, no doubt, was that it was basically free. Tudor only needed to pay workers to shape the blocks taken from the frozen lakes. The New England economy consisted of another equally useless product, sawdust, the main waste product of the sawmills. After years of experimenting with different solutions, Tudor discovered that sawdust was an excellent insulator for ice. Layered blocks, on top of each other, separated by sawdust, lasted almost twice as long as unprotected ice.
This was Tudor’s frugal genius: he took three things that had zero cost – ice, sawdust, and an empty ship – and turned them into a thriving business.
Refrigeration using ice changed the map of the United States, and nowhere was the transformation more pronounced than in Chicago. Chicago’s initial explosion of growth came about because of the canals and rail lines that connected the city to both the Gulf of Mexico and East Coast cities. Its prime location as a distribution center – by its nature and one of the most ambitious engineering feats of the century – meant that wheat flowed from the abundant plains to the population centers of the Northeast. But meat could not make that journey without spoiling. So Chicago developed a large trade in canned pork.
It was ice that ultimately provided a way around this impasse. In 1868, pork magnate Benjamin Hutchinson built a new packing plant, featuring refrigerated rooms with natural ice that allowed pork to be shipped year-round.
Young physician John Gorrie began mulling over a more lasting solution for the hospital where he worked: making his own ice. Fortunately for Gorrie, that happened to be the perfect time for this idea. In his spare time, Gorrie began building a refrigeration machine using the power of a pump to compress the air. The compression heated the air. The machine cooled the compressed air by passing it through pipes cooled with water. When air is expanded it removes heat from the environment; and just as tetrahedral hydrogen bonds dissolve in liquid water, the extraction of the heat cools the ambient air. The thing could even be used to produce ice.
Incredibly, Gorrie’s machine worked. No longer dependent on ice shipped from a thousand miles away, it reduced the fever of patients with homemade cold. Gorrie applied for a patent, correctly envisioning a future in which artificial cold, as he wrote, “could better serve mankind…Fruits, vegetables, and meats will be preserved in transit by my refrigeration system, and thus will be enjoyed by all!”
The first “air handling apparatus” was dreamed up by a young engineer named Willis Carrier in 1902. The story of Carrier’s invention is a classic in the annals of chance discoveries. A 25-year-old engineer, Carrier was hired by a printing company in Brooklyn to devise a scheme that would help them keep their ink spotless in the humid summer months. Carrier’s invention not only removed the humidity from the printing room but also cooled the air. He noticed that suddenly everyone wanted to eat lunch next to the presses, so he began designing the contraption to regulate the temperature and humidity in the interior spaces. The first big test took place on Memorial Day weekend in 1925, when Carrier debuted an experimental air-conditioning system in the Rivoli cinema, the new flagship of Paramount Pictures.
Chapter 3. Sound
In the 1850s, a Parisian printer named Édouard-Léon Scott Martinville came across one of those Enlightenment-era books on the anatomy of the ear. It awakened in him an interest in the hobby of the biology and physics of sound.
Scott also studied shorthand and had published a book on the history of shorthand years before he started thinking about sound. At the time, stenography was the most advanced form of voice recording technology; no system could capture the spoken word with the precision and speed of a trained stenographer. But as he observed those detailed illustrations of the inner ear, a new concept began to take shape in Scott’s mind: perhaps the process of transcribing the human voice could be automated. Instead of a human being writing the words, a machine could record the sound waves.
In March 1857, two decades before Thomas Edison invented the phonograph, France’s patent office granted Scott a patent for a machine that recorded sound. The contraption channeled sound waves through a cornucopia-like device that ended with a parchment membrane. The sound waves caused vibrations in the parchment that were transmitted to a needle made of a pig’s bristle. The needle recorded the waves on a page blackened with charcoal soot. He called his invention the “phonautograph,” the self-writing of sound. However, Scott’s invention was crippled by a fundamental – even comical – limitation. He invented the first sound recording device in history. He forgot, however, to include playback.
Scott’s blind spot would not be completely unsolved. Fifteen years after his patent, another inventor began experimenting with the phonautograph, modifying Scott’s original design and including the real ear of a cadaver in order to better understand acoustics. With this new configuration, he arrived at a method of capturing and transmitting sound. This man’s name was Alexander Graham Bell.
Perhaps the telephone’s most important legacy, however, lies in a strange and wonderful organization that grew out of it, Bell Labs, a company that was to play a key role in the creation of almost every major technology of the 20th century. Radios, vacuum tubes, transistors, televisions, solar cells, coaxial cables, laser beams, microprocessors, computers, cell phones, fiber optics – all these fundamental tools of modern life descend from ideas originally generated at Bell Labs. No wonder the company became known as the “idea factory”.
After all, how did these ideas reach the market, the American public? Justice Department lawyers opposed to the telephone monopoly worked out an intriguing agreement, officially established in 1956. AT&T would be allowed to maintain its monopoly on telephone service, but any patented invention that originated at Bell Labs was to be freely licensed to any American company that found it useful, and all new patents would have to be licensed for a modest fee.
Chapter 4. Hygiene
With its railroad and transportation network expanding at an extraordinary rate, Chicago more than tripled in size during the 1850s. This growth rate imposed challenges for the city’s housing and transportation resources, but the biggest problem of all came from something more scatological when nearly 100,000 new residents arrive in a city, they generate a lot of excrement. A local editorial declared, “The culverts are so filthy that even the pigs wrinkle their noses in supreme disgust.”
Chicago officials commissioned Ellis Chesbrough, experienced in the railroad industry, to find an alternative to this chaos. Ellis used a tool he had seen when he was a young railroad worker: the screw jack, a device used to lift locomotives weighing tons off the tracks.
Aided by the young Pullman, who would later make his fortune building rail cars, Chesbrough launched one of the most ambitious engineering projects of the 19th century. Building by building, Chicago was suspended with screw jacks by an army of men. While the jacks lifted the buildings inch by inch, laborers dug holes under the buildings’ foundations and installed large wooden boxes for support, while masons built a new foundation under the structure. Sewer pipes were inserted under the buildings, with the main galleries running underneath the streets, which were then covered by a to error taken from the Chicago River, suspending the entire city nearly ten feet on average.
Chicago’s experience was replicated around the world; sewers removed human waste from people’s basements and backyards, but more often than not it was simply dumped into drinking water supply sources, either directly, as in Chicago’s case, or indirectly, by torrential rains. Making designs for sewers and water pipes on the scale of the city was not in itself sufficient to the task of keeping the big city clean and healthy. We also needed to understand what was happening at the scale of micro-organisms. We needed a theory that linked germs to disease – and we needed to keep those germs from harming us.
Working in Vienna’s General Hospital, Semmelweis came across an alarming natural experiment: the hospital had two maternity wards, one for the well-to-do, staffed by doctors and medical students, and one for the working class, who were cared for by midwives. For some reason, puerperal fever mortality rates were much lower in the working-class ward. After investigating the two environments, Semmelweis found that the elite doctors and their students alternated between deliveries and research on cadavers in the morgue. It was clear that some kind of infectious agent was transmitted from the cadavers to the mothers. With a simple application of a disinfectant, such as chlorinated lemon, the cycle of infection could be broken. Semmelweis was eventually fired, simply because he recommended that doctors wash their hands when delivering babies and dissecting cadavers on the same afternoon.
The modern synthesis that was to replace the miasma hypothesis – that diseases such as cholera and typhoid fever are caused not by smell, but by invisible organisms growing in contaminated water – was finally established, once again, thanks to an innovation in glass. In the early 1870s, German craftsmen at the Zeiss lens factory began producing new microscopes – devices that, for the first time, were built from mathematical formulas describing the behavior of light. These new lenses made possible the microbiological work of scientists such as Robert Koch, one of the first scientists to identify the cholera bacterium. (After receiving the Nobel Prize for his work in 1905, Koch wrote to Carl Zeiss, “I owe much of my success to your excellent microscopes.” Like his great rival Louis Pasteur, Koch and his microscopes helped develop and spread the germ theory.
Chapter 5. Time
They are now motionless [chandeliers of Pisa’s cathedral, the one with the leaning tower], but legend had it that in 1583 a nineteen-year-old student at the University of Pisa was in the cathedral and, while daydreaming in the church pew, saw one of the chandeliers swaying back and forth. While his concentrated classmates were reciting the Nicene Creed beside him, the student was mesmerized by the regular movement of the chandelier. Regardless of the path of the swing, the chandelier seemed to take the same amount of time to go back and forth. When the arc trajectory decreased, the speed of the chandelier decreased as well. To confirm his observations, the student measured the candle’s swing with the only reliable watch he could find, his own wrist.
Most nineteen-year-olds would think of less scientific ways to entertain themselves while attending mass, but it turns out that that college freshman was Galileo Galilei.
Galileo spent the next twenty years teaching mathematics, experimenting with telescopes, and more or less invented modern science, but he managed to keep alive in his memory the image of the swinging chandelier in the cathedral. Increasingly obsessed with dynamic science – the study of how objects move in space – he decided to build a pendulum that would recreate what he had observed in the Duomo of Pisa so many years before. He discovered that the time a pendulum takes to swing does not depend on the size of the arc or the mass of the object it swings, but on the length of the string. “The wonderful property of the pendulum,” he wrote to a colleague, the scientist Giovanni Battista Baliani, “is to make all oscillations, large or small, in equal times.”
After 58 years of preparation, his slow hunch about the “magic property” of the pendulum began to take shape after all. The idea lay at the intersection point of several disciplines and interests: Galileo’s recollection of the Duomo chandelier, his studies of motion and Jupiter’s moons, the emergence of the global shipping industry, and the new demand for clocks that measured the second. Aided by his son, he began to draw up plans for the first pendulum clock.
By the end of the next century, the pendulum clock would become commonplace throughout Europe, particularly in England – in workplaces, squares, and even in more prosperous homes. Without the pendulum clock, the industrial take-off that began in England in the mid-18th century would have taken at least that much longer to reach escape velocity.
The British dealt with the problem of differences in the many clocks by standardizing the entire country by Greenwich Mean Time (GMT, from Greenwich Mean Time) in the late 1840s, synchronizing the railroad clocks by telegraph. (To this day, clocks in all air traffic control centers and flight cabins around the world follow Greenwich time; GMT is the only time zone in the sky.)
The remarkable ability of quartz crystal to expand and contract in “equal time” was first exploited by radio engineers in the 1920s, using this property to maintain radio transmissions at current frequencies. In 1928, W. A. Marrison of Bell Labs built the first clock that kept time from the regular vibrations of a quartz crystal.
Studying the behavior of electrons orbiting in a cesium atom, Bohr noticed that they moved with surprising regularity; undisturbed in the chaotic resistance of mountain ranges or tides, the electrons pulsed at a rate several orders of magnitude more reliable than the rotation of the Earth. The first atomic clocks were built in the mid-1950s and soon set a new standard for precision. We were now able to measure nanoseconds, a thousand times more accurate than the microseconds of quartz.
Chapter 6. Light
To this day, scientists are not sure why sperm whales produce so much spermaceti. (An adult sperm whale can hold up to 1,890 liters of spermaceti inside its skull.) Some believe that whales use the spermaceti to float; others think that the substance helps the mammal’s echolocation system. New Englanders, however, soon discovered a light much stronger and clearer than tallow candles, and without the annoying smoke. In the second half of the 18th century, spermaceti candles became the most valued form of artificial light in the USA and Europe.
One of the ocean’s most extraordinary creatures was spared because humans discovered fossil plant deposits below the earth’s surface. Fossil fuels would become fundamental to almost every aspect of life in the 20th century, but their first commercial use revolved around light. The new lamps were twenty times brighter than any candle had been before, and their brighter glow generated an explosion in magazine and newspaper publishing in the second half of the 19th century.
The electric light bulb marked a limit in the history of innovation, but for quite different reasons. It would be a stretch to say that the light bulb was created by a joint effort, but to claim that a single man named Thomas Edison invented it is an even more serious distortion.
The traditional story goes something like this: After a triumphant start to his career inventing the phonograph and the stock ticker at the age of thirty, Edison spent a few months touring the American West – perhaps not coincidentally a region much darker at night than the gas-lit streets of New York and New Jersey. Two days after returning to his lab in Menlo Park in August 1878, Edison drew three diagrams in his notebook, calling them “electric light. In 1879, he filed a patent application for an “electric light bulb” that exhibited all the major features of the bulb we know today.
The bulb was a product of network innovation, and so it is fitting that the reality of electric light ultimately revealed itself more as a network or system than as a single entity. Edison’s real victory did not come with the bamboo filament incandescent in a vacuum, or with the lighting of the Pearl Street district two years later. For this to happen, the light bulb had to be invented, yes, but it also required a reliable source of electric current, a current distribution system that covered the locality, a mechanism to connect the individual bulbs to the grid, and a meter to gauge the amount of electricity used in each house.
Just like Edison’s light bulb, the true story of the origin of the photographic flash is a very complicated affair; it is more of a networked business. Big ideas merge from smaller incremental advances. Smyth may have been the first to conceive the idea of combining magnesium with an oxygen-rich fuel element. However, flash photography would become standard practice two decades later, when two German scientists, Adolf Miethe and Johannes Gaedicke, sprayed fine magnesium powder with potassium chlorate, creating a much more stable mixture that made it possible to take pictures with high shutter speeds at low light. They called this technique Blitzlicht – literally, “flashlight”.
If early science fiction fans of War of the Worlds and Flash Gordon were disappointed to see the powerful laser scanning gum packets – the bright, concentrated light used for inventory management – they would probably be more excited to see the National Ignition Facility (NIF) at Lawrence Livermore Laboratory in Northern California, where scientists have built the world’s largest, highest-energy laser system. Artificial light began as simple illumination so that we could read and enjoy ourselves after dark, soon after it was transformed into advertising, art, and information. But at the NIF they are bringing light full circle, using lasers to produce a new source of energy based on nuclear fusion, recreating the process that occurs naturally in the core of the Sun, the original source of natural light.
FACTSHEET
Original Title: How We Got to Now
Author: Steven Johnson
Photos: Claudio Rolli, Alexandre Lecocq, Maksym Pozniak-Haraburda, Maskmedicare Shop, Lucian Alexe, Alex Litvin / Unsplash
Review: Rogério H Jönck