10 American Inventions That Changed the World

10 American Inventions That Changed the World
(R.W. for American Essence)
3/16/2022
Updated:
12/3/2022
0:00

Electricity

Efforts to understand and harness electricity began in the 18th century. Scientists thought electricity could be used to create a cheap way for people to light their homes. One of the most notable pioneers in electricity was Thomas Edison, who developed the first practical electrical light bulb in the late 1870s. Edison launched a company that would later become General Electric and opened America’s first central power plant in New York in 1882. Edison’s electrical system used direct current (DC), which was the standard in the United States during the early years of electricity. However, Nikola Tesla and George Westinghouse were in favor of alternating current (AC), which transmitted electricity over long distances more economically than DC. Worried that he would begin losing profits from his DC patents, Edison began a campaign to discredit AC. This bitter dispute between these inventors became known as the War of the Currents. This “war” came to an unofficial end during the 1893 World’s Fair in Chicago, when Westinghouse beat out General Electric in a bid to supply electricity to the fair using AC. AC soon became dominant in the electric power industry. Most of our electricity today is powered by AC, although DC has seen a bit of a resurgence in recent years.
(R.W. for American Essence)
(R.W. for American Essence)

Telephone

During the early 19th century, several inventors started creating devices that used electric signals to transfer messages. Elisha Gray and Antonio Meucci were both noted in history to have designed devices that could transmit speech electrically. However, Alexander Graham Bell is credited with the invention of the telephone, and he received the first U.S. patent for it. Bell’s success with the telephone came from his efforts to create a harmonic telegraph, a device which could transmit multiple messages over a wire at a time. After an accidental discovery during one of his experiments, Bell began to explore a different idea—transmitting the human voice over the wires. He eventually succeeded in developing the telephone in 1876 and created the Bell Telephone Company (now known as AT&T) one year later. The telephone completely transformed human communication. It allowed people to connect in real time and share information with greater efficiency. It also spurred the development of telephone lines and later cellular phone networks. In 1993, IBM released a touchscreen cellular phone with Personal Digital Assistant (PDA) capabilities, a precursor to the Apple iPhone. Modern smartphones—more computers than telephones—allowed phones to become not only a productivity tool, but also a multimedia powerhouse.
(R.W. for American Essence)
(R.W. for American Essence)

Moving Assembly Line

Henry Ford’s development of the Model T car brought about another very significant innovation—the moving assembly line. In the early 1900s, automobiles had been growing in popularity in the United States. The Model T became a favorite due to its affordability, yet Ford was constantly looking for ways to lower the price even more. In 1913, Ford introduced the idea of building cars one piece at a time instead of one car at a time. He used a conveyor belt to pull a vehicle down a line, where it could be built step-by-step, thereby creating the first moving assembly line. This reduced the time it took to build a car from over 12 hours to around 90 minutes. The assembly line enabled workers to be more efficient by dividing up labor, which helped increase productivity and profits. It also allowed Ford to drop the price of the Model T from $825 in 1908 to just $260 in 1925. However, Ford’s employees quickly found assembly line work quite monotonous. In an effort to reduce turnover, Ford increased the wages of his employees and decreased shifts by one hour. Pretty soon, other factories began replicating the production process that Ford created. Assembly lines are still used in most factories today, except many of the production steps are performed by machinery.
(R.W. for American Essence)
(R.W. for American Essence)

Personal Computer

Early computers were hardly personal or portable. One of the first computers, the Electronic Numerical Integrator and Computer (ENIAC), was built for U.S. military applications during World War II. However, it weighed 60,000 pounds and took up almost 2,000 square feet of space. In 1971, the invention of the microprocessor really helped make the idea of a “personal” computer a reality. Microprocessors could also run computer programs, but they were only the size of a thumbnail. This led to the invention of the “minicomputer” and later “microcomputer,” which eventually developed into the personal computer. One of the first microcomputers, the Altair computer, quickly became popular among the public after Bill Gates and Paul Allen developed software for it that made it easier to use. Gates and Allen formed Microsoft (short for microcomputer software), which later led the development of software for computers. Meanwhile, Steve Jobs and Stephen Wozniak formed the Apple Computer Company and launched several computers that expanded upon the capabilities of the Altair. In 1977, Jobs and Wozniak introduced the Apple II, which included a colored screen, keyboard, and expansion slots. In 1981, IBM released a powerful computer called the Personal Computer or PC. The personal computer revolution was underway. Other innovations like the graphical user interface and the computer mouse made PCs even more accessible to the masses.
(R.W. for American Essence)
(R.W. for American Essence)

Internet

Early computers were quite powerful, but also large and immobile. Researchers had to travel great distances to access and share information. At the height of the Cold War, the U.S. government began to investigate other means of sharing information that would not be as easily affected by a Soviet nuclear attack. In the 1960s, the U.S. Advanced Research Projects Agency (ARPA) developed ARPANET, a network that eventually grew into the modern internet. However, as more networks of computers began to join ARPANET, it quickly became necessary to develop a set of rules for how to transmit data through these networks. On January 1, 1983, ARPANET adopted the Transfer Control Protocol/Internetwork Protocol (TCP/IP), a standard way for computers to be able to “talk” to each other. This universal language for computers enabled researchers to eventually build a globally interconnected network of networks, also known as an “internetwork” or internet for short. This also paved the way for the creation of the World Wide Web, which is the most common way of accessing data online today.
(R.W. for American Essence)
(R.W. for American Essence)

Airplane

The earliest attempts at flight were made by mimicking bird flight. The idea of the airplane as a fixed-wing aircraft did not emerge until the late 18th century. Many scientists later experimented with controlled flights in gliders but were unsuccessful at achieving self-powered, sustained flight. Inspired by these pioneers in aviation, brothers Orville and Wilbur Wright began experimenting with their own aircraft designs and developing steering systems for powered flight. On December 17, 1903, the Wright brothers made the first successful self-powered flight of a heavier-than-air aircraft in Kitty Hawk, North Carolina. By 1905, they had developed their aircraft into the first fully practical airplane. During one test flight in this 1905 flyer, Wilbur covered a distance of 24.5 miles in 39 minutes. However, public interest in their achievements did not occur until 1908, when the Wright brothers made their first public flights in Europe. Engineers from around the world soon began to study and improve upon the Wright brothers’ design. World War I later accelerated the militarization and manufacturing of airplanes.
(R.W. for American Essence)
(R.W. for American Essence)

Air Conditioning

Although the concept of air conditioning has been around since ancient Egypt, the first modern electrical air conditioning unit was originally developed to solve an industrial problem. In 1902, extreme humidity at the Sackett-Wilhelms Lithographing and Publishing Company in Brooklyn caused pages to shrink and swell, which disrupted the printing process. Willis Carrier, a 25-year-old experimental engineer, invented a machine to help reduce humidity around the printing plant. Carrier quickly realized his invention also had the benefit of producing cooled air for comfort, which could benefit many other industries. By 1922, Carrier had refined his invention into the Centrifugal Refrigeration Compressor, a forerunner to the modern air conditioning systems today. However, the general public did not have much exposure to the idea of “comfort cooling” until the 1920s, when cooling systems became a fixture in movie houses. Public theaters at the time were known for being humid and smelly, so Carrier’s invention was a revolution for the theater industry. The air conditioner gradually moved from public spaces to private homes, and by the late 1960s, most U.S. households had central air conditioning. Today, the United States consumes more energy for air conditioning than the rest of the world combined.
(R.W. for American Essence)
(R.W. for American Essence)

Plastic

In the 19th century, excessive hunting of elephants caused ivory to become scarce, which created a problem for billiards suppliers. Plastic actually came about when suppliers began to look for a substitute for the ivory used to create billiard balls. In 1869, John Wesley Hyatt, a printer from Starkey, New York, took up the challenge and ended up inventing the first plastic—a synthetic material he called celluloid. Celluloid could be crafted into different shapes and patterned to imitate various types of natural substances (including ivory). This breakthrough meant that human manufacturing was no longer constrained by natural materials. Other advances in plastics soon followed, including the development of polystyrene, vinyl, acrylics, and nylon. This flood of new materials also brought about new manufacturing technologies like injection molding, which allowed plastics to be produced inexpensively and at a rapid scale. During World War II, the U.S. military began to rely heavily on synthetic alternatives in order to conserve natural resources, which caused a fourfold increase in plastic production in the United States. In postwar years, plastic manufacturing companies shifted their attention to consumer products. Plastics allowed goods to be made cheaper, safer, and stronger, which also helped to raise people’s standard of living. However, some plastics and overconsumption have also resulted in a huge buildup of waste, creating environmental concerns that are still being addressed today.
(R.W. for American Essence)
(R.W. for American Essence)

GPS

When the Russian satellite Sputnik was launched in 1957, American physicists discovered they could accurately track the satellite’s movements thanks, in part, to the Doppler effect. This discovery intrigued the U.S. government since it could aid in planning operations for the U.S. military. For decades, three men were given recognition for their various contributions to the U.S. Defense Department’s GPS project, which began in 1973. These men are Roger L. Easton, Ivan Getting, and Bradford Parkinson. It wasn’t until 2018 that the contributions of Dr. Gladys West were recognized. West’s calculations gave an accurate understanding of the geometric shape of the Earth. Without Dr. West’s calculations, GPS would not have been possible. GPS is a navigation system that uses satellites to accurately determine the locations between geographical points on Earth. Currently, many civilian technologies use GPS technology, including smart phones, smart watches, and wireless headphones. Real time traffic apps like Google Maps and Waze, and even mobile games like Pokemon Go, use GPS technology.
(R.W. for American Essence)
(R.W. for American Essence)

Transistors

The world is more connected than ever, and we may have this single tool to thank for it: Transistors are instrumental in making many of our modern pieces of technology possible. A transistor is a semiconductor device which allows an electrical current to be switched or amplified within a larger machine or piece of tech—it is integral to our modern computers as it makes the transfer of information possible. These are present in radios, smart phones, and television. The most common usage of transistors in the 21st century is within memory chips inside computers. This makes transistors one of the most important technological advances of the 20th century.

The transistor was first patented by Austro-Hungarian physicist Julius Edgar Lilienfeld in 1926, but he was unable to produce a working prototype. It wasn’t until 1947 that three physicists, John Bardeen, Walter Brattain, and William Shockley, working at Bell Laboratories in Murray Hill, New Jersey, built the first working transistor. Their experiments aimed at studying how solid matter was affected within an electronic field. Their breakthrough came about when they replaced the standard glass vacuum tubes with gold and germanium to create a successful semiconductor diode within their transistor. This achievement—dubbed by some the most important invention of the 20th century, and by others the most world-changing device to ever be invented—secured them the Nobel Prize in 1956.

This article was originally published in American Essence magazine.

Related Topics