Some people may believe that 20th and 21st century physics research has
less of a direct impact on their daily lives than biology, chemistry, engineering, and other fields. Perhaps they think of physics as an abstract, enigmatic, or purely academic endeavor. Others
think that physics only contributes to national
defense and medical
imaging. I created this page to dispel those myths.
Nearly everyone would agree that the computer, the transistor, and the
World Wide Web are among the greatest inventions of the 20th
century. Economists and laymen alike know that today's entire world
economy is inextricably linked to these technologies. The daily lives of
a large fraction of Earth's inhabitants would be substantially different were it
not for their inventions. Most would agree that America's preeminence
in computer and information technology is at least partly responsible for its
status as an "economic superpower." The
wealth of other nations such as Japan, Taiwan, countries in Western Europe, and others
is also due, in part, to their embracement of, and contributions to, the information age.
Read below to learn these little known facts: The electronic digital computer, the
transistor, the laser, and even the World Wide Web were all invented by
physicists. These inventions make up the foundation of modern technology.
More than 100,000 research papers have been written on the phenomenon of high-temperature superconductivity, but still no understanding has been reached as to why they "superconduct" at the relatively "high" temperatures they do. Driven by the desire to create materials that superconduct at even higher temperatures (say room temperature), and due to the many current and potential applications,
this continues to be one of the most active areas of research in physics today. It is well known that whoever figures out the correct mathematical
description of high-temperature superconductivity will win a Nobel Prize as well.
Before talking more about specific inventions, I want to introduce the
fundamental science that made them possible.
When physicists such as Planck, Bohr, de Broglie, Heisenberg, Schrödinger,
Dirac, and Einstein formulated quantum mechanics from 1900 to 1930, they
were trying to understand the fundamental laws of the universe, not invent something
of great economic importance. But it turns out they did, as we shall explain below.
And when the great physicist Paul Dirac said
in 1929 that all of chemistry
could, in principle, be explained in terms of the newly formulated theory of quantum
mechanics, probably few people believed him. But it turns out he was
right. As far as we know, the
structure of every atom in the universe is determined by quantum
mechanics. Today, all chemists and material scientists are trained extensively in quantum mechanics, as evidenced by this chemistry class at Harvard. Biologists like Francis Crick, who won
the 1962 Nobel Prize
in Medicine for the discovery of DNA, realized many years ago that even biology
is ultimately governed by the laws of physics and quantum mechanics.
A thorough understanding of quantum mechanics is necessary to engineer solid state devices such as transistors. Transistors are the building blocks of
electronics and computers. It is impossible to understanding semiconductors (the building blocks of transistors), or any material for that matter, with classical physics alone (i.e. physics known before the discoveries of quantum mechanics and relativity). The physics of lasers
and the interaction of light with matter are described by what's called quantum
electrodynamics. Even the light entering your eye from this computer
screen requires quantum mechanics to understand! Elementary
particle physics describes the fundamental building blocks of the
universe in the language of relativistic quantum field theory, which is
basically quantum mechanics mixed with Einstein's relativity. Without
quantum mechanics, the "information age" (and much of modern
science) would not exist today.
This discovery of the electron by
physicist J.J. Thompson in 1897 was probably underappreciated when it occurred,
just like the development of quantum mechanics. After all, in 1897 it probably sounded like a waste of money to do experiments on a particle that is too tiny to ever see. But
of course, now our civilization is dependent on electronics, chemistry, materials science, medicine, etc.--all of which require an understanding of the electron.
It is difficult to put a price tag on the amount of current U.S. gross
domestic product that would not exist without the discoveries of the
electron and quantum mechanics. But it would likely reach into the
trillions of dollars. The inventions of the computer, the transistor, and the World Wide Web are also at the root of billions or trillions of dollars of our economy. The laser is used in fiber optics, which are the basis for a global telecommunications industry worth over a trillion dollars.
The first electronic digital computer was built in the basement of the
physics department at Iowa State University in 1939 by Professor
John Atanasoff, who had a Ph.D. in theoretical physics from the University of Wisconsin,
and his physics graduate student Clifford Berry.
Atanasoff was given the
of Technology in 1990 by U.S. President George Bush in recognition of his
invention (pictures of
ceremony). John Gustafson, an Ames Laboratory computational scientist
pointed out: "It is not an exaggeration to say that Atanasoff's work is at
the root of trillions of dollars of our economy."
It is indeed the case that
Atanasoff and Berry do not receive the proper recognition, at least from the general
public, who have no idea that an electronic digital computer was created as early as 1939, nor that
it was designed and built by physicists (perhaps many think Bill Gates invented the computer?).
It is amazing to think that the computer industry, now worth in the hundreds of billions of dollars, owes
its existence to a brilliant physics professor and his talented graduate student, working away
at Iowa State University with a $650 research grant (no that is not a typo), driven by their
own curiosity to think, design, and build something truly novel. It is certain that
they never dreamed their modest machine would have such a profound impact on the world.
The second electronic digital computer,
also proposed and designed by a physicist,
was completed in 1945. This computer, called the ENIAC, was largely based
on Atanasoff's pioneering work, and is discussed below.
In 1947, young physicists at Bell Laboratories in New Jersey inserted two gold contacts 1/64th of an inch apart from each other into a slab of germanium and, by wiring up some electronics, discovered that the signal coming out of this semiconductor had at least 18 times the power of the signal going in--in other words they had achieved amplification! Walter Brattain wrote in his lab notebook: "This circuit was actually spoken over and by switching the device in and out a distinct gain in speech level could be heard and seen on the scope presentation with no noticeable change in quality."
The transistor is the building block of all modern electronics and
computers (everything from a battery operated watch, to a coffee maker, to a cell phone, to a
supercomputer). Microprocessors for modern personal computers, such as the Intel Pentium 4 Processor, contain around 55 million transistors each. Unless you printed this page and are reading it in the woods, there are millions of transistors within a meter of you at this time (and even in the woods you probably have a cell phone and are wearing a watch).
Before the invention of the transistor, computers used vacuum tubes. It took one of these large vacuum tubes to do the same job as a transistor, the smallest of which today are only 80 atoms wide. Computers using vacuum tubes filled huge rooms, but were not powerful by today's standards. In 1945 the U.S. Army built a vacuum tube computer called the ENIAC, proposed by and developed in part by physicist John W. Mauchly, who borrowed (or perhaps stole) many of the ideas and design from physicist John Atanasoff (discussed above). The ENIAC cost about $500,000 ($5 million adjusted for inflation), took up a room the size of a suburban home, weighed 60,000 lbs, used 18,000 vacuum tubes, and was the fastest computer of its time. The vacuum tubes and cooling system used huge amounts of power--$650 per hour worth of electricity to be exact.
But despite its size and cost, the vacuum tube-based ENIAC was only capable of about 1000 math operations per second, compared to around 1 billion operations per second for today's transistor-based personal computers. To put this in perspective, sometimes I perform physics calculations on a modern desktop computer that take about 30 minutes to run. It's a good thing I am not using the ENIAC, or these calculations would each take 60 years! Of course, modern supercomputers are even faster than desktops. A calculation that takes just 15 seconds on today's fastest supercomputer would take 19,000 years on the ENIAC, meaning we would have had to start the calculation during the ice age for it to be finished by now.
Thanks to transistors, today's personal computers can pack all their computational power into a tiny microchip the size of cracker that costs only a couple hundreds bucks and uses very little electricity. If a modern notebook computer were made using vacuum tubes, the tubes, power system, wiring and cooling equipment would fill an entire skyscraper! It has also been calculated that a cell phone would be the size of the George Washington monument. Vacuum tubes were not only big, expensive, and hot, they were also unreliable and would burn out frequently. Because the tubes actually glowed and gave off heat, they attracted moths and other bugs, which caused short circuits. Scientists would have to periodically "debug," which literally meant shutting down the computer and cleaning out the dead bugs (which is
why still to this day, fixing computer problems is called debugging).
The vacuum tube-based computers
built for the SAGE project, which was the brain
child of physicist George E. Valley, were even larger than the ENIAC (in fact they are the largest computers in history). Each of the 23 computers in the SAGE network, the last of which went online in 1962, contained 55,000 vacuum tubes, weighed 250 tons, and took up two stories of a building. The total project cost of SAGE was around $10 billion (or $60 billion adjusted for inflation), which is more than double the inflation adjusted price of the Manhattan project that built the first atomic bomb! Their power consumption was enormous; if you ran a SAGE computer in your house today, your electricity bill would be around $150,000 per month. The SAGE computers were a breakthrough for their day. But today an $8 hand held calculator built with transistors will out perform them, and use watch batteries to do it. The affordability, small size, and power of modern computers and electronics would never have been achieved without the invention of the transistor. The information age as we know it simply would not exist.
In the 1980s, the thousands of physicists at CERN Particle Physics
Laboratory in Geneva needed a better way to exchange information with their
colleagues working in different universities and institutes all over the world. Tim Berners-Lee, a graduate from Oxford University with 1st
class Honors in Physics, invented the World Wide Web at CERN in 1990 to meet this demand. Along with creating the first web browser and web server, he
developed the software conventions that are key to the Web's usefulness,
with acronyms like URL (uniform resource locator) and HTTP (hypertext
transfer protocol). Berners-Lee's supervisor was physicist D. M. Sendall, who gave him the initial go-ahead on the project.
Between 1990 and 1993, the Web was mostly used by scientists to collaborate their research. In 1993 it began to spread to the rest of the world, and now already the majority of Americans surf the Web. The number of websites has grown from just 130 in June 1993 to around 9 million in 2002. Now
over a trillion dollars worth of commerce takes place over the Internet every year!
Much of this e-commerce is done over the World Wide Web.
(As you may know, the terms Web and Internet do not mean the same thing. The Web, that you are surfing now, uses the Internet but is not the only communication service on it. Before the invention of the Web, few people in the general population used the Internet, but it did exist. See here and here for Berners-Lee's explanation of this.) What began as a better way for physicists to manage information and communicate--the World Wide Web--is now a vast "global information superhighway," accessible to all.
In 1999 Time magazine dubbed Berners-Lee one of the 100 greatest minds of the century. In 2004, he won the first annual Millennium Technology Prize, an "international acknowledgement of outstanding technological innovation that directly promotes people's quality of life," with an award of $1.2 million.
CD players, CD-ROMs, CD-burners, and DVD players all use lasers to read data.
Without fundamental research in physics by Einstein,
the inventors of the laser, and others, the CD and other applications of the laser such as fiber optics representing
industries worth billions of dollars would
not exist. It is ironic that, like so many other discoveries in physics, the laser was at first thought by many to have no practical uses whatsoever.
It is imperative that the federal government and private industry continue to fund fundamental research in physics so that
physicists can continue to make discoveries and inventions as important as the ones they made in the past. Indeed,
it would make economic sense to increase the funding. The relatively small percent of GDP that goes toward fundamental research in physics is only a tiny fraction of the trillions of dollars that inventions by physicists have contributed to the economy. Unfortunately, in the United States, the federal R&D expenditures for all physical
sciences combined was only 0.7 percent of GDP in 2000. Federal funding for physics declined 20% between 1993 and 2000. If we wish to remain a prosperous, innovative nation, this trend cannot continue.
Only time will tell what the next groundbreaking invention by physicists will be, but if history is any guide, we can be sure there will be one. Perhaps it will be a quantum computer, capable of speeds millions of times faster than current computers. Or perhaps it will be a 2000 MPH levitating train, made possible by research in superconductors. But most likely, it will be something no one has thought of, yet.