1804: Dalton’s Atomic Theory

The quest to discover the basic building blocks of matter has always been an important goal of physicists.  Today most people are familiar with the atomic theory, but when it was first fully developed in the early 19th century it was considered a revolutionary conceptual leap.  The first complete atomic theory was proposed by John Dalton, and his atomic theory of matter laid the groundwork for modern physics and chemistry.  

Background and Development to Dalton’s Atomic Theory

The concept of atoms as fundamental units of matter had been around for centuries, dating back to the ancient Greek philosophers.  However, the ideas of the Greeks were purely philosophical and lacked any experimental evidence. They therefore laid mostly dormant for many centuries.

By the start of the 19th century chemistry was undergoing significant changes.  The old ideas of alchemy were becoming obsolete and were being replaced by a more systematic approach.  Alchemy was the predecessor of modern chemistry, but it was mostly concerned with the transmutation of materials.  Beginning in the 17th century, scientists such as Robert Boyle began challenging the idea of alchemy.  However, the true chemical revolution came in the late 18th century, particularly with the work of Antonie Lavoisier.  

Lavoisier is often referred to as the “father of modern chemistry” and with good reason.  His work involved conducting meticulous quantitative experiments which established the Law of Conservation of Mass and refuted the phlogiston theory, the prevailing explanation for combustion and rusting.  The Law of Conservation of Mass was a crucial idea that John Dalton used in formulating his atomic theory.  

One final idea that was used by Dalton to establish his atomic theory was developed by the French chemist Joseph Proust and is known as the law of definite proportions.  Proust demonstrated that chemical compounds always contain the same proportion of elements by mass, implying that elements combine in fixed rations.  

The Development of Dalton’s Atomic Theory

John Dalton's A New System of Chemical Philosophy - 1808
John Dalton’s A New System of Chemical Philosophy

John Dalton was an English Quaker who was the first person to put forth a complete atomic theory of matter.  Dalton did not come from a wealthy background and his scientific interests developed gradually, after being influenced by a wealthy local Quaker named Elihu Robinson. Robinson spurred Dalton’s interest in meteorology, and in March 1787 he began a meteorological diary where he recorded daily observations until his death in 1844.

Dalton’s keen interest in meteorology led him to think deeply about the ways gases mix providing the initial steps in his formulation of his atomic theory. He eventually conducted several experiments that were focused on gases, especially what happened when they combined.  In 1803, Dalton published his first list of atomic weights for a number of elements base on his experiments and observations.  He soon became convinced of a few key principles that became the central principles of his atomic theory. Dalton presented his full ideas in lectures to the Royal Institution in December 1803 and January 1804.

Dalton’s atomic theory consisted of these key principles.

  1. Elements are composed of atoms: Dalton proposed that all elements are composed of indivisible particles called atoms.  These atoms are unique to each element and have certain properties that determine that characteristics of the element.
  2. Atoms are indestructible and indivisible: Dalton believed that atoms were indestructible and did not consist of any smaller, more fundamental particles. 
  3. Atoms of the same element are identical: Dalton suggested that atoms of the same element are identical in mass, size, and chemical properties.
  4. Compounds are formed by combinations of atoms: Dalton suggested that compounds are formed by mixing different combinations of atoms of different elements in fixed rations.
  5. The law of multiple proportions: This law states that when two elements combine to form more than one compound, the masses of one element combine with a fixed mass of the other element in whole number ratios.
Dalton's depiction of various atoms, taken from A New System of Chemical Philosophy, 1808
Dalton’s depiction of various atoms, taken from A New System of Chemical Philosophy

In 1808, he published a book titled A New System of Chemical Philosophy, the seminal work that laid the foundation for modern atomic theory. Dalton’s work, however, did not make an immediate impact as many people still found it hard to accept the idea of atoms. Most considered it a useful heuristic device, as a tool to describe how elements behave, but not convinced that atoms are actually real, physical entities. It took another 100 years before the first definitive proof of atoms, when Albert Einstein conducted a series of experiments and in 1905 published his paper on Brownian motion.

Impact and Legacy of Dalton’s Atomic Theory

Dalton’s atomic theory had a profound impact on the development of the emerging science of chemistry, and it laid the groundwork for many further scientific discoveries.  It contributed to the development of the periodic table of elements as scientists discovered more elements and their associated properties.  It also helped scientists understand chemical reactions better by viewing these reactions as interactions between individual atoms.  Although many aspects of Dalton’s atomic theory have been revised and updated based on new discoveries, his core principle of atoms as the basic building blocks of the elements fundamental to modern atomic theory.  

Continue reading more about the exciting history of science!

1745: The Leyden Jar

The invention of the Leyden Jar marked a significant moment in this history of electrical engineering. The Leyden Jar can be thought of as the first electrical capacitor – a device that stores and releases electrical energy.

The Invention of the Leyden Jar

During the 18th century the mysterious phenomenon of electricity was becoming a hot topic among learned men of science. Electricity could only be created and observed in the moment. One of the mysteries to be solved was whether electricity could be stored for later use and how to accomplish it. The invention that solved the mystery became known as the Leyden jar, named after the city of an early inventor of the device.

Drawing of a Leyden Jar
Drawing of a Leyden Jar

The Leyden Jar is typically credited to two individuals, who independently came up with the same idea.  In Germany, Ewald Georg von Kleist was experimenting with electricity.  He was attempting to store electricity with a medicine bottle filled with water and a nail inserted through a cork stopper.  He charged the jar by touching the nail with an electrostatic generator, and he assumed that the glass jar would prevent the electricity from escaping.  While holding the glass jar in one hand, he accidentally touched the nail and received a significant shock, proving that electricity was indeed stored inside the jar.  

Von Kleist experiments were not well known and around the same time another man experimenting with electricity named Peter van Musschenbroek, of Leyden, Netherlands, also stumbled upon the same invention.  Musschenbroek’s device was much like von Kleist’s jar.  It consisted of a glass jar filled with water that contained a metal rod through a cork sealing the top of the jar.  The outside of the jar was coated with a metal foil.  When an electric charge was applied to the metal rod it was found that electricity could be stored in the jar.  Unfortunately for the person touching the metal rod, a significant shock was received. As Musschenbroek recorded what happened when he first touched the rod:

Suddenly I received in my right hand a shock of such violence that my whole body was shaken as by a lightning stroke…. I believed that I was done for.

It didn’t take long until Musschenbroek’s Leyden jar being used and improved by others.  In 1746, the following year, the English physician William Watson improved the jars storage capacity by coating both the inside and outside with metal foil.  Also that same year, the French physicist Jean-Antoine Nollet discharged a Leyden jar in front of the French King Louis XV.  During the demonstration, Nollet arranged a circle of 180 Royal Guards, each holding hands and he passed the charge from the Leyden jar through the circle.  The shock was felt almost instantaneously by all members of the Royal Guard, to the delight of the King and his court.  Demonstrations such as this brought widespread attention to the exciting new field of electricity.  

Impact and Legacy of the Leyden Jar

Prior to the invention of the Leyden jar, electricity could only be observed and experimented at the moment it was created.  The Leyden jar changed this by allowing scientists to store electrical energy and use it when needed.  Researchers could now conduct various experiments related to electric discharge, conductivity, and other electrical phenomena.  As an added bonus, the jar was easily transported, especially compared to the electrostatic generators of the day.  The jars could be linked together to provide additional storage capacity.  These abilities contributed to the growth of knowledge in the field of electrical science.  

For one example, the American scientist and statesman Benjamin Franklin famously used the Leyden jar during his kite experiment in 1752.  In that experiment, Franklin flew a kite during a lightning storm in an attempt to prove that lightning was a form of electricity.  He attached a metal key to the end of the kite, and the key was then connected to a Leyden jar.  Despite some popular accounts of the experiment, lightning likely never struck the kite directly or else Franklin would have been killed. However, he was able to observe that the Leyden jar was being charged, thus proving the electric nature of lightning.  

The Leyden jar is also considered the first electrical capacitor, which today is a fundamental component in modern electric circuits.   The invention of the Leyden jar laid the groundwork for the development of more sophisticated capacitors.  While they briefly feel out of use after the invention of the battery, the basic idea of the Leyden jar capacitor found a renewed use at the end of the 19th century in modern electronic devices, albeit in a much smaller form. 

Overall, the Leyden jar played a pivotal role in the early explanation and understanding of electricity.  Its impact can be seen in the subsequent development of electrical technology and science.  

Continue reading more about the exciting history of science! 

1932: Discovery of the Neutron

The neutron was discovered by the British physicist Sir James Chadwick in 1932, marking a pivotal moment in the understanding of atomic structure. It was, in a way, the culmination of a series of scientific investigations of the subatomic particles of the atom that spanned several decades. The identification of the neutron provided answers to questions about the mass of the atom, ultimately leading to important developments in nuclear physics.

Developments that Led to the Discovery of the Neutron

Nuclear Structure of the Atom
Nuclear Structure of the Atom

In 1897, J. J. Thompson discovered the electron, a subatomic particle with a negative electrical charge. This discovery provided the evidence that atoms were composed of smaller particles. Two decades later, Ernest Rutherford discovered the proton, a subatomic particle with a positive electrical charge. Rutherford proposed a model of the atom with a dense, positively charged nucleus at its center, orbited by negatively charged electrons. However, this model presented a problem. The positively charged protons in the nucleus should all repel each other, causing the nucleus to burst apart. Yet, this did not happen as the nucleus is obviously stable, and the reasons for this stability were not understood at the time. The existence of a neutral particle was postulated by Rutherford as early as 1920.

The Discovery of the Neutron

This discovery of the neutron was the culmination of a series of successive experiments, worked out by several scientists in the late 1920s and early 1930s. In Germany, Walter Bothe found that beryllium exposed to alpha particles produced a new form of radiation. Bothe attempted to explain this sradiation in terms of gamma rays, because it was not deflected by either electric or magnetic fields. All known particles at the time (electrons and protons) contained a charge.

Taking this information a step further, the French husband-and-wife team of Irene Joliot-Curie (the daughter of Pierre and Marie Curie) and Frederic Joliot reported results from an experiment in January 1932 that led to to neutrons discovery. In the same vein as Bothe, their experiment involved the bombardment beryllium by alpha particles. They noticed that the radiation could eject protons from hydrogen-rich substances such as paraffin.  This was puzzling because gamma rays should not have enough energy to be able to knock out protons in this manner. In other words if this unknown radiation was indeed gamma rays then the law of conservation of energy was being violated.

Experiment by James Chadwick that led to the discovery of the neutron.
Experiment by James Chadwick that led to the Discovery of the Neutron
(Credit: scienceready.com)

Back at the Cavendish Laboratory in Cambridge, James Chadwick, a college or Rutherford, quickly became interested in these results.  Chadwick and Rutherford had been working on and off over the past decade in identifying the missing neutral particle suspected to be in the atomic nucleus.  This background allowed Chadwick to move quickly.  He also conducted a series of experiments where he bombarded light elements, such as beryllium, with alpha particles.  He noticed the same radiation being emitted which was not deflected by electric or magnetic fields.  However he interpreted the results differently than the others who conducted similar experiments. His observations led him to correctly conclude that the radiation was composed of uncharged particles. He used the laws of conservation of momentum and conservation of energy to calculate that the neutron has a mass similar to that of a proton.  He presented this as evidence of a new subatomic particle, which he detailed in a paper published in 1932 and named the neutron.  For his work he was awarded the Noble Prize in Physics in 1935.  

Impact of the Neutron’s Discovery

The discovery of the neutron was a key piece of the puzzle that allowed scientists to understand the binding energy of the atomic nucleus.  It had enormous impacts in both applied and theoretical physics.  

The discovery of the neutron explained the missing mass in atomic nuclei.  This in turn helped to explain the existence of isotopes – variants of elements with the same number of protons but different atomic weights – and therefore different numbers of neutrons in the atomic nuclei.  It also advanced the understanding of radioactive decay processes.  

The most important impact of the discovery of the neutron was in nuclear physics.  The neutron – a particle without an electric charge – was the crucial component in the development and study of nuclear fission, which occurred in 1938 by Otto Hanh and Lise Meitner.  The development of nuclear fission was quickly applied the development of nuclear energy and weapons.  The neutron plays the key role in the chain reactions that occur in both nuclear reactors and atomic bombs.  It is probably fitting then, that James Chadwick was placed as head of the British team that work on the Manhattan Project that produced the world’s first atomic bomb. 

Beyond nuclear fission, the discovery of the nucleus aided in our understanding in the nuclear processes that power the stars through the process of nuclear fusion.  The study of this process had advanced our understanding on the origins and evolution of the elements.  

Continue reading more about the exciting history of science!

1915: General Relativity

It has been famously stated that if Albert Einstein hadn’t published his theory of special relativity, someone else likely would have within the decade. But if Einstein hadn’t published his theory of general relativity we would still be waiting on that discovery to this day. The theory of general relativity, proposed by Einstein in 1915, is one of the most profound and revolutionary achievements in the history of science and transformed our understanding of space, time, and gravity.

Background on the Theory of General Relativity

Curvature of Spacetime, as described by General Relativity
Curvature of Spacetime, as described by General Relativity
(Credit: blackholecam.org)

For nearly two centuries, Isaac Newton’s law of universal gravitation stood unquestioned and unchallenged in the field of physics.  However, around the end of the 19th century some tiny, yet vulnerable, kinks in its armor were beginning to emerge.  Physicists noticed it was unable to explain certain phenomena, such as the orbit of Mercury.  They were also struggling to reconcile classical mechanics with new observations of the nature of light and electromagnetism.   A relatively unknown physicist at the time named Albert Einstein provided the solutions to these vexing problems.

Einstein’s journey towards his theory of general relativity began in 1905 with the publication of his theory of special relativity.  Special relativity primarily focuses on objects moving at a constant speed.  It ignores acceleration, or objects affected by gravity, which Einstein determined are essentially the same thing after he formulated general relativity.  

Between 1907 and 1915 Einstein worked extending of his theory of special relativity to include gravity.  The mathematical complexity of this task proved enormous, and so Einstein had to learn various advanced mathematical techniques to complete his ideas.  He worked with other mathematicians the help him understand the underlying math needed to formulate his ideas. 

Space, Time, and a Sliver of Curvature

In November 1915, Einstein presented his revolutionary ideas on general relativity to the Prussian Academy of Science. His theory describes how mass and energy curve the fabric of spacetime, effectively producing the force of gravity.  Therefore, a major implication of general relativity is that it redefined gravity from a force acting at a distance to a description of gravity as a geometric property of spacetime. The core of general relativity is the field equations, which explain the geometry of four-dimensional spacetime to the distribution of mass and energy within it. These equations are known as the Einstein field equations.  

General relativity also implies an equivalence principle.  This states that the effects of gravity are locally indistinguishable from the effects of acceleration. In other word, someone in a closed space cannot determine whether they are feeling the effects of gravity or acceleration.  A classic example is the elevator thought experiment.  Imagine you are in an elevator with no windows.  If you drop an object it falls to the floor at an acceleration of 9.81 m/s^2 due to earth’s gravity.  However, you could also be in distant space, far away from any gravitational force accelerating at a constant rate of 9.81m/s^2.  In the second scenario when you drop an object it will also fall to the floor at an acceleration of 9.81m/s^2 – the effects of the two scenarios are indistinguishable.  

Testing and Acceptance

General relativity has been repeatedly verified by observation and experimentation.  One of the first key confirmations of general relativity came in 1919 during a solar eclipse expedition by Sir Arthur Eddington.  To goal of the expedition was to measure the apparent shift of the stars near the sun during the solar eclipse.  Indeed, the observed shift matched the predictions made by general relativity.  

Photograph of the solar eclipse by Arthur Eddington' s expedition in 1919
Photograph of the solar eclipse by Arthur Eddington’ s expedition in 1919

Another significant test of general relativity was the perihelion precession of Mercury’s orbit.  Mercury’s orbit exhibits a small deviation in the orientation of its elliptical orbit over time, a phenomenon known as perihelion precession.  Prior to general relativity, astronomers struggled to account for the advance of this phenomenon.  Newtonian classical mechanics failed to explain the discrepancy.  However, when Einsteins equations of general relativity were applied to this problem, they yielded an answer that matched the observed rate of precession.  This successful prediction, along with the results of Eddington’s experiment, provided crucial empirical evidence for the validity of general relativity.  General relativity also predicts other phenomenon such as gravitational redshift and time dilation, both of which have been confirmed through experiments as well.

As empirical validation rolled in, general relativity gradually began to gain acceptance within the scientific community.  By the latter half of the 20th century, general relativity had become fully accepted and became one of the two twin pillars of physics – along with quantum mechanics.  The acceptance of general relativity can be viewed as a watershed moment in the history of science, marking the transition from the Newtonian worldview to that of a relativistic worldview.  

Continue reading more about the exciting history of science!

1860s: The Electromagnetic Spectrum

The existence of a form of light other than visible light was inconceivable prior to the 19th century. However, in nature there exists a continuum of radiation waves, all traveling at the same speed of light, with a nearly infinite possibility of wavelengths and frequencies. This is the electromagnetic spectrum, and the first new form of light was unexpected discovered in 1800 by the British scientist William Herschel.

The Discovery of Infrared Rays

William Herschel's experiment leading to the discovery of infrared rays
William Herschel’s experiment leading to the discovery of infrared rays
(Credit: NIRS Research)

William Herschel is best know for his discovery of the planet Uranus but his interests also extended beyond the visible universe. Herschel was fascinated by the nature of heat and light and his involvement in this area of study lead to the discovery of a completely new and unexpected form of light.

In 1800 Herschel conducted an experiment where he directed sunlight though a glass prism to create a spectrum of visible light. He then measured the temperature of the different colors of light, where he noticed the temperature increased as he moved from the violet to the red end of the spectrum. Then in a moment of true scientific curiosity, he took his experiment one step further. He decided to measure the temperature just beyond the red light, but where no sunlight was visible. This region showed the highest temperature of all, and this led Herschel to conclude that there must be some form of invisible light present that we cannot see. He named this invisible radiation infrared, from the Latin ‘infra’ meaning “below” – or in this case below the red in the spectrum.

The importance of this discovery can hardly be overstated. It added an entirely new dimension to our perception of the universe and to this day has had significant practical applications in a variety of fields such as astronomy, telecommunications, healthcare, and environmental science. It also suggested that there may be other forms of light yet to be discovered.

The electromagnetic Spectrum
The Electromagnetic Spectrum
(Credit: www.miniphysics.com)

A Continuous Spectrum of Electric and Magnetic Fields Oscillating Together

Shortly after Herschel discovery of infrared rays, the German chemist Johann Wilhelm Ritter discovered another of invisible rays. Ritter was obviously inspired by Herschel’s discovery of infrared rays, and he decided to experiment at the opposite end of the spectrum, beyond violet. Ritter conducted his experiment by also focusing sunlight through a glass prism to create a spectrum of colors. He noticed that silver chloride, a chemical known to darken when exposed to sunlight, darkened faster when exposed to the region beyond violet light. He had discovered what would later become known as ultraviolet rays, from the Latin ‘ultra’ meaning “beyond” – in this case beyond violet in the spectrum. The discovery of ultraviolet and infrared rays further prompted the discovery of other types of electromagnetic rays.

The first X-ray photograph of Anna Bertha Ludwig's hand, the wife of Wilhelm Roentgen.  The bones and wedding ring are visible in the photo.
The first X-ray photograph of Anna Bertha Ludwig’s hand, the wife of Wilhelm Roentgen. The bones and wedding ring are visible.

In the 1860s the British physicist James Clerk Maxwell revolutionized the world of physics with his electromagnetic theory. He developed four mathematical equations, known as the Maxwell Equations, that described how electric and magnetic fields interact. These equations unified the previously separate fields of electricity and magnetism. His new theory suggested that light was a form of electromagnetic wave, and that it was just a smaller part of a much broader spectrum. This spectrum is known as the electromagnetic spectrum. In 1867, Maxwell predicted that there should be wavelengths of light longer than the infrared rays discovered by Herschel.

It took another two decades, but in 1887 the German physicist Heinrich Hertz confirmed Maxwell’s predictions when he discovered radio waves in his laboratory. Light with wavelengths of a shorter length were soon discovered next. In 1895 Wilhelm Roentgen accidentally discovered what became known as X-rays while experimenting with cathode ray tubes. It took another two decades until scientists were able to determine that these X-rays were indeed another form of light. The last part of the spectrum to be discovered was gamma rays. The French physicist Paul Villard discovered these in 1900 while he was studying radiation emitted by radium.

Far Reaching Technological Impacts

The discovery of the electromagnetic spectrum has had far reaching impacts on 20th century technology and beyond. The development of radio and television technology was predicted on the understanding of radio waves. In similar fashion, the invention of radar during World War II was possible due to the understanding of microwaves.

From science and technology to medicine and daily life, the impacts of the electromagnetic spectrum are almost too vast and numerous to mention. But to name a few – radio and television broadcasting, mobile phones and the internet, X-rays used in healthcare, space telescopes and satellite-based sensors are just a few of the many examples of the areas in which electromagnetic radiation have had an impact on civilization.

Continue reading more about the exciting history of science!

1927: The Big Bang

What is the origin of the universe? This is the ultimate origin question. The Big Bang is the scientific theory that proposes to answer this ultimate origin question. According to the Big Bang theory, the observable universe began in a singularity of infinite density and temperature at a specific moment in time some 13.8 billion years ago. It then rapidly expanded in to the vase and complex structure that we see today. The Big Bang theory has revolutionized our understanding of the universe by providing us with a framework to explain the origin and evolution of everything we see around us.

Big Bang Expansion
The Big Bang Expansion
(Credit: Wikimedia Commons)

Deconstructing the Problem of the Origin of the Universe

The origin of the universe has been a topic of human interest since the beginning of recorded history and was likely discussed and debated from the time that language evolved. Only recently have we been able to make significant progress in outlining a theory of the origin of the universe grounded in observable evidence. This is mainly due to the fact that up until the last century scientists had limited observable evidence in which to accurately account for the universe’s beginnings. Powerful telescopes and other advanced instruments allowed scientists to learn more information about the universe’s structure, composition, and history.

In the early 20th century there were two main competing theory’s of the origin of the universe. The first and most popular was the steady-state theory, which postulated that the universe had no beginning or end and that it was in a state of perpetual equilibrium. The second was the expanding universe theory, which eventually became to be known as the Big Bang, postulated that the universe began as an incredibly hot and dense point, a singularity, and has been expanding and cooling ever since. As you can see, these are two very different views of the universe’s evolution. Eventually, some key lines of evidence began to emerge that settled the debate, leaving only the Big Bang theory left standing.

Key Lines of Evidence for the Big Bang Theory

Big Bang Theory
The Evolution and Structure of the Universe
(Credit: European Space Agency)

There are three convincing lines of evidence typically put forward as proof for the Big Bang Theory.  When all there of these are taking together it provides a compelling validation for the Big Bang Theory.

  1. Universal expansion of galaxies – this refers to the observation that all of the galaxies are moving away from each other, with the furthest galaxies moving a way at an accelerating rate.  This phenomenon was first discovered by the astronomer Edwin Hubble in the 1920s when he observed that light from distant galaxies always shifted towards the red end of the spectrum. This line of evidence taken by itself does not prove the big bang.  There could be a center far away where new matter is being created and ejected from the center, pushing everything away. 
  2. Cosmic background microwave radiation – Arno Penzias and Robert Wilson, while working on a satellite at Bell Labs, accidentally discovered the cosmic background radiation (CMB).  The CMB is a form of electromagnetic radiation that fills the universe and is detectable in all directions, and is believed to be the residual heat left over from the Big Bang. It has a mostly uniform temperature of approximately 2.7 Kelvin, however small variations in temperature have been studied to provide important insights in the early universe, such as the distribution of matter and the structure of galaxies.
  3. Relative amount of light elements in the universe – The first few minutes of the early universe provided for extremely hot and dense conditions, with temperatures of billions of degrees and densities of billions of particles per cubic centimeter. The universe quickly expanded and cooled, allowing for nuclear fusion to occur, and the protons and neutrons that made up the early universe began to combine to form the lightest elements in a process called the Big Bang nucleosynthesis. The observed ratios of light elements such as hydrogen, helium, and lithium strongly match up with predictions made by the Big Bang.

Along with these three key lines of evidence there are other pieces of evidence that strongly support the Big Bang theory. The large scale structure of the universe and the age of the universe are both consistent with the predictions of the theory. Taken together, these lines of evidence provide strong support for the Big Bang theory, making it the most plausible explanation for the origin and evolution of the universe.

Continue reading more about the exciting history of science!

1960s: Plate Tectonics

Plate tectonics is a scientific theory that explains the movement of the Earths surface and many of its most prominent geological features. It is responsible in forming the deepest trench in the ocean to the tallest mountain on land – and in the ocean as we’ll soon see.  The uppermost layer of the Earth is called the lithosphere and is composed of large rocky plates that are on top of a lower molten layer of rock called the asthenosphere.   A convection current is generated between the two layers, causing the plates to glide at a rate of a few centimeters per year.  While it may not seem like much, given enough time this process has formed many of the geologic features of our planet.  It has resulted in the formation of the Himalayas and the separation of the South American and African continents. 

History of Plate Tectonics

The theory of plate tectonics is a relatively new idea, however fragments of ideas can be found in earlier times. In 1596 cartographer Abraham Ortelius observed that the coastlines of Africa and South America could be fitted together like pieces of a jigsaw puzzle.  He speculated that the continents may once have been joined, but have since been ripped apart by earthquakes or a flood.  Creationist have ceased on this idea to “prove” the existence of Noah’s Flood.  It does provide for an arresting image, as Richard Dawkins points out, of South America and Africa racing away from each other at the speed of a tidal wave. In the 1850s there was a noticeable correlation of rock type and fossils discovered in coal deposits across the two continents.  1872 saw the mapping of the Mid-Atlantic ridge. 

In the early 20th century a few versions of a continental drift began making an appearance but there was one that stood out in its influence and the development of the Earth sciences. In 1912 Alfred Wegener published two papers where he proposed his controversial theory on Continental Drift.  He suggested that in the past all landmasses were arranged together into a supercontinent that he called Pangaea (meaning “all lands”).  However he did not have a geological mechanism for how the landmasses drifted apart and the idea was met with much skepticism first. Over a period of decades Wegener continued to amass evidence for his theory and continued to promote his model of continental drift. He gathered evidence from paleoclimatology to provide a further boost to the idea. He noticed glaciations in the distant past occurring simultaneously on continents that were not connected to each other and were outside the polar region. Unfortunately, while in Greenland on an expedition for some of this data, Many of his specific details have turned out to be incorrect but his overall concept that the continents were not static and did move over time has been proven correct.

During the middle of the 20th century more evidence began to support the idea that the continents did move. Huge mountain ranges were being mapped on the ocean floor. It was assumed by both supporters and opponents of continental drift that the sea floor was ancient and would be covered with huge layers of sediment from the continent. When samples were finally obtained it showed all of this was wrong. There was hardly any sediment and that the rocks were young, with the youngest rocks being found near the ocean ridges. In 1960 the American geologist Henry Hess pieced all of this evidence together in his theory of sea-floor spreading. The ocean ridges were produced by molten lava rising from the asthenosphere. As it rose to the surface the magma cooled, forming the young rocks, and spreading the ocean floor in conveyor belt like motion through the slow process of convection. While some parts of the Earth were creating new oceanic crust and spreading the continents apart, other parts were doing just the opposite. Along the western edge of the Pacific for example, thin layers of oceanic crust were being forced beneath the thicker layer of crust, being driven down into the mantle below. This explains the presence and high frequency of earthquakes and volcano’s in places like Japan.

By the beginning of 1967 the evidence of continental drift and sea-floor spreading was quite strong and ready to be assembled into a complete package. That year Dan McKenzie and his colleague Robert Parker published a paper in Nature introducing the term plate tectonics for the overall package of ideas that describe how the Earth’s plates move and the resulting geological features it creates. To this day many details are still being flushed out but this moment marked a revolution in the Earth Sciences. It is relevant now only in geology but also in its synthesis with the evolution of life on Earth.

Continue reading more about the exciting history of science!

1876: The Telephone

The story of the invention of the telephone one is a complex one, rife with rival claims over the invention itself, and the product of numerous discoveries and individual contributions. Italian inventor Antonio Meucci appears to have invented an apparatus he call the teletrophono in the late 1840s. However credit for the invention of the telephone is given to the Scottish born inventor Alexander Graham Bell who was awarded the first telephone patent in 1876.

From Telegraph to the Invention of the Telephone

The invention of the electric telegraph in the 1830s marked the beginning of telecommunications. Now, for the first time in history messages could be sent over enormous distances in an instant. The telegraph quickly gained popularity and attempts to modify and improve the device were fervent. Successive attempts at improvements to the telegraph ultimately lead to the invention of the telephone.

Alexander Graham Bell's Box Telephone, 1877-78.
Alexander Graham Bell’s Box Telephone, 1877-78.
(Credit: National Museums Scotland)

Both the telegraph and the telephone are wire based electrical systems and thus similar in concept. Alexander Graham Bell was experimenting with how to send multiple messages along the same wire at one time, a technique known as multiplexing. This was the goal of several inventors at the time, including the US inventor Elisha Gray. In the early 1870s Gray was using a harmonic telegraph, which consisted of a transmitter and a receiver connected to sets of metallic reeds, in order to investigate speech transmission. A harmonic telegraph was known to be capable of sending multiple messages at the same time on a single wire by using varying frequencies. Bell also began his experiments in 1873 using a harmonic telegraph in his search for a multiplexing device but he quickly hit on the idea of using it for speech transmission too.

By 1875 Bell had proved that different frequencies could vary the strength of the electrical signal in a wire. Now all he needed to do was to develop a working transmitter that could send varying electronic currents and a working receiver that could reproduce these frequencies in audible form. On March 7th, 1876 Bell awarded a patent for his telephone as an “apparatus for transmitting vocal or other sounds telegraphically”. The telephone is considered one of the most valuable patents ever awarded by the U.S. Patent Office.

Bell’s telephone was composed of various elements that largely remained unchanged for many years. First a power source such as a battery was needed to power the electric current generated by the transmitter. The transmitter converted the speaker’s voice into a direct current and on the other end a receiver converted the current back into an audible voice. While not in use the receiver hung on a hook with a switch on it, known as a switch hook. The switch hook connects the telephone to the direct current through a loop. When the telephone is on the hook, contact to the loop is broken. Pick up the telephone and contact is restored with current now flowing through the loop. A dialer and a ringer were also critical components. Lastly an anti-sidetone circuit, and assemblage of transformers, resistors, and capacitors, was used to reduce various forms of noise and electrical feedback.

The Telephone: A Revolution in Telecommunications

The telephone continued to evolve and improve over time. Early telephones needed to be connected directly with each other and Bell quickly realized that all telephones needed to be connected to all other telephones in service for the device to be practical.  By the late 1870s the first switchboard had been invented, solving the problem.  By the 1880s telephone were being assigned numbers to make operations easier, introducing the first telephone numbers.  For that point on the innovations haven’t stopped. 

The 20th century saw enormous leaps in telephone technology. The dial tone was introduced in Germany in 1908 but it took until the 1920s until it was adapted in the United States.  That same decade witnessed the first transatlantic call in 1927, from the United States to the United Kingdom and was transmitted by radio waves. Also by the 1920s it is estimated that there were over ten million telephones that were in service in the United States. The 1960s saw the appearance first Touch-Tone telephones and launch of the worlds first international communications satellite, Telstar.  Around this time first cellular phone had started to appear. Digital telephone technology combined with cellular phones have drastically changed the telephone in the later part of the 20th century. Today most people have converted from using their land lines to carrying around their versatile smartphones.

Continue reading more about the exciting history of science!

1830s: The Electric Telegraph

For most of human history information could only travel at the speed a person could run, a horse could ride, or a ship could sail.  The persisted until only very recently.  For instance only around 250 years ago during the American Revolution it would take months to communicate from the colonies to Britain across the Atlantic.  Today that same communication can happen in an instant.  The electric telegraph was the first major breakthrough of modern telecommunications.

Early Forms of Telegraphy

A Tower with a Semaphore Signal
A Tower with a Semaphore Signal
(Credit: Wikimedia Commons)

A telegraph is a from of long distant communication, as spelled out in the root meaning of its name: tele – at a distant, and graph – to write.  Some of the earliest forms of long distant communication include light signals, drum beats and smoke signals.  These measures were employed with varying but mostly minimal success. In Ancient China, soldiers positioned in towers along the Great Wall would use smoke signals to warn of impending attackers.  Native Americans were particularly well known for using smoke signals to communicate over long distances.  First, tribes would agree on a communication system – such as one puff for a greeting, two puffs for danger, and so on.  However despite some effectiveness these communication systems were limited in their ability to communicate complex messages and required certain weather conditions. 

When people think of a telegraph today most are probably referring to the electric telegraph.  However the first true telegraph that was put into widespread use was the optical telegraph, invented by Claude Chappe in the late 1700s during the tumultuous time of the French Revolution.  With events happening at a frantic pace, French government was in need of fast and reliable communication during the revolution. Claude Chappe, along with his brother Ignace Chappe began working on the problem in the summer of 1790.  The system they developed used semaphore shutters positioned on top of towers to transmit light signals that corresponded to letters.  Relay towers could be positioned approximately every 20 miles and messages could be transmitted distances of over a hundred miles at a rate of approximately two words per minute.  

The Electric Telegraph Transforms Communications

The electrical telegraph harnessed the new science of electricity to revolutionize long distance communication.  Early experiments in the 18th century demonstrated effects of electricity and showed that it could be transmitted across a wire almost instantly. Alessandro Volta developed a battery providing a steady source of energy. The electric telegraph combined these new discoveries, making it was the first valuable invention of applied electricity before 1860s, the decade James Clark Maxwell brought together all the laws of electromagnetism in his mathematical formulations.

To transmit a telegraph electrically there needs to be two or more stations connected by a wire. The first working electrical telegraph was made by a British inventor named Francis Ronald in 1816. His invention was met with little fanfare by the British government. Over the next few decades several other inventors built improved versions of the electric telegraph with limited success. It was Samuel Morse in the 1830s and 1840s who finally brought the invention into the mainstream.  The telegraphic system Morse developed is still used sparingly today and is known to most as Morse Code. Morse Code encodes 26 letters and 10 numbers into a series of dots and dashes that that can be used to communication a written message electrically across a wire.

Morse Telegraph
Morse Telegraph
(Credit: Wikimedia Commons)

Morse used his new system to send his first message across a two mile wire in Morristown, New Jersey in 1838.  The United States Congress quickly realized the benefits of Morse’s system and they provided the funds to set up a wire from Baltimore to Washington D.C. in 1843.  On May 21, 1844 the famous first message “what hath god wrought” was transmitted across this 44 mile line. Within two years the telegraph become such an integral part of communication in the United States that the Associated Press was formed.  The telegraph grew so rapidly that by 1875 there were over 250,000 miles of telegraph wires in the United States alone and over 100,000 miles of wire undersea, linking with world together as never before.

A Short Lived Reign

The electric telegraph took only a few generations complete its full product life cycle complete – an ominous sign of things to come for the telecommunications industry. The telephone, invented 40 years after the telegraph, can be said to mark the beginning of the end for the telegraph.  Radio communications only accelerated its decline. Today telecommunication technologies can rise and fall in a single generation. However many of these technologies owe its scientific roots to the invention of the electric telegraph.

Continue reading more about the exciting history of science!

1947: The Transistor

The ability to transmit and acquire information through a means of communication may be our species greatest quality. The first step on this journey was the evolutionary development of spoken language. For that, we thank nature. The second influential step was the development of writing. While speaking and writing were effective, they were unable to mass produce information. The invention of the printing press, possibly the most important achievement in the history of science, changed the landscape and ushered in a scientific revolution that changed the world. However printing still worked at a speed that humans could comprehend. The next frontier was in electronics and that worked at the speed of light. The invention of the transistor made electronics a practical reality and reaffirmed that good things do come in small packages.

A Transistor
A Transistor
(Credit: Amazon.com)

Electronics Before the Transistor

Electronic devices were in use well before the invention of the transistor. The problem with these devices was that they were powered by large structures called vacuum tubes. A vacuum tube looks like a medium sized light bulb and consisted of two electrodes called a cathode and an anode placed on opposite ends of a closed glass tube. The cathode is heated causing electrons to be released which flow through the vacuum in the tube to the anode, creating a current. Eventually a third electrode called the control grid was added to these tubes to control the flow of electrons from the cathode to the anode.

The earliest computers were actually made from vacuum tubes. These computers were enormous in size sometimes requiring a full room to house them. They generated a tremendous amount of heat and required a huge amount of energy to power them. The first large scale, general purpose vacuum tube computer was the ENIAC – Electronic Numerical Integrator and Computer – designed and constructed in 1945 by the United States Army. By the end of its use in 1956 it consisted of over 20,000 tubes, weighed 27 tons, occupied 1,800 square feet and consumed 150 kilowatts of electricity. Instead of 20,000 tubes, today’s much more powerful electric devices such as smartphones consist of billions of transistors. Imagine trying to pack a billion vacuum tubes inside your smartphone.

A Panel of Vacuum Tubes from the ENIAC Computer
A Panel of Vacuum Tubes from the ENIAC Computer
(Credit: Wikimedia Commons)

The Invention of the Transistor

For electronics to really take off a new device that was substantially smaller and consumed less power was needed. The transistor was invented by John Bardeen, Walter Brattain and William Shockley of Bell Labs in 1947 as a replacement for the inefficient vacuum tube. The concept of the transistor is based on the field-effect transistor (FET) principle which uses electronic fields to control the flow of electricity. The theory was first patented by physicist Julius Edgar Lilenfield in 1925. Shockley was familiar with FET and had been working on the idea for over a decade. Unable to produce a functioning transistor, Bell Labs teamed up Shockley with Brattain and Bardeen. Within two years they were able to construct a working transistor which they first demonstrated to the world on December 23, 1947.

The design of the transistor has been improved over time. The first transistors were constructed from germanium, a chemical element with similar properties to silicon. In 1965 Gordon Moore (the co-founder of Intel) published a paper describing the doubling of transistors which he revised into what is now known as Moore’s Law. Moore’s Law is his observation which states that every two years the number of transistors will double on a circuit, and the observation has roughly held for four decades.

The Transistors Impact on Technology

Transistors and vacuum tubes function the same in that they amplify and control the flow of electrical signals in a current. The difference lay in their size, heat generation, and power requirements. A transistor generates very little heat and requires a small power requirement. Importantly, it can be manufactures on a miniature scale. Transistor technology can be shrunk to the nanometer level with today’s transistors measuring between ten and twenty nanometers in length. That tiny size allows millions of them to be deployed in microchip technology. A microchip, also called an integrated circuit, is produced from a silicon wafer that contain millions of transistors on an area the size of a grain of rice. The compactness of these microchips make it possible to pack immense processing power into a tiny space, such as in today’s smartphones. These chips are deployed in all of our modern electronic devices. Televisions, radios, the personal computer, numerous household appliances, numerous industrial appliances all incorporate the technology of microchips whose function depends on the transistor. The transistor has unrecognizably changed the world in its 80 year history and it shutters the imagine to think what the next 80 years hold.

Continue reading more about the exciting history of science!