1763: Bayes’ Theorem

Bayes’ theorem is a fundamental concept in probability theory.  It was formulated in 1763 by the English statistician and Presbyterian minister, Reverend Thomas Bayes.

History of Bayes Theorem

Thomas Bayes was born in London in 1702 and studied at the University of Edinburgh.  During his time at Edinburgh he was exposed to some of the leading mathematicians and philosophers of his time.  He was elected as a Fellow of the Royal Society where he may have served as a mediator of intellectual debates. He later returned to London to become a minister, but he continued to pursue an interest in mathematics, specifically in probability theory. 

Heading for Bayes Doctrine of Chances (1764)
Heading for Bayes Doctrine of Chances (1764)

Bayes wrote his theorem in order to address the question of how to revise beliefs in the light of new evidence. However, more interestingly it appears that he likely wrote it as a mathematical means to defend Christianity and to combat an argument by David Hume in his 1748 essay Of Miracles, from his book An Enquiry Concerning Human Understanding. In this essay, Hume made the case for dismissing miracles, such as the resurrection of Christ, on the grounds of probability. In effect he argued that the probability for miracles (a violation of the laws of nature) was much more improbable than the probability that miracle was accurately reported. While there is no absolute or direct evidence that Bayes sole motivation to compose his work was to refute Hume’s essay, there is extremely good circumstantial evidence he did at least in part, given the details surrounding the events of his later life and the eventual publication of his work.

Whatever the real motivation for his work may have been, Bayes’ work was published two years after his death, when his friend Richard Price brought it to the attention of the Royal Society and read on December 23, 1763. It was published to following year both in the Philosophical Transactions, the journal of the Royal Society, and as an offshoot. The now famous essay was titled An Essay towards solving a problem in the Doctrine of Chances.  It should be noted that in 1767, Prince published a book titled Four Dissertations, where he explicitly took on the work of Hume and challenged his probabilistic arguments in Of Miracles. He used Bayes results in an attempt to show that Hume failed to recognize the significance of multiple independent witnesses to a miracle, and that the accumulation of even imperfect evidence could overcome the statistical improbability of an event.

As things sometimes happen in the history of science, the theorem initially was largely forgotten, until it was independently rediscovered by the brilliant French mathematician Pierre-Simon LaPlace in 1774. The theorem is used to describe the conditional probability of an event.  Conditional probability tells us the probability of a hypothesis if some event has happened. 

The Goal of Getting Closer and Closer to the Truth

Bayes’ Theorem involves beginning with an educated guess, called a prior probability, and then revising your prediction when new evidence comes in.  As the new evidence is considered the probability of the event is updated give you the posterior probability.  Bayes’ Theorem provides a useful way of thinking by approximation, getting closer and closer to the truth as we accumulate new and relevant evidence.  This is an important point to consider because we are always working with incomplete information in nearly all situations.

Formula of Bayes' Theorem
Formula of Bayes’ Theorem

A Bayesian way of thinking requires us to constantly update our probabilities as new evidence becomes available to us.  This revision does not happen just once but can continually happen. We may never know the truth with 100% certainty, for example we can never be 100% certain the sun will rise tomorrow.  But with Bayesian thinking we can be 99.999999% sure which tells us we’re getting very close to the truth and gives us a high degree of confidence in the proposition. Bayes theorem helped to revolutionize probability theory by introducing the idea of conditional probability – probability conditioned by evidence. If you have an extraordinary hypothesis, it should require extraordinary evidence to convince you that it’s true.

Practical Uses of Bayes Theorem

Bayes Theorem has relevance in any avenue of life because it is a form of probabilistic thinking.  If you think about it, everything you and happens to you in life is probabilistic in nature.  The theory’s flexibility and versatility provide the ability to make both life and business decisions under conditions of uncertainty.  Here are a few examples of Bayesian theory used in the real word. In biology it is used for medical diagnosis, genetics, and the spread of infectious diseases.  In computer science it is used in speech recognition, search engine algorithms, spam filtering, and weather forecasting.  Its practical examples are almost limitless.  Ultimately, it is a learning process, with more observations and evidence leading to better certainty.  Lets take a look at one interesting application of Bayes theorem in a real word setting. 

The theorem was used to crack the Nazi Enigma code during WWII.  The Enigma code was an encryption machine that the Germans used to send secure messages.  Its effectiveness was that its cipher system was changed daily.  Alan Turing, the brilliant British mathematician, used Bayes Theorem to break down an almost infinite number of translations based on messages that were most likely to be transmitted.  For example, messages from German U-boats were most likely to transmit messages containing information about the weather or allied shipping.  The strong priors thus greatly reduced the possible translations to be deciphered and sped up to time to crack the code.  Eventually he and his staff invented a machine known as The Bombe, which ultimately cracked the German Enigma Code. The use of Bayes’ theorem in cracking the Enigma code was a monumental breakthrough for the Allies, as it provided them access to critical information about German military operations. It provided a significant strategic advantage in the war effort and played a key role in their eventual victory.

Bayes’ theorem continues to impact statistics and society to this day. In recognition of Bayes’ contribution to the development of probability theory, the Bayesian Analysis journal was established in 2006 as a peer-reviewed academic journal dedicated to Bayesian statistics. Additionally, they Thomas Bayes Award is awarded every two years by the Royal Statistical Society to recognize outstanding contributions to the field of Bayesian statistics. The continuing relevance of Bayes’ theorem is a testament to the enduring legacy of Thomas Bayes and his contribution to the field of probability theory.

Continue reading more about the exciting history of science!

1905: Special Relativity

Special Relativity is a theory proposed by Albert Einstein that attempts to explain the relationship between space, time and motion.  Einstein outlined his proposal in a scientific paper titled “On the Electrodynamics of Moving Bodies” published on September 26th, 1905. 

Approaching the Limits of Classical Physics

Einstein's famous E=mc^2 equation shows the equivalence of mass and energy
Einstein’s famous E=mc^2 equation shows the equivalence of mass and energy

Isaac Newton’s laws of motion were a wildly successful explanation of the physics of motion, and a dominate force in scientific understanding of physics for over two centuries.  However, by the turn of the 20th century some critical shortcomings in the explanatory power of Newton’s famous laws were becoming evident.  One of the major phenomena classical physics couldn’t explain was the behavior of light.  Classical physics assumes that space and time are absolute and that the speed of light is invariant.  It is simply added to or subtracted from the speed of a light source.  However, experimental evidence, particularly from the Michelson-Morley experiment of 1887, which demonstrated that the speed of light in a vacuum is independent of the motion of the Earth about the Sun, suggested this is not the case.  Their experiments suggested that the speed of light is constant for all observers, regardless of the speed they are traveling at. 

Working out a Swiss patent office, the 26-year old Einstein solved these problems with his Theory of Special Relativity and later his Theory of General Relativity.  Special relativity describes motion at a constant speed traveling in a straight line (a special case of motion, hence the name) while general relativity describes accelerating motion.  Interestingly, his famous equation E=mc^2 did not actually appear in the original paper but was added essentially as an addendum a few months later.

Einstein’s Theory of Special Relativity is based on two principles:

  1. The laws of physics are the same in all inertial frames of reference
  2. The speed of light is the same in all inertial frames of reference (186,000 miles/second)

This implies that an observer at rest observes light traveling at a speed of 186,000 miles/second while another observer traveling at 180,000 miles/second relative to the observer at rest also observes light traveling at 186,000 miles/second. Based on these principles several interesting conclusions can be worked out.  These conclusions are, or course, relative to an observer at “rest.”

  1. Time slows as speed increases
  2. Mass increases as speed increases.
  3. Mass and energy are equivalent as exemplified through Einstein’s famous E=Mc^2 equation
  4. The length of an object contracts as speed increases

Testing the Theory and Acceptance

As unlikely as these conclusions seem they have been verified repeatedly through many experiments.  The conclusions seem counter-intuitive because many of these effects are only perceptible at near-light speeds.  Perhaps this is why the acceptance of special relativity was not immediate and was met with harsh criticism by many scientists.  The idea that measurements of time and length could change with velocity conflicted with everyday experiences.  However, the aesthetic beauty of its mathematics initially attracted interest among some scientists, prompting further inquiry.  

The tide of acceptance began to turn as the predictive power of special relativity was confirmed by experimentation.  The most famous example was the 1919 solar eclipse experiment, led by Sir Arthur Eddington, where light was shown to bend around the mass of the sun.  This experiment was more about general relativity, however the success of general relativity helped to bolster the acceptance of its counterpart, special relativity.  

Experimental confirmation also came from the field of atomic physics.  The most famous and direct experimental verification came through nuclear reactions in 1932, when James Chadwich discovered the neutron.  Subsequent experiments in atomic physics with experiments involving nuclear fission and fusion continued to confirm the validity of E=mc^2.   

Another famous experiment that validated special relativity was a time dilation experiment done in 1971 using high precision atomic clocks flown on commercial airlines.  By comparing the time measured by the atomic clocks on the airplanes to those on the ground, experimenters confirmed that the clocks on the airplanes ran slightly slower than the stationary clocks on the ground, in accordance with special relativity.  Further experiments using particle accelerators and high energy experiments have provided validation by demonstrating the increase in mass and the time dilation of particles moving at relativistic speeds.  

By the beginning of the 21st century special relativity has been fully accepted and fully integrated into the fabric of modern physics.  It plays a crucial role in the application of many technologies, including the development of GPS, particle accelerators, and nuclear energy.

Continue reading more about the exciting history of science.

Reasons to be an Atheist #1: Science Works

The reasons to be an atheist today are numerous and varied. This article will be the first in a series of articles outlining the many reasons to be an atheist. The first reason to be discussed acknowledges that we have a better system of explaining the world than a religious one. The world is big, complex, and many times confusing. This is one of the reasons people turn to religion, because they feel that religion will provide answers to some of life’s biggest questions. However there is a more accurate system of thought out there that provides these answers and unlike religion, it actually works. It’s called science.

Science vs Faith
Quote by Tim Minchin

Civilizations have invented thousands of religions over the centuries in an attempt to explain the world around them.  After centuries of debate the results of religion are inconclusive on most topics and it’s been a spectacular failure on the rest.  As for one popular example, in the early 17th century the Catholic Church forbid Galileo Galilei from teaching the Copernican view of the Solar System that placed the Sun at the center of the system with the Earth and the other planets orbiting around it. According to the church at the time this view was in conflict with the teachings of the Bible. Clearly, the Bible got this basic fact wrong. On the other hand, there is overwhelming evidence that science works very well as a system of thought for explaining the world around us.  By adapting a scientific worldview we have no reason to be religious.

Science works because of its method and its commitment to the principles of scientific objectivity. Religion relies on blind faith to reveled prophecy. The two systems of thought for attempting to understand the world couldn’t be more polar opposites. Let’s take a look at why science works by looking at the five steps of the scientific method and the five principles of scientific objectivity.

The Scientific Method

The scientific method consists of five steps.

  1. Making an observation – this involves observing some phenomena that requires an explanation of the phenomena
  2. Asking a question – the purpose here is to identify a specific problem and to narrow the focus of the inquiry
  3. Formulating a hypothesis – the hypothesis is an educated guess that can be tested
  4. Testing the prediction in an experiment – this is the investigation to see if the real world behaves as the hypothesis predicts
  5. Analyzing the result – here is where the conclusion is drawn on whether the evidence supports or rejects the hypothesis
The Scientific Method as an Ongoing Process
The Scientific Method
(Credit: Wikimedia Commons)

Scientific Objectivity

The scientific method works because it is objective. While it is true that scientists are people, and peoples perceptions of the world are subjective, that doesn’t mean that science hasn’t figured a way around that problem to become objective in practice. Scientific objectivity also consists of five principles.

  1. Observability – for something to be scientifically objective it must be observable. This includes things that our senses cannot observe directly but we can observe the effects through equipment such as infrared radiation.
  2. Universality – for something to be scientifically objective it must consider and account for all relevant data.
  3. Self-consistency – all of the observable data must fit into a self consistent pattern to produces accurate results.
  4. Reproducibility – the data must be reproducible by other people.
  5. Debatability – the results must be debatable. The is the error-correcting process in science since individual people make strong emotional attachments to their idea’s or sometimes make mistakes.

The Difference Between Science and Religion in Explaining how the World Works

The scientific method along with its adherence to scientific objectivity provides the strongest tools we have to answering questions about the world around us. The universe works how it works and its up to us to discover how it works. That’s what science does, it discovers how the world works through observation and experimentation. Inventing superstitious religious stories and institutionalizing them over the generations doesn’t prove they must be correct, especially when observation and experimentation say otherwise. There is an enormous wealth of observable evidence that fits into a self consistent pattern that we call the theory of evolution by natural selection and which explains how humans, and all other species, evolved on this planet. There is absolutely no observable evidence for the creation myth of Adam and Eve in the book of Genesis from the Bible.

As the body of scientific knowledge has grown over time it has replaced religious explanations with scientific explanations. Religion still thinks it can provide answers to questions that science can’t yet answer, but that doesn’t mean that science won’t ever answer them. The remarkable progress of scientific knowledge over the past few centuries is one of the strongest reasons to abandon your religion and become an atheist.

Further Reading: Science as a Candle in the Dark by Carl Sagan; The Magic of Reality by Richard Dawkins; The God Delusion by Richard Dawkins; A Devil’s Chaplain by Richard Dawkins; Letter to a Christian Nation by Sam Harris; God and the Folly of Faith: The Incompatibility of Science and Religion by Victor Stenger

1977: Voyager Program

The Voyager Program represents an ambitious undertaking in exploring the boundaries of the Solar System, and beyond.  The program consists of two spacecraft, Voyager 1 and Voyager 2, launched by NASA in 1977 in order to probe the four outer planets of the Solar System as well as their moons, rings, and magnetospheres.

Background and Objectives of the Voyager Space Program

The primary objective of the Voyager space program was to complete a comprehensive study of the outer edges of our Solar System.  This mission was made possible due to a bit a good luck.  Known as The Grand Tour, a rare geometric alignment of the four outer planets that happens roughly once every 175 years allowed for a single mission to fly by all four planets with relative ease.

Originally, the four-planet mission was deemed too expensive and difficult, and the program was only funded to conduct studies of Jupiter and Saturn. and their moons.  It was known, however, that fly-by of all four planets was possible.  In preparation of the mission over 10,000 trajectories were studied before two were selected that allowed for close fly-by’s of Jupiter and Saturn.  The flight trajectory for Voyager 2 also allowed for the option to continue on to Uranus and Neptune.  

The Different Instruments of the Voyager Spacecraft
The Different Instruments of the Voyager Spacecraft
(Credit: Nasa.gov)

The two Voyager spacecraft are identical, each equipped with several instruments used to conduct a variety of experiments. These include television cameras, infrared and ultraviolet sensors, magnetometers, plasma detectors, among other instruments.

In addition to all of its instruments, each Voyager spacecraft carried on it an addition interesting item called the Golden Record.  The Golden Record is a 12-inch gold-plated copper disk designed to be playable on a standard phonograph turntable.  It was designed to be kind of time capsule, intended to communicate the story of humanity to any extraterrestrial civilization that might come across it.  The Golden Record contains a variety of sounds and images intended to portray the diversity of culture on Earth.  This includes:

  • greetings in 55 languages, including both common and lesser-known languages.
  • a collection of music from different cultures and eras including Bach, Beethoven, Peruvian panpipes and drums, Australian aborigine songs, and more
  • a variety of natural sounds such as birds, wind, thunder, water waves, and human made sounds such as laughter, a baby’s cry and more.
  • various images such as human anatomy and DNA, plant and animal landscapes, the Solar System with its planets and more
  • a “Sounds of Earth” Interstellar Message, featuring a message from President Jimmy Carter and a spoken introduction by Carl Sagan
The Golden Record from the Voyager Space Mission
The Golden Record from the Voyager Space Mission

A committee chaired by the astronomer Carl Sagan was responsible for selecting the content put on the record.  The value of the Golden Record is, in Sagan’s own words:

“Billions of years from now our sun, then a distended red giant star, will have reduced Earth to a charred cinder. But the Voyager record will still be largely intact, in some other remote region of the Milky Way galaxy, preserving a murmur of an ancient civilization that once flourished — perhaps before moving on to greater deeds and other worlds — on the distant planet Earth.”

Carl Sagan

The Launch, Voyage and Discoveries

Voyager 2 was launched on August 20,1977 from the NASA Kennedy Space Center at Cape Canaveral, Florida, sixteen days earlier than Voyager 1.  The year 1977 provided a rare opportunity where Jupiter, Saturn, Uranus, and Neptune were all in alignment allowing Voyager 2 to fly by each of the four planets.  Voyager 1 was on a slightly different trajectory and only flew by Jupiter, Saturn, and Saturn’s largest moon Titan. 

Voyager Space Probe
Voyager Space Probe

Voyager 2’s fly-by of Jupiter and Saturn produced some important discoveries.  It provided detailed and close up images of both planets and its moons.  While much was learned about each planet, its useful to note one important discovery of Jupiter and Saturn.  On its fly-by of Jupiter, Voyager 2 revealed information on its giant red spot such as its size and structure (a complex storm with a diameter greater than the Earth!), dynamics, and its interaction with the surrounding atmosphere.  On it’s fly-by of Saturn, Voyager 2 revealed information on its rings such as its structure (close up images revealed the rings are made up of countless, individual particles), dynamics, and various features.

After the success of the Jupiter and Saturn fly-by’s, NASA increased funding for Voyager 2 to fly by Uranus and finally Neptune.  Currently both spacecraft are leaving the Solar System as they continue to transmit data back to Earth.

Continue reading more about the exciting history of science!

Richard Dawkins

Richard Dawkins (1941 – present) is an evolutionary biologist known for emphasizing the role of the gene in biological evolution.  His 1976 book The Selfish Gene popularized this gene-centered view of evolution.

Richard Dawkins
Richard Dawkins

Richard Dawkins was born in Kenya to two parents very much interested in the natural sciences who cultivated his curiosity in the natural world.  At the age of eight his family moved to England where he attended public school before enrolling at Balliol College, Oxford – a constituent college of Oxford University.  He was raised in a traditional Anglican upbringing, but by this point in his life he had abandoned Christianity in favor of the scientific worldview.

After a brief teaching job at UC Berkley in the USA, Dawkins returned to the UK in 1970 to be a lecturer at the University of Oxford.  In 1976 he published his book The Selfish Gene, a wildly popular book that was effective at both explaining the gene-centered view of evolution to laypeople as well as persuasively convincing other scientists the validity of the idea.  Not since Charles Darwin’s revolutionary book On The Origin of Species has a scientific literature been so successful at achieving both of those important ends.  Interestingly, Dawkins coined the now famous term meme in this book, a term that has become an icon of internet culture.

Dawkins has gone on to author countless other best-selling science books while working as a professor at the University of Oxford.  Since 1970 he has been a fellow of New College, Oxford.  In 1995 he was appointed as chair to The Simonyi Professorship for the Public Understanding of Science, a position endowed by Charles Simonyi with the instructions that Richard Dawkins be the first to hold the position.  In 2006 he founded the Richard Dawkins Foundation for Reason and Science – a nonprofit science education organization.

More recently, Dawkins has become an outspoken critic of religion, especially of creationism.  In 2006 he published his most popular book The God Delusion where he debunks the notion of an intervening God in our universe and where he masterfully illustrates religious faith as a delusion.  The book has proven to be provocative to some, yet it has sold over 1 million copies while becoming an international best seller.  In public he has been a leading figure in promoting the virtues of atheism while lambasting vices of religion.  As of May 2019 Dawkins is currently authoring another book titled Outgrowing God: A Beginner’s Guide to Atheism. It is expected out in September 2019.

1953: The Structure of DNA

In April 1953, Francis Crick and James Watson published a paper in Nature which established the structure of DNA.  Their paper was of the most significant discoveries in all of biology, giving rise to the field of molecular biology, while answering the question of how life reproduces itself. 

The Hunt for the Hereditary Material

DNA double helix and its nitrogenous bases
DNA double helix and its nitrogenous bases

In the middle of the 19th century, the Austrian monk Gregor Mendel conducted a series of experiments with pea plants where he established some fundamental principles of hereditary.  His pioneering work laid the groundwork for the field of genetics.  However, the physical basis for the hereditary material of life remained a mystery for the time being.

In the early 20th century, the search for the hereditary material intensified.  The American geneticists Thomas Hunt Morgan conducted important research at Columbia University with the fruit fly Drosophila that advanced our understanding of the role of chromosomes in heredity.  Chromosomes were discovered in 1888, four years after the rediscovery of Mendel’s work on pea plants, and it was suspected that they were involved in the passing of hereditary traits.  It was observed that patterns of trait transmission corresponded with the movement of chromosomes, an idea became known as the Chromosomal Theory of Inheritance.  But in the early 20th century it wasn’t clear how to prove this.

Morgan and his team conducted a breeding and crossbreeding program involving millions of fruit flies, and through this program was able to work out correlations between particular traits and individual chromosomes that showed chromosomes were involved in the passing of genetic material.  More specifically, Morgan was able to establish that one of the mutations he found in his fruit flies, a change in eye color, only occurred in males.  This could only be explained if the mutation occurred on the male Y chromosome, which Morgan was also able to establish.  

Chromosomes are made up mostly of proteins, with a bit of nucleic acid.  The search was on the discover the molecule on the that transmitted hereditary material, and the early assumption was that it was a protein.  By the 1940s various biological techniques were devised to break down the different biological materials – proteins, lipids, carbohydrates, and nucleic acids – and only when nucleic acids were broken down was the hereditary material not transmitted.  However, there was a bias towards assuming that proteins were the hereditary material and more evidence was needed to convince the scientific community.  

This evidence came from the experiments of Alfred Hershey and Martha Chase in the early 1950s.  They designed an elegant set of experiments using radioactive tracers in viruses.  This was accomplished by growing viruses in a solution with different radioactive isotopes, one with radioactive sulfur (a main component of proteins but not found in DNA) to label the protein and one with radioactive phosphorous (a main component of DNA but not found in proteins) to label the DNA.  They allowed the labeled viruses to infect a culture of E. coli bacteria and initiate reproduction and then were able to trace which material was responsible for hereditary transmission by a process called centrifugation.  The results of this experiment produced strong and convincing evidence that DNA, and not proteins, carried the hereditary material.

This laid the groundwork for the quest of understanding deoxyribonucleic acid or DNA which was now shown as the physical material responsible for carrying hereditary information.

Unraveling the Structure of DNA

By this point in the middle of the 20th century, it was realized by most biologists that a detailed knowledge of the three-dimensional structure of DNA was critical to understanding how the molecule works.  This is the classic biological principle that form follows function.  Many people were racing towards this inevitable discovery, and indeed Crick and Watson’s work was dependent on the work of others, most notably Rosalind Franklin and her X-ray crystallography.  Franklin took a purified DNA crystal and produced the clearest X-ray crystal image to date.  On a visit to London, James Watson was able to view this image and working with Crick they were able to deduce some important structural elements from it.  First, they learned that the molecule had to form some type of helix.  Second, they were able to figure out the width of this helix, which was about two nanometers, or about the size of one nucleotide.  

Watson and Crick with the models of DNA
Watson and Crick with the models of DNA

The was one other piece of information was important to determining the structure of DNA.  It was observed through experiments that there was a consistent ratio between the nitrogenous bases of the DNA.  The amount of adenine always equaled the amount of cytosine, and the amount of guanine always equaled the amount of thymine.  

Working with this information, Watson and Crick began building literal models of DNA strands.  Through a series of trial and error, they eventually were able to build a model that fit with all of the data. Their model is the famous double helix.

The double helix model that Crick and Watson established contained a few major features.  The outside consists of two sugar-phosphate backbones held together by hydrogen bonds, with nitrogenous bases on the inside.  There are four types of bases – adenine (A), cytosine (C), guanine (G), thymine (T), with pairing always occurring with A & T and C & G.  The copying mechanism works with the helix “unzipping” into two separate strands that act as templates for a new molecule due to the pairing mechanism.

Continue reading more about the exciting history of science!

1900 – 1920s: Quantum Mechanics

Quantum Mechanics

Quantum mechanics is a paradigm shifting theory of physics that describes nature at the smallest scale. Understanding quantum mechanics requires a journey into the realm of the infinitesimally tiny, where the rules that govern our everyday reality no longer apply. The theory was developed gradually in the early part of the 20th century.

The Birth of Quantum Mechanics

The field of quantum mechanics began emerging very early on in the 20th century as a revolutionary framework for understanding the behavior of particles at the atomic and subatomic levels.  The theory was formed from the observations and experiments of a handful of scientists of that period.  As the 19th century was coming to a close classical physics was reaching its limits.  New phenomena were being observed that it couldn’t explain.  Quantum mechanics first entered into mainstream scientific thought in 1900 when Max Planck used quantized properties in his attempt to solve the black-body radiation problem.  Plank introduced the concept of quantization, proposing that energy is emitted or absorbed in discrete units called quanta. It was initially regarded as a mathematical trick but later proved to be a fundamental aspect of nature. 

Five years later Albert Einstein offered a quantum-based theory to describe the photoelectric effect, earning Einstein the Nobel Prize in Physics in 1921.  The next major leap came from Niels Bohr in 1913.  One of the problems that puzzled physicists of the day was according to the current electrodynamic theory the orbiting electrons should run out of energy fairly quickly, almost right away, and crash into the nucleus.  Bohr’s solution was to propose a model of the atom where electrons orbited the nucleus in definite energy levels or ‘shells’. In this new theory, electrons moving between orbits would move instantaneously between one orbit and another.  They would not travel in the space between the orbits, an idea that became known as a quantum leap.  Bohr published his work in a paper called On the Constitutions of Atoms and Molecules, and for this unique insight he won the Nobel Prize in Physics in 1922.

With these discoveries and others, quantum mechanics became a revolutionary field in physics.  It also became one of the strangest fields in science to study and attempt to understand.  It happens that things on the subatomic level don’t behave like anything we are used to in our everyday experience.  Because of this strangeness some physicists did not like quantum mechanics very much, including Albert Einstein.  Despite its strangeness and messiness, quantum mechanics is known for having a high degree of accuracy in its predictions.  In the decades to follow quantum mechanics was combined with special relativity to form quantum field theory, quantum electrodynamics, and the standard model.

Foundational Principles of Quantum Mechanics

The main principle of quantum mechanics is that energy is emitted in discrete packets, called quanta.  This differs from classical physics where all values were thought possible and the flow was continuous.  Other attributes of quantum mechanics include:

  • Uncertainty Principle:  this states that certain pairs of physical properties, such as position and momentum, cannot be simultaneously known with absolute precision.  It was first formulated by Werner Heisenberg in 1927 and arises from the wave-like nature of particles at the quantum level.
  • Wave-Particle Duality:  this describes the dual wave-like and particle-like behavior exhibited by objects at the quantum level.  Electrons, for instance, can exhibit both particle like behavior (localized position and momentum) and wave-like behavior (interference and diffraction) under different experimental conditions.
  • Quantum Entanglement:  this is the curious phenomenon in quantum mechanics that occurs when a pair or group of particles becomes correlated in such a way that the quantum state of one particle is directly tied to the other particle.  This means that changes to one particle instantly affects the other, regardless of the distance between them.  The instantaneous correlation has been confirmed in numerous experiments with photons, electrons, and even individual molecules.  
Double slit experiment showing both the wave and particle behavior of light
Double slit experiment showing both the wave and particle behavior of light

Together, these four principles provide solutions for physical problems that classical physics cannot account for.  Quantum mechanics therefore provides a much more comprehensive framework than classical physics for understanding the behavior of matter and energy at the smallest scales.  However, on some levels its development goes much further than that.  It has transformed our understanding of matter and energy and upended our notions of predictability and determinism, with interesting philosophical implications.  It is one of those areas in science where we are reminded of the power of human curiosity, ingenuity, and the perseverance in unraveling the mysteries of the universe.  

Continue reading more about the exciting history of science!

Max Planck

Max Planck picture
Max Planck

Max Planck (1858 – 1947) was a German physicist whose revolutionary research led to the foundation quantum mechanics and quantum theory.  Quantum theory was an entirely new type of physics that replaced classic physics on the atomic scale.

Max Planck was born in Kiel, Germany in 1858.  As a youth his family moved to Munich where he enrolled in the Maximilian Gymnasium school where he was introduced to mathematics, astronomy, and mechanics. He graduated at the age of 17 and made the decision to pursue a career in physics.  He promptly enrolled in the University of Munich but after a year he transferred to Friedrich Wilhelms University, where he attended lectures by two of Germany’s most eminent physicists – Hermann von Helmholtz and Gustav Kirchhoff.  He was intrigued by the concept of thermodynamics and began to read over papers written by Rudolf Clausius, one of the pioneers of thermodynamics.  Planck graduated and began teaching physics, first at the University of Munich, then at the University of Kiel, and finally at Friedrich Wilhelms University in Berlin, where he became a full professor in 1892.  It was in this position where he conducted some of his most important research.

In 1894 Planck turned his attention to the problem of black-body radiation.  In 1859 Kirchoff had identified a black-body as a perfect absorber and emitter of radiation at all wavelengths.  Physicists could create a black-body curve by displaying how much radiant energy is emitted at different frequencies for a given temperature of the black-body.  Classical theory was having difficulty in having their predictions match up with the observations.  In order to solve this problem, Planck made a radical proposal.  He proposed that energy could only be emitted in certain, discrete amounts called quanta, whereas classical theory allowed for all possible values of energy. He was able to derive a formula that accurately predicted the energy radiated by a black body – E=hv, where h is Planck’s constant and v is the frequency of radiation. This was the beginning of quantum mechanics and for this work he won the 1918 Nobel Prize in Physics.

Unlike many German scientists of his day, Planck stayed in Germany his entire life and lived through World War 2.  His home in Berlin was destroyed by an Allied air raid where he lost all of his scientific papers.  He died shortly after the war ended in 1947.  The following year the Kaiser Wilhelm Society was renamed the Max Planck Society in his honor.

1938: Nuclear Fission

The discovery of nuclear fission, a process that releases an enormous amount of energy by splitting the nucleus of an atom, was an explosive moment in the history of science and technology. This incredible discovery directly led to the development of nuclear weapons and nuclear energy production, both of which would become world changing events.

The Birth of Atomic Physics

1979 German postage stamp depicting a diagram of splitting uranium atoms
1979 German postage stamp depicting a diagram of splitting uranium atoms

The discovery of nuclear fission began with the birth of atomic physics and its related research into the components of the atom.  In 1897, J. J. Thomson discovered the first subatomic particle, the electron, while working on experiments with cathode ray tubes.  This discovery prompted further research into the structure of the atom.  Fourteen years later, Ernest Rutherford discovered the nucleus of the atom when to his surprise alpha particles were occasionally reflected straight back to the source when directed at a thin sheet of gold foil.  At this point, the nucleus was thought to only contain positively charged protons, however in the 1920s Rutherford hypothesized the existence of neutrons, a theoretical neutral subatomic particle in the nucleus with no electric charge, to account for certain observed patterns of radioactive decay.

The neutron was quickly discovered by James Chadwick, a colleague and mentee of Ernest Rutherford, in 1932.  The discovery of the neutron proved to be a critical step for the development of nuclear fission technology.  Scientists soon realized that they could use the neutron to split heavier atomic nuclei.  Since the neutron have a lack of electrical charge, they are not repelled by the positively charged nucleus in the way alpha particles are repelled, and they can penetrate and be absorbed by the nucleus.  This makes the nucleus unstable and causes it to split into two or more smaller nuclei, releasing a tremendous amount of energy in the process.  

The Discovery of Nuclear Fission

Shortly after the discovery of the neutron scientists began using it to probe the structure of the atom further.  In 1934 Enrico Fermi began using the neutrons to bombard uranium atoms.  He thought he was producing elements heavier than uranium, as was the conventional wisdom of the time.  

The first experimental evidence for nuclear fission occurred in 1938 when German scientists Otto Hahn, Fritz Strassmann, Lise Meitner, and their team began also bombarding uranium atoms with neutrons.  As was so often the case in the early days of atomic physics, their results were completely unexpected.  Instead of creating heavier elements, the neutrons split the nucleus to produce smaller, lighter elements such as barium among the radioactive decay.  At the time it was thought improbable that a neutron could split the nucleus of an atom.  The experiments were quickly confirmed, and the first instance of nuclear fission had been achieved.  

It was quickly realized that if enough neutrons were emitted by the fission reaction it would create a chain reaction, and an enormous amount of energy would be released in the process.  By 1942 the first sustained nuclear fission reaction had taken place in Chicago.  Hann was awarded the Nobel Prize in Chemistry in 1944 “for his discovery of the fission of heavy nuclei”. 

An Explosive Impact on Civilization

Nuclear fission is the process of splitting an atom into smaller fragments.   The mass of the sum the fragments is slightly less than the mass of the original atom, usually by about 0.1%.  The mass that had gone missing is actually converted into energy according to Albert Einstein’s E=mc^2 equation.  

The discovery of nuclear fission ushered in the atomic age, leading to inventions such as nuclear power and the atomic bomb with world changing consequences.  Almost immediately after its discovery, scientists realized the immense power that could be unleashed by splitting the atom.  In 1939, a group of influential scientists including Albert Einstein drafted a letter to President Frankling D. Roosevelt warning at the potential military applications of nuclear fission and urging the United States government to initiate its own nuclear research program.  They speculated that Nazi Germany may be developing nuclear weapons of their own.  

Cooling reactors of a nuclear power plant
Cooling reactors of a nuclear power plant
(Credit: Wikimedia Commons)

In response, an Advisory Committee on Uranium was formed which eventually led to the creation of the Manhattan Project. The Manhattan Project was officially launched in 1942 and led by J. Robert Oppenheimer at a secret facility in Los Alamos, New Mexico.  The result of the massive scientific and engineering project was the development of the world’s first atomic bomb, which was ultimately used against Japan at the end of World War II.

A more positive benefit to civilization than atomic weapons is the development of nuclear energy.  Nuclear energy produces extremely low amounts of greenhouse gases, making it a much cleaner alternative form of energy from fossil fuels.  If humanity it to solve its climate crisis in the coming century, nuclear energy may prove to be the saving technology. 

Continue reading more about the exciting history of science!

Marie Curie

Marie Curie
Marie Curie

Marie Curie (1867 – 1934) was a Polish physicist and chemist who overcame a gender discrimination in the sciences to conduct groundbreaking work on radioactivity. Her incredible scientific career awarded her two Nobel Prizes in two different fields and earned her the distinction of being the first woman to win the award.

Marie Curie was born in Warsaw to the parents of two teachers who were very interested in science.  She was the top student in her high school, passionate about science, and wanted to peruse a higher education however there were obstacles in her way.  She was unable to enroll in traditional higher education institutions because she was a woman and her family had little money to support her.  To earn money for herself and to help support her sister’s studies she worked as a tutor.  In her free time she read books on physics, chemistry and mathematics.  In 1891 she departed Poland for Paris, France to join in studies with her sister.  There she studies physics, chemistry, and mathematics, and once again was the top student in her class.  She earned her Ph.D. in physics and in 1985 she married Pierre Curie.  That same year Wilhelm Roentgen discovered x-rays.  The following year Henri Becquerel discovered a new type of ray, resembling that of x-rays yet different, emitting from Uranium.

Curie decided to study these new rays emitting from Uranium and made a handful of remarkable discoveries.  Her husband became interested in her work and joined her.  Their joint work resulted in the discovery of new two elements – Polonium, named for Curie’s home country Poland, and Radium, named for the word ray.  They discovered that Radium would continuously produce heat without any chemical reactions occurring that it emitted rays in far greater quantity than Uranium.  They term they coined for this phenomenon they were observing was radioactivity.  The term stuck.

In December 1903 Marie Curie was the first woman ever to be awarded a Nobel Prize.  Along with her husband Pierre, Marie won the prize in physics for her work in the field of radiation.  The award brought recognition and money for the two scientists, however they would not be able to enjoy it for long.  Pierre was killed in a tragic accident in 1906 when he was hit by a horse-drawn carriage while crossing the street.  Marie Curie continued her scientific work after her husband passed away and was awarded a second Nobel Prize in 1911, this time in the field of chemistry.  By now she had cemented her reputation as one of the elite scientists alive.

Curie continued to work up until her death in 1934 when she died from a rare bone marrow disease.  The disease was likely cause by her long-term exposure to radiation without proper protection.  Her legacy was that of one of the greatest scientists of the time and her work broke barriers for other woman to pursue work in the scientific fields of their choosing.