God and the Folly of Faith: The Incompatibility of Science and Religion (29 page)

BOOK: God and the Folly of Faith: The Incompatibility of Science and Religion
4.79Mb size Format: txt, pdf, ePub
ads

Now, it should be noted that most cosmologists currently do not expect that the big crunch will happen. The best guess based on current observation and theory is that the universe is open; that is, it will expand forever. In fact, since 1998, it has been established that the expansion of the universe is accelerating. Rather than a big crunch, we are heading toward an ultimately dead universe composed of only photons and neutrinos, or whatever are the remaining stable particles. So, attractive as it may seem for those who want to live forever, Tipler's scenario is not very promising.

REDUCTIONISM VERSUS HOLISM

 

A major contrast between religious or spiritual thinking and science concerns whether or not physical phenomena can simply be reduced to the sum of their parts.

As apologist William Grassie notes, the word
science
derives from the Latin
scire
, “to know,” and perhaps also from the Latin
scindere
, “to split,” and the Sanscrit
chyati
, “to cut off.” Presumably these suggest the scientific method of breaking things up into parts. The word
religion
came from the Latin
religare
, “to bind together.” Grassie suggests, “The concepts of reductionism and holism are embedded in the very etymology of the two words.”
20

In
chapter 6
we saw that the current standard model of elementary particles and forces is fully reductionist. This notion disturbs those who see themselves as part of a great, integrated whole. While notions of a holistic science are bandied about, nothing much has come of them (recall S-matrix theory in
chapter 6
). The dominant methodology of science remains reductionist.

In the holistic view of life, every event is part of a grand scheme that applies, under divine guidance, to the whole system, from bacteria to humans and from billions of years ago to the present and indefinite future. In the view of quantum spiritualists, subatomic events are part of a grand scheme that applies holistically to every particle, from an electron in a french fry at
McDonald's to a photon in the cosmic background radiation billions of light-years away and billions of years in the past. In the reductionist view, physics as well as biology is broken down into a series of events local in space and time—collisions between subatomic particles such as electrons and photons.

Classical physics is reductionist. While direct proof of the existence of atoms was not found until the twentieth century, they were implicit in Newtonian mechanics, which was able to describe all of the behavior of macroscopic material systems—gases, liquids, and solids—in terms of the motions of their parts.

The great twentieth-century advances of relativity and quantum mechanics did not, as you often hear, “prove Newton wrong.” Any scientific model is valid in a limited domain. While the advance of science since Newton has been remarkable, it is not as incredible as it is often blown up to be. Science evolves, so progress is to be expected. Nevertheless, the physics of Galileo and Newton, their predecessors such as Descartes, and their successors such as Laplace, Einstein, and Feynman, remains much the same in basic principles and methods. Today's physics students still study Descartes' analytic geometry and his Cartesian coordinates, Newton's and Leibniz's calculus, Galileo's relativity, and Newton's laws. They learn the same definitions of space, time, velocity, acceleration, mass, momentum, and energy that are found in Newton's mechanics.

It's true that some of the equations have been modified for high speeds, and some new equations have been added for small distances. Still, the equations of relativity derived by Einstein, both the special and general versions, reduce to the Newtonian equations when the speeds of the bodies involved are small compared to the speed of light in a vacuum,
c
. Similarly, the equations of quantum mechanics, derived by Max Planck, Niels Bohr, Erwin Schrödinger, Max Born, Pascual Jordan, Werner Heisenberg, Paul Dirac, Richard Feynman, and others, reduce to the Newtonian equations when Planck's constant
h
is set equal to zero.

The basic scientific method of observation and mathematical model-building remains unchanged. We physicists still model the world as material bodies and build mathematical models to describe what we see when we observe these bodies around us everyday or in the laboratory and cosmos.

New Age spiritualists and Christian apologists have appropriated quantum mechanics to claim a more holistic picture of nature. However, the standard
model of particles and forces developed in the 1970s has agreed with all the data gathered at particle accelerators since then, and is only now being seriously tested at the Large Hadron Collider in operation in Geneva. Discoveries at the LHC are unlikely to change the general reductionist scheme.

In short, reductionism in science remains consistent with all the data. It isn't defeated just by the fact that it can't derive everything that happens. It still works. Holism has no evidentiary support. It doesn't work. Holism is nothing more than the wishful thinking of those who have the hubris to believe they are an important part of some cosmic plan.

WEAK EMERGENCE

 

Nevertheless, many observers think they sense a deviation from strict reductionism in the scientific observations made above the level of elementary particles. They argue that new principles “emerge” at those levels that do not simply arise from particle interactions. Grassie asserts, “The concept of emergence says simply that the whole is greater than the sum of its parts.”
21

Although comprising only 5 percent of the total mass and energy of the universe, up and down quarks, electrons, and photons are all that are needed as ingredients of conventional matter in a working model for those observable phenomena that are of direct concern to most humans. Whether you are a condensed matter physicist, a chemist, a biologist, a neuroscientist, a sociologist, a surgeon, or a carpenter, all the stuff you deal with in your work is made of up and down quarks, electrons, and photons. Only elementary particle physicists and cosmologists worry about the other 95 percent of matter.

The conventional reductionist picture envisages a series of levels of matter. From elementary particles (or strings, or whatever are the most elementary) we move to the nuclei of atoms, then to the atoms themselves and to the molecules that are composed of chemical atoms. While only on the order of a hundred different chemical atoms exist, the number of molecules is endless—especially the huge structures built around carbon that form the ingredients of life and our fossil fuels, as well as many synthetic materials, from plastics to polyesters.

The objects of our everyday experience are composed of molecules. Living organisms are an important component at this level, at least to us living organisms. How important they are on a cosmic scale is more dubious. Humans organize themselves into societies, so we can regard social systems, politics, and economics as a yet higher level of material existence. Beyond that we have Earth and its complex systems, the solar system, our galaxy, other galaxies, and whatever else is out there, such as black holes, the cosmic background radiation, dark matter, dark energy, and other universes.

In 1972, Philip W. Anderson, the eminent condensed matter physicist who won the Nobel Prize in physics in 1977, wrote an article in
Science
with the title “More Is Different.” In the article, he complained about the implication of the reductionist notion that all the animate and inanimate matter of which we have detailed knowledge is controlled by the same set of fundamental laws. In that case, Anderson stated, “The only scientists who are studying anything really fundamental are those working on those laws. In practice this amounts to some astrophysicists, some elementary particle physicists, some logicians and other mathematicians, and a few others.”
22
Anderson pointed out that many properties of complex systems cannot be derived from particle physics. Wow! No one had ever realized that before.

Obviously an elementary particle physicist cannot take her equations and produce a derivation of every physical property we observe at all levels. She cannot calculate the structure of DNA from “first principles” or predict the stock market. At every level of matter, from the smallest bodies to the largest, we have specialists developing the principles that apply at that level by applying the standard method of science—observation, model building, and hypothesis testing. The principles that apply at each level are said to “emerge” from the level below.

But the fact that we cannot derive these emergent principles from particle physics does not prove that everything cannot be just collections of particles. The practical irreducibility of emergent principles to particle physics is trivially an
epistemological irreducibility
. The key question is whether it is also an
ontological irreducibility
.

Many scientists who work at the higher levels of experience think emergence is ontologically irreducible. They argue that there is more to it than
simply emergence from below. Since they never need to use any particle physics in their work, and are rarely exposed to it during their training, it seems to them that the regularities they uncover arise from an independent set of principles that apply at a higher level than quarks and electrons.
23

For example, biologist Stuart Kauffman proposes that self-organization plays a role in biological evolution and the origin of life, which, he insists, cannot be reduced to chance and natural selection alone.
24

Theologians and religious apologists are quick to agree. William Grassie writes, “From the surface tension of water in a glass to superfluidity and superconductivity in a physicist's lab, the behavior of huge numbers of particles cannot be deduced from the properties of a single atom or molecule.”
25
This is misleading. Of course we don't deduce the behavior of a group of many particles from just the properties of a single particle, but we deduce that behavior by considering a system of particles with individual properties.

In
chapter 5
we described two examples from physics that lend credence to the notion that higher-level principles emerge from those below with no further inputs required. In the nineteenth century, thermodynamics developed from the need to understand the engines of the Industrial Revolution. By the usual combination of creative thinking nourished by empirical data, principles such as the first and second laws of thermodynamics and the ideal gas law were discovered without any reference to the underlying nature of matter.

Then, James Clerk Maxwell, Ludwig Boltzmann, and Josiah Willard Gibbs showed that the principles of thermodynamics could be derived from the assumption that matter is composed of huge numbers of tiny particles that move around and collide with one another according to the laws of Newtonian mechanics. They knew it was impossible to predict the motion of each particle individually, so they used probability theory to predict the average behavior of many particle systems in what is called
statistical mechanics
.

Assuming the particles in a gas in equilibrium moved around randomly, Maxwell, Boltzmann, and Gibbs showed that the pressure in a container of a gas resulted from the momentum transferred when the particles collided with the walls of the container. The absolute (Kelvin) temperature of the gas was proportional to the average kinetic energy of the particles. The first law followed from conservation of energy. The second law followed from the tendency of a system
of many particles to become more disorderly with time, as multiple collisions of particles with one another tend to wipe out any initial local regularity.

In short, thermodynamics emerged naturally from statistical particle mechanics. Every emergent thermodynamic principle that was originally introduced to explain macroscopic phenomena was derived from particle mechanics and statistics. The new picture was simpler, more elegant, and far more useful as physics entered the quantum era. And the whole was still equal to the sum of its parts.

A second example of “emergent” principles demonstrating how the whole is still equal to the sum of its parts comes from quantum mechanics. Two kinds of particles exist: fermions with half-integer spins, and bosons with integer spins. Both types of particles have diametrically opposed behavior when they are part of a collection of particles. No two identical fermions, such as electrons, can exist simultaneously in the same quantum state. This is called the
Pauli exclusion principle
.

Starting with hydrogen and adding, step-by-step, a proton to the nucleus of the corresponding atom and enough neutrons to keep the nucleus stable, electrons must be added to complete an electrically neutral atom. These electrons cannot all fit in the lowest energy level, and so some electrons must move to higher levels, or “shells.” In this way the Periodic Table of the chemical elements is built up. Without the Pauli exclusion principle, the complexity of atoms would not exist, and neither would life as we know it.

On the other hand, identical bosons, such as helium atoms, tend to congregate in a single state, making possible phenomena such as lasers, superconductivity, superfluidity, and boson condensates.

These are all “emergent” phenomena that occur only for collections of particles. This might be regarded as the result of some holistic principle. However, their behavior can be derived from the basic physics of individual particles and their interactions with other individual particles.

As a third example, chaos theory, described in
chapter 6
, can be used to further demonstrate how the existence of emergent principles does not imply the whole is greater than the sum of its parts. Basic statistical mechanics applies to simple systems of many particles that are at or near equilibrium, that is, fairly uniform and homogeneous throughout with the same temperature and
pressure everywhere within the system. However, most many-particle systems are not that simple. In particular, a special kind of system exists called a
nonlinear dissipative system
, in which the motions of the bodies are so complicated that the mathematical equations describing those motions cannot be solved by standard methods and equilibrium statistical methods do not apply. Earth's atmosphere is a prime example, with its turbulence and large temperature and pressure variations from place to place and time to time.

BOOK: God and the Folly of Faith: The Incompatibility of Science and Religion
4.79Mb size Format: txt, pdf, ePub
ads

Other books

A Pocketful of Eyes by Lili Wilkinson
A Forest of Wolves by Chelsea Luna
Three's a Charm by Michkal, Sydney
Opposites Distract by Judi Lynn
Flashman in the Peninsula by Robert Brightwell
Off the Road by Hitt, Jack
Feels Like Family by Sherryl Woods
Give Me More by Jenika Snow
Wings over the Watcher by Priscilla Masters