Changing Constants
We have so far been entertaining the possibility that the distribution
of matter, or the rate of expansion of the universe might differ from place to
place or on the average. This does not involve ditching any cherished notions
of physics. But it is interesting to consider the consequences of a world in
which the constants of physics are slightly changed -- either in a hypothetical
universe with different true constants or a world (which might be ours) in
which they are not all truly constants but either vary from place to place very
slowly or have their origins in processes with quasi-random aspects which could
have fallen out differently. The first detailed considerations of this sort
were made by Hoyle in 1953 when he recognised that the presence of a
significant level of carbon in the universe hinges upon a fine coincidence of
physical constants taking values which just allow the carbon nucleus to possess
a resonance for the production of carbon from helium, yet just fails to possess
a resonance for the reaction that would then burn it all away into oxygen.
Later, Freeman Dyson pointed out the significance of the non-existence of the
diproton, helium-2, in Nature. If it did exist then very rapid hydrogen burning
would allow stars to race through their evolutionary history, producing black
holes and dead relics long before planets could form or life could evolve.
Considerations like these, together with the light that superstring
theories have shed upon the origins of the constants of Nature, mean that we
should assess how narrowly defined the existing constants of Nature need to be
in order to permit biochemical complexity to exist in the Universe. For example, if we were to allow the ratio
of the electron and proton masses β=
me/mN
and the fine structure constant α to be change their
values (assuming no other aspects of physics is changed by this assumption --
which is clearly going to be false!) then the allowed variations are very
constraining. Increase β
too much and there
can be no ordered molecular structures because the small value of β
ensures that electrons occupy well-defined positions in the Coulomb field
created by the protons in the nucleus; if β
exceeds about 5 x 10-3\α2 then
there would be no stars; if modern grand unified gauge theories are correct
then α must lie in the narrow range between about 1/180
and 1/85 in order that protons not decay too rapidly and a fundamental
unification of non-gravitational forces can occur. If, instead, we consider the
allowed variations in the strength of the strong nuclear force, αs,
and α then roughly αs
< 0.3α1/2 is required for the stability of
biologically useful elements like carbon. If we increase αs
by 4% there is disaster because the helium-2 isotope can exist (it just fails
to be bound by about 70KeV in practice) and allows very fast direct proton +
proton ->
helium-2 fusion. Stars would rapidly exhaust their fuel and collapse to degenerate
states or black holes. In contrast, if αs
were decreased by about 10% then the deuterium nucleus would cease to be bound
and the nuclear astrophysical pathways to the build up of biological elements
would be blocked. Again, the conclusion is that there is a rather small region
of parameter space in which the basic building blocks of chemical complexity
can exist.
We should stress that conclusions regarding the fragility of living
systems with respect to variations in the values of the constants of Nature are
not fully rigorous in all cases. The values of the constants are simply assumed
to take different constant values to those that they are observed to take and the
consequences of changing them one at a time are examined. However, if the
different constants are fully linked together, as we might expect for many of
them if a unified Theory of Everything exists, then many of these independent
variations may not be possible. The consequences of a small change in one
constant would have further necessary ramifications for the allowed values of
other constants. One would expect the overall effect to be more constraining on
the allowed variations that are life-supporting. For examples of such coupled
variations in string theories see refs.
These considerations are likely
to have a bearing on interpreting any future quantum cosmological theory. Such
a theory, by its quantum nature, will make probabilistic predictions. It will
predict that it is ‘most probable’ that we find the universe (or its forces and
constants) to take particular values. This presents an interpretational problem
because it is not clear that we should expect the most probable values to be
the ones that we observe. Since only a narrow range of the allowed values for,
say, the fine structure constant will permit observers to exist in the
Universe, we must find ourselves in the narrow range of possibilities which
permit them, no matter how improbable they may be. This means that in order to fully test the
predictions of future Theories of Everything we must have a thorough
understanding of all the ways in which the possible existence of observers is
constrained by variations in the structure of the universe, in the values of
the constants that define its properties, and in the number of dimensions it
possesses.
Contributed by: Dr. John Barrow
|