The world around us is a remarkable structure, despite all of its annoying substructures such as hypocricy, bureaucracy, environmental advocacy; communism, Nazism, feminism, terrorism, alarmism, NGOism; insanity, loop quantum gravity, pity; regulation, explosion, Californication; IRS, INS, HIV, AIDS, NOW, PC, NSDAP; silliness, laziness, and the Loch Ness monster.
What are the most general features of the natural laws and the environment that are required for the existence of worlds that qualitatively resemble ours? The entries below may look too constraining to some readers and too vague to others but a working draft may turn out to be useful anyway. In each case, I will try to explain why the feature is important and whether we understand its origin.
One timeThe world is interesting because many things happen in it. The important processes that will be discussed below take place in this Cosmos. The future may support more organized structures than the past. Something that looks like a time coordinate is necessary. We can also see that this kind of a world requires one time coordinate and not more. If you have (at least) two large time coordinates t1,t2, then you can continuously change the direction of worldlines in the t1-t2 plane. This means that there can't exist any sharp differences between the past and the future.
In perturbative string theory, we naturally predict the spacetime to have one time coordinate from the requirement that the physical Hilbert space is positively definite - a requirement discussed in the following paragraph. The Virasoro algebra is as large as we need to decouple negative-norm states arising from one set of time-like oscillators (and another set of longitudinal i.e. spacelike oscillators dies at the same moment). Similar arguments hold in all approaches to physics that admit the light-cone gauge.
Postulates of quantum mechanicsAlthough quantum mechanics was born less than 100 years ago, I think it is fair to say that the basic principles of quantum mechanics are probably necessary for life, too. By basic principles, I mean the existence of a complex Hilbert space that encodes possible states of the Universe, the existence of Hermitean linear operators on this space that represent real observables that can physically distinguish different states and/or are responsible for their evolution, and the procedure to calculate probabilities of different outcomes of a sequence of events as squared absolute values of certain complex amplitudes. The last postulate requires the Hilbert space to be equipped with a positively definite norm because negative probabilities make no sense even at the logical level. The evolution operators should be unitary in order for the total probability to be preserved.
Quantum mechanics is needed in order to create sharply separated discrete states - such as the bases of DNA discussed below - in a continuous world: I am talking about discrete energy eigenstates. It is also necessary for allowing some rather unlikely processes such as quantum tunnelling. In a classical world, different possibilities would be connected by a continuum, the existence of discrete and binary codes such as the DNA code or binary computers would be extremely problematic, and small effects would always modify the uncorrected process by small amounts rather than allowing qualitatively different processes with small probabilities. Burning of stars, the transistor effect, and many other things would be at risk. Of course, in a canonical classical world that is easy to write down, the atoms would be unstable which would have truly catastrophic consequences, but even if you tried to design a more stable set of classical laws, for example by replacing point-like electrons by solid balls, I guess that life would not start in such a world.
According to the state-of-the-art picture of physical reality, the postulates of quantum mechanics as described above are exact. It seems hard to modify them or deform them and they don't need to be modified even if we want to describe processes that are seemingly as incompatible with quantum mechanics as black hole evaporation. As far as we can say, the postulates of quantum mechanics need to be imposed at the very beginning. String theory wouldn't have most of its cool properties such as dualities without quantum mechanics, but at present, quantum mechanics itself can't be derived from a deeper principle. Many leading string theorists believe that the quantum postulates will be "unified" with other features of the real world such as the geometry of various spaces (configuration spaces, phase spaces, moduli spaces, spacetime) in the future which could make quantum mechanics look much more inevitable than today.
Enough spacetime for complicated structuresOne of the general features of our Universe is that it is somewhat large. There are many animals and people living on our planet that orbits one star among tens of billions in a galaxy that is just one galaxy among tens of billions. You need about billions of bits to store the information from this blog and about 10^{100} of quantum bits to store the less important information about the rest of the Universe.
Evolution needed a lot of space and time to go through and there probably exist fundamental reasons why the complex processes could not have been too much faster. At the beginning, I wrote that time was essential and you could have asked why I didn't mention space. Yes, it was because I think that from a non-physics perspective, space that approximately follows the rules of the Euclidean geometry is less essential. Nevertheless, now we see that we need a lot of room where the information may be stored. If you have a lot of "room", it does not have to be organized according to the rules of Euclidean geometry. But the Euclidean geometry is certainly a natural choice. If you want to be more specific, life like ours requires 3+1 dimensions. But theoretical physicists would certainly not count a theory as "physically inconsistent" just because the dimension of its spacetime differs from 3+1.
At the end of the whole article, I will also argue that a symmetry between space and time is needed for some aspects of life.
How can we explain that the world is large? Where do the large numbers such as 10^{90} encoding the number of particles in the Cosmos come from? The best explanation we have is inflationary cosmology. Even if you start with a Planckian universe where all quantities are of order one, inflation can lead to a dramatic, exponential increase of the total volume and the total mass of the Universe: the energy density of the vacuum is essentially constant during the inflation while the volume expands exponentially. Many quantities of this kind can naturally be calculated as exponentials of numbers that are expected to be of order 10 or 100. Although we cannot yet calculate the exact number of e-foldings from the first principles, it is clear that the mystery why the Universe is so huge significantly diminishes once we appreciate the power of inflation. Inflation simultaneously explains many other mysteries that are potentially necessary for decent life such as the flatness and isotropy of the Cosmos.
Needless to say, a large spacetime in the era of thriving life also depends on a small value of the cosmological constant. We don't have a satisfactory explanation why it's so small. In other cases, I would be ready to accept the anthropic explanation of some principles of our world because these principles are qualitative ones - for example, the existence of one time coordinate or the validity of the quantum postulates. It is much harder for me to accept an anthropic explanation for quantitative features of reality.
Hierarchies and large dimensionless numbersInflation has allowed us to calculate some mysteriously large numbers as exponentials of more reasonable numbers which reduced the degree of mystery hiding in many numbers of that kind. However, there are other large universal constants that have not yet been explained by this exponential mechanism or a similar mechanism. One of them is the ratio of the Planck mass and the proton mass - the kind of number that governs the size of neutron stars (and perhaps other stars, too).
If evolution needs a lot of space and time, the stars that are naturally created and that must pump energy into this whole process must be large enough. Their size is dictated by various hierarchies. The ratio between the Planck scale and the QCD scale is a key player that helps to determine the size of some stars. The huge size of this particular ratio is understood, too. It's because the QCD scale is defined as the scale where the QCD coupling is of order one. It's plausible that the QCD coupling at the Planck scale is a reasonable number such as 1/25, and because this coupling only depends on the logarithm of the energy scale, we must go to very low energy scales to change 1/25 to 1. This is why the QCD scale is so much lower than the Planck scale. This is why the protons are so much lighter than the Planck mass i.e. why their gravitational interaction is so much weaker than other interactions between them. This is one of the conditions for evolution to have enough spacetime to go. Again, you see another example of an exponential explanation of a large constant.
In the previous paragraph, I was comparing the strong force with the force of gravity. The qualitative explanation of their difference exists. However, if we compare the electroweak interactions with gravity, we don't quite know where the huge ratio between their strengths comes from - not even qualitatively. This is called the hierarchy problem. Supersymmetry helps to make this problem less serious and the smallness of the weak scale more natural and more stable against corrections - but no one has found a sketch of a convincing calculation that would describe the ratio of the weak scale and the Planck scale as the exponential of a more reasonable number. If you were able to show that the weak scale and the QCD scale must be close to each other, it could also become a solution.
There exist other large dimensionless numbers in the real world that are important for life. The
proton-electron mass ratio, about 1836.15, must be large in order for the nuclei to behave classically, while electrons determine the effective interactions between them. The
fine structure constant is about 1/137.036. Its smallness is necessary not only for the impressive quantitative success of perturbative QED, but it is also needed for non-relativistic physics to be a good approximation for the physics of atoms because the speed of electrons in atoms divided by the speed of light is controlled by the fine structure constant.
Nowadays, all parameters of the natural laws we use to describe the world are reduced to 30 or so parameters of the Standard Model (with neutrino masses) expressed in the Planck units, and to the cosmological constant. The tiny cosmological constant is not really understood. Some patterns in the Standard Model parameters are qualitatively understood. For example, the neutrino masses are roughly where they are because of the see-saw mechanism, assuming that a relevant scale like a GUT scale exists. Other constants have potential quantitative explanations – such as one relation between the three gauge couplings that can be deduced from grand unification. The hierarchy between the Planck scale and the Higgs vev (weak scale) is not understood too well, much like the large ratios between the different Yukawa couplings. However, the top quark Yukawa coupling may have a semi-quantitative explanation, much like some relations between the bottom quark and tau lepton couplings. Some mysteries seem less serious than others, some large parameters seem to be more necessary for anthropic considerations than others, and so on - a typical example of chaos that is still waiting for a better explanation.
Causality and locality, at least approximate onesWhen I mentioned that a time coordinate was needed, I implicitly required that this time coordinate gives us an ordering that can separate the past from the future by the present. The logic of our world is based on the past that can influence the future, but not the other way around. Using a more careful quantum language, the knowledge of the data from the past is used to predict the probabilities of different outcomes in the future, but I am convinced that you should never use these procedures backwards unless you become capable to account for the fact that the entropy in the past should be lower than today which is dictated by another principle, a low-entropy beginning of the Universe. This asymmetric relation between the past and the future is what we call "causality".
We also mentioned that the Universe should be large. For structures and ideas to be developed independently and to have a value that can approximately be separated from the rest of the world, the phenomena should occur locally, without too high an influence of the rest of the Universe. The most natural way to achieve this goal is to impose a strict
locality: a signal can only get from A to C if it occurs somewhere in between, in B, before it reaches C. At the end of the article, I will also discuss that the principle of relativity may be needed for life in general. String theory predicts that the rules of special relativity are always respected locally in spacetime. Special relativity strengthens the constraints of causality - you are not only unable to influence your future but you can't influence spacelike-separated events in spacetime either. In fact, locality and causality become more or less synonyma in the relativistic context.
Although one could perhaps imagine more general schemes where life would be possible and locality and causality would be just an approximation, string theory and observations seem to agree about a rather strong way to satisfy these constraints: local physics is Lorentz-invariant and satisfies the laws of special relativity.
The existence of classical limitsWhile some important processes in the world have a quantum-mechanical essence, it is fair to say that an overwhelming portion of the key processes in the Cosmos can be interpreted using the language of classical physics, at least if you don't care about the detailed microscopic origin of these phenomena. The functions of a classical computer, human brain, or DNA reproduction probably belong to the classical realm: the decoherence time in all these cases is much shorter than the typical timescales at which these processes occur which is why the classical intuition is justified.
The existence of justifiable classical limits is therefore guaranteed by a sufficiently large environment and sufficiently strong interactions with the environment. Once again, these things are large because the number of atoms in the brain and similar numbers are large. In this sense, the previous points about the existence of large numbers also explain why there must inevitably exist processes that admit a classical description. The very same large parameters also explain why thermodynamics is a good approximation, and so forth. There is a lot of relationships between these principles although they could seem independent a priori.
Nearly permanent sources of usable energyEveryone knows that the energy needed for the terrestrial life
comes from the Sun's thermonuclear energy after all. It's a system of processes that clearly work and no sane person can argue otherwise. A question is whether very different alternatives are plausible.
I am not so sure. The energy from the Sun is modulated by various oscillating patterns at several timescales. It's a good driving force for many interesting events. You can hide from the Sun if you need to. The biosphere is getting about 1 part per million of energy from geothermal sources. Indeed, you could design planets where this fraction would become higher. Some people have argued that the life on such planets would not require any stars. I am not quite certain. Both answers seem plausible to me.
The large stars arise if the hierarchies explained above exist and if other parameters such as the cosmological constant belong to a certain window. Because this topic is often discussed in the anthropic context, I won't repeat these considerations here.
Large enough environment to get rid of entropyLife leads to an evolution and creation of more organized structures. In some sense, more organized structures carry smaller entropy than the organized ones. A decrease of the total entropy would violate the second law of thermodynamics. In reality, of course, the decrease of the entropy of an animal is overcompensated by the increase of the entropy of its environment. There must exist a sufficiently large environment that acts as a dumping ground for the entropy generated by life. Indeed, the Earth and perhaps the Universe are large enough for us to throw our entropy away. The environmentalist simpletons could call this emission of entropy away from the biosphere a pollution, just like they do so in the case of carbon dioxide. They call it pollution; we call it life.
There are other processes involving entropy that may be needed for life. Non-gravitational physical systems typically maximize the entropy if they become completely uniform (like a purple gas). Gravity can change the rules of the game and clumped objects can actually carry a higher entropy and become natural final states of many processes. Black holes in particular carry the maximum entropy you can squeeze into a given volume. This feature of gravity was necessary for structure formation during the childhood of our Cosmos, and I recommend you Brian Greene's "The Fabric of the Cosmos" for additional words about this interesting semi-philosophical topic.
Metastable carriers of information
Animals, plants, as well as computers need to sharply define the message they represent. They do so in the form of DNA codes or computer files. Some of these structures must occur naturally – so let us start with the DNA molecules. The information is encoded in the A/C/G/T(U) bases. If the sharp and discrete information were impossible, everything would be vague and fuzzy. Everything could continuously collapse to one particular configuration - a ground state. However, physical systems in the real world admit discrete states - for example discrete bound states that are energy eigenstates.
Mechanisms to xerox bits of information
A DNA molecule carries a message - a piece of life that wants to spread. A necessary ingredient for life to spread is a procedure that can copy bits of information. Because of the quantum xerox no-go theorem, these bits must always be classical bits and the mechanisms must fit into the framework of classical physics. As you know, the DNA molecule is a double helix or a sequence of pairs of complementary bases. When it splits into the two strands, new bases absorbed from the environment replace the old members of the pairs and two copies of the original DNA appear. The mechanisms that copy bits in classical computers differ substantially but in both cases, copying of classical information is a must. The life before DNA - with RNA or protein folding only - was probably not able to lead to intelligent, complex life because of some rather general constraints but some experts in biology will correct me if I am wrong.
At any rate, the existence of these processes in biology and computation can be demonstrated in a simple non-relativistic quantum mechanical description of these systems. The presence of the relevant ingredients is a rather generic consequence of constraints from other sections of this article.
The reproduction of the information is guaranteed to be imperfect in the real world – and this imperfection and mutations are in fact necessary for the whole process of evolution to work. The technically difficult part of the physical processes is the task for Nature to make the processes reliable enough. Making them a bit unreliable is almost never a difficult problem.
Processes that depend on the information and evaluate itClassical computers can behave according to a program encoded in the memory. In the same way, cells are able to produce proteins if they interpret the DNA. The protein production can be viewed as a translation from the binary discrete world of the DNA codes to a more continuous world of mechanics and macroscopic biology. Computers often come with analogous devices that can translate a binary code to a physical object embedded in the continuous world.
Framework for natural selectionThese physical objects - proteins or pictures printed on your printer - then behave according to more general, continuous laws of physics. Different programs, DNA codes - or scientific theories, for that matter - fight for the room under the Sun. Their interaction with the real world and with each other acts as a method to evaluate the value of the original algorithms, memes, genes, or DNA codes - and the fittest statistically survive more often than the less fit ones. This mechanism leads to progress and the evolution of more viable life forms. In the context of scientific theories, "more viable" should really mean "more true" and it does if the scientific methods to evaluate ideas are "artifically" inserted instead of the fights between the animals and other life forms. Everyone should know how evolution works because of a combination of the key mechanisms above. Completely analogous mechanisms are at work when better ideas are being developed in science or better products, technologies or policies to regulate the society are being looked for by the humans.
Once again, the microscopic realization of the basic steps differs substantially if we study different systems that are able to improve themselves. However, some counterpart of these assumptions is needed in all cases.
Less general and more technical requirements:The principle of relativityTheories that are normally considered in the context of modern theoretical high-energy physics are relativistic theories: they respect the laws of special relativity. In perturbative string theory, short distance physics automatically respects the Lorentz invariance because nonlinear sigma-models, when expanded at very short distances, always resemble the Polyakov action in which the symmetry between all spacetime coordinates (scalar fields on the worldsheet) is manifest. Open strings can break the Lorentz symmetry spontaneously by introducing a non-commutativity. But at some fundamental level, the Lorentz symmetry still holds. And these conclusions are probably true non-perturbatively, too. I certainly think that string theory predicts that special relativity will be verified successfully with ever greater accuracy - except for phenomena that result from well-understood properties of the environment. String theory predicts that similar violations of well-motivated principles, violations that are predicted by not-so-consistent alternatives, will remain absent well beyond the level at which they have been verified as of today.
Is the principle of relativity necessary for life? I would tend to answer Yes, it is. Our planet is moving within the Solar system and the Solar system is moving within the Milky Way. Our Galaxy is in motion with respect to the cosmic microwave background, too. These nonzero velocities are more or less inevitable consequences of the violent history of our world that was necessary for the creation of life, too. We expect the "viability" of life structures that are selected by the natural selection to be independent of this motion at many different distance scales. Moreover, the life forms should still respect the rotational symmetry. I did not include the rotational symmetry among the essentials of life because we could hypothetically imagine a living world with a preferred direction even though it is helpful if animals can use the very same methods to move and look in all directions, but if the accurate enough rotational symmetry that remains relevant for the life forms is a condition, the previous argument shows that the principle of relativity should be another condition.
In the actual theories we use and believe, the rotational symmetry and the principle relativity are of course naturally unified in the Lorentz group. Because we are aware of no experimental evidence of a Lorentz violation, because such a violation does not seem necessary for life, and because it seems incompatible with the deepest description of the real world we have, namely string theory, I am among those who think that the Lorentz symmetry of local physics is probably an exact law of Nature. In our world, the principle of relativity is the relativistic one, with a finite speed of light; no, this sentence is not a tautology. It is certainly necessary for the life and cosmology as we know it but I don't know whether you could construct life in a non-relativistic world. At any rate, if you believe that special relativity and quantum mechanics from the beginning are conditions, it also means that the world should obey the rules of quantum field theory at long distances because quantum field theory follows from quantum mechanics and special relativity.
In combination with the requirement of gravity below, we can also deduce that string theory is needed for life even though we can’t yet offer a proof that every honest person with IQ above 100 would have to accept immediately.
The existence of fermions (and atoms)
The most obvious observation that makes fermions, particles that obey the Pauli exclusion principle, essential for life is the diversity of atoms. Chemistry as we know it is based on a table of elements with very different properties. Some of the qualitative properties are a quasiperiodic function of the atomic number. As you know, this is only possible because at most one electron can occupy a state with all quantum numbers fixed. The existence of fermions is probably important for other aspects of life, too. In the context of string theory, fermions are one of the reasons not to take bosonic string theory seriously. In other words, fermions force us to consider superstring theory that has other virtues such as the absence of bulk tachyons, a necessary condition for the perturbative stability of the vacuum. In the presence of fermions, supersymmetry is a natural extension of the well-established symmetries of spacetime. Some people feel certain that supersymmetry strictly below the Planck scale - and probably close to the electroweak scale - is an inevitable prediction of string theory for the real world. Others disagree.
The existence of a universally attractive force (gravity)
As mentioned previously, gravity has the ability to clump matter without violating the second law of thermodynamics that normally drives systems (such as gases) towards uniformity. It is related to the fact that gravity is universally attractive because it is sourced by a positively definite quantity, namely the mass. Forces with both signs such as electromagnetism tend to create neutral systems such as atoms. The leading force vanishes between neutral systems. This "neutral" outcome is even more obvious for confining forces such as the strong force because charged (colored) systems are completely unphysical and cannot appear in separation. It seems that a force similar to gravity must survive after the "neutralization" time scale if we still want planets that orbit around their stars at a fair distance. Needless to say, gravity in the very narrow sense - a spin 2 force that respects the diffeomorphism gauge symmetry at low energies - is an undeniable prediction of string theory.
The existence of a U(1) gauge field at low energies (electromagnetism)
It is a disputable technical feature of the real world. Nevertheless, life similar to the life we know requires at least one unbroken U(1) at low energies. Ions with both signs of their electric charge seem to be essential building blocks for many compounds and the key players in processes in chemistry and biochemistry. Electromagnetism controls virtually everything we know about life and technology and it is of course very hard to imagine a world without electromagnetism. Although I have no complete proof, my prejudice is that the world whose only force is the SU(3) Yang-Mills interaction of QCD could not produce intelligent life. One problem of such a world would be that animals could not see – photons and arbitrarily soft photons in particular are important to transfer information without paying huge amounts of energy.
The existence of hierarchies, the Born-Oppenheimer approximation, and many "nuclei" (strong force)At the beginning, I mentioned the importance of large dimensionless numbers that allow the world to be large and complex. One of the minor examples was the proton-electron mass ratio that allows the motion of nuclei to be interpreted classically, as the Born-Oppenheimer approximation suggests. But do we need the nuclei at all? We have mentioned that different atoms were needed which required the Pauli exclusion principle. This principle controlled the electrons. But frankly speaking, electrons are not enough. You want the atoms to be neutral. There must exist something like the nuclei that can have many values of the charge. If we want to avoid the introduction of new special parameters for each nucleus, the structure of the nuclei should be governed by more fundamental microscopic laws of physics. The laws we know from our world are described by QCD and I am not able to imagine a working appealing theory that would allow many nuclei and atoms but avoided QCD-like physics. Again, I have no proof of a no-go theorem.
Summary
As you can see, many features of particle physics did not appear in the lists above. It is fair to say that I am not aware of any truly solid arguments that the existence of some life requires entities such as weak interactions, additional families of (heavy) quarks and leptons, CP-violation (except for baryogenesis or leptogenesis), a small CP-angle, and so forth. The apparent fact that some features are needed and some features are not is enough to convince yourself that the anthropic principle can't be the universal answer to all the questions that can't be answered at present. In other words, you have all the rights to say that the superficial successes of the principle in some cases are just coincidences.
The anthropic principle, much like this text, is a method to confirm that our world makes sense after all. But it is not a good framework for producing new predictions.