Thursday, January 11, 2007

Raphael Bousso: probabilities in the landscape

Raphael Bousso (Berkeley) gave a very inspiring talk about the probabilities in the landscape.

This topic may be viewed as a controversial one but in the seminar rooms, the controversy appears at a slighter different and more quantitative level than what the critics of science and - let me be extremely polite - their fellow [potentially offending word has been removed] readers would lead you to believe. No one offers comments like "this is all unscientific" because the people in the room actually have to look for real answers to some of the questions instead of just screaming at other people. ;-)

Why Raphael Bousso and others think that the anthropic principle is almost inevitable

As our discussions with him have clearly revealed, Raphael is a firm anthropic believer. And The Reference Frame admits that he has definitely some reasons for his opinions. As described e.g. in Polchinski's review of the cosmological constant problem, it seems to be a very difficult task to find a non-anthropic explanation of the small value of the cosmological constant because of several reasons. Most of these reasons are, in fact, independent of string theory, for example:
  • the value of the cosmological constant is so small that this effect played almost no role in the Planckian era, so it is hard to see how a physical mechanism could have depended on this tiny quantity and how the hypothetical mechanism could have selected the right value
  • many subtle but unacceptably large contributions to the vacuum energy coming e.g. from loops of electrons or nuclear matter etc. seem to exist because they have been tested in many experiments; by the equivalence principle, something that has also been tested, it seems necessary to assume that these effects gravitate, too
  • unlike the masslessness of fermions such as neutrinos that can be explained by the chiral symmetry that seems to be pretty much unbroken experimentally, the known symmetries that could underlie vanishing/smallness of the cosmological constant - such as supersymmetry - have been observed to be pretty strongly violated and the corresponding contributions from this violation to the vacuum energy are already far too high; Raman Sundrum's ghosts are an example how weird theories you must consider if you want to find another symmetry principle




  • if you imagine any mechanism that has worked before the electroweak symmetry breaking and that is supposed to have chosen the right value of the vacuum energy we observe today, you run into a contradiction because at that time, our Universe was actually in a different regime in which the vacuum energy differed by a TeV^{4} or so, and it seems acausal to think that this mechanism was deciding about the future tiny value of the constant that we see today; a similar comment applies to many other phase transitions including the inflationary era when the actual vacuum energy was much higher as well
Fine. None of these things is a proof of the anthropic lack of principles as Raphael agrees - there can still exist a dual description of reality that makes the smallness of the cosmological constant manifest and can be shown to be equivalent to quantum field theory and the known formulations of string theory - but the union of these arguments leads him to believe that the anthropic explanation that depends on the observers is more or less inevitable. He is, however, unsatisfied with the details of the existing picture. The goal of this approach is to get from the anthropic philosophy to predictions which, according to Raphael, may occur in three steps:
  • constructing and understanding the set of vacua, the stringy landscape
  • geometrizing and realizing the points in the landscape as regions of a real multiverse, typically with eternal inflation or something similar
  • counting the observers according to some criteria
Raphael leaves the first task to those string theorists who actually like to study flux compactifications of string theory etc. and focuses on the other two points. He has something to say about both of them.

Volume in eternal inflation

If we want to calculate the probability that an observer ends up in a given vacuum, we need to draw a Penrose diagram of a typical inflating universe and compute the volumes, weighted by some appropriate densities of observers, that are associated with the individual minima in the landscape.

That seems to be an extremely ambiguous and ill-defined task. We have mentioned these difficulties in the context of Vilenkin's talk about the same topic. In most cases, your Penrose diagram will look like de Sitter space in the far future - a horizontal straight terminates your spacetime. But there can be many V-like triangles embedded into each other and attached to the horizontal line that describe expanding de Sitter universes that tunnel into each other.

If you end up in flat space with a vanishing cosmological constant, you should decorate a horizontal segment with an A-like triangle at the top - with a future half of the Minkowski space Penrose diagram. If you end up with a negative cosmological constant, you will evolve into a Big Crunch: the horizontal line interval becomes a horizontal wiggly line interval representing a singularity. There has been some disagreement whether the shape of this wiggly line can be globally removed by a conformal transformation but let's not discuss it here.

Fine. The anthropic counters would now invent a horizontal slice through this spacetime - near the boundary of this spacetime in the far future - and compute the fraction of the volume occupied by various types of vacua. The result will depend on the way how you draw the slice. And it's a complete mess.

Raphael views this approach as a flawed one. He thinks that the Penrose diagram of an evaporating black hole is an important lesson that tells us that we should never simultaneously look at many portions of a Penrose diagram that no observer can see. In the case of the black hole, a nearly horizontal slice could lead you to believe that the initial information has been xeroxed - it appears both in the interior as well as in the Hawking radiation - which is bad. You should never look at the interior and the exterior of this black hole simultaneously. If you restrict your description of reality to the perspective of the external observer only or the internal observer only, you will see no xeroxed information or other obvious contradictions.

Raphael rightfully argues that if this lesson is true in the black hole setup, it should have broader implications, especially implications for cosmology, too. The primary implication for the anthropic probabilistic calculus is that we shouldn't draw the long horizontal lines in the upper parts of Penrose diagrams.

Instead, we should always focus on a causal diamond. A causal diamond is a region of spacetime associated with a particular time-like worldline. It is the intersection of the
  • union of the (solid) future light cones of all points along the worldline
  • union of the (solid) past light cones of all points along the worldline
Physically, it means that the causal diamond consists of all events in spacetime that can be affected by a decision of the observer in the past but that can still influence the observer on the same worldline in the future so that the observer can say that she has probed the event.

Raphael used the word "observer" but because this word has led to some confusion whether one can believe that the density of the observers is negligible in some situations even though we seem to have an observer, he retracted the word "observer" and only talked about the "worldlines" in the context of these causal diamonds.

The principle that you should only look at the causal diamonds instead of bigger regions in spacetime is usually described by the adjective "holographic" because it has something to do with black hole complementarity, as we have shown above, which may itself be viewed as a manifestion of holography. But you can view the word "holographic" to be nothing else than "Boussian" because Bousso himself is holographic.

Instead of calculating the volumes on a horizontal spatial slice, Raphael wants to focus on the worldline - a typical worldline, whatever it exactly means (there have been discussions whether worldlines of people orbitting a black hole or the Sun may ever be typical) - and count how many observers living in different Universes the non-existent observer living on that worldline will see. ;-) This counting, if done properly, decides about the relative voice of different vacua in the landscape choir according to Raphael Bousso.

So Raphael constructs the matrix ETA encoding the rate-per-unit-time of transitions between pairs of vacua. The weights of the different vacua can be determined as coordinates of the eigenvector V of the matrix ETA associated with the eigenvalue one. We have discussed that this approach is analogous to Google's PageRank algorithm, see the last two paragraphs, that determines the weight of different web pages on the web. In this analogy, vacua correspond to web pages and tunnelling in between them corresponds to hypertext links.

Counting observers

The anthropic explanations always rely on the existence of observers or intelligent observers and it is very hard and probably impossible to define such a notion or count how many of them you have. Can a pile of silicon, a small chimp, a tiny chip, or Peter Woit be counted as intelligent observers? While the intuitive answers are quite clearly "No", we don't have any quantitative method to define them, count them, and prove that the intuition is correct.

Raphael's proposal is the following: take your favorite cosmology and a causal diamond in it and compute the entropy production in this causal diamond. The more entropy you create, the more observers have lived in that diamond. The idea behind this counting is that every observer needs to produce a certain nonzero amount of entropy for every observation or every step in their reasoning. There will exist less efficient observers - elephants will create way too much entropy for too small intellectual outcomes - but there can also exist more efficient observers.

This rule will automatically suppress the role of the exponentially long periods of time in the far future of nearly empty de Sitter spaces. Only the beginning of your de Sitter cosmology will contribute to the probabilistic measure substantially because entropy is not being produced when the space is empty. This beginning of the expanding cosmology is where most of the intelligent life occurs. That's why the process of tunnelling itself - the identity of the two vacua on both sides and the rate - will play much more important a role than the large spacetime worldvolumes.

OK. Let's accept Raphael's counting. Embed a diamond into a particular expanding cosmology that has already selected a vacuum. You want to pick a diamond that maximizes the entropy production - the difference between the entropy crossing the future light-like boundaries minus the entropy crossing the past light-like boundaries. Such a diamond will be as large as it can get but you will try to draw it close to the Big Bang where the production of entropy is strong. Try to write down this entropy production - the renormalized number of observers that have lived in that Universe - as a function of Lambda.

What you will find out is that this entropy production will get an extra factor proportional to "1/Lambda" - from the size of the diamond - relatively to some more straightforward methods to assign weight to different values of the cosmological constant. For example, Weinberg's original anthropic quasicalculation of the cosmological constant led him to a distribution that, for small enough "Lambda", behaved like
  • d P = C . d Lambda
i.e. it was uniform in "Lambda" but this distribution was truncated at some point by Weinberg's upper bound. Nevertheless, when you look at Weinberg's prediction quantitatively, the most likely predicted value of the cosmological constant is about 1,000 times higher than the observed value and the observed value is pretty unlikely in this distribution. A multiplicative discrepancy by a factor of 1,000 is smaller a disaster than a factor of 10^{120} but it is still pretty bad. Of course, people have tried to introduce various kinds of biases to reduce the number of 1,000, but none of them was too natural.

With Raphael Bousso's additional rule, the probability measure gets modified by a factor of "1/Lambda" that is proportional to the size of the diamond, so the distribution becomes
  • d P = D . d Lambda / Lambda
which, as you can see, is nearly uniform a distribution as a function of "ln(Lambda)" - a uniform distribution on the logarithmic scale. With this modification, smaller values of "Lambda" become more likely than Weinberg thought. When the numbers are inserted, the maximum of Raphael's probabilistic distribution for the cosmological constant happens to be almost exactly at the observed value.

His numerical result is much better than Weinberg's result although he had to refine the analysis in a certain way. The precise agreement is a matter of good luck: the width of his distribution is comparable to one order of magnitude so the prediction is not terribly quantitative. But it is still good.

What kind of a principle it is that tells you that the entropy production wants to be optimized? You can't really view this thing as a fundamental principle of physics. Raphael makes it clear that he doesn't pretend this principle - a proposed measure that counts observers - to be an exact law of physics. For example, he removes the entropy of the black holes created in the causal diamond from his calculation even though he suspects that it wouldn't spoil his results. That's a strange approach: fundamentally, the entropy stored in the black holes is as good as the entropy stored in anything else.

But in Raphael's setup, it is marginally plausible to make such a subtraction. He seems to offer a phenomenological rule to count the number of observers that is as independent on the detailed physics inside the universe - such as carbon dynamics - as possible. The entropy can be defined in all minima of the landscape. And even the subtraction of the horizon entropy can be made in all of them as long as the semiclassical approximation becomes good. So why not?

My discomfort

Needless to say, I still feel very uncomfortable with this kind of explanation for anything. The step that Raphael has made can be interpreted differently. Weinberg's quasireligious calculation simply gave too high a value of the cosmological constant. So Raphael has picked some additional ingredients that favor smaller values of the cosmological constant. Is the order of magnitude of one particular constant of Nature enough to decide which ingredients are correct and which ingredients are not?

Or should we view Raphael's approach or any similar approach as a guesswork that is much less reliable a path to the correct theory than a careful and exact analysis of a lot of patterns, data, and phenomena that impose much stronger and more accurate constraints on the theory than a single number, the vacuum energy density? But do we actually know such stronger constraints that would guide us along a more accurate path of progress?

We report, you decide. ;-)