- Probabilities in the landscape
The goal: distributions
The goal of the talk, as I understood it, was to make the first steps to determine some probability distributions for various quantities using the hardcore anthropic reasoning. These distributions are the only numbers we should be trying to figure out, according to the anthropic reasoning. Unlike quantum mechanics where the distributions may be exactly checked by repeating the experiment many times, we only have one Universe to measure.
The first choice is to fix the parameters of the Standard Model and low-energy physics, and only vary the cosmological constant "Lambda" and perhaps "Q". That's of course unjustifiable, but let's not stop at this point. For our purposes, the quantity "Q" is defined as "delta rho / rho" which is the typical relative fluctuation of the temperature of the cosmic microwave background (CMB) and it is equal "10^-5" in our Universe. Its value depends on the energy scales and other parameters of the inflaton potential which is "very high-energy physics" that does not affect local life.
The probability distributions are written, in analogy with Drake's equation for the number of telephone contacts with the extraterrestrial aliens, as products of many quantities - the simplest example is
- P (Lambda) = P_{prior} (Lambda) P_{formation} (Lambda)
Density of galaxies
On the other hand, P_{formation} is taken to be proportional to the number of observers that we generate in a given Universe. Vilenkin argued that it can be taken to be a quantity proportional to the number of galaxies in the Universe. This particular quantity, the number of galaxies, is pretty much calculable, as many Vilenkin's simulations and graphs of the volume of various regions (the three regions are roughly speaking: quantum diffusion near the top of the potential; slow roll in the middle; thermalization near the bottom) in the inflating "pocket Universe" showed, but it is much more controversial whether this is the right number that should become a factor in the probability distribution.
As you see, Vilenkin considers one galaxy to be a unit of life. P_{formation} should really measure the number of observers that will exist (or be born) in a given Universe. The total amount of intelligent life, so to say. The precise definition of P_{formation} - which is the quantity affected by the details of the anthropic (non)rules - belongs to humanities, not to science, I think. For example, Nima kept on asking "what is the moment at which you measure the number of observers" and he suggested that the integrated number of observers in the whole spacetime should be relevant. Vilenkin disagreed and he only wanted to count the number of observers at one moment (still not sure which one, and I was not the only person confused by the rules how to choose the "right" slice). Nima's proposal seems to be the more plausible one to me but neither of them is justifiable scientifically.
Intermezzo: defining and predicting intelligent life
But imagine that you would like to create an argument of this sort that is more than just hand-waving and where some other parameters would be varied. You would have to decide who is an intelligent observer. For example, a Universe with different parameters could produce small planets with weaker gravity. This would probably affect the typical size of the animals, the size of their brains, and consequently (politically correct people, please, forgive me) their expected average intelligence. Should we consider or expect the "bugs" living as the Universe to be equally good intelligent observers as larger animals like us? This can change the distribution by 10 orders of magnitude, or parameterically by huge additional power laws.
Note that in order to count these things realistically, we really need to know virtually all factors in the real equation due to Drake. In other words, physics behind the Standard Model is now supposed to be less solid than the search for extraterrestrial civilizations (SETI) because we need all answers from SETI and some additional answers to figure out what the new physics beyond the Standard Model should look like.
Moreover, should we just count the total number of observers that live in the Universe, or should we also take their lifetime into account? The number of words they can say per life? If the lifetime of humans were 1000 years, like in the Bible, should they increase the probability of a given Universe by an order of magnitude? Were the long-lived heroes of the Old Testament ten times as good people as we are? If the typical velocity of the humans in some civilization is 1000 times faster than in ours and if these people have 1000 times richer lives, does it increase the probability of their Universe and the quantity P_{formation} by three orders of magnitude? Shouldn't we count the number of cells as opposed to the number of people which would add 9 orders of magnitude for our life? Or should we calculate the number of villages or nations as the independent units of intelligent life which could subtract 9 orders of magnitude?
Fortunately, I am not the only one who is convinced that these questions will never be resolved and cannot be resolved scientifically. There are hundreds of similar questions, we can answer them in many different ways that reflect our prejudices. All different answers lead to different outcomes, and by "properly" adjusting all of our prejudices, we can obtain almost any answers we want. No doubt, there will be people who will argue that only one choice (theirs) is correct - but they will never be able to show why their choice is better than others. Moreover, there are many ways how to tune the prejudices to obtain one desired set of answers. The prejudices are untestable. They are not a subject of scientific verification. There is no natural "one-dimensional" scalar function of the properties of animals that could tell us how much they contribute to the probability distribution.
Input vs. output
Someone can try to intimidate other people by the conjectured large number of vacua in string theory. But the number of different versions of the anthropic principle - the number of formulae through which we determine "the amount of intelligent life in the Universe" times the number of ways how we count the "number/volume of Universes of certain kind" is much higher even than the worst existing finite estimates for the number of vacua. In other words, we can always fine-tune our anthropic "principle" in such a way that virtually any kind of a Universe can be chosen as the preferred one. Is that science? Even if there exists a Universe that cannot be obtained as the most likely one by adjusting the anthropic (non)laws, we can't really eliminate it because we don't know whether our fantasy about the possible anthropic (non)laws was complete.
Weinberg's success (?) story
If I return to slightly more technical topics, one of them was the distribution of the cosmological constant. You know, Steven Weinberg showed that it should be between -1 and +100 times the currently observed value, roughly speaking, for the galaxies to be able to form. If the cosmological constant were too negative, the Universe would approach the Big Crunch too early; if it were too large and positive, it would expand and dilute before the matter could clump into galaxies. This Weinberg's calculation is a source of pride and inspiration for the people who have switched to the anthropic mode because it was an actual "prediction" of a positive cosmological constant. Weinberg's predicted value differs from the right one by two orders of magnitude - which is 30 times better than the naturally predicted value of "Lambda" after SUSY breaking which differs from the observed one by 60 orders of magnitude. The price for this factor of 30 in the exponent is that the methods of physics are supposed to be replaced by methods of humanities.
Note that Weinberg's argument implies a rough solution to the coincidence problem ("why now") - the problem why do we live in the era in which the cosmological constant is comparable to the density of regular matter. Weinberg's answer is that the probability distribution for Lambda is naturally peaked near the density of matter at the time when galaxies are formed (which is comparable to the era in which we live), as his calculation shows. This simply comes from the requirement that galaxies can still form (the density is sufficient) when the cosmological constant starts to dominate - this is roughly where his allowed interval ends.
Can we make Weinberg's computation more accurate? Depending on your assumptions, you can achieve any distribution you want. The total distribution may be nearly constant near "Lambda=0", but it can also behave as "exp(C/Lambda)" which hugely prefers tiny positive values of Lambda. Moreover, the constant "C" is very important to get some details. Note that the "constant distribution" has absolutely no invariant meaning because distributions depend on the coordinates that we use to parameterize the parameter space.
Trying to falsify the anthropic framework
Another thing. Banks, Dine, and Gorbatov argued that if we allow not only "Lambda" to vary but also "Q", then we obtain wrong predictions for "Lambda" and "Q" because the Universe with correlated higher values of "Lambda" and higher values of "Q" will be preferred. The matter density will be less uniform if "Q" is larger, and the density of galaxies increases. This will allow us to increase "Lambda" because the galaxies won't dilute so quickly anyway. In my opinion, this is the right approach to these questions - try to falsify them, instead of adapting them.
Vilenkin has made a trick that led him, once again, to a factorized probability distribution for "Lambda" and "Q", so that any change for "Q" is irrelevant for the statistical predictions of "Lambda". I forgot what the trick was, and I am not sure whether I should be sad because of it. A lot of extra discussion focused on the bounds on "Q". Of course that some "mainstream" bounds tell you that "Q" is not that far from the observed value "10^-5". Concretely, it should be true, they say, that
- 10^-6 is smaller than Q is smaller than 10^-4.
Googol vs. googolplex
Regardless of the value of "Lambda", it's always possible that a Universe appears as a gigantic fluctuation. The probability of such fluctuations are not the inverse googol (10^{-100}) but rather the inverse googolplex (10^{-10^{100}}). Obviously, as Nima likes to say, in the anthropic framework we need to allow the unlikely events whose probabilities are of order the inverse googol because it is necessary for picking the Universe with the unnaturally tiny Lambda, but we should not allow phenomena whose probabilities are the inverse googolplex because this would also allow us to say that the Universe (including the fossils) was created 6,000 years ago, in agreement with Genesis.
The only good reason why they want to allow googol-like probabilities but not googolplex-like probabilities is that they're not Christians, and they only want to support the weak form of the religion (the anthropic principle) but not the strong version (literal belief in the Bible). You may think that I am joking, but I am not. The choice of these bounds of acceptability, once again, influences the conclusions dramatically. Moreover, it is not quite clear how the probabilities should be calculated. We can always imagine that there are googolplex of different galaxies in a Big Universe, and in one of them, life can arise as a gigantic fluctuation. In fact, I believe that one could arrange various fluctuations whose probability would be much closer to one (much greater) than the inverse googolplex.
Let me mention a very different example showing why it is important for the data to be stable with respect to perturbation of the "conventional" parameters. If the people in the 1970s looked at the temperature records from 1850, they could have seen a mild warming trend. If they looked at the temperature records starting from 1940s, which is what they actually liked to do, they could have deduced that a new ice age was getting started. Both conclusions were unjustified. If the conclusion depends on the choice of the year where you start to measure, or on the probability which you call the least acceptable probability, then your conclusion is unstable and it is scientifically irrelevant. Nothing better than guessing.
Summary
These ideas based on infinite possibilities and infinite Universe can never lead to convincing i.e. new quantitative results. The amount of bits that we insert as parameters - the answers to various "subtle" questions how the anthropic principle should work (plus, in reality, the assumptions about the behavior of the fundamental theory that we don't yet know fully) is hugely larger than the amount of (fuzzy) information that we can derive. A scientific theory should generate more predictions than the number of assumptions we insert into it.
The only way how similar things can work scientifically is that we understand the full theory at the fundamental scale well enough, including its unique Hartle-Hawking-like framework (which must be free of all assumptions and can only use the standard path integral of the theory, or an equivalent of it, to calculate any probabilities) to calculate the initial state of the Universe, and we take this initial state to calculate the probabilities of the subsequent evolution. Another possibility is just ignore the cosmological selection considerations altogether, and just continue to try to identify the correct "vacuum" by trying to match the properties of particle physics, leaving the early cosmological questions to the very end.
Finally, I am sure that various people who have a similar opinion about the anthropic thinking will use this admitted frustration as a weapon against string theory. Unfortunately, I must assure you that the expansion of the anthropic principle is a problem of the whole theoretical physics, not just string theory - and this talk was not a string theory talk after all. Incidentally, I was just explained by one of the authors of the entertaining article Supersplit Supersymmetry that it was making fun primarily of the anthropic approach, not supersymmetry.