Tuesday, October 31, 2006

Landscape: floating correlations

Dienes and Lennek show a general difficulty with the statistically predictive approach to the landscape: the floating correlations. This problem has been known to me for some time and was a reason not to consider the statistically predictive landscape approach to be a promising one. Below, I also try to propose the opposite approach although some details will be missing. ;-)

If you want to statistically predict the probability "P" that e.g. a vacuum contains an SU(3) factor in its gauge group, you will find out that "P" is not really a constant but a slowly increasing function of the number of models "N" that you tried. Dienes and Lennek tell you much more.

Of course, if the number of all models in your ensemble were finite, namely "N0", then the probability "P", assuming the unrealistic democratic distribution, would be simply
  • P = Nyes / N0
where "Nyes" is the number of models among "N" that have an SU(3) factor. The correct "P" would be obtained as the limit for very large values of "N" - the number of the models that you tried. This limit "N" goes to "N0" defines the extreme anthropic approach to the set of vacua. A reasonable measure of this kind only exists for a finite "N0".

In the numerical attempts to compute similar probabilities, "P" doesn't seem to converge to a constant for large values of "N". This simply means that if you consider a larger set of possible models, e.g. heterotic models, such a larger set must secretly carry an approximately label "I" that increases with "N", so that probabilities such as "P" above can depend on "I".

I used the letter "I" because "I" stands for the average irrelevance. My approach to the right vacuum selection is exactly the opposite one from the extreme anthropic approach: we should be looking at sets of vacua with the minimal possible value of "I". These are the most relevant vacua for phenomenology. If you look at a set of vacua with a small value of "I", it will typically be a small set and these vacua will likely to be relevant. Also, these vacua will look simple and/or special in some way.

The quantity "I" is positively semidefinite for all subsets of vacua and it is equal to zero for the set that contains the vacuum describing our Universe only. We should be looking at sets of vacua with the minimal possible value of "I", and it won't be hard to find the right vacuum then.

Unfortunately, there is not enough room in this blog article for me to write the full definition of "I". :-) Nevertheless, it is clear that some large sets of generic complicated vacua will have a large value of "I". Also, I tend to think that the sets of Calabi-Yau vacua with high values of the Hodge numbers will have a high value of "I", too. People shouldn't waste too much time with these vacua.

Also, a decreasing function of "I" such as "exp(-I)" is likely to appear as an additional prefactor in the Hartle-Hawking wavefunction. It is conceivable that the value of "I" may be determined by or correlated with the mass spectrum at low energies. For example, it is conceivable that low values of "I" are associated with theories whose mass spectrum has large hierarchies and/or uniform distribution of mass scales on the logarithmic scale. A pretty good feature.

In principle, we could just measure physics up to the Planck scale and decode the topology, fluxes, and other labels of the physically relevant vacuum from the experiment scale by scale. I never understood the point of talking about NP-completeness of the problems to find the vacuum from the known cosmological constant only. There are many other, arguably more important numbers about the real world that we can measure today and others that people will measure in the future. With a sufficient hierarchy, one can determine aspects of the stringy compactification feature by feature, or scale by scale. One new scale can tell you something about a throat, another scale can tell you about the shape of the compact Calabi-Yau, and so forth.

More generally, I think that the focus should be on finding the right vacuum - with a large "I" - instead of doing statistics of the wrong vacua. Also, there can hypothetically exist new processes in cosmology that suppress vacua with a large "I", and there can be additional yet unknown decay channels that will make the vacua with a large "I" unstable, decaying towards the more elementary, more fundamental, less "excited" and less "contrived" vacua with a small "I" that are conjectured to be more likely to be the right description of this Universe.

The homework problem:
  1. Calculate the value of "I" for some heterotic vacua and KKLT vacua
  2. Find the set with one element "{U}" whose "I" is equal to zero
  3. Use the model found in 2. to calculate the masses of quarks and leptons. ;-)