This hypergentle approach arguably works in the expert community but it may fail when it is applied behind its mantinels. There often exists a huge gap between the experts on one side and the public - even the broader scientific public - on the other side. Such a gap can allow a wrong idea - an idea realized by every single expert to be wrong - to massively influence the media, the readers of popular books, and other sources.
This is also the case of cosmic natural selection. That's a hypothesis that the black hole singularity is a seed of a new Universe whose properties are close to the parent Universe but not identical: by this mechanism, we emulate the framework of Darwin's theory and the fittest Universes - those able to produce a maximum number of black holes - must therefore dominate.
That's an idea that a smart basic school student may find attractive and logical but her friend from the college or a renowned Stanford professor ;-) who also knows some physics beyond the high-school level may notice very serious problems with the idea, for example:
- the production of everything, including the black holes, is dominated by the parameters of inflation; eternal inflation is an exponentially better mechanism for reproduction which is why the criterion won't constrain particle physics even if it were true
- it is impossible to define what one black hole means when the black holes merge
- it is impossible to define what a black hole means because there is no qualitative difference between black holes and elementary particles: black holes are just heavily excited elementary particles (or strings)
- it is unlikely that the daughter Universe, even if it could be created in this fashion, would resemble the parent Universe; more likely, it would have very different values of all constants, especially the cosmological constant, because the adjacent vacua in the landscape have similar values of fluxes but very different values of low-energy parameters
- the creation of a new Universe inside a black hole seems to contradict the conservation law for information which seems to hold in Nature
One such a serious problem would be enough for a serious physicist to abandon the idea. We have at least five of them here.
You could also argue that the modified parameters of the Standard Model - if you adjust them in the right direction - can increase the rate of black hole production in the astrophysical context.
As you know, various crackpot and flawed ideas have become - at least according to the media - serious competitors of cutting-edge physics, including string theory, so the physicists and cosmologists must be interested in these flawed ideas. ;-) OK, so what do they tell you?
Today, Alex Vilenkin chose a different, characteristically cosmological method how to rule the hypothesis out: he argues that in the long run, we approach de Sitter space and we have an infinite spacetime volume anyway. Because it is infinite, even very unlikely processes must be considered as long as they appear in empty space. Black hole nucleation is one of them and its rate, calculated by Ginsparg and Perry from an "S2 x S2" gravitational instanton, is proportional to
- exp(-pi/Lambda)
in the Planck units with G=1 (otherwise write "Lambda G" instead of "Lambda"). It is enough to increase the cosmological constant Lambda and the rate of black hole nucleation will increase which will also increase, in the long run, the (infinite) number of produced black holes.
Well, although I agree with Vilenkin's conclusion, there could be an extra subtlety in his particular argument: the Poincaré recurrence time. Recall that the maximum meaningful time in de Sitter space is of order
- exp(S)
where S is the de Sitter entropy
- 24 pi^2 / Lambda.
So the volume is not really infinite. The total time of the Universe is cut off at something like
- T = exp (24 pi^2 / Lambda)
- V_4 = exp (96 pi^2 / Lambda).
Then the black hole production would be just another argument that the cosmological constant should try to be as small as possible. In field theory, such a prediction is ruled out because you can adjust the vacuum energy in both ways. In the full theory, string theory, it is always possible to imagine that our Universe is a vacuum with the minimum allowed positive value of the cosmological constant.
Nevertheless, Smolin's argument would differ from other arguments (wavefunction of the Universe; anthropic principle) proposing that the cosmological constant should be small by its obvious disagreement with other laws of physics. As you can see, I tend to think that Susskind's arguments are probably more solid than Vilenkin's but it could be just a cultural artifact. ;-)
The "8.pi" factors in the units should be re-checked. It is plausible that there is a "64.pi^2" error in the entropy's exponent. Then, surprisingly, it wouldn't mean that Vilenkin's calculation is right either: it's because if the suppressing, exponentially small factor for the black hole production wins, then the number of black holes produced in the full recurrence four-volume will be much smaller than one, and the nucleation can't be used as a dominant black hole production process. You should look back in the astrophysical processes. Again, you will find many ways how to increase the production i.e. by reducing the hierarchy between the weak scale and the Planck scale (or by changing almost any other Standard Model parameter).
Update: Prof. Vilenkin in the slow comments would like the true interpretation of the recurrence time to be clarified. So would I. Does the space disappear after the recurrence time? I don't think so. It starts to recur. By waiting for times of order "exp(S)", the Universe starts from the scratch, or at least it approaches the initial state arbitrarily closely. Treating the moments separated by much more than "exp(S)" as independent moments is a sort of double counting, I think.
Classically, the volume of the phase space is equal to the dimension "exp(S)" of the Hilbert space (times a power of "2.pi.hbar"). Divide the phase space literally into small regions that correspond to quantum states. The evolution is a trajectory in the phase space. It takes about "exp(S)" units of time to try all accessible boxes of the phase space before you have to return to the original state plus minus the error that you tolerated in the definition of the quantum state.
This recurrence time for a given state doesn't mean, I think, that the evolution operator over this time is close to the unit operator because other states will evolve into something different and their precise recurrence time could differ, too. But it probably means that for every state you start with, you find a time comparable to the recurrence time after which the evolution returns you to the exact state you started with.
If the evolution after a time of order "exp(S)" were the identity operator, then I would find it obvious that we should count the production over the recurrence time only once - the time is, in a sense, periodic. In the real de Sitter space, I am not so sure. But still, any prescription that depends on "counting of objects" should say how to deal with such infinities such as the volume of the de Sitter spacetime. The original CNS proposal arguably solves no such problems of the defining rules and I am not sure how the right clarification should look like.