Tuesday, August 29, 2006

Gr-qc papers on Tuesday

Luca Baiotti and Luciano Rezzolla propose a paradigm that could conceivably become a breakthrough in numerical relativity. When you're numerically integrating equations of general relativity, you must tell the computer what coordinates should be used. Even with well-defined initial conditions, the solution is only determined up to coordinate transformations. Which coordinates should you choose? Moreover, physical singularities may form - what should you do with them? Normally, the regions around likely singularities are amputated and a gauge choice is used for the rest. The present authors use different coordinates and keep the singularities, arguing that this setup is good to calculate gravity wave emissions by collapsing stars and perhaps even the quasinormal ringing modes. The usual calculations would collapse much earlier, they argue.

Louis J. Rubbo introduces the Bayesian reasoning but especially more general assumptions to the analysis of detection of gravity waves. Instead of assuming a particular waveform as most people nowadays do, he uses the Bayesian inference to find a more refined Ansatz for the profile. The experimental discussion focuses on LISA. Be sure that I am not irritated by the Bayesian terminology because it looks like a good strategy to me. Bayesian reasoning is often a good framework to choose a strategy to analyze data and look for fits (and win probabilistic games); it is not a good framework for presenting and defending the final answers. Scientific results are only solid if they're independent from the strategies how they were found.

Evgeny Sorkin uses numerical methods to study wiggly black strings. Recall that according to a perturbative calculation of Gregory and Laflamme, too long and/or thin uniform black string solutions become unstable - because of a negative mode - and decay either to black holes or to non-uniform strings. Horowitz and Maeda have argued that the black hole can't be the final state. Sorkin probably thinks that their paper is wrong because he doesn't cite it at all. Finally, Sorkin deduces some scaling laws for the "nearly pinched" black strings from the numerical data. An interesting point is that the behavior of the black strings changes when the spacetime dimension is around eleven - this has superficially nothing to do with the calculations of critical dimensions of superstring/M-theory because it is a purely classical result of general relativity. It could be shocking if someone found an explanation of this coincidence.

Daniele Oriti and Tamer Tlas try to refine some questions about causality in the spin foam models. It's a nice effort but I think that all these papers are manifestly wrong. They manually remove the acausal configurations from the path integrals - in order to get rid of the acausal, non-smooth, crumpled behavior of the path integral - that makes the resulting theory non-unitary and unphysical. If I use a simpler language, it is no longer true that the evolution operator from A to B times the evolution operator from B to C gives you the evolution operator from A to C because the restrictions on the A-C interval are stronger than the union of the restrictions on the two partial intervals. More generally, it is critical for Feynman's path integral to sum over all configurations of the allowed degrees of freedom, not just some politically correct configurations, to get meaningful results. I won't read any of these papers, at least until someone addresses these rather serious issues.

Pablo Laguna studies bounces in loop quantum cosmology. Loop quantum cosmology is obtained by applying certain simplifying rules on the isotropic cosmological solutions. These rules are analogous to the rules that cripple quantum gravity down to loop quantum gravity. But this doesn't mean that loop quantum cosmology follows from loop quantum gravity. In loop quantum cosmology, people keep on celebrating that they have removed the initial singularity. I think that all these statements are completely nonsensical because the only thing they have done was to include a randomly chosen, unphysical ultraviolet cutoff. There is nothing interesting or unusual about their specific cutoff and nothing physical about the Planckian results of such a treatment. This also makes the results of the calculations of the bounce - that are similar to "shallow water" calculations, as Laguna says - unphysical. Indeed, these are shallow waters of quantum gravity. Loop quantum cosmology is even bigger nonsense than loop quantum gravity.

Gregory J. Galloway has continued to investigate their recent impressive results that the black hole horizons in any dimension must have topologies that admit positive scalar curvature - which implies, in 3+1 dimensions, the familiar result that the horizons must have a spherical topology. In the recent paper, he sharpens some statements and rules out e.g. toroidal horizons.

Lorenzo Iorio fights against Ciufolini and Pavlis because of their radical proposals to improve (?) the satellite tests of the Lense-Thirring effect. Recall that the effect, essentially synonymous with "frame-dragging", is a gravitomagnetic "Machian-like" prediction of general relativity in which ocean tides and solid tides of the Earth influence the motion of satellites such as LAGEOS I and LAGEOS II (as well as the Gravity Probe B). Iorio argues that all proposals of his two colleagues to improve the setup are incorrect.

Peter K.F. Kuhfittig finally argues that de Sitter 3-branes embedded in a 4+1-dimensional bulk with negative vacuum energy causes the bulk to shrink. If the bulk has a positive vacuum energy, the bulk is either doing nothing or everything - where everything means to blindly follow the inflation on the brane. The bulk is assumed to be more general than anti de Sitter space of the Randall-Sundrum models and I am a bit skeptical about the physical realism of such a more general setup.