Off-topic - anniversary: On Monday night, it is expected that the counter in the sidebar will show 600,000. The first person who sends a screenshot with 599,999 or 600,000 can post an article or give this right to someone else. Also, all people with 5,000+ hep-th citations are granted a free access to make postings on this blog (submitted by e-mail or new accounts) up to 1 posting per day, so they shouldn't be ashamed if they have something to say.
Let me start with a paper by Radford Neal, Professor of statistics, on a math arXiv:
Statistician comments on anthropic reasoning
It should be interesting for everyone who has been thinking about these issues to see what statisticians would tell us. The author resolves various doomsday paradoxes and other paradoxes of the anthropic reasoning by emphasizing that all known facts, not just the existence of intelligent life, should be taken as assumptions and corresponding conditional probabilities should be evaluated instead of the normal probabilities.
That's a perfectly principled approach of a statistician and, to some extent, the only non-religious method to decide which things we are allowed to assume and which we aren't. ;-) But it's the same approach that Stephen Hawking had proposed on a conference. David Gross summarized Hawking's idea as the "extreme anthropic principle" because the message to the experimentalists is "don't measure anything else: every new measurement is a new problem for us because we will have to add its results into our increasingly awkward list of conditions".
Clearly, physics is not just statistics or just botany, just like it is not just philosophy. Physics depends on a subtle balance between the cool and boring statistical data from the experiments and hot, speculative, potentially far-reaching and philosophically sounding hypotheses and speculations. When the balance is broken, physics deteriorates either to philosophy or to botany. In the purely statistical (=botanical) approach to questions in physics, the main goal of physics - namely to predict new phenomena without assuming them - seems to be lost. I am ready to sacrifice the explanations of some facts - and justify them anthropically - but certainly not all facts. So I don't know how to reconcile the deep statistical thoughts with the rest of physics.
Let's now switch to hep-th papers from Sunday night.
Stephen Hsu proposes a solution of the information loss problem. When a black hole is formed, a baby universe is born and detached from its parent. The baby carries the information so that the evolution, including the data in the baby universe, remains unitary but the information is lost in the parent universe. This seems to clearly contradict the situation in AdS/CFT where no baby universes exist after the black hole evaporates (and information is not lost), so my reasoning about this new possibility is academic because I won't believe it at the end anyway. ;-)
But even at the academic level at which we're ready to abandon everything we learned about string theory, I feel that the solution shares all disadvantages with the remnant theory and adds some new disadvantages.
Florian Bauer, Tomas Hallgren, and Gerhart Seidl study deconstruction of two additional discrete dimensions. But they don't deconstruct gauge theories but rather a theory of gravity - which involves concepts of massive gravity etc. that are likely to be meaningful at the level of effective field theory only. And their two deconstructed dimensions describe a negatively curved manifold - the Poincaré disk if you wish - which leads to some phenomenological possibilities. The authors study the generation of small fermionic mass as a way to get hierarchies from the large discrete volume.
E. Antonyan, J. A. Harvey, and D. Kutasov study intersecting braneworlds. Recall, you're in a world where all matter lives in the bi-fundamental representations of U(N) groups. In this picture, various Weyl fermions live at different intersections of the type IIB D-branes. If you assume that the Higgs lives at a third intersection, you may use disk instantons to generate the Yukawa couplings and the bare masses if the Higgs has a vev. But that's not what they look at. Instead, they look at a direction generation of mass terms - pairing of the fermions at two intersections - that results from chiral symmetry breaking. Bulk fields are important for the chiral symmetry to be broken and the effect only occurs for various dimensions and relative orientations of the D-branes. The chiral symmetry is also studied by AdS/CFT duals but this is a non-holographic stringy realization of the chiral symmetry breaking in these scenarios although some relations could exist.
Shinji Tsujikawa and M. Sami study the influence of the Gauss-Bonnet term (the Euler density which is quadratic in the Riemann tensor) multiplied by a function of some scalar fields on the transition between the matter-dominated era and the dark-energy-dominated era of cosmology. I have not quite understood what good positive and observable features the term can bring. Nevertheless, it may be a good idea to look at it because the Gauss-Bonnet term is the most typical higher-derivative correction generated in string theory although their assumed dependence on the scalars seems less justified by a fundamental theory to me.
A. P. Balachandran, T. R. Govindarajan, G. Mangano, A. Pinzul, B. A. Qureshi, and S. Vaidya look at the quantum deformation of the Poincaré group. Using some reasoning that could perhaps be more comprehensible to the Bogdanoff brothers than it is to me, they conclude that the Drinfeld twist in the theories they study influences the spins and statistics of particles. The quantum groups are also linked to non-commutative geometry, which I thought to be two very different things, and they probably argue that the UV-IR mixing from non-commutative field theories is removed by the twist unless I misunderstood them. It's too abstract to me and I tend to believe that no physical theory that follows these strangely deformed rules can exist, so all of this is a game with symmetries of non-existent theories, but I may be easily wrong.
Mithat Unsal and Laurence G. Yaffe investigate some hypotheses that resulted from the attempts to construct the AdS dual of QCD. In the effort to get rid of the unwanted matter of N=4 super-Yang-Mills, our most successful AdS/CFT representative, people have looked at various gauge theories with orientifolds or, equivalently, with matter fields in the tensor representations, claiming that two theories - one of which is similar to N=4 and another is similar to QCD - are equivalent for the limit of many colors. The authors disprove this hypothesis assuming that the time direction is compactified on an S^1 - multiplied by the usual S^3. I think that the authors agree with me that the theories may still become equivalent in the decompactification limit of S^1 so that the result that could be viewed as negative by some AdS/QCD people is not too negative. ;-)
Kazuyuki Furuuchi reviews the AdS/CFT correspondence in the regime of highly curved spacetime or, equivalently, small 't Hooft's coupling. That's the opposite limit than one where AdS/CFT is studied most of the time. Most of the things that are known about this unusual regime are related to Hagedorn-like phase transitions of the strings made out of string bits or, equivalently, phase transitions of the corresponding black holes such as the Hawking-Page transition.
Bindusar Sahoo and Ashoke Sen explicitly and fully evaluate the entropy of dyonic black holes in heterotic string theory by including all four-derivative terms. They confirm all prejudices of incomplete calculations that have appeared in the literature: you get the right result if you only include the Gauss-Bonnet terms; you also get the right result if you include all the squared curvature terms plus their supersymmetry partners but nothing else; you also get the right result if you approximate everything by the AdS_3 near-horizon geometry. In the non-supersymmetric cases, only the last simplification is possible, but the authors now give you the full calculation anyway. Previous papers are confirmed by a much more rigorous (and perhaps tedious) calculation. Be ready that alpha' is equal to 16 in their units. ;-)
O. B. Zaslavskii computes some quantities related to the backreaction of black holes caused by things like their Hawking radiation, but in the context of the two-dimensional dilaton gravity. Some previous numerical results are confirmed exactly and seem to be less singular than naively expected. One of the philosophical lessons of these papers is, I believe, that a lineland may differ from a 3+1-dimensional world (or higher-dimensional ones) in qualitative ways.
Matthias R Gaberdiel and Ingo Runkel analyze, for the first time, the open string version of a special kind of conformal field theory, called the c=-2 logarithmic triplet theory. What are these logarithmic theories about? They're relatives of the solvable rational conformal field theories. However, they differ by the logarithmic (as opposed to power law) character of many of their correlators that is caused by "bigger" (not highest-weight-state) representations that appear in the spectrum. Because they want the open string version, they must include the boundaries which modify some of the previous calculations involving this most elementary logarithmic CFT. Be ready for things like boundary states similar to those that you know from ordinary flat space string theory.
Yoichi Chizaki and Shigeaki Yahikozawa perform the full modern covariant quantization of strings propagating on a pp-wave limit of a geometry with the NS-NS flux - probably something you would get in "AdS3 x S3 x K3". If you like all commutators, Fourier expansions, and the BRST operator to be written down explicitly, checked, and re-checked, you will like the paper. The longitudinal X^{-} coordinate - one whose value is solved for in the light-cone gauge - plays a subtle role in their calculations.
Robert H. Brandenberger, Sugumi Kanno, Jiro Soda, Damien A. Easson, Justin Khoury, Patrick Martineau, Ali Nayeri, and Subodh Patil rewrite some results of the subset of the authors (plus others) about their stringy alternative to inflation from the string frame to the Einstein frame. Recall that their picture uses a rather exotic thermodynamical behavior of string theory near the Hagedorn temperature that, as the authors believe, is able to produce the scale-invariant spectrum much like inflation. Their older papers have used the string frame - the choice of units of distance and the metric tensor in which the Einstein-Hilbert term has an exp(-2.phi) in front of it. In the Einstein frame, this phi-dependence is removed and some of the phases of their novel cosmological model are re-interpreted and put into a new light.
Let me mention that some cosmologists have doubts whether the density perturbations that the authors of this interesting proposal seem to produce are physical modes or just pure gauge modes (removable by a coordinate redefinition) that are misinterpreted as physical perturbations. More generally, I am still a bit puzzled whether the geometric intuition is allowed or is prohibited in these stringy-dominated configurations. More seriously, a new paper by Kaloper et al. will appear tomorrow, arguing that not only Ali Nayeri et al. predict n=5 instead of n=1, which is unacceptable, but their speculations that this problem could be fixed by strong coupling effects violate the null energy condition. Assuming that Kaloper et al. don't have serious errors in their analysis, inflation is likely to remain a unique solution. I hope that our friends - Ali, Cumrun, Robert, et al., wouldn't cry as much as Neil Turok! ;-)
Jai Grover, Jan B Gutowski, Wafic Sabra show you how a modern but classic supergravity paper about fancy topics that remain mostly disconnected from string theory looks like. They focus on five-dimensional gauge supergravities (supergravities with gauge fields: in their case several U(1) multiplets) with 16 supercharges. The spinorial geometry method, whatever this impressive method exactly is, is used to look for solutions and to prove that neither solution can preserve exactly 3/4 of the supercharges.
Bhaskar Dutta and Jason Kumar propose a new scenario how the hidden sector may be useful to create realistic baryogenesis. Recall that our Universe contains many more baryons than anti-baryons: a result of an imperfectly balanced annihilation in the past (one billion and one baryon against one billion antibaryons). The overall baryon number B had to be created somewhere if it were zero at the beginning.
As Sakharov pointed out, one must violate the CP symmetry, thermal equilibrium, and the baryon number sometime in the past - otherwise no matter survives in the universe to allow for life. Various mechanisms how this could have happened exist, much like many astrophysical bounds. These bounds may look unnatural but the authors claim that they are naturally satisfied by their triangle mixed anomaly - which mixes the baryon number with the hidden sector. The setup is naturally realized in intersecting braneworlds of string theory and it predicts some rather detailed properties of exotic quarks produced at the LHC. That's why this paper is the 8,724th proof that the people who suggest that string theory deals with untestable physics are, politely speaking, morons. It is, on the contrary, very likely that this particular class of model will be falsified rather soon. ;-)
N. Yokomizo, P. Teotonio-Sobrinho, and J. C. A. Barata are interested in three-dimensional theories involving gauge fields and spin degrees of freedom. Their point is that a low-temperature limit makes the dynamics of this system topological. A Hopf algebra is identified, the number of ground states may be counted from topological invariants, and a low-temperature expansion around the topological description is studied. These investigations seem to imply new equivalences between classical spin models.
Hisham Sati looks at the role of E8 gauge theory for the classification of charges in string theory and in relation with the gerbes. Many mathematical structures are argued to be interrelated in this setup, including twisted K-theory whose #$# [blah blah blah] class is identified with the NS-NS 3-form H-field. I guess that Jarah Evslin likes the paper, too.
Quite obviously and impressively, Sati seems to argue that he can derive why the E8 gauge theory is directly relevant for type IIA string backgrounds because the dual Coxeter numbers of E8 magically appear, even though a more general group G is used to create the loop groups in type IIA etc. In some sense, this is a new direct proof of the Diaconescu-Moore-Witten framework to describe the integrality properties of the M-theory fluxes. The E8 group is also linked to high-level Wess-Zumino-Witten models.
One of the speculations that result as natural ones from this reasoning is that the 10D type IIA theory is a boundary theory for another 11D "bulk" theory with a "hypergravity" in it, via a holographic duality. It all sounds very intriguing. The paper is an example why I believe that the papers with fancy mathematical structures are much more likely to have a deep point than papers and diatribes against mathematics in physics.