Thursday, August 31, 2006

Alain Connes: a theory of everything

I want to point out the paper by Alain Connes:

that appeared on hep-th tonight. So far, it looks fascinating. I plan to study it in detail, and let me admit that at this moment I remain unable to tell you any details why it should be wrong and/or entirely vacuous. Believe me that the previous sentence is written not only because Alain Connes is an amazing thinker and a genius. It is also partly written because most of the page 10 and one third of the page 11 is dedicated to a "formula" that looks much like the Standard Model Lagrangian. No octopi and no unworldliness are involved. ;-)

The basic paradigm is that one assumes something like extra dimensions but their structure is not described by a metric tensor but rather by a Dirac operator. The possibilities for its structure are more general than in the case of the metric tensor with its corresponding box operator that would describe a Riemannian geometry. The Dirac operator is capable to match a certain kind of non-commutative geometry - a generalized manifold with a finite-dimensional space of possible functions on it. We will discuss whether the notion of a "geometry" is justified or whether it is just a fancy word.



The hidden discrete or noncommutative Connes manifold "F" has the KO-dimension equal to 6, just like in perturbative string theory, making the total KO-dimension of spacetime equal to 10. This dimension is only defined modulo 8, so you can also imagine that the dimension of spacetime equals 26 as implied by the left-movers of the heterotic string. Recall that in string theory, the chiral bosons must be added in groups of 8 for modular invariance. The dimension 10 mod 8 is probably also necessary from a spacetime perspective, to get the right violation of the C, P, and CP symmetries. I will mention a few more details about it later.




In previous papers co-authored by Alain Connes, it was argued that the weak SU(2) gauge group follows from a non-commutative solution of certain constraints for the Dirac operator. The colors had to be added by hand and the model suffered from a fermion doubling problem. Alain Connes now claims that these problems have been resolved. His proposed theory is supposed to include

  • the Standard Model and gravity
  • right-handed neutrinos with the seesaw mechanism
  • the same gauge coupling unification as in SU(5)+ GUT theories
  • possibly some other relations between the parameters.

Believe me that I have not yet seen a paper with a theory of everything that I couldn't deconstruct and reliably debunk in 15 minutes - which may be partly because I did not read the previous Connes' physics papers too carefully. This is the first one in my life but I hope to be able to return with a more meaningful analysis what's nontrivial about this proposal later. ;-)

Some things were however clear after the first minute. Connes can't explain the number of generations that is put in. Also, he only discusses the classical Lagrangian and his treatment has nothing to say about the UV problems of gravity and the Standard Model itself.

After an hour or so

It seems clear that the main task for a physicist trying to fully understand the paper is to be able to separate things that are just a translation of the usual laws of physics to a particular mathematician's language from things that are non-trivial and different from any kind of translation.

In equation (2), Connes defines an algebra whose parts seem to be in one-to-one correspondence with the factors of the Standard Model gauge group, up to having two copies of the quaternions that is probably needed to match the fact that the fundamental representation of SU(2) is pseudoreal. Some involutions are added. If I summarize, my feeling is that Connes postulates some data that are equivalent to the choice of the gauge group, so he doesn't derive the gauge group.

What seems to be the most conceivable non-trivial task is to derive the 90-dimensional representation "H_F" that encodes the quarks and leptons. The number 90 is mentioned at the bottom of page 2. It makes it clear that his spectrum of fermions, or at least the part called "H_F", does NOT contain right-handed neutrinos. 90 is a multiple of 10+5, using the dimensionalities of the SU(5) representations, not a multiple of 16 that would be needed to construct the full representation of SO(10) including the right-handed neutrinos. So I am already a bit suspicious how the see-saw mechanism is gonna work without adding the right-handed neutrinos by hand (which is clearly not Connes' discovery or prediction).

The rest of the page 4 is, in my opinion, dedicated to introducing the features that are needed for the right reality properties of the various representations. Three generations are put in by hand. As I indicated above, I am actually rather unimpressed by the statement that he can formally derive that the KO-dimension of his compact quasi-manifold F equals 6 modulo 8.

Imagine that you start with something like heterotic string theory but in D dimensions where D is not necessarily ten. D must obviously be even to get a chiral theory from any non-singular compactification (G2 manifolds for M-theory must be singular in order to be realistic) - and his construction is a counterpart of a non-singular although arguably highly non-standard compactification. Moreover, the spinors of SO(D-4) must have complex representations in order to be tensor-multiplied with the complex Weyl spinors of SO(3,1). The groups SO(8k+6) are the only even-dimensional Euclidean orthogonal group that have complex spinor representations, inequivalent to their complex conjugates, and that simultaneously lead to real spinor representations in the full spacetime whose dimension is higher than 4, namely the spacetime with the Lorentz group SO(8k+9,1). If you changed the dimension of "F" by four, you would get pseudoreal spinor representations in the full spacetime, which would lead to a doubling of fermionic degrees of freedom in 4D.

I think that no string theorist would thus be really surprised that in this kind of generalized construction, the dimension of the compact manifold must be 6 mod 8 (which is the periodicity of the qualitative properties of the spinors) because Connes just analyzes elementary properties of spinors in various dimensions. The correct choice, of course, follows from string theory. The string theorists have never emphasized these things but with extra non-singular dimensions, 10 mod 8 is really the only possible correct dimensionality that admits realistic physics. Those people who say that any number would be equally good as 10 clearly misunderstand not only the purely stringy reasons why 10 is unique, but also the spacetime (spinor) arguments why 10 mod 8 is necessary.

So I am sufficiently confident that the page 4 only contains things I know about the duality and conjugation properties of different representations that we have known before. Various pieces of discrete information are rearranged a bit, but I feel that the set of assumptions about his structures is equivalent to what we normally assume. Because of the Standard-Model-like algebra assumed in (2), it is not shocking that the Standard Model gauge group is derived on page 5.

At this moment, we have already the right gauge group because an equivalent piece of information was assumed. On page 6, the Dirac operators start to be discussed. We know very well physically what Dirac operators we can have - and his definition of the operator seems conventional: we face the choice between the usual Dirac operators with the covariant derivatives corresponding to different representations for the elementary fields.

Connes can't derive three generations, as we mentioned, so the question is whether he can say something non-trivial about the representations of the gauge group in which one generation appears.

I am reasonably confident that on page 6 in theorem 2.7, the representations for the fermions are already assumed, not derived from anything. It is only shown that the usual Dirac operators for the fermions satisfy some rules that more or less agree with the physics rules for the Dirac operator anyway. Note that the theorem 2.7 is an existence theorem. Of course, the classical Lagrangian of the Standard Model exists and the Dirac operators satisfy the elementary rules.

It seems almost certain at this moment that more complicated representations - and maybe not just anomaly-free representations - of the Standard Model gauge group could be assumed at this point, and a corresponding existence theorem 2.7 for the Dirac operators could be proved. This fact makes me reasonably certain that there is nothing important we didn't know up to page 6. Note that the right-handed neutrinos are added by hand, via "M_R", with symmetric mass matrices, and an introduction to the see-saw mechanism is offered on the top of page 7.

The rest of page 7 seems to confirm some trivial facts about gamma matrices in 4 dimensions and the reality conditions of the spinors in different dimensions, as discussed above. Various involutions resulting from the reality and other discrete conditions are listed. The fermion doubling problem is resolved, by using the correct dimension mod 8 that normally follows from string theory. That's probably just a correction of a simple error from the older papers.

Deriving Higgs, and the whole SM from a dozen of letters

When you start to think that everything is trivial, things change and explode on the top of page 8. It is argued that all bosons of the Standard Model, including the Higgs, are derived from some unimodularity of certain fluctuations. ;-) Moreover, the Standard Model action is written on half a line - see the first displayed formula on page 8.

This compact formula is supposed to give you the whole Standard Model, as derived later, with the usual gauge coupling unification - although some of the compactness has a similar origin to the compactness of the "unwordliness theory of everything". ;-) I was trying to figure out whether the Higgs mass is determined because "alpha_h" seems to be determined on page 11. Later I thought that the Higgs mass was adjustable because of an extra parameter "M". UPDATE: However, recently, Prof. Connes has pointed to me that "M" stands for the mass of the W-boson.

The conclusions say that the Higgs mass at the unification scale could be fixed and the running puts it below the big desert hypothesis but the precise numbers don't seem to be available.

ANOTHER UPDATE: In equation 39 and below it, a derived relation between Yukawa couplings is written down which effectively predicts the top quark mass to be 198 GeV in the zeroth approximation. (The top mass was already argued to be predicted by previous papers.) Connes argues that the large discrepancy may be fixed by the tau neutrino Yukawa couplings which I find very unlikely.

The calculations and their discussion continue up to the page 12.

To summarize:

  • the parameters of the Standard Model are geometrized, assuming that the word "geometry" can also mean "non-commutative geometry" of Connes' type with a finite-dimensional space of functions on it - a geometry that is determined by the Dirac operator
  • it happens that the particular "geometry" F - which we can start to call the Connes manifold - predicts gauge coupling unification
  • as far as I see right now, the construction has enough arbitrary assumptions to set (not derive) the gauge group, the representation for the fermions, and all parameters except for two of the coupling constants that are determined by the gauge coupling unification
  • it is still not clear to me whether the origin of the gauge coupling unification is based on some hidden respect to the grand unified group or whether Connes' argument for the coupling unification is entirely independent
  • it is not yet clear to me whether the construction actually implies a natural argument why there should be one Higgs doublet and no triplets or other representations, or whether the Higgs representations are effectively inserted by some of these convoluted enough pieces of Connes' formalism
  • as far as I understand, the unification of gravity only means that gravity is used for 4 dimensions, while the other forces are associated with the Connes manifold and interpreted as the rest of the same gravity, but there is no actual unification because the internal Connes manifold F follows different rules than the four-dimensional Minkowski space; note that the unification in string theory is much less vacuous because the forces morally arise from geometry of the extra dimensions that follow the same laws as the large dimensions - they're the same fields on the worldsheet, to use a perturbative example

I would say that there seem to be at least two general features that are predicted rather naturally from the Connes framework:

  • KO-dimension of "F" is 6 mod 8; I have already discussed that this seems to be trivially required for any non-singular compactification of anything that shares qualitative features with string theory
  • gauge coupling unification; recall that gauge coupling unification holds in many vacua of string theory, even those that break the grand unified symmetry at the string scale (including e.g. some intersecting branewolds); the Connes manifold could be a more general example suggesting that the prediction is rather general

Connes doesn't predict SUSY but I don't quite see what is better about his fermion-only model as opposed to a model in which he would put the full supermultiplets. After all, he needs SUSY not to be falsified by the high-precision gauge coupling unification whose validity he predicts. A SUSY Connes model would have to predict two Higgs doublets.

To summarize: Connes seems to describe some common features of a class of compactifications. You could imagine that all the Connes-like semi-realistic compactifications of string theory must formally reduce, at energies below the Calabi-Yau compactification scale (using the heterotic language), to the discrete Connes manifold. This Connes manifold, in some sense, only knows about the Dirac zero modes on the Calabi-Yau manifold (or its generalization). My main problem is, of course, that I have not quite understood in what sense the Connes manifold is a geometry rather than the same old physics of the covariant derivatives inside Dirac operators pretending to be "a new kind of geometry".

The deepest wisdom about the real world is, in some sense, generalized geometry. Many of us believe this statement. However, one shouldn't be using the notion of "generalized geometry" for random concepts without a good reason - for example, chess in SWF is not a version of non-commutative geometry, I guess, and the closest thing in the real life that could deserve to be called generalized geometry is Toyota Matrix M-theory - and I am so far uncertain whether there is a really good reason to call the Connes construction "geometry" or "Connes manifold" or whether it is just a misleading label that suggests a non-existent unification.

Predictions

From a purely predictive vantage point, Connes seems to predict that the gauge coupling unification is generic for all string vacua that satisfy certain conditions whose character I can't describe independently of Connes' formalism. I would like to know whether the gauge coupling unification in his picture is a non-trivial result or a consequence of a secret SU(5) or other broken GUT structure.

He also seems to derive the existence of the Higgs doublet simultaneously with the derivation of the gauge bosons. I would love to know whether this prediction is really natural - whether the Higgs bosons are naturally "additional components" of the gauge fields - or whether the Higgs doublet is effectively assumed. In other words, I want to know whether Connes could write a similar paper with a different algebra and a different geometry that would predict the Standard Model with different representations - or even any representations - of the Higgs bosons.

New eyes and unification

Finally, Connes' paper may turn out to be nothing else than a translation of physics of the Standard Model to an unusual language. Even if it is so, it could still be true that the notion of the Connes manifold - which seems to be a truncation of the full stringy manifold below the Calabi-Yau compactification scale, i.e. the truncation down to the Dirac zero modes - could be useful. But if there is no way how to connect the Connes manifolds with usual manifolds, then I would be skeptical about the usefulness of the notion of the Connes manifold, too.

You know, the unification of geometry (gravity) and other forces (and matter) in string theory is real. These things are all made out of the same "stuff". The Connes unification seems fake to me. Until you show that the local or other detailed physical properties of the Connes manifold "F" are analogous or physically equivalent to the local properties of the four-dimensional spacetime, your calling both of them "geometry" is just a trick to suggest unification of something that is not really unified.

Imagine that aliens study the terrestrial buildings. They can already draw plans of each floor but they can't capture three-dimensional images. Someone proposes that there is a unification of the horizontal dimensions and the vertical dimension - but he still describes the horizontal directions by continuous numbers while the floors are labeled by integers. Is it unification? I don't think so.

You only unify the dimensions of the skyscraper once you start to build on the three-dimensional continuous geometry - or, speculatively, something that satisfies the same equations of motion. I deliberately used the term "equations of motion" instead of more vague "mathematical constraints" because it is not enough to find some common general mathematical characteristics of two structures in order to claim that you have unified them. In physics, you must show that these two structures both follow from a set of equations that completely determines the dynamics. A unification built on calling two different things by the same name is fake, and it can't imply any predictions.

Half a day later

Although it's been a great fun, I feel almost certain now that there is no new physics in the paper. For example, the Higgs fields are effectively added by hand already in equation (2) that contains two copies of the quaternions. One combination of these two quaternionic spaces gives you the core of the SU(2) weak gauge group, after a projection. Another combination gives you a Higgs doublet. Using the words of Nima, the Higgs doublet arises according to the logic of deconstruction from the bi--fundamental matter whose equivalent is inserted at the beginning.

One could construct similar algebras with other projections that would give different Higgses - different numbers or different representations. Also, the gauge coupling unification seems to be a consequence of an underlying SU(5)-like structure that is partly explicitly broken by Connes' formalism.

As I mentioned above, the representations for the quarks and leptons and the number of generations were added by hand, too.

Everything is just a derivation of the Standard Model from an unfamiliar set of algebras employed as the starting point - algebras that, however, contain the very same information as the normal information that we use to specify the Standard Model - the gauge group, representations, and the data about the couplings. Someone might think that string theory is doing the same thing - namely that it derives the familiar phenomenological data from some other starting point that has a comparable number of parameters - but it is not. The starting point in string theory is truly unified and follows other fundamental laws (that manifest themselves as conformal symmetry in the perturbative approximation). The unification in string theory and its uniqueness is real; Connes' unification is fake and his construction is extremely far from being unique.

It's been fun but it's good to be finished with it. Also, I am happy that Alain Connes now agrees with us that a unifying theory with fermions in a non-singular bulk must have 10 dimensions (at least mod 8). And because I still believe that he could really move with theoretical physics, I hope that it will take less time to agree about hundreds of other things that are probably pre-requisites for all serious attempts to unify the fundamental theories in 2006. The dimensionality mod 8 is a nice result but it would probably occupy 3 lines in Polchinski's book if it were mentioned explicitly, and there are many other important things in physics...

Harvard Crimson's fun

It seems that funny students took over the Harvard Crimson, at least until the September 11th when the dead serious editors return to their offices, hopefully without any aircrafts.



John Harvard, or more precisely the unknown young man from the statue of three lies, enjoys the mild summer, too.

Meanwhile, Robert Rubin decided which board is more valuable for him. He left Ford and kept Harvard. Also, Jack the Ripper is coming to Harvard. Piotr Brzezinski believes that the British conservatives who want to get rid of Cameron are kamikaze fighters. An obscure college from New Jersey has recently topped Harvard in the rankings while Harvard remains the toughest school to get into. There are other interesting things in the Crimson, too.

Wednesday, August 30, 2006

Mohammed AlQuraishi: On a theory of biology

The previous article about theoretical biology was an interview with Franziska Michor. Below, you find a completely unedited article by Mohammed. The links to at least three reactions to the article below are listed here.

Thanks to Lubos for the opportunity to write on this blog. Naturally, many potential topics presented themselves, but I ultimately chose to write about the subject that is dearest to my heart, and that is the emerging hard science of biology. I said emerging because while biology as a scientific discipline has certainly existed for many decades, and in some sense centuries, it has only recently started acquiring status as a genuinely hard science. The next few paragraphs will concern themselves with the challenges and opportunities that face us today as biologists, as we embark on formulating a quantitative theoretical foundation for biology.

I will enumerate three main points, all of which represent both a challenge and an opportunity. The first will deal with a scientific challenge of a theoretical orientation, namely the lack of a theory for biology. The second with the sociological organization of biologists and biology departments at the leading research institutions. And the third will be part science, part sociology, having to do with the focus of current experimental methods and programs on biomedical research as opposed to basic biological research. The challenges are listed according to my own judgment of their importance.



First is the need to develop a unified theory and formalism for biology. No true theory of biology exists today. Instead, the field is characterized by a collection of disparate models, each concerned with a particular subset of the general biological space. Furthermore, each subfield employs its own set of formalisms, ones that are in some cases fundamentally incompatible with the formalisms used in other subfields. One such example that has received considerable attention in the past several years is the field of genetic regulatory networks, or GRNs. This class of problems deals with the regulatory relationships that exist between genes. The products of certain genes, known as transcription factors, are able to bind to the DNA molecules encoding other genes and regulate their production. Transcription factors that ‘turn off’ genes are known as inhibitors, and ones that ‘turn on’ genes are known as activators. Such relationships are typically encoded as graphs in the computer science sense, with nodes representing genes, and edges representing different types of relationships, for example inhibition or activation. Initial work in this field, such as that pioneered by Harley McAdams and Adam Arkin at Stanford and UC Berkeley, respectively, employed electrical circuits, modeled by ordinary differential equations, as the underlying formalism. More recent work, such as that by Eric Davidson at Caltech and Hamid Bolouri at the Institute for Systems Biology in Washington, makes extensive use of the sequence properties of the genes—specifically the presence and arrangement of DNA motifs, short sequences of nucleotides such as “AGGTA”, that can be used to predict whether a given transcription factor would bind or not, and the resulting logic of a given binding combination, for example if X and Y bind then activate, if X or Z binds then repress, etc. Yet, GRN models typically lack any grounding in the kinetics of the underlying enzymes, or the structural basis that determines how transcription factors bind to the DNA molecule. Thus, they are fundamentally ad hoc, and so cannot be used to predict the behavior of previously uncharacterized genes and transcription factors. Many other examples exist—biochemical networks make extensive use of kinetic data, also typically formulated as ODEs, but encode no spatial information. Protein folding models do make extensive use of spatial information, and are typically simulated using quantum mechanics-based molecular dynamic simulations or classical rigid-body dynamics. Yet the timescales at which such simulations are made make it entirely impractical, for the foreseeable future, to combine structural data with the other types of models mentioned. Other structural models, such as those used for the morphology of cells or even organisms, completely forgo continuous models and instead employ discrete formalisms, such as cellular automata. Integration in this instance becomes difficult due not only to the vastly different physical scales at which the phenomena is occurring, but also to differences in the very formalism that is representing said phenomena. The opportunity is clear: Develop a theoretical framework that can coherently combine all the aforementioned disparate models, along with an underlying mathematical formalism that can quantitatively and elegantly capture it. To address this problem one cannot rely on better ‘motif-finding’ algorithms for DNA sequences, or more accurate clustering methods for microarrays. The challenge is not building software to process biological data—it is to construct a theory of what biology is.

The second challenge that I will address is a sociological one. It can be summarized as follows: Experimentalists think theoreticians don’t do 'real work' because they’re sitting behind computers all day, and theoreticians think experimentalists are charming individuals whose work will ultimately be automated by robotics. Neither picture is true. What makes experimental work difficult is not performing the experiment, but the careful and thoughtful experimental design that is required to obtain the sought results. Nor is the difficulty of theoretical work merely writing software or solving equations, but the insight necessary to abstract from separate experiments a coherent quantitative picture of what is truly taking place. To address this problem, various institutions have attempted to bring biologists, computer scientists, physicists, and others to the same table and address the problem from their individual perspectives. At Stanford, where I am currently pursuing my graduate research, there is BioX. The University of California campuses at Berkeley, San Francisco, and Santa Cruz have QB3. And up in Washington there's the Institute for Systems Biology. On the east coast, Harvard recently opened a Systems Biology department, and MIT is now offering a Computational Systems Biology major. Alas, I think this approach alone will not solve the problem. Collaboration between different departments is a step in the right direction, but ultimately a new generation of scientists will need to be trained as quantitative biologists. Scientists who care about biology as a science, and who are expertly trained in mathematics and computer science, much like their physicists counterparts. Stated differently, we need one brain with two things in it, as opposed to two brains with one thing in each.

Moving away from sociological factors and half way back to science, the third challenge I would like to outline is an experimental one. Current experimental work, and biology as a whole, suffer from a serious problem—the perception that their existence is only justified as a tool for medicine. Biology is a science, with its own set of fundamental and basic objectives. To require biological research to forgo the foundations and focus on applications is bad for both. I will provide one example to illustrate this point. Significant resources are allocated to the model organisms closest to humans, namely mice and to some extent the fruit fly Drosophila melanogaster. Less is given to simpler eukaryotes like Yeast. And yet less to prokaryotes like Escherichia coli. But in view of the first challenge above, it is imperative that biologists work on as simple an organism as possible. Initial work in quantum mechanics solved the hydrogen atom, not ununoctium. Stated differently, it is as if someone tasked with learning a programming language for the first time decides to examine the source code of Microsoft Windows for tips--unlikely. Instead, beginning computer scientists are usually tasked to write a simple ‘Hello World’ program. I hate to break it to everyone, but we biologists are still at the ‘Hello World’ stage. One organism in particular, Mycoplasma genitalium, has the potential to serve as the hydrogen atom of biology. It is the simplest known organism, yet its status as a model organism for experimental research is very poor. If we are in the business of constructing a quantitative biology, simple organisms deserve to receive a lot more attention, even if they lack immediate biomedical applications.

I will conclude with a parting thought, and an accompanying question. Biologists are rumored to suffer from 'physics envy', an affliction known to arise from the purportedly inferior mathematical skills of the working biologist with respect to her physicist colleague. It appears however that for at least parts of biology, it is not quite clear that a physical model is the right one. Certainly for constructs such as a 'cell', one can think of objects existing in physical space, enzymes, nucleic acids, large protein complexes, etc, and interacting according to the laws of physics. But other biological constructs don't fit the bill as neatly. Consider the 'genome'. What physical object does it map to? One may argue that the chromosome of any one organism is in fact such a physical object. But evolutionary biologists speak of turnover rates for specific nucleotide positions, and 'ultraconserved' regions existing between distant extant organisms as far apart as humans and yeast! Clearly the word 'conserved region' means little in the context of one specific chromosome, as an individual physical object. Perhaps one can escape this dilemma by arguing for the need to examine things on a longer timescale, not simply to consider one single chromosome as a physical object, but the physical life of many such chromosomes over many years and millennia. Or perhaps not. My question is:

Are we trying to construct a physical theory, albeit one on a restricted temporal and spatial scales, like chemistry? Or are we coming up with an entirely new abstraction, much like computer science, where the underlying physics are ultimately inaccessible?

And that, as in the words of one Lubos Motl, is the memo. [LM: Bill O'Reilly will certainly forgive us.]

Mohammed AlQuraishi, visitor #600,000

The GOP has found a new word

An old word, but new to them, of course. It's only the latest Rovian strategy to define the terms according to his own warped ideology:

President* Bush in recent days has recast the global war on terror into a "war against Islamic fascism." Fascism, in fact, seems to be the new buzz word for Republicans in an election season dominated by an unpopular war in Iraq.

Bush used the term earlier this month in talking about the arrest of suspected terrorists in Britain, and spoke of "Islamic fascists" in a later speech in Green Bay, Wisconsin. Spokesman Tony Snow has used variations on the phrase at White House press briefings. Sen. Rick Santorum, R-Pa., in a tough re-election fight, drew parallels on Monday between World War II and the current war against "Islamic fascism," saying they both require fighting a common foe in multiple countries. It's a phrase Santorum has been using for months.

And Defense Secretary Donald H. Rumsfeld on Tuesday took it a step further in a speech to an American Legion convention in Salt Lake City, accusing critics of the administration's Iraq and anti-terrorism policies of trying to appease "a new type of fascism."


These men are obviously confused about the word. Merriam-Webster defines fascism as "a political philosophy, movement, or regime (as that of the Fascisti) that exalts nation and often race above the individual and that stands for a centralized autocratic government headed by a dictatorial leader, severe economic and social regimentation, and forcible suppression of opposition".

A recent definition is that by Robert O. Paxton:

"Fascism may be defined as a form of political behavior marked by obsessive preoccupation with community decline, humiliation, or victim-hood and by compensatory cults of unity, energy, and purity, in which a mass-based party of committed nationalist militants, working in uneasy but effective collaboration with traditional elites, abandons democratic liberties and pursues with redemptive violence and without ethical or legal restraints goals of internal cleansing and external expansion."


So essentially you can't be a fascist unless you have a country and an economy. You can, however, be a religious fundamentalist/extremist without a nation, but you still must be able to finance your revolution.

But the Republican party, its own hoary alliance with Christian fundamentalists -- and yes, extremists -- together with the billions in US corporate largesse, fits the definition perfectly. And believe it: Bush knows a fascist when he hosts one.

Simply put (for you conservatives still having trouble understanding), the Islamic fundamentalists attack us because they want us out of the Middle East. It's not just about oil or money or power; they reject our commercialism because they fear Islam losing its influence to the siren song of Western fashion, electronic gadgets, cinema, music, and vices. And because they want revenge for what (they perceive) we have done to the region and the religion.

But back to the Fascists. I know; let's ask someone who actually was a Fascist for his definition. Hey, Benito Mussolini:


"Fascism should more appropriately be called corporatism because it is a merger of state and corporate power."


What do you have to say about this, Sinclair Lewis?


"When fascism comes to America, it will be wrapped in the flag and carrying the cross."

Here's more from the AP:


Dennis Ross, a Mideast adviser to both the first Bush and Clinton administrations and now the director of the Washington Institute for Near East Policy, said he would have chosen different words.

"The 'war on terror' has always been a misnomer, because terrorism is an instrument, it's not an ideology. So I would always have preferred it to be called the 'war with radical Islam,' not with Islam but with 'radical Islam,'" Ross said.

Why even mention the religion? "Because that's who they are," Ross said. "Fascism had a certain definition. Whether they meet this or not, one thing is clear: They're radical. They represent a completely radical and intolerant interpretation of Islam."

While "fascism" once referred to the rigid nationalistic one-party dictatorship first instituted in Italy, it has "been used very loosely in all kinds of ways for a long time," said Wayne Fields, a specialist in presidential rhetoric at Washington University in St. Louis.

"Typically, the Bush administration finds its vocabulary someplace in the middle ground of popular culture. It seems to me that they're trying to find something that resonates, without any effort to really define what they mean," Fields said.


Naah, that can't be true, Mr. Fields. (Can it?) So how long has this been going on, Mr. President?


"The real truth of the matter is, as you and I know, that a financial element in the large centers has owned the government of the U.S. since the days of Andrew Jackson."


That was FDR, not GWB. Another US president:


"I hope we shall... crush in its birth the aristocracy of our moneyed corporations, which dare already to challenge our government to a trial of strength and bid defiance to the laws of our country."

Thomas Jefferson said that. In 1816. Once more from the original source:


Stephen J. Wayne, a professor of government at Georgetown University, suggested White House strategists "probably had a focus group and they found the word 'fascist.'

"Most people are against fascists of whatever form. By definition, fascists are bad. If you're going to demonize, you might as well use the toughest words you can," Wayne said.



Remember who the real fascists are as you hear this phrase repeated over and over again in the coming days and weeks.

Tuesday, August 29, 2006

Thomas Thiemann and the Master Constraint

The last hep-th paper tonight was written by Thomas Thiemann and it is called

The title is meant to be a parody of the modest title of the most cited 2005 paper on loop quantum gravity called

that was written by Nicolai, Peeters, and Zamaklar. We have discussed the paper on this blog. Thiemann repeats his bizarre opinion that Nicolai et al. are "outsiders" while he is an "insider" about five times in his paper. Needless to say, I think that his opinion is unrealistic because Nicolai et al. know what they're talking about, unlike Thiemann himself.

Let me try to explain. We will start with some details.

Spin foam fad

In the middle of page 21, Thiemann criticizes Nicolai et al. because they didn't spend enough time with the spin foam models. That's quite a cute criticism because Thiemann dedicates approximately one page, namely page 38, to these models. The section about them contains one displayed equation, namely an ill-defined path integral with the delta-functions (4.50). For comparison: Nicolai and Peeters dedicate 1/3 of their newer paper to the spin foam models.

Harmonic oscillators

One of Thiemann's many amusing constructions appears on page 45. Thiemann confirms that he, apparently with the rest of the loop quantum gravity community and perhaps also the algebraic quantum field theory community, disagrees with the rest of the physicists how to quantize the harmonic oscillator. Helling and Policastro have shown that if the loop quantum gravity methods are applied to the harmonic oscillator, we don't obtain the usual spectrum "(n+1/2) hbar.omega" that has been derived in a dozen of formalisms, tested experimentally, and that has become one of the main textbook results of quantum mechanics. Robert Helling adds some comments and links about this work in the slow comments.




Thiemann says that Helling and Policastro's construction is a technically correct loop quantum gravity calculation but their physical conclusion is "completely wrong". In order to show what this bizarre self-contradictory statement means, he offers a brand new quantization of the harmonic oscillator :-) with the strange sines of "epsilon.x" and sines of "epsilon.p" we've seen in Helling & Policastro but also with roughly three new ad hoc unjustifiable steps that go beyond Helling and Policastro.

Thiemann's result? What you have learned at school about the quantum harmonic oscillator is essentially correct but only up to a level "E_0" and measurement error "delta", while the ground state energy is zero instead of "hbar.omega/2". No kidding. ;-)

Thiemann's landscape confusion

On page 4, he "rationally" addresses the problems of loop quantum gravity by pointing out that the "infinite number of vacua, recently found by Douglas and Denef, show that string theory is not mathematically unique". Well, this statement is completely off-topic which shows how difficult it is for Thiemann to rationally focus on a well-defined question. Second of all, the number of classical solutions of a theory has nothing whatsoever to do with the mathematical uniqueness of the theory. Third, paradoxically, it was exactly Douglas who recently argued, in his paper with Acharya, that the number of the realistic vacua is finite.

The main point: how to cure all anomalies

There are hundreds of other absurdities about loop quantum gravity and its relations to other ideas in physics mentioned in the paper. Some of them have been discussed on this blog.

But an even more incredible part of the paper is what Thiemann considers to be an answer to one of the main criticisms by Nicolai et al., namely the fact that the constraint algebra of loop quantum gravity isn't closed off-shell which makes it impossible to impose the constraints at the quantum level. On page 35, eighth line from the bottom or so, Thiemann agrees that at the quantum level - as far as the corrections that are of higher order in hbar go - it is indeed true because there are anomalies in the algebra.

But he argues that it doesn't matter. Let me spell it again: gauge anomalies don't matter because of his Master Constraint program. Wow. So a reader returns back to page 9 to see how is the mysterious M(aster) Constraint program supposed to cure anomalies if the anomalies are really there. ;-)

Feynman's joke = Thiemann's work

You will have to use the following trick due to Feynman. Feynman once said - and wrote in Feynman's Lectures on Physics - that the theory of everything can be written in the following form:

  • M = 0.

Originally, Feynman used either "U" (for "Universality" or "Unworldliness" as Frank Wilczek cleverly calls it) or "A" instead of "M" but we will use "M" to make the letter closer to M-theory as well as to the notation of Thiemann's Master Constraint program. ;-) The only detail that Feynman needed in order to clarify his theory of everything was to explain how to define "M". He defined it like this:

  • M = (F - ma)^2 + integral (div D - rho)^2 + ...

You see, it is a quadratic, positively definite form involving all equations of motion we need. As you can see, "M=0" really means "F=ma", "div D = rho", and so forth. How smart. ;-)

Thiemann, apparently convinced that he is being original, is using the same "M" to impose not all the equations of motion but just the constraints. The constraints are normally a subset of those equations of motion that don't involve any time derivatives; however, in loop quantum gravity, the word stands for all equations of motion - in fact, especially those deduced from the Hamiltonian that generate the time evolution - and Thiemann's biggest discovery is thus exactly equivalent to Feynman's joke. Instead of the constraints

  • C1=0, C2=0, C3=0 ... ,

he simply requires that

  • M = 0

where

  • M = sum_{IJ} C_I K_{IJ} C_J

is constructed from a positively definite matrix "K_{IJ}" on the space of constraints that are visualized as functions on the classical phase space. Using "M" instead of "C_I", he thinks that he can perhaps solve all the problems of loop quantum gravity. What a great idea.

Note that the more general form of "M" can always be rewritten as a sum of squares using the procedure called the diagonalization of "M" - something that the pigs will never learn but the freshmen should. The content of the diagonal "M" and the general "M" is identical.

The difference between Feynman and Thiemann is that Feynman offered his universality function as a joke and he has also explained very well why it can never solve or simplify anything: the very point of his joke was to teach students to avoid vacuous concepts and dumb, unnecessary superconstructions. Thiemann, on the other hand, is dead serious when he says that it is an important step in physics that solves many of his problems. He believes that his idea makes all anomalies harmless.

He wants "M" to annihilate the physical states. The only problem that can happen, he believes, is that "0" can be absent from the spectrum of "M". In that case, he just postulates that the physical states must be eigenstates of "M" with the minimum eigenvalue, see page 11 (point 3). He won't tell you why it's not the second minimum eigenvalue - that can actually be zero - but he thinks it's a good idea. I mentioned the second minimum eigenvalue because it is the eigenvalue of "m^2" that actually gives us massless states in bosonic string theory that admit "p=0" modes. If Thiemann had ever tried to compute any one-loop effect, he would have known that the properly regulated ordering constants are often negative, in sharp contradiction with Thiemann's assumption in the middle of page 10 about the positive definiteness of the unshifted operator "M".

In order to see how incredibly childish Thiemann's trick is, it is useful to look at a particular theory with an anomaly in the algebra of constraints that is analogous to the anomaly pointed out by Nicolai et al. It doesn't really matter whether you take the Virasoro anomaly or a gauge anomaly in gauge theory or anything else. All of these anomalies are less severe than the gravitational ones - so Thiemann's trick should work even better in those familiar contexts - but the basic structure is universal.

Take the Virasoro algebra with a central extension. Instead of postulating that the generators "L_I" annihilate the physical states, you follow Thiemann's recommendation and construct something like

  • M = sum L_{-N} L_{N}

where the sum goes over integers. For free fields describing strings in the flat space, such an "M" is quartic in the oscillators. You can easily see that if you rewrite "M" in the continuum worldsheet variables, it will be an integral whose integrand has severe UV problems because the expression contains things like the integrated square of the delta-function. Thiemann is aware of this problem, as you can see on page 9, so he proposes a "better" kernel which, in my variables, has the form

  • M = sum L_{M} K_{MN} L_{N}

where the matrix "K_{MN}" is chosen in such a way that the UV problems are regulated away. Imagine that you deal with an anomalous Virasoro algebra in string theory. You define "M" by the formula above and require that the physical states are either annihilated by "M", or, if the vanishing eigenvalue is impossible, that are the eigenstates corresponding to the lowest eigenvalue.

Can you find the physical spectrum? For a non-anomalous algebra or algebra that is related to a non-anomalous one, you might actually be able to reproduce the usual physical spectrum in the old covariant quantization if you take "M" with the original, diagonal kernel regulated properly. It's because there exists a natural non-anomalous extension (with the b,c ghosts) of the whole system and the minimum eigenvalue of "M", if you do it right, will impose the original Virasoro constraints or at least one half of them as explained by Gupta and Bleuler.

However, it is not hard to convince yourself that if such an extension - an underlying theory with the exact symmetry - didn't exist, you couldn't have ever obtained any meaninful result. One of the problems would be that the precise structure of your physical Hilbert space would depend on the kinematical Hilbert space you started with. More importantly and more quantitatively, it would depend on the matrix "K_{IJ}".

That's of course a disaster because the role of "K_{IJ}", as Thiemann admits on page 9, is to act as a UV regulator or a cutoff. Because the nature of the physical Hilbert space is clearly going to depend on this kernel (how could eigenstates of an operator randomly constructed from a kernel be independent of this kernel? At least the quantum corrections to physics will surely be affected), we can say - using the normal language - that the resulting physical observables will be cutoff-dependent. We haven't constructed any meaningful theory that is independent of the regularization. The kernel "K_{IJ}" contains infinitely many undetermined parameters and each of them influences the resulting physics. Be sure that an "algebra" that doesn't close is not a "symmetry", and without a symmetry reason, how could the physical properties of "M" be independent on some kernel that explicitly enters the definition?

The situation is completely equivalent to a randomly picked prescription how to cut off or otherwise regulate UV divergent integrals, including the linearly divergent integrals that are responsible for anomalies. Of course that if you insert some cutoffs and functions that smooth the expressions out, you may get finite results. But these finite results will be, from a physics viewpoint, complete garbage because they will depend on the regulator which, moreover, will break the general consistency criteria such as unitarity and, of course, also Lorentz symmetry (that was broken in loop quantum gravity from the beginning). You haven't solved any problems whatsoever.

The dependence on "K_{IJ}" is one of several examples showing how unjustifiable are Thiemann's statements on the top of page 5 and elsewhere that he has eliminated the infinite ambiguities that were also pointed out by Nicolai et al., among others including Jacques Distler and myself.

If someone had proposed this obviously silly idea of Feynman's universality function in the world of high-energy physics, he or she would be instantly explained why the idea is silly and he or she would be encouraged to learn or re-learn some basic things about UV cutoffs and anomalies. In the world of loop quantum gravity in particular and discrete approaches to physics in general, however, there is no one who could explain Thiemann why he is being silly because, as far as I know, there is no one who understands the very basic things about the UV divergences and anomalies in quantum field theory.

There is no one who understands that if you impose a cutoff on a UV divergent, non-renormalizable, anomalous, or otherwise inconsistent theory, you remain very far from finding a finite physical theory. It is still an inconsistent theory with a cutoff. They keep on celebrating these worthless constructions with infinitely many undetermined parameters - constructions in which all real problems are just swept under the rug - as solutions to the key problems of physics.

More concretely, Thiemann, being an important representative of a "serious competitor" to string theory, as he modestly calls all this nonsense on the top of page 3, can sell this obviously unhelpful and silly idea as an important advance in physics. That's an example what happens if the society implicitly supports affirmative action for the proponents of bad ideas who were skipping their courses of quantum field theory and, as it became clear from Thiemann's reply to Helling and Policastro, also introductory lectures of quantum mechanics.

And that's the memo.

P.S.: I would like to believe that similar blog articles as well as preprints by Nicolai, Peeters, Zamaklar, Helling, Policastro, and others will ultimately lead complaining Lee Smolin to avoid similar untrue statements like his recent false assertions in the Wired magazine that the string theorists don't read the papers of others. Incidentally, William Dembski predicts that next time, Lee Smolin will start to create room for Intelligent Design. So far, the most important supporter of Lee's book is Evelyn Fox Keller, a professional feminist science-hater.

Gr-qc papers on Tuesday

Luca Baiotti and Luciano Rezzolla propose a paradigm that could conceivably become a breakthrough in numerical relativity. When you're numerically integrating equations of general relativity, you must tell the computer what coordinates should be used. Even with well-defined initial conditions, the solution is only determined up to coordinate transformations. Which coordinates should you choose? Moreover, physical singularities may form - what should you do with them? Normally, the regions around likely singularities are amputated and a gauge choice is used for the rest. The present authors use different coordinates and keep the singularities, arguing that this setup is good to calculate gravity wave emissions by collapsing stars and perhaps even the quasinormal ringing modes. The usual calculations would collapse much earlier, they argue.

Louis J. Rubbo introduces the Bayesian reasoning but especially more general assumptions to the analysis of detection of gravity waves. Instead of assuming a particular waveform as most people nowadays do, he uses the Bayesian inference to find a more refined Ansatz for the profile. The experimental discussion focuses on LISA. Be sure that I am not irritated by the Bayesian terminology because it looks like a good strategy to me. Bayesian reasoning is often a good framework to choose a strategy to analyze data and look for fits (and win probabilistic games); it is not a good framework for presenting and defending the final answers. Scientific results are only solid if they're independent from the strategies how they were found.

Evgeny Sorkin uses numerical methods to study wiggly black strings. Recall that according to a perturbative calculation of Gregory and Laflamme, too long and/or thin uniform black string solutions become unstable - because of a negative mode - and decay either to black holes or to non-uniform strings. Horowitz and Maeda have argued that the black hole can't be the final state. Sorkin probably thinks that their paper is wrong because he doesn't cite it at all. Finally, Sorkin deduces some scaling laws for the "nearly pinched" black strings from the numerical data. An interesting point is that the behavior of the black strings changes when the spacetime dimension is around eleven - this has superficially nothing to do with the calculations of critical dimensions of superstring/M-theory because it is a purely classical result of general relativity. It could be shocking if someone found an explanation of this coincidence.

Daniele Oriti and Tamer Tlas try to refine some questions about causality in the spin foam models. It's a nice effort but I think that all these papers are manifestly wrong. They manually remove the acausal configurations from the path integrals - in order to get rid of the acausal, non-smooth, crumpled behavior of the path integral - that makes the resulting theory non-unitary and unphysical. If I use a simpler language, it is no longer true that the evolution operator from A to B times the evolution operator from B to C gives you the evolution operator from A to C because the restrictions on the A-C interval are stronger than the union of the restrictions on the two partial intervals. More generally, it is critical for Feynman's path integral to sum over all configurations of the allowed degrees of freedom, not just some politically correct configurations, to get meaningful results. I won't read any of these papers, at least until someone addresses these rather serious issues.

Pablo Laguna studies bounces in loop quantum cosmology. Loop quantum cosmology is obtained by applying certain simplifying rules on the isotropic cosmological solutions. These rules are analogous to the rules that cripple quantum gravity down to loop quantum gravity. But this doesn't mean that loop quantum cosmology follows from loop quantum gravity. In loop quantum cosmology, people keep on celebrating that they have removed the initial singularity. I think that all these statements are completely nonsensical because the only thing they have done was to include a randomly chosen, unphysical ultraviolet cutoff. There is nothing interesting or unusual about their specific cutoff and nothing physical about the Planckian results of such a treatment. This also makes the results of the calculations of the bounce - that are similar to "shallow water" calculations, as Laguna says - unphysical. Indeed, these are shallow waters of quantum gravity. Loop quantum cosmology is even bigger nonsense than loop quantum gravity.

Gregory J. Galloway has continued to investigate their recent impressive results that the black hole horizons in any dimension must have topologies that admit positive scalar curvature - which implies, in 3+1 dimensions, the familiar result that the horizons must have a spherical topology. In the recent paper, he sharpens some statements and rules out e.g. toroidal horizons.

Lorenzo Iorio fights against Ciufolini and Pavlis because of their radical proposals to improve (?) the satellite tests of the Lense-Thirring effect. Recall that the effect, essentially synonymous with "frame-dragging", is a gravitomagnetic "Machian-like" prediction of general relativity in which ocean tides and solid tides of the Earth influence the motion of satellites such as LAGEOS I and LAGEOS II (as well as the Gravity Probe B). Iorio argues that all proposals of his two colleagues to improve the setup are incorrect.

Peter K.F. Kuhfittig finally argues that de Sitter 3-branes embedded in a 4+1-dimensional bulk with negative vacuum energy causes the bulk to shrink. If the bulk has a positive vacuum energy, the bulk is either doing nothing or everything - where everything means to blindly follow the inflation on the brane. The bulk is assumed to be more general than anti de Sitter space of the Randall-Sundrum models and I am a bit skeptical about the physical realism of such a more general setup.

Mohammed AlQuraishi: a portrait of the 600,000th visitor

You could ask: who are the people who visit this blog? Surely it can't just be just the set of my e-friends, nice women plus the e-non-friends and the crackpots and other nasty folks whom all of us know and who e-reproduce on the dumping ground of the blogosphere. Let me avoid a description what they must be e-doing before you e-reproduce. ;-)



Instead, let us start differently. Anousheh Ansari, a 39-year-old Iranian American multimillionaire who has earned a lot of bucks in the telecommunication industry and who is the chairwoman and co-founder of Prodea Systems Inc., will pay a symbolic fee of a few million dollars to become the world's first female space tourist. I suppose that it is harder to earn the fee than to master everything you need to know to become a space tourist. You see that although there are no female CEOs of Dow Jones companies, Ansari doesn't need any affirmative action, despite being a member of at least three minorities: the birthplace in the Axis of Evil might be the strongest handicap among them. :-)

I don't know whether she reads this blog; the 600,000th unique visitor of the blog is Mohammed AlQuraishi. If you look at his screenshot, you will see that Internet Explorer 7 looks kind of nice, it has tabs, and the rankingblogs.com icon still doesn't work! ;-)



According to the freely available data, he is an interesting person, too. When he was 6 years old, he started to make computer programs. As soon as he entered the high school, he founded a company in the Bay Area that created software for wireless and small electronic devices. Having received degrees in IT engineering, biology, and logic, his goal is nothing less than to transform the core of biology into a rigorous member of the family of exact sciences.

Recall that Quraish was the Arab tribe that was the strongest opponent of the prophet Mohammed (PBUH) throughout his life: it was his own tribe. (A similarity with Jesus and Jews is rather obvious.) If you're Mohammed AlQuraishi, that's quite a combination. ;-) Because Mohammed is the 600,000th unique visitor - the uniqueness means that at most 1 visit from an IP address is counted every day - he can also contribute any uncensored posting he wants or postpone this right to someone else.

Fixing the vote (and then fixing that)

Stephen Pizzo writes:

If you can watch this entire video and still use an electronic voting machine, you deserve the government you get. If your state or district has decided to use electronic voting machines this November demand an absentee ballot today. Watch this video. Then join those of us who have decided that since paper was good enough for our Constitution, it's good enough for our vote too.
Oh, and when you're done watching the whole video... pass it along. November is only a few weeks off and the last thing Republicans want to see is either house returned to Democratic control. Because if that happens, hearings happen. And if hearings happen... well, who knows - someone(s) could go to jail. So demand a paper ballot or an absentee ballot in Nov. and leave the cheaters with a pocket full of worthless Diebold electrons.

Here's a partial transcript if you don't have time to watch right now...

Are there computer programs that can be used to secretly fix elections?

Yes.

How do you know that to be the case?

Because in October of 2000, I wrote a prototype for Congressman Tom Feeney [R-FL]...

It would rig an election?

It would flip the vote, 51-49. Whoever you wanted it to go to and whichever race you wanted to win.

And would that program that you designed, be something that elections officials... could detect?

They'd never see it.


Two recent Houston Chronicle editorials detailed the concerns of fraudulent vote processing associated with Hart Intercivic's e-Slates, the DRE voting machines in use in Harris County and throughout the state. First, Stan Merriman wrote:

When the Hart voting systems were acquired in 2001, voters in Harris County thought they were being treated to the "latest and greatest" in voting system technology. This electronic system replaced a punch card system (remember hanging chads?) with the belief that we needed to enter the electronic age in the electoral process while also meeting emerging federal guidelines to simplify the voting process for our disabled citizens. ...

In the 2002 election some strange "vote flipping" incidents occurred that actually resulted in the temporary sequestering of machines reported as malfunctioning. The problem occurred with votes cast for senatorial candidates Ron Kirk and John Cornyn "flipping" to both rival party candidates. Lawyers were dispatched to scratch their heads over the cause and effect. No resolution of the situation was achieved.

This same anomaly occurred in the Kerry/Bush presidential election in 2004 in Harris County. Once again, the matter was dismissed as a "glitch" of no consequence and blamed on improper voter use. ...

In all, 1,218 voting machine complaints were filed in Texas in the 2004 general election with People For The American Way's Election Protection Division. In Harris County, 2,400 voting machine complaints were filed with a national voting advocacy group during that election.

In addition to these complaints, others were filed in Collin, Travis, Bexar and Wichita counties. Complaints included vote "transfers" (Kerry/Bush evidenced the same phenomenon reported in the 2002 and 2004 election in Harris County), lost votes, and machine and memory card failures. For the 2004 election, the Electronic Frontier Foundation and the Verified Voting Foundation received more complaints from Harris County than from any other voting jurisdiction in the nation.


And the Chronicle editorial board wrote:


"If folks can hack the Pentagon," Harris County Democratic Chairman Gerry Birnberg said, "they can certainly hack a machine in Harris County."

County Clerk Beverly Kaufman, a Republican, says such concerns are unfounded. "There's this kind of cavalier attitude on these folks' part that all you've got to do is just bolt on a printer and there it is," said Kaufman, who estimates that it would cost up to $8 million to buy equipment and reprogram the system with the capability to print ballots in three languages. "We're just not at a point here where we're able to do it if we wanted to, which we don't."


Well, we're just going to have to fix this, Bev. And we're going to do so first by replacing you with someone who does.

Sunday, August 27, 2006

Hep-th papers on Monday

Off-topic - anniversary: On Monday night, it is expected that the counter in the sidebar will show 600,000. The first person who sends a screenshot with 599,999 or 600,000 can post an article or give this right to someone else. Also, all people with 5,000+ hep-th citations are granted a free access to make postings on this blog (submitted by e-mail or new accounts) up to 1 posting per day, so they shouldn't be ashamed if they have something to say.

Let me start with a paper by Radford Neal, Professor of statistics, on a math arXiv:

Statistician comments on anthropic reasoning

It should be interesting for everyone who has been thinking about these issues to see what statisticians would tell us. The author resolves various doomsday paradoxes and other paradoxes of the anthropic reasoning by emphasizing that all known facts, not just the existence of intelligent life, should be taken as assumptions and corresponding conditional probabilities should be evaluated instead of the normal probabilities.

That's a perfectly principled approach of a statistician and, to some extent, the only non-religious method to decide which things we are allowed to assume and which we aren't. ;-) But it's the same approach that Stephen Hawking had proposed on a conference. David Gross summarized Hawking's idea as the "extreme anthropic principle" because the message to the experimentalists is "don't measure anything else: every new measurement is a new problem for us because we will have to add its results into our increasingly awkward list of conditions".

Clearly, physics is not just statistics or just botany, just like it is not just philosophy. Physics depends on a subtle balance between the cool and boring statistical data from the experiments and hot, speculative, potentially far-reaching and philosophically sounding hypotheses and speculations. When the balance is broken, physics deteriorates either to philosophy or to botany. In the purely statistical (=botanical) approach to questions in physics, the main goal of physics - namely to predict new phenomena without assuming them - seems to be lost. I am ready to sacrifice the explanations of some facts - and justify them anthropically - but certainly not all facts. So I don't know how to reconcile the deep statistical thoughts with the rest of physics.

Let's now switch to hep-th papers from Sunday night.

Stephen Hsu proposes a solution of the information loss problem. When a black hole is formed, a baby universe is born and detached from its parent. The baby carries the information so that the evolution, including the data in the baby universe, remains unitary but the information is lost in the parent universe. This seems to clearly contradict the situation in AdS/CFT where no baby universes exist after the black hole evaporates (and information is not lost), so my reasoning about this new possibility is academic because I won't believe it at the end anyway. ;-)

But even at the academic level at which we're ready to abandon everything we learned about string theory, I feel that the solution shares all disadvantages with the remnant theory and adds some new disadvantages.

Florian Bauer, Tomas Hallgren, and Gerhart Seidl study deconstruction of two additional discrete dimensions. But they don't deconstruct gauge theories but rather a theory of gravity - which involves concepts of massive gravity etc. that are likely to be meaningful at the level of effective field theory only. And their two deconstructed dimensions describe a negatively curved manifold - the Poincaré disk if you wish - which leads to some phenomenological possibilities. The authors study the generation of small fermionic mass as a way to get hierarchies from the large discrete volume.

E. Antonyan, J. A. Harvey, and D. Kutasov study intersecting braneworlds. Recall, you're in a world where all matter lives in the bi-fundamental representations of U(N) groups. In this picture, various Weyl fermions live at different intersections of the type IIB D-branes. If you assume that the Higgs lives at a third intersection, you may use disk instantons to generate the Yukawa couplings and the bare masses if the Higgs has a vev. But that's not what they look at. Instead, they look at a direction generation of mass terms - pairing of the fermions at two intersections - that results from chiral symmetry breaking. Bulk fields are important for the chiral symmetry to be broken and the effect only occurs for various dimensions and relative orientations of the D-branes. The chiral symmetry is also studied by AdS/CFT duals but this is a non-holographic stringy realization of the chiral symmetry breaking in these scenarios although some relations could exist.

Shinji Tsujikawa and M. Sami study the influence of the Gauss-Bonnet term (the Euler density which is quadratic in the Riemann tensor) multiplied by a function of some scalar fields on the transition between the matter-dominated era and the dark-energy-dominated era of cosmology. I have not quite understood what good positive and observable features the term can bring. Nevertheless, it may be a good idea to look at it because the Gauss-Bonnet term is the most typical higher-derivative correction generated in string theory although their assumed dependence on the scalars seems less justified by a fundamental theory to me.

A. P. Balachandran, T. R. Govindarajan, G. Mangano, A. Pinzul, B. A. Qureshi, and S. Vaidya look at the quantum deformation of the Poincaré group. Using some reasoning that could perhaps be more comprehensible to the Bogdanoff brothers than it is to me, they conclude that the Drinfeld twist in the theories they study influences the spins and statistics of particles. The quantum groups are also linked to non-commutative geometry, which I thought to be two very different things, and they probably argue that the UV-IR mixing from non-commutative field theories is removed by the twist unless I misunderstood them. It's too abstract to me and I tend to believe that no physical theory that follows these strangely deformed rules can exist, so all of this is a game with symmetries of non-existent theories, but I may be easily wrong.

Mithat Unsal and Laurence G. Yaffe investigate some hypotheses that resulted from the attempts to construct the AdS dual of QCD. In the effort to get rid of the unwanted matter of N=4 super-Yang-Mills, our most successful AdS/CFT representative, people have looked at various gauge theories with orientifolds or, equivalently, with matter fields in the tensor representations, claiming that two theories - one of which is similar to N=4 and another is similar to QCD - are equivalent for the limit of many colors. The authors disprove this hypothesis assuming that the time direction is compactified on an S^1 - multiplied by the usual S^3. I think that the authors agree with me that the theories may still become equivalent in the decompactification limit of S^1 so that the result that could be viewed as negative by some AdS/QCD people is not too negative. ;-)

Kazuyuki Furuuchi reviews the AdS/CFT correspondence in the regime of highly curved spacetime or, equivalently, small 't Hooft's coupling. That's the opposite limit than one where AdS/CFT is studied most of the time. Most of the things that are known about this unusual regime are related to Hagedorn-like phase transitions of the strings made out of string bits or, equivalently, phase transitions of the corresponding black holes such as the Hawking-Page transition.

Bindusar Sahoo and Ashoke Sen explicitly and fully evaluate the entropy of dyonic black holes in heterotic string theory by including all four-derivative terms. They confirm all prejudices of incomplete calculations that have appeared in the literature: you get the right result if you only include the Gauss-Bonnet terms; you also get the right result if you include all the squared curvature terms plus their supersymmetry partners but nothing else; you also get the right result if you approximate everything by the AdS_3 near-horizon geometry. In the non-supersymmetric cases, only the last simplification is possible, but the authors now give you the full calculation anyway. Previous papers are confirmed by a much more rigorous (and perhaps tedious) calculation. Be ready that alpha' is equal to 16 in their units. ;-)

O. B. Zaslavskii computes some quantities related to the backreaction of black holes caused by things like their Hawking radiation, but in the context of the two-dimensional dilaton gravity. Some previous numerical results are confirmed exactly and seem to be less singular than naively expected. One of the philosophical lessons of these papers is, I believe, that a lineland may differ from a 3+1-dimensional world (or higher-dimensional ones) in qualitative ways.

Matthias R Gaberdiel and Ingo Runkel analyze, for the first time, the open string version of a special kind of conformal field theory, called the c=-2 logarithmic triplet theory. What are these logarithmic theories about? They're relatives of the solvable rational conformal field theories. However, they differ by the logarithmic (as opposed to power law) character of many of their correlators that is caused by "bigger" (not highest-weight-state) representations that appear in the spectrum. Because they want the open string version, they must include the boundaries which modify some of the previous calculations involving this most elementary logarithmic CFT. Be ready for things like boundary states similar to those that you know from ordinary flat space string theory.

Yoichi Chizaki and Shigeaki Yahikozawa perform the full modern covariant quantization of strings propagating on a pp-wave limit of a geometry with the NS-NS flux - probably something you would get in "AdS3 x S3 x K3". If you like all commutators, Fourier expansions, and the BRST operator to be written down explicitly, checked, and re-checked, you will like the paper. The longitudinal X^{-} coordinate - one whose value is solved for in the light-cone gauge - plays a subtle role in their calculations.

Robert H. Brandenberger, Sugumi Kanno, Jiro Soda, Damien A. Easson, Justin Khoury, Patrick Martineau, Ali Nayeri, and Subodh Patil rewrite some results of the subset of the authors (plus others) about their stringy alternative to inflation from the string frame to the Einstein frame. Recall that their picture uses a rather exotic thermodynamical behavior of string theory near the Hagedorn temperature that, as the authors believe, is able to produce the scale-invariant spectrum much like inflation. Their older papers have used the string frame - the choice of units of distance and the metric tensor in which the Einstein-Hilbert term has an exp(-2.phi) in front of it. In the Einstein frame, this phi-dependence is removed and some of the phases of their novel cosmological model are re-interpreted and put into a new light.

Let me mention that some cosmologists have doubts whether the density perturbations that the authors of this interesting proposal seem to produce are physical modes or just pure gauge modes (removable by a coordinate redefinition) that are misinterpreted as physical perturbations. More generally, I am still a bit puzzled whether the geometric intuition is allowed or is prohibited in these stringy-dominated configurations. More seriously, a new paper by Kaloper et al. will appear tomorrow, arguing that not only Ali Nayeri et al. predict n=5 instead of n=1, which is unacceptable, but their speculations that this problem could be fixed by strong coupling effects violate the null energy condition. Assuming that Kaloper et al. don't have serious errors in their analysis, inflation is likely to remain a unique solution. I hope that our friends - Ali, Cumrun, Robert, et al., wouldn't cry as much as Neil Turok! ;-)

Jai Grover, Jan B Gutowski, Wafic Sabra show you how a modern but classic supergravity paper about fancy topics that remain mostly disconnected from string theory looks like. They focus on five-dimensional gauge supergravities (supergravities with gauge fields: in their case several U(1) multiplets) with 16 supercharges. The spinorial geometry method, whatever this impressive method exactly is, is used to look for solutions and to prove that neither solution can preserve exactly 3/4 of the supercharges.

Bhaskar Dutta and Jason Kumar propose a new scenario how the hidden sector may be useful to create realistic baryogenesis. Recall that our Universe contains many more baryons than anti-baryons: a result of an imperfectly balanced annihilation in the past (one billion and one baryon against one billion antibaryons). The overall baryon number B had to be created somewhere if it were zero at the beginning.

As Sakharov pointed out, one must violate the CP symmetry, thermal equilibrium, and the baryon number sometime in the past - otherwise no matter survives in the universe to allow for life. Various mechanisms how this could have happened exist, much like many astrophysical bounds. These bounds may look unnatural but the authors claim that they are naturally satisfied by their triangle mixed anomaly - which mixes the baryon number with the hidden sector. The setup is naturally realized in intersecting braneworlds of string theory and it predicts some rather detailed properties of exotic quarks produced at the LHC. That's why this paper is the 8,724th proof that the people who suggest that string theory deals with untestable physics are, politely speaking, morons. It is, on the contrary, very likely that this particular class of model will be falsified rather soon. ;-)

N. Yokomizo, P. Teotonio-Sobrinho, and J. C. A. Barata are interested in three-dimensional theories involving gauge fields and spin degrees of freedom. Their point is that a low-temperature limit makes the dynamics of this system topological. A Hopf algebra is identified, the number of ground states may be counted from topological invariants, and a low-temperature expansion around the topological description is studied. These investigations seem to imply new equivalences between classical spin models.

Hisham Sati looks at the role of E8 gauge theory for the classification of charges in string theory and in relation with the gerbes. Many mathematical structures are argued to be interrelated in this setup, including twisted K-theory whose #$# [blah blah blah] class is identified with the NS-NS 3-form H-field. I guess that Jarah Evslin likes the paper, too.

Quite obviously and impressively, Sati seems to argue that he can derive why the E8 gauge theory is directly relevant for type IIA string backgrounds because the dual Coxeter numbers of E8 magically appear, even though a more general group G is used to create the loop groups in type IIA etc. In some sense, this is a new direct proof of the Diaconescu-Moore-Witten framework to describe the integrality properties of the M-theory fluxes. The E8 group is also linked to high-level Wess-Zumino-Witten models.

One of the speculations that result as natural ones from this reasoning is that the 10D type IIA theory is a boundary theory for another 11D "bulk" theory with a "hypergravity" in it, via a holographic duality. It all sounds very intriguing. The paper is an example why I believe that the papers with fancy mathematical structures are much more likely to have a deep point than papers and diatribes against mathematics in physics.

Journalistic hyenes attack inflation

One week ago, The Sunday Times have given us a rather explicit example that string theory is not the only target in a new jihad declared by a coalition of fringe physicists, scientific zombies, postmodern crackpots, and journalistic hyenes, using some catchy words of the former social-democratic Czech prime minister, Miloš Zeman, a big fan of fancy herbal alcoholic drinks. :-)

A journalist called Jonathan Leake didn't like Alan Guth's opinions about the cyclic universe and related ideas. Most of the broader community of cosmologists including me tend to agree with Alan Guth that the cyclic universe seems much less convincing and promising a direction for cosmology than the inflationary framework. Alan Guth may perhaps have some additional personal sentiments in this story.




Nevertheless, at a summer conference, he has made a calculation and ended up with the result that Neil Turok's new theory is comparably reasonable as a thinking of a monkey. To visualize his result, Alan Guth has also showed a cute picture of a monkey. His result is perhaps controversial but most people would probably disagree about the quantitative and presentation issues only. ;-)

Moreover, most people in the field prefer to know whether the Universe ever expanded by a factor of 10^{30} or more than whether Alan Guth ever drew a monkey. Leake's priorities are probably different. Moreover, he did a bad job even as a "social scientist". If he studied these things more carefully, he would know that there has been tougher tensions involving Linde and Hawking or Linde and Turok.

Now you may ask: is it fine to show a picture of a monkey? No doubt, Neil Turok was going to be offended. Well, what a surprise. It was probably one of the points of Alan Guth to attract Neil Turok's attention. Alan Guth feels strongly about these things, and as far as I can say, he has good reasons. Many other people fully share Alan Guth's opinions even if they don't share all of his emotions.

I think it is absolutely critical for scientists to be allowed to think and say that a theory is crap. In fact, it is one of the main jobs of a scientist to identify theories that are crap and to abandon them. Science can never prove that a theory is correct forever; such things are only possible in the world of religion (and perhaps in science, once we or the future generations complete the theory of everything).

But what science can do is to falsify hypotheses. This is how the progress is made. The scientists keep on abandoning hypotheses and keep on studying the most plausible theories that have not been falsified yet - theories that are Not Yet Wrong or Not Even Wrong if you wish - in more detail. Quite clearly, most science-haters don't understand this strategy of science. But they're wrong and they're irrelevant, together with thousands of their dumb fans.

Moreover, it is important for the senior scientists to evaluate their peers' theories critically; otherwise all of them would end up working on theories that neutrinos are octopi swimming in the spin network. Silly journalists would surely like such a situation and support them in this enterprise but science as we know it would be over.

Dr. Jonathan Leake is not a cosmologist and he can dislike slides with monkeys, perhaps because of some biological connotations. That's his right and that's his problem. But what I find completely unacceptable is to write a scientifically misleading article about cosmology based on these irrational emotions of the journalist.

The article overhypes the cyclic universe as a revolution in physics that is painted as analogous to the Einsteinian revolutions. Its author paints Alan Guth in the worst light that is publishable. More importantly, the article is written as a weird attack on inflation - which is a different entity than Alan Guth despite his pioneering contributions. It invents many childish links between the Catholic Church and the inflationary cosmology.

Neil Turok helps the journalist by saying:

  • "The supporters of inflation have become too evangelical. They have no idea why inflation happened but they still believe in it," he declares.

If I were Neil Turok, with all of my respect for him, I would be ashamed for having used the same crap that the crackpots are using on a daily basis when they attack anything about physics - for example relativity. If you can't find working arguments that would support your theory and convince others, you start to compare them to generally disliked groups of people, Nazis, or the Inquisition, don't you? It's great to have a compassion of the laymen but it is not a scientific argument.

We don't need to encourage stupid people to invent new comparisons of physics and religion. I have personally seen too much of it.

Science is not a place to cry and to help those who cry. Science is a place where evidence and convincing arguments brutally destroy wrong theories and less convincing arguments, whether or not the feminists find this mechanism sufficiently feminist, in order to make progress. Scientific arguments are what makes Guth's comparison of his colleague with a monkey more powerful than Turok's comparison of the mainstream with the Church.

Inflation happened because an underlying theory such as string theory naturally gives us scalar fields whose potential energy has a non-trivial profile that can admit a slow roll; with these assumptions, the evolution is a derivable fact and it is called inflation. The conditions needed for inflation seem rather generic in an underlying theory by which we usually mean string theory, unlike the cyclic universe.

We know that inflation is likely to be correct because it non-trivially solves dozens of problems with the Big Bang cosmology. Equally importantly, we have obtained new very accurate WMAP data in March that confirm basic predictions of inflation and allow us to start to ask the first questions beyond the zeroth approximation (of the scale-invariant spectrum) - questions that can support or rule out classes of specific inflationary models.

These facts strikingly contradict the text of the dumb article:

  • Guth himself has built his career on it. Recently, however, it has become clear that the theory has major flaws. There is, for example, no widely accepted way for physics to explain how such "inflation" could have happened.

What a pile of Severely Hurt Information Technologies, if I have to avoid the acronyms. We are aware of no major flaws of inflation. In fact, we don't know of any flaws of inflation that would indicate that something is wrong. And one of the main victories of inflation is exactly its ability to explain where did the Big Bang get the bang, if I use Brian Greene's words in "The Fabric of the Cosmos". What Jonathan Leake wrote is a malicious lie.

Moreover, even if there were a sense in which inflation doesn't explain its very beginning, the same is true for the other scenarios even though the beginning may be shifted further to the past. To summarize: the lie is absolutely unjustifiable and atrocious piece of propaganda.

And it is not the only absurdity: the next paragraph argues that inflation contradicts the cosmological constant which is quite crazy given the fact that the whole mechanism of inflation is an exponential expansion because of temporarily higher cosmological constant that eventually rolls to its present value.

It is even more outrageous if you realize that the theory that was chosen to have "major flaws" according to The Sunday Times is a nearly proved theory of inflation, while other patently wrong theories that have been invalidated by all of recent research - such as the discrete theories of gravity - are never mentioned negatively in the media - because their authors share the same anti-scientific sentiments with the poor journalists.

The only thing we don't know about the inflation is the full picture with all the potentials etc. - something that should ultimately be calculable from a fundamental theory (which probably means string theory) - but we have never known the theory of everything in the past so this is not the first time. This incomplete knowledge has never prevented us from making progress in the previous examples. The progress called "inflation" is no different.

Inflation is the way to go beyond the standard Big Bang cosmology. Just like in high-energy physics, it is true that we must rely on theoretical reasoning more than ever before because direct data about the high-energy mechanisms that created the CMB spectrum are not available.

And yes, we don't think that the same predictions have been derived from the alternative theories as reliably and convincingly as they have been derived from the inflationary framework - the difference seems large - and many people argue that the papers on the cyclic universe have been plagued by errors. I haven't checked most of these statements myself but I have sociological reasons to believe them.

Cyclic universes are also far from being the only competitors of inflation, if you can't resist to talk about alternatives. See e.g. the recent papers by Ali Nayeri, Cumrun Vafa, Robert Brandenberger, et al. These guys also think that the cyclic universe is not a good approach. Their alternative uses some very specific possibilities offered by string theory thermodynamics but it remains controversial, too. You can see that more stringy-sounding titles don't necessarily mean that they are more acceptable by the "establishment". At any rate, you should read the papers by Ali, Cumrun, Robert et al. because they are understudied and they could lead to a more realistic alternative solution than the cyclic universe.

What to do with articles that try to increase the heat inside the scientific community?

Of course that an ideal scientist won't get affected by irrational arguments like stupid and untrue articles in the newspapers or hate mail that he may receive after these stupid articles are published. ;-) But we live in the real world, not in the ideal world. Recently, I find the amount of attempts of dumb journalists to intimidate scientists and influence what they think about scientific questions - by misusing the public opinion - to be absolutely outrageous, especially given the thoroughly unscientific character of the journalists' motives.

Dear Dr. Jonathan Leake, I don't know how I can explain these things so that you would understand them. But as far as cosmology goes, you're just a layman. However, you should try to understand that if you write something in The Sunday Times, it can have quite an effect, so you should better try not to write such an incredible piece of Severely Hurt Information Technologies like this painful article.

Once again, the validity of scientific theories has nothing to do with whether or not you like Alan Guth or his slides, and you should allow the scientists and the scientific public to continue to live in the world where these two things are absolutely disconnected. Writing about cosmology - that has become a full-fledged exact science in the last decade - is something different than writing about wild religious sects' witch-hunts against their heretics because the rules are very different. If you bring new external emotions, compassion, or support for your horses, it always hurts science. Everyone who thinks that he or she can help science by some emotions or unscientific recommendations and pressures is misguided.

I, for one, like Alan Guth a lot, but even if I didn't, I would still think that the evidence that inflation is the best way to go in cosmology is extremely strong and the other proposals so far look like convoluted variations borrowing the same basic ideas from inflation that however don't quite work. As far as I can say, Guth and Linde are right and deserve the credit for the key pre-Big-Bang cosmological discoveries, while Hawking and Turok are less right. ;-) But even if the truth is different, the last thing science needs is a political or emotional pressure from journalists who have no idea what's going on.

Do you want to reduce the impact of Einstein's theories because he was a Jew? Feel free to believe and print whatever you want but your dust in the river will follow Einstein's laws of Brownian motion after you die anyway. Do you want to ban quantum mechanics because its key discoverer, Werner Heisenberg, was a German patriot and the leader of the German nuclear efforts? Screw you. The laws of quantum mechanics are those who will eventually kill you and show you who is the king in the city. Similar fate awaits you if you want to deny the QED results of Richard Feynman, the sexist pig, or the quark results of Yuval Ne'eman, the Israeli spy.

And that's the memo.