Sunday, April 30, 2006

Pandora's box



Pandora was, according to the Greek mythology, the first woman in the world. Zeus has constructed her to punish the men for Prometheus' theft of the secret of fire. Good job, Zeus.

She had received gifts from all Gods. How do you say it in Greek? Yes, "all" is "pan" like in "pan-American". Gifts are given by "donors", so a girl who got gifts from all Gods must be Pandora. :-)

The gift from Zeus, the God of Gods, was a box. She was instructed not to open it. However, as a curious woman, she opened it and released sorrow, plague, poverty, crime, and other misfortunes of mankind. (Well, Adam and Eve did more or less the same thing with the Macintosh.) She closed the box before the hope evaporated: hope was the only ingredient that was left in Pandora's box. Later, she opened the box again and released hope: this act of hers ended a very bad period of the human history.




What is the message? The message is that what you get from Pandora's box depends on the timing and other circumstances. Pandora's nuclear box was first opened in Summer 1945, and although the obvious consequences looked horrific, most historians and others agree that the nuclear bombs saved a lot of lives. Nuclear weapons give people a certain additional power and the power can lead both to clearly bad as well as relatively good outcomes, in comparison with the non-nuclear alternatives.

I think that this was essentially the case of Hiroshima and Nagasaki. The White House would be mad if they declared the nuclear weapons to be forever unusable. I've been always amazed how many nuclear heads have been produced despite the tiny probability that someone would like to use a significant portion of them. When I was a kid, we would hear about the disastrous consequences of nuclear weapons all the time. Life on Earth would end, and all this stuff.

I no longer believe these alarmist ideas; they look kind of childish and irrational to me. Life on Earth did not end with Hiroshima and Nagasaki. In fact, even life in Japan did not end; on the contrary, Japan started one of its most optimistic historical periods. The higher the moral standards of the main nuclear powers are, the better outcome one can expect from a possible nuclear confrontation.

A necessary condition for the hundreds of thousands of Hiroshimas in the currently existing warheads to destroy life is that they're controlled by a lunatic such as Mahnoud Ahmadinejad or the leaders of Sudan. And one of the tasks for responsible politicians is to guarantee that such a thing won't happen.



Figure 1: In fact, it's a task for both of them.

In the real world, there are many other sources of evil, instability, and numerous threats, and there are easy-to-imagine circumstances in which the nuclear weapons could become helpful for a more civilized party in a certain conflict. Yes, I still prefer a future without nuclear confrontations, but the price to pay for such a future can't be infinite.

John Kenneth Galbraith, 1908 - 2006

Nearly 40 years after writing "The Affluent Society," (economist John Kenneth) Galbraith updated it in 1996 as "The Good Society." In it, he said that his earlier concerns had only worsened: that if anything, America had become even more a "democracy of the fortunate," with the poor increasingly excluded from a fair place at the table.


Galbraith likely was distressed, as much as any of us, by the kudzu-like spread of the most obnoxious and appalling aspects of conservatism across the American political and social landscape.

A major influence on him was the caustic social commentary he found in (Thorstein) Veblen's "Theory of the Leisure Class." Mr. Galbraith called Veblen one of American history's most astute social scientists, but also acknowledged that he tended to be overcritical.

"I've thought to resist this tendency," Mr. Galbraith said, "but in other respects Veblen's influence on me has lasted long. One of my greatest pleasures in my writing has come from the thought that perhaps my work might annoy someone of comfortably pretentious position. Then comes the realization that such people rarely read."


I'll pause for a moment while you acquaint (or re-acquaint) yourself with Veblen. He's worth a post all to himself, but I try to keep it light around here.

Galbraith's seminal work was written the year I was born:


"The Affluent Society" appeared in 1958, making Mr. Galbraith known around the world. In it, he depicted a consumer culture gone wild, rich in goods but poor in the social services that make for community. He argued that America had become so obsessed with overproducing consumer goods that it had increased the perils of both inflation and recession by creating an artificial demand for frivolous or useless products, by encouraging overextension of consumer credit and by emphasizing the private sector at the expense of the public sector. He declared that this obsession with products like the biggest and fastest automobile damaged the quality of life in America by creating "private opulence and public squalor."


Almost fifty years ago, and before that by Veblen 107 years ago. How far we have come.

And the call to arms:

"Let there be a coalition of the concerned," he urged. "The affluent would still be affluent, the comfortable still comfortable, but the poor would be part of the political system."


Rest in peace, Mr. Galbraith.


Saturday, April 29, 2006

The Final Theory: two stars



A short comment: a reader has pointed out that right now, the crackpot book by
  • Mark McCutcheon

has the average rating of 2 stars because of 200 one-star reviews that suddenly appeared on the amazon.com website. The new reviews are a lot of fun: many reviews come from Brian Powell, Jack Sarfatti, Greg Jones, Quantoken, David Tong, me, and many others. Many of the readers have written several reviews - and you can see how they struggled to make their reviews acceptable. ;-)

When we first informed about the strange system, McCutcheon's book had the average of 5 stars and no bad reviews at all. The previous blog article about this story is here.

How to spend 6 billion dollars?

Related: What can you buy for 300 billion dollars?
What is the best way to spend 6 billion dollars?
Two weeks of Kyoto
ILC: the linear collider
One month of war in Iraq
Millions of free PCs for kids
Ten space shuttle flights
Free polls from Pollhost.com
Additional comments: the world pays about 6 billion dollars for two weeks of the Kyoto protocol which cools down the Earth by 0.00006 degrees or so. The International Linear Collider would have the capacity to measure physics at several TeV more accurately than the LHC, but it is also more expensive - about 6 billion dollars. The U.S. pays 6 billion dollars for one month of the military presence in Iraq. One could buy 60 million computers for kids if the price were $100 as MIT promises. Whenever you launch space shuttle, you pay about 600 million USD.




Klaus meets Schwarzenegger



Figure 1: Californian leader Schwarzenegger with his Czech counterpart during a friendly encounter in Sacramento. Arnold has accepted Klaus' invitation to the Czech Republic.

When Czech president Václav Klaus visited Harvard, he complained that the capitalism of the European Union is not the genuine capitalism the he always believed - the capitalism as taught by the Chicago school - but rather a kind of distorted, socialized capitalism, something that could be taught here at Harvard. ;-)

Finally, he could have spoken to the peers - at the Graduate School of Business at University of Chicago. The other speakers over there agree with Klaus' opinions. In his speech, he explained that the Velvet Revolution was done by the people inside the country: it was not imported. Equally importantly, Americans have a naive understanding of the European unification because they don't see the centralized, anti-liberal dimension of this process.

This Week in Felons

-- I've been remiss in following the trials of the two Enron scoundrels Skilling and Lay. You've got many sources who have been doing the yeoman's task, and I trust you're already aware that the two men have employed the Sergeant Schultz defense, which I believe will be a losing one.

Update (5/2/06): Houstonist summarizes Lay's bipolarity.

-- Karl Rove is on the verge of indictment for perjury. Will he resign if he is, or continue working to repair the GOP's political fortunes for November? I suppose Fitzo de Mayo sounds as good as Fitzmas.

-- Rush Limbaugh made an arrangement to avoid his drug charge. He smiled broadly for his mugshot, as did Tom DeLay. This appears to be an extraordinarily lax penalty: 'stay clean for a year and a half and we'll drop the charges', essentially. Does it seem as if the state of Florida has really lenient narcotics law enforcement -- or does this only apply to Republican addicts?

-- Finally (for the time being), the Republican lobbying mega-scandal seems to have lately taken a sexual turn. Who could the "one person who holds a powerful intelligence post" be?

Porter Goss, CIA director, whoremonger? What's that likely to mean for national security? If you recall, Goss (in refusing to investigate the Plame leak case) said, "Somebody sends me a blue dress and some DNA, I'll have an investigation." So I suppose we can anticipate a potential case made on the basis of hands-on physical evidence.

You know, photos, call logs, credit card transactions, that sort of thing.

Twenty years after Chernobyl

On Wednesday morning, it's been 20 years since the Chernobyl disaster; see news.google.com. The communist regimes could not pretend that nothing had happened (although in the era before Gorbachev, they could have tried to do so) but they had attempted to downplay the impact of the meltdown. At least this is what we used to say for twenty years. You may want to look how BBC news about the Chernobyl tragedy looked like 20 years ago.

Ukraine remembered the event (see the pictures) and Yushchenko wants to attract tourists to Chernobyl. You may see a photo gallery here. Despite the legacy, Ukraine has plans to expand nuclear energy.

Today I think that the communist authorities did more or less exactly what they should have done - for example try to avoid irrational panic. It seems that only 56 people were killed directly and 4,000 people indirectly. See here. On the other hand, about 300,000 people were evacuated which was a reasonable decision, too. And animals are perhaps the best witnesses for my statements: the exclusion zone - now an official national park - has become a haven for wildlife - as National Geographic also explains:
  • Reappeared: Lynx, eagle owl, great white egret, nesting swans, and possibly a bear
  • Introduced: European bison, Przewalski's horse
  • Booming mammals: Badger, beaver, boar, deer, elk, fox, hare, otter, raccoon dog, wolf
  • Booming birds: Aquatic warbler, azure tit, black grouse, black stork, crane, white-tailed eagle (the birds especially like the interior of the sarcophagus)

Ecoterrorists in general and Greenpeace in particular are very wrong whenever they say that the impact of technology on wildlife must always have a negative sign.





In other words, the impact of that event has been exaggerated for many years. Moreover, it is much less likely that a similar tragedy would occur today. Nuclear power has so many advantages that I would argue that even if the probability of a Chernobyl-like disaster in the next 20 years were around 10%, it would still be worth to use nuclear energy.




Some children were born with some defects - but even such defects don't imply the end of everything. On the contrary. A girl from the Chernobyl area, born around 1989, was abandoned by her Soviet parents, was adopted by Americans, and she became the world champion in swimming. Her name? Hint: the Soviet president was Gorbachev and this story has something to do with the atomic nucleus. Yes, her name is Mikhaila Rutherford. ;-)

If you have Google Earth, you may also

  1. amount of radiation
  2. rectangular cubes
  3. new building

Incidentally, if you want a three-dimensional modelling software, try

The files created by users are shared at

Who cares about $3 gas? The Texans drafted WHO!?!

Bob McNair better hope Mario Williams is real good, real fast, or there will be blood in the streets of H-Town, and not over trivial matters such as immigration or the legislature failing to fund public education or even the exorbitant price of gasoline.

Osama ain't got nothin' on Casserly if this turns out bad.

Friday, April 28, 2006

Net Neutrality (and why you should care)

Sean-Paul Kelley has been leading the charge in the battle to keep the Internet wild and free despite the whining of corporate titans like AT&T's Ed Whitacre, who would rather make us all pay dearly for the privilege of using his "pipes".

(A personal shoutout to Big Ed: I just upgraded my service with you despite the fact that you signed off on the most massive invasion of privacy in recorded history. If you keep trying to shaft the entire world, you'll force me to drop my DSL like a bad transmission and encourage all ten of my loyal readers to sign up with Time Warner Cable. Capice?)

Little would have come of this had it not been for five Democrats in the House, two of which sold out extraordinarily cheap: San Antonio's Charlie Gonzales and Houston's Gene Green. Charles Kuffner, as usual, has the best summary and linkage.

Please go and follow them -- his links, and his suggestions.

Some days I think that the Romans really were onto something

It's been a couple of weeks since I last mocked out the Christians, so let's laugh first at what Lisa sent me:

God Hates Shrimp

... and then at one of the archived "Jesus of the Week" features:

You can’t read it at this size, but the little sweetsie cutesie cross-stitched pillow (go here to see it) next to this fine white lady says “Awaiting Christ’s Return.”

Well, turn around and try not to flip off the porch swing! He’s right behind you!

Is there room for two on that bench?


Please take note of the Son of God's perfectly blow-dried, Aqua-Netted coif in the photo linked above. Never in his entire life has Rick Perry had a hair day that good.

And Christ's Commandant has the last word for Shelley Malkin.

Update (4/29): Pete details how the Christian marketing effort has spread into throwback jerseys.

This Land is Mi Tierra

Thanks for the header and the artwork, Houston Press.

You can get it on a T-shirt, and maybe in time for the Brownout on Monday.

Update: And don't miss this week's "Ask a Mexican".

Update II: Does anybody still care what this stupid bastard thinks about anything?

Thursday, April 27, 2006

Yuval Ne'eman died

Because this is a right-wing physics blog, it is necessary to inform you about the saddening news - news I heard from Ari Pakman yesterday - that Yuval Ne'eman (*1925), an eminent Israeli physicist and right-wing politician, died yesterday.



If you're interested, you can read the article about him on Wikipedia and Peter Woit's blog, much like the text of Yisrael Medad, Ne'eman's political advisor. News summarized by Google are here. In 1961, Ne'eman published a paper with a visionary title
  • Derivation of strong interactions from a gauge invariance
As far as I understand, the symmetry he was talking about was the flavor symmetry which is not really a gauge symmetry. Ne'eman co-authored the book "The Eightfold Way" with Murray Gell-Mann, contributed tremendously to the development of nuclear and subnuclear physics in Israel (which includes the nuclear weapons), and was the president of Tel Aviv University, among many other organizations.

And of course, he has studied QCD, supersymmetry, superstrings, and other things. One month ago, a book with his selected research was released:

Science and fundamental science

Chad Orzel did not like the proposals to build the ILC because they are derived from the assumption that high-energy physics is more fundamental a part of physics than other parts of physics - and he disagrees with this assumption. Instead, he argues that technology is what matters and it does not depend on particle physics. Also, Chad explains that one can have a long career without knowing anything about high-energy physics - which seems to be a rather lousy method to determine the fundamental value of different things.

There are three main motivations why people stretch their brains and think about difficult things and science. We may describe the corresponding branches of science as follows:
  • recreational mathematics
  • applied science
  • pure science
Recreational mathematics is studied by the people to entertain themselves and show others (and themselves) that they are bright. Chess in flash or without it may be viewed as a part of this category. People do this sort of activity because it is fun. Comedians are doing similar things although their work requires rather different skills. In this category, entertainment value is probably the main factor that determines the importance. People do whatever makes them happy and excited. If someone else does things on their behalf, they prefer those with a higher entertainment value. The invisible hand of freedom and the free market pretty much takes care of this activity.

The rules of chess depend on many historical coincidences. Other civilizations could have millions of other games with different rules and the details really don't matter: what matters is that you have a game that requires you to turn your brain on.

Applied science is studied because scientific insights can lead to economical benefits. They can improve people's lives, their health, give them new gadgets, and so forth. The practical applications are the driving factor behind applied science. People, corporations, and scientists pay for applied science because it brings them practical benefits. It is often (but not always) the case that the benefits occur at shorter time scales, and it is possible for many corporations and individuals to provide applied scientists with funding. And if you look around, you will see that many fields of applied science are led by laboratories of large corporations - such as IBM, drug companies, and others.

Pure science is studied because the human beings have an inherent desire to learn the truth. In our Universe, the truth turns out to be hierarchical in nature. It is composed of a large number of particular statements and insights that can typically be derived from others. For equivalent insights, the derivations can work in both directions. In many other cases, one can only derive A from B but not B from A. The primary axioms, equations, and principles that can be used to derive many others are, by definition, more fundamental. The word "fundamental" means "elementary, related to the foundation or base, forming an essential component or a core of a system, entailing major change".

If you respect the dictionaries, the physics of polymers may be interesting, useful, and important - but it is not too fundamental. If Chad Orzel or anyone else offers a contradictory statement, he or she abuses the language. Among the disciplines of physics, high-energy physics is more fundamental than low-energy physics. Moreover, I think that as long as we talk about pure science, being "fundamental" in this sense is a key component of being important. If we want to learn the scientific truth about the world, we want the most fundamental and accurate truth we can get.

I am not saying that other fields should be less supported. Nor am I proposing a hierarchical structure between the people who chose different specializations. What I am saying is that other fields that avoid fundamental questions about Nature are being chosen as interesting not only because of their pure scientific value but also because of their practical or entertainment value.

You may be trying to figure out what happens with a particular superconductor composed of 150-atom molecules under particular conditions. The number of similar problems may exceed the number of F-theory flux compactifications. How can you decide whether a problem like that - or any other problem in science - is important? As argued above, there are many different factors that decide about the answer: entertainment value, practical applications, and the ability to reveal major parts of the general truth. I guess that the practical applications will remain the most likely justification of a particular specialized research of a very particular type of superconductors.

People and societies may have different motivations to study different questions of science. If you extend this line of reasoning, you will realize that people can also do many things - and indeed, they do many things - that have no significant relation with science. And they can spend - and indeed, do spend - their money for many things that have nothing to do with science, especially pure science. And it's completely legitimate and many of these things are important or cool.

When you think about the support of science in general, what kind of activity do you really have in mind? I think that pure science is the primary category that we consider. Pure science is the most "scientific" part of science - one that is not motivated by practical applications. As we explained above, pure science has a rather hierarchical structure of insights.

If something belongs to pure science, it does not mean that it won't have any applications in the future. In the 1910s-1930s, radioactivity was abstract science. By various twists and turns, nuclear energy became pretty useful. There are surely many examples of this kind. The criterion that divides science into pure science and applied science is not the uncertain answer to the question whether the research will ever be practically useful: the criterion is whether the hypothetical practical applications are the main driving force behind the research.

Societies may be more interested in pure science or less interested in pure science. The more they are interested in pure science, the more money they are willing to pay for pure science. A part of this money is going to pure science that is only studied as pure science; another part will end up in fields that are partly pure and partly applied.

Chad Orzel thinks that if America saves half a billion dollars for the initial stages of the ILC collider, low-energy physics will get extra half a billion dollars. I think he is not right. The less a society cares about pure science - even about the most fundamental questions in pure science such as those in high-energy physics - the less it is willing to pay for other things without predictable practical applications or entertainment value. Eliminating high-energy experimental physics in the U.S. would be a step towards the suppression of experimental pure science in general.

Iran may nuke Czechia, Italy, Romania

According to Haaretz, Iran has just received a first batch of BM-25 missiles from its ally in the Axis of Evil, namely North Korea. They are able to carry nuclear warheads and attack countries such as the Czech Republic, Italy, and Romania.

Such a conflict is not hard to start. Imagine that sometime in the future, for example on August 22nd, 2006, Iranian troops suddenly attack Romanian oil rigs on their territory. Romania will respond nervously - and the mad president of Iran will have an opportunity to check out his nukes.
The Czech Republic is, together with England, one of two European countries on an Iranian black list of countries whose citizens are not allowed to get 15-day visa for Iran. Some Muslims in the Czech Republic preach that Islamic Shari'a law should be adopted by Czechia.

The diplomatic relations between Czechia and Iran cooled down 8 years ago when the Radio Liberty (more precisely in Iran: Radio Tomorrow) started to broadcast anti-government programs in Persian from Prague. See here.

US told to invest in particle physics

PhysicsWeb.org

National Academy of Sciences has also recommended the U.S. to invest into neutrino experiments and high-precision tests of the Standard Model to stop the motion of the center of mass of particle physics away from the U.S.

New York Times

Dennis Overbye from the New York Times describes the same story: the ILC must be on American soil. See also Physorg.com and Nature.

CERN new tax

Meanwhile, CERN has adopted the digital solidarity principle: 1% of ITC-related transactions must be paid to CERN.

Wednesday, April 26, 2006

Pomeron

Matt Strassler has just described their fascinating work on
with Richard Brower, Chung-I Tan, and Joe Polchinski. Return 40 years into the past. The research that eventually evolves into string theory is proposed as a theory of strong interactions: something that would be known as a failed theory of strong interactions for the following 30 years. Things only start to slowly change after the 1997 discovery by Juan Maldacena and a steady flow of new insights eventually leads to a nearly full revival of the description of strong interactions using a "dual" string theory, albeit this string theory is more complicated than what was envisioned in the late 1960s. QCD can be equivalently described as the old string theory with some modern updates: higher-dimensional and braney updates.

The basic concepts of the Regge physics included the Regge trajectory, a linear relation between the maximum spin "J" that a particle of squared mass "m^2" can have; the slope - the coefficient "alphaprime" of the linear term "alphaprime times m^2" - is comparable to the inverse squared QCD scale. The dependence of "J" could be given by a general Taylor expansion but both experimentally as well as theoretically, the linear relation was always preferred.

Note that "alphaprime" in "the" string theory that unifies all forces is much much smaller area than the inverse squared QCD scale (the cross section of the proton). We are talking about a different setup in AdS/QCD where the four-dimensional gravity may be forgotten. This picture is not necessarily inconsistent with the full picture of string theory with gravity as long as you appreciate the appropriately warped ten-dimensional geometry.

At this moment, you should refresh your memory about the chapter 1 of the Green-Schwarz-Witten textbook. There is an interesting limit of scattering in string theory (a limit of the Veneziano amplitude) called the Regge limit: the center-of-mass energy "sqrt(s)" is sent to infinity but the other Mandelstam variable "t" - that is negative in the physical scattering - is kept finite. The scattering angle "sqrt(-t/s)" therefore goes to zero.

In this limit, the Veneziano amplitude is dominated by the exchange of intermediate particles of spin "J". Because the indices from the spin must be contracted, the interaction contains "J" derivatives, and it therefore scales like "Energy^J". Because there are two cubic vertices like that in the simple Feynman diagram of the exchange type, the full amplitude goes like "Energy^{2J}=s^J" where the most important value of the spin "J" is the linear function of "t" given by the linear Regge relation above.

The amplitude behaves in the Regge limit like "s^J(t)" where "J(t)" is the appropriate linear Regge relation. You can also write it as "exp(J(t).ln(s))". Because "t=-s.angle^2", you see that the amplitude is Gaussian in the "angle". The width of the Gaussian goes like "1/sqrt(ln(s))" in string units. Correspondingly, the width of the amplitude Fourier-transformed into the transverse position space goes like "sqrt(ln(s))" in string units. That should not be surprising: "sqrt(ln(s))" is exactly the typical transverse size of the string that you obtain by regulating the "integral dsigma x^2" which equals, in terms of the oscillators, "sum (1/n)" whose logarithmic divergence must be regulated. The sum goes like "ln(n_max)" where "n_max" must be chosen proportional to "alphaprime.s" or so.

If you scatter two heavy quarkonia (or 7-7 "flavored" open strings in an AdS/CFT context, think about the Polchinski-Strassler N=1* theory) - which is the example you want to consider - the interaction contains a lot of contributions from various particles running in the channel. But the formula for the amplitude can be written as a continuous function of "s,t". So it seems that you are effectively exchanging an object whose angular momentum "J" is continuous.

Whatever this "object" is, you will call it a pomeron.

In perturbative gauge theory, such pomeron exchange is conveniently and traditionally visualized in terms of Feynman diagrams that are proportional to the minimum power of "alpha_{strong}" that is allowed for a given power of "ln(s)" that these diagrams also contain: you want to maximize the powers of "ln(s)" and minimize the power of the coupling constant and keep the leading terms. When you think for a little while, this pomeron exchange leads to the exchange of DNA-like diagrams: the diagrams look like ladder diagrams or DNA.

There are two vertical strands - gluons - stretched in between two horizontal external quarks in the quarkonia scattering states. And you may insert horizontal sticks in between these two gluons, to keep the diagrams planar. If you do so, every new step in the ladder adds a factor of "alpha_{strong}.ln(s)". You can imagine that "ln(s)" comes from the integrals over the loops.

What is the spin of the particles being exchanged for small values of "t", the so-called intercept (the absolute term in the linear relation)? It is a numerical constant between one and two. Matt essentially confirmed my interpretation that you can imagine QCD to be something in between an open string exchange (whose intercept is one) and a closed string exchange (whose intercept is two). The open string exchange with "J=1" is valid at the weak QCD coupling - it corresponds to a gluon exchange. At strong coupling, you are exchanging closed strings with "J=2".

For large positive values of "t", you are in the deeply unphysical region because the physical scattering requires negative values of "t" (spacelike momentum exchange). But you can still talk about the analytical structure of the scattering amplitude - Mellin-transformed from "(s,t)" to "(s,J)". For large positive "t", you will discover the Regge behavior which agrees with string theory well. Unfortunately, this is the limit of scattering that can't be realized experimentally. Nevertheless, for every value of "t", you find a certain number of effective "particles" that can be exchanged - with spins up to "J" which is linear in "t".

The negative values of "t" can be probed experimentally, and this is where string theory failed drastically in the 1970s: string theory gave much too soft (exponentially decreasing) behavior of the amplitude at high energies even though the experimental data only indicated a much harder (power law) behavior. So now you isolate two different classes of phenomena:
  • the naive string theory is OK for large positive "t"
  • the old string theory description of strong interactions fails for negative "t"; the linear Regge relation must break down here
But the old string theory only fails for negative "t" if you don't take all the important properties of that string theory into account. The most important property that was forgotten 35 years ago was the new, fifth dimension. The spectrum of particles - eigenvalues of "J" - is related to the Laplacian but it is not just a four-dimensional Laplacian; it also includes a term in the additional six dimensions, especially the fifth holographic dimension of the anti de Sitter space. And this term can become - and indeed, does become - important.

What is the spectrum of allowed values of "J" of intermediate states that you can exchange at a given value of "t"? Recall that each allowed value of "J" of the intermediate objects generates a pole in the complex "J" plane - or a cut whenever the spectrum of allowed "J" becomes continuous. For large positive "t", the spectrum contains a few (roughly "alphaprime.t") eigenvectors with positive "J"s, and a continuum with "J" being anything below "J=1". For negative values of "t", you only see the continuum spectrum (a cut) for "J" smaller than one.

Don't forget that the value of "J" appears as the exponent of "s" in the amplitude for the Regge scattering. We are talking about something like "s^{1.08}" or "s^{1.3}" - both of these exponents appear in different kinds of experiments and can't be calculated theoretically at this moment.

Matt argues convincingly that the Regge behavior for large positive "t", with many poles plus the cut below "J=1", is universal. The "empty" behavior at large negative "t" where you only see the continuum below "J=1" is also universal. It is only the crossover region around "t=0" that is model-dependent and where the details of the string-theoretical background enter. And they can calculate the spectrum of "J" as a function of "t" in toy models from string theory. They assume that the string-theoretical scattering in the AdS space takes place locally in ten dimensions, and just multiply the corresponding amplitudes by various kinematical and warp factors - the usual Polchinski-Strassler business.

The spectrum of poles and cuts in the "J" plane reduces to the problem to find the eigenvalues of a Laplacian - essentially to a Schrödinger equation for a particle propagating on a line. You just flip the sign of the energy eigenvalues "E" from the usual quantum mechanical textbooks to obtain the spectrum of possible values of "J". And they can determine a lot of things just from the gravity subsector of string theory - where you exchange particles of spin two (graviton) plus a small epsilon that arises as a string-theoretical correction.

For large positive "t", you obtain a quantum mechanical problem with a locally negative (binding) potential that leads to the discrete states - those that are seen at the Regge trajectory.

When all these things are put together, they can explain a lot about physics observed at HERA. The calculation is not really a calculation from the first principles because they are permanently looking at the HERA experiments to see what they should obtain. But they are not the first physicists who use these dirty tricks: in the past, most physicists were constantly cheating and looking at the experiments most of their time. ;-)

Rae Ann: Alien recycling

By Rae Ann, one of the four winners who have seen the #400,000 figure.

My first grader brought home some interesting EPA publications for school children. While I totally support teaching children to recycle and be mindful of wise use of resources I think it's a little off to tell them that 'garbage leads to climate change'.
And what's with the little flying saucers and aliens (graphics in the publications)? What do they have to do with climate change and garbage?? One publication does open with the statement, "Space creatures might think the idea of reusing containers is an alien concept but here on Earth it's easy to keep an old jar out of the trash and give it new life." (That is a direct quote and the missing comma is their punctuation error.) Well, how does the government know that aliens don't recycle? Is it because they have left a bunch of their stuff here? Hmm? Sounds like a very prejudiced and discriminatory attitude to me. What is that teaching our kids about aliens??




I'm sure some homeschoolers would use this as another reason not to send their kids to public (government) schools, but I use things like this to open up a dialogue about different points of view and how it's good to analyze what you read and are told. It's a good lesson for the kids to learn that while something has some points of value it might also have questionable content.

Also TVA uses methane gas from landfills to produce energy...

Tuesday, April 25, 2006

The Czech Fabric of the Cosmos

My friend Olda Klimánek has translated Brian Greene's book "The Fabric of the Cosmos" into Czech - well, I was checking him a bit, reading his translation twice - and the book was just released by Paseka, a Czech publisher, under the boring name "Struktura vesmíru" (The Structure of the Universe). The other candidate titles were just far too poetic.

I think he is a talented writer and translator and there will surely be many aspects in which his translation is gonna be better than my "Elegantní vesmír" (The Elegant Universe).

What I find very entertaining is the different number of pages of this book (in its standard hardcover editions) in various languages:
  • Czech: Struktura vesmíru, 488 pages
  • Polish: Struktura kosmosu, 552 pages
  • English: The Fabric of the Cosmos, 576 pages
  • Portuguese: O tecido do cosmo, 581 pages
  • Italian: La trama del cosmo, 612 pages
  • French: La magie du Cosmos, 666 pages
  • Korean: 우주의 구조, 747 pages
  • German: Der Stoff, aus dem der Kosmos ist, 800 pages

I am not kidding and as far as I know, Olda's translation is complete. If you need to know, 800/488 = 1.64. ;-) The Czech Elegant Universe was also much shorter than the German one but the ratio was less dramatic.

I like the rigid rules of German but this inflation of the volume is simply off the base. The Czech language has similar grammar rules but it avoids the articles and it has much more free word order. A slightly more complex system of declination removes many prepositions. And Olda may simply be a more concise translator. :-)

Uncle Al: on the equivalence principle

By Uncle Al who has submitted a #400,000 screenshot.

Does the Equivalence Principle have a parity violation? Weak interactions (e.g., the Weak Interaction) routinely violate parity conservation. Gravitation is the weakest interaction.

Either way, half of contemporary gravitation theory is dead wrong.


Gravitation theory can be written parity-even or parity-odd; spacetime curvature or spacetime torsion. Classical gravitation has Green~Rs function Newton and metric Einstein or affine Weitzenböck and teleparallel Cartan. String theory has otherwise and heterotic subsets. Though their maths are wildly different, testable empirical predictions within a class are exactly identical...


...with one macroscopic disjoint exception: Do identical chemical composition local left and right hands vacuum free fall identically? Parity-even spacetime is blind to geometric parity (chirality simultaneously in all directions). Parity-odd spacetime would manifest as a background pseudoscalar field. The left foot of spacetime would be energetically differently fit by a sock or left shoe compared to a right shoe.


String theory could be marvelously pruned. Does a single crystal solid sphere of space group P3(1)21 quartz (right-handed screw axes) vacuum freefall identically to an otherwise macroscopically identical single crystal solid sphere of space group P3(2)21 quartz (left-handed screw axes)? Both will fall along minimum action paths. In parity-odd spacetime those local paths will be diastereotopic and measurably non-parallel -- a background left foot fit with left and right shoes.

A chiral background pseudoscalar field diverges Big Bang evolution of matter and antimatter. It sources biological homochirality (exclusively left-handed chiral protein amino acids and right-handed chiral sugars). At a 10^(-10) difference/average or smaller level it is consistent with 420+ years of physical observations.

The world has lots of Eötvös balances with 10^(-13) difference/average sensitivity. The proper challenge of geometric gravitation is test mass geometry. More than 2100 tonnes of single crystal quartz are grown annually. Somebody should look.

Monday, April 24, 2006

Frank Wilczek: Fantastic Realities

Technical note: Everyone who will be a visitor number 400,000 and who will submit an URL for the screenshot proving the number today will be allowed to post any article on this blog, up to 6 kilobytes. The reader #400,000 was Rae Ann who just returned from a trip - what a timing. :-) UncleAl has still opened the page (reload) when it was showing #400,000, much like Doug McNeil. I have no way to tell who was the first one. The others just reloaded the page and obtained the same number because it was not their first visit today, and it thus generated no increase of the counter. Congratulation to all three.

fantastic realities

Yes, I just saw this irresistable book cover at Betsy Devine's blog. The book is called
and Frank Wilczek is apparently using a QCD laser. The journeys include many of Wilczek's award-winning Reference Frame columns. Have you heard of Wilczek's Reference Frame columns in Physics Today? Let me admit that I have not. ;-)

Because of the highly positive reviews, your humble correspondent has just decided to double the number of copies that Frank Wilczek is going to sell. Right now, the yesterday's amazon.com rank is 100,000 and today's rank is 130,000. Look at the promotional web pages of the book, buy the book, and see tomorrow what it does with the rank. Remember that the rank is approximately inversely proportional to the rates of selling the books.

Update: At 7:00 p.m., the rank was about 11,000, better than 136,000 in the morning. On Wednesday 8:30 a.m., the rank was 9,367, an improvement by a factor of fifteen from the rank 24 hours earlier.

The promotional web pages also reveal that Betsy is proud to be the 4th Betsy found by Google. Congratulations, and I wish her to capture the most important Frank Wilczek blog award, too. ;-)

Bruce Rosen: brain imaging

Bruce Rosen started the colloquium by saying that it is useful to have two degrees - PhD and MD - because every time he gives a talk for the physicians, he may impress them by physics, and every time he speaks in front of the physicists, he may impress them by medicine. And he did.

Although there are many methods to study the anatomy and physiology of the brain - such as EEG and/or flattening the brain by a hammer which is what some of Rosen's students routinely do - Rosen considers NMR to be the epicenter of all these methods. (This is a conservative physics blog, so we still refer to these procedures as NMR and not MRI.) This bias should not be unexpected because Rosen's advisor was Ed Purcell.

Some of the results he shown were obtained by George Bush who is an extremely smart scientist as well as psychiatrist, besides being a good expert in B-physics.

Rosen has shown a lot of pictures and videosequences revealing how the activity of the brains depends on time in various situations, on the presence of various diseases, on the age, and on the precise way how the brains are being monitored. Many of these pictures were very detailed and methods already exist to extract useful data from the pictures and videos that can't be seen by a naked eye.

Human brains are being observed at 10 Tesla or so, and magnetic field of 15 Tesla is the state-of-the-art environment to scan the brains of smaller animals. The frequency used in these experiments is about half a gigahertz. Many tricks how to drastically reduce the required amount of drugs that the subject must take before the relevant structures are transparent have been found.

Most of the data comes from observations of water that is a dominant compound in the human body and not only the human body. It turns out that the blood that carries oxygen and the blood that carries carbon dioxide is diamagnetic and paramagnetic, respectively. That simplifies the NMR analysis considerably.

There's a lot of data in the field and fewer ways to draw the right conclusions and interpretations out of the data.

Other takes on the past week's events

David Sirota on his visit to Deep-In-The-Hearta:

When I boarded the plane this past Friday to head to Austin, Texas, I didn’t know what to expect. As I told some folks while I was down there, usually when I think of Texas I think of George W. Bush - not a great image. But now that I am back from the weekend, I am inspired - there is some real progressive energy down there, both in Austin, and in Texas as a whole. You can see some video here and hear some audio here from the big panel event the Texas Observer held for Hostile Takeover - it featured me, Congressman Lloyd Doggett, Molly Ivins, author Robert Bryce and Texans for Public Justice’s Craig MacDonald.

During the weekend I had a chance to meet progressives like David Van Os, the Democratic nominee for Texas Attorney General, state Rep. Garnet Coleman, state Rep. Elliott Naishtat, and the hard-working members of the Progressive Populist Caucus. The disgust with what the right-wing is doing to Texas is palpable - and I came away from my Lone Star State experience feeling hopeful that that state does not have to stay red forever.


Joy Demark, via Vince Leibowitz, on the SDEC meeting where Boyd Richie was selected interim chair of the Texas Democratic Party, also in Austin this weekend past:

It is painfully obvious that most Republicans are planning to starve public schools out of existence in Texas. They are going about it in a most vicious and systematic manner.

The Texas Young Democrats met and elected officers in Austin this weekend -- that was the third thing happening.

DKS, posting at the new Texas Kos, on the Filibuster for Education (just a week in the rear-view mirror):

Imagine yourself just after sundown, sitting on the grass in front of the steps of the state capitol, and a country lawyer is talking about the state constitution. Soon, he asks a question of you, wanting to know what you think of a particular point, or wanting to know if you have a story about what he's discussing. You find, after a few second's thought, that you do have something to say, and you slowly begin to articulate an idea or experience, or even another question. Something very deep inside you begins to emerge, and you find that what you have to say in response, opens up a spring in others. You feel intensely engaged; learning, teaching, questioning. You are not there to score points, to show how smart or informed you are. You along with the 20 or so others and that country lawyer, are, you suddenly realize, practicing democracy. Then it hits you - Elliot Shapleigh is over there on the grass, listening, as is Maria Luisa Alvarado, a tourist from Pakistan, a capitol guard, a few students, a bag lady, and sundry assorted folks - they're all listening to you. And the warm, breezy dark is like a caress, the bottle of cold water somebody hands you is like the finest bourbon, you can smell newly-mown grass, and you remember that this is the real world you want to make better; protecting this for the future is why your presence is so important to these others, yourself and our posterity.


Vince also has a nice summary of public school financing in Texas, and has been live-blogging the special session. Don't miss it.

OVV in higher dimensions?

Brett McInnes proposes a generalization of the Hartle-Hawking approach to the vacuum selection problem pioneered by Ooguri, Vafa, and Verlinde (OVV) and described by this blog article to higher dimensions. McInnes identifies the existence of two possible Lorentz geometries associated with one Euclidean geometry as the key idea of the OVV paradigm. He argues that the higher-dimensional geometries must have flat compact sections which is certainly a non-trivial and possibly incorrect statement:

Everything you wanted to know about Langlands

... geometric duality but you were afraid to ask could be answered in this 225-page-long paper by Edward Witten and Anton Kapustin:
Previous blog articles about the Langlands program the following ones:

A semi-relevant discussion about related topics occurs at Not Even Wrong.

In the article, one learns that most of the miraculous results in mathematics that go under the name "geometric Langlands program" follow from a small portion of a decoupling limit of a topologically twisted background of string theory. ;-)

In the fast comments under this text, X.Y. argues that the non-trivial Langlands program is the arithmetic one, not the geometric one.

Sunday, April 23, 2006

Translation and related news

Just a technical detail: I've added two utilities to the web pages of individual articles:
  • related news and searches, powered by Google (blue box under each article)
  • translations of the blog articles to German, French, and Spanish, powered by Google (three flags at the top of the articles)

I apologize to the readers from the remaining 142 countries that also visit this website - according to the Neocounter - besides the three countries indicated above that their language has yet to be included. :-)

Recent comments

Also, "recent comments" were added to the sidebar of the main page. The recent slow comments in the lower Manhattan (skyscraper) area are sorted according to the corresponding article. You may find out which article the comment belongs to if you hover over the timestamp. You can also click it.

There are also ten "recent fast comments" in a scrolling window at the upper portion of the sidebar.

Leonard Susskind Podcast

I am just listening a podcast with Leonard Susskind. You can find the link somewhere on this page. I will add it here later. Then you click "Podcasts" on the left side, and the second one is Susskind: the 5.57 MB is 23:55 long. Entertaining, recommended.




See another MP3 with the Godfather of string theory.

Saturday, April 22, 2006

Manic Miner

Manic Miner...

Manic miner flash game removed from the page because it was making a lot of noise. Please click at the second "Manic miner".

How many people used to play such things 20 years ago? Links to previous flash games on this blog can be found here.

PageRank algorithm finds physics gems

Several colleagues from Boston University and from Brookhaven have proposed a method

to look for influential papers using the same algorithm that Google uses to rank web pages. This algorithm uses the list of web pages (or papers) and the links between them (or citations) as input. The web pages or papers are nodes of a graph and the citations are oriented links. It works as follows:

You have lots of "random walkers". Each of them sits at some paper XY. In each step, each random walker either jumps to a random paper in the full database, with probability "d", or it jumps to a random paper mentioned in the references of the previous paper XY, with probability "1-d". Once the number of random walkers associated with each paper reaches equilibrium (approximately), the algorithm terminates. The number of walkers at each paper gives you the rank.

As Alex Wissner-Gross has explained me, the algorithm is equivalent to a Monte Carlo approach to the diagonalization of the adjacency matrix.

This method differs from the naive, straightforward sum of all citations in two basic respects:
  1. In the PageRank framework, it is better to be cited by influential papers.
  2. In the PageRank framework, it is better to be cited by papers with a small number of references i.e. to be rather special among these references.
When the colleagues applied this method to a graph of 300,000+ papers, they discovered the true undercited gems. Two of the four most spectacular ones were

The first paper was written by Eugene Wigner whose statement that particles transform as representations of the Lorentz group has been under a severe attack of algebraic quantum field theorists; and by Frederick Seitz, an eminent scientist who was recently under a heavy and immoral attack of anonymous terrorists, fashionable communists, and envirofascists.

The second paper by Feynman and Gell-Mann has revealed the A-V (axial vector minus vector) structure of the four-fermion weak interactions. The paper is also important as an example how very good theorists are right even if renowned experiments say that the theorists are wrong: the renowned experimenters argued that the weak interactions were S-T (scalar or tensor exchange) but they made an error: their conclusions were based on the last point of their graph which is why Feynman identified them - in an explosion of anger - as morons as soon as he open the experimental article in Phys. Rev. ;-)

Via PhysicsWeb.ORG.

Illinois: Particle Accelerator Day

Illinois' governor has declared this Saturday (or Friday?) to be the

and everyone must celebrate. Mr. Blagojevich is trying to attract the future linear ILC collider to his state. Congratulations, Argonne and Fermilab. PhysOrg.com illuminates some ILC attempts of these two facilities here. Meanwhile, on the same day, the celebrations of the Earth Day, invented by John McConnell, dominate in Massachusetts. Those who are already fed up with the Earth - and with Google Earth - may try Google Mars.

Via JoAnne.

Back to Austin today

for a little party business, a meeting with a leading progressive blogger (and one of my personal favorites), and some of that Sixth Street magic.

More with pictures to come.

Friday, April 21, 2006

Detlev Buchholz: algebraic quantum field theory

Prof. Detlev Buchholz who is a rather famous researcher in the algebraic quantum field theory community has given the duality seminar today and we had a tête-à-tête discussion yesterday.

He has attempted to convert the string theorists to the belief system of algebraic quantum field theory which is not a trivial task. Algebraic quantum field theory is a newer version of the older approach of axiomatic quantum field theory.



In this approach, the basic mathematical structure is the algebra of bounded operators acting on the Hilbert space. In fact, for every region R, you can find and define a subalgebra of the full algebra of operators, they argue. A goal is to construct - or at least prove the existence of - quantum field theories that do not depend on any classical starting point.

This is a nice goal. Because the string theorists know S-dualities and many other phenomena in field theory and string theory which imply that a quantum theory can have many classical descriptions - more precisely, it can have many classical limits - we are certainly open to the possibility that we will eventually be able to formulate our interesting theories without any direct reference to a classical starting point. Instead, we will be able to derive all possible classical limits of string/M-theory from a purely non-classical starting point.

On the other hand, the particle physics and string theory communities are deeply rooted in experimental physics and we simply do not want to talk about some abstract concepts without having any particular theory that can at least in principle predict the results of experiments and that respects these concepts. In fact, we want to focus on theories of the same kind that are relevant for observational physics.




There is some kind of trade-off going on here: you want to be general, but on the other hand, you want to learn or predict something relevant about the observed physical phenomena and make progress.

Having a classical limit makes things extremely concrete and calculable which is one of the reasons why quantum theories constructed as a quantization of a particular starting point have been so successful, especially whenever they're weakly coupled. There exist extremists who would like to ban any physics that is based on a classical starting point or a background - and Prof. Buchholz is arguably not one of them.

As you can see, this trade-off between generality and relevance is completely analogous to the question of background independence. We would eventually like to have a theory that allows all types of backgrounds we need - but on the other hand, we must know that our assumptions still allow for the interesting theories that are relevant for experiments. And moreover, we must still be able to find new and true insights about these theories.

As you can guess, your humble correspondent thinks that both the loop quantum gravity community as well as the algebraic quantum field theory community are far from the ideal equilibrium: their hypothetical idealized theories are so carefully disconnected from perturbative expansions around the classical limits and from backgrounds that they are also rather safely disconnected from observationally relevant insights.

Prof. Buchholz has introduced us to some basic axioms of algebraic quantum field theory. The more concrete part of the talk was a mathematical proof of the existence of a particular infinite-dimensional family of integrable but interacting two-dimensional theories. Because I am using the word "theory" with a less general meaning than the meaning in algebraic quantum field theory, the term "certain formulae" would seem as a more accurate description to me. My knowledge of these rigorous tools was not sufficient for me to understand in which sense the proof was non-trivial and I cannot comment on this particular issue.

Differences

There are four main groups ot topics which are answered and interpreted very differently by the AQFT community and the conventional high-energy physics community:
  • the ultraviolet physics
  • the infrared physics
  • the physics in between
  • the philosophy of physics
Let me discuss them one by one.

Ultraviolet physics

A conventional 21st century particle physicist or string theorist would tell you that the methods of the renormalization group are the key to understand the actual interrelations between short-distance physics and long-distance physics: the details of short-distance physics only influence long-distance physics through a small number of parameters. These parameters can be determined without the exact knowledge of the short-distance physics. Physical phenomena can be organized according to the length scale or the energy scale. The details of short-distance physics, especially the coefficients of very high dimension operators, simply become unimportant or irrelevant at long distances.

This system of thinking allows us to determine what the experiments actually tell us and what they don't. Also, the fact that the laws are organized according to the scale explains why the humans could have made a gradual progress in the first place. This philosophy also puts the ultraviolet, short-distance divergences under the proper light: these divergences are not real physical inconsistencies because they can be cured by some kind of short distance physics or a regulator.

In renormalizable theories, one can show that physics at energies well below the cutoff scale - the energy scale above which the theory breaks down - is determined just by a few (finitely many) parameters. If this is the case, a predictive theory exists; otherwise it does not. These principles are very important for us. Even in string theory which eliminates ultraviolet divergences completely and which is valid and exact at arbitrarily short distance scales, we find the renormalization group important. The insights of the renormalization group are also important in condensed matter physics. The renormalization group equations can be related to the dependence on the holographic dimension in the AdS/CFT correspondence.

Prof. Buchholz has strengthened my feeling that these RG insights play no role in algebraic quantum field theory; they seem to be entirely ignored. And yes, I even think that the concept of the idealized operator algebra is inconsistent with the insights of the renormalization group. The operators in quantum field theory are only finite with respect to a particular renormalization mass scale; they're well-defined after we choose a renormalization scheme. The rigorous operator algebra seems to be an attempt to remove this dependence - i.e. to return to the naive picture where the cutoff is effectively sent to infinity. I feel that it can't work, not even in UV complete theories. Theories in low spacetime dimensions may be an exception.

Note that in conformal field theories, the correlator of two operators of the same dimension "Delta" goes like the distance of the two points to the power of "-2Delta". The composite operators have anomalous dimensions and can't be interpreted as simple products of the "elementary" operators in some algebra. Products of two operators that exactly coincide is always a singular object. Such a singular product occurs even if you multiply two operators that are averaged over some regions of space. A choice of the UV cutoff or a renormalization scale is necessary to define products in an algebra. Its power gives the products the right dimensions.

An important group of questions in algebraic quantum field theory is apparently the choice of the test functions that define how the operators are smeared over finite regions. The researchers seem to worry about various "paradoxes" such as the ability to find negatively-normed states in the perturbative expansion with a sufficiently pathological choice of the test functions.

I find all these constructions completely unphysical. The existence of such pathological constructions does not mean that there is anything wrong with the theories because the pathological test functions don't correspond to anything that can occur in the experiments, not even in principle. In field theory, even if it is UV complete, we should first define a cutoff and we should never consider test functions that are changing drastically at distances shorter than the cutoff. This implies no limitation of predictivity because the cutoff energy scale can be chosen arbitrarily high and the measurable results at finite energies can be shown to be independent of the cutoff as long as the energy cutoff is high enough.

Infrared physics

According to conventional 21st century particle physics as we teach it, the nature of infrared divergences is very different from the character of the ultraviolet ones. They cannot be eliminated by a more careful definition of the laws of physics: the infrared divergences are real. They inform us that we have asked a wrong question. If you collide an electron with a positron and tell them to annihilate, you produce two hard photons but the one-loop diagrams contributing to this production are infrared-divergent.

What do you do with this infinity? You realize that you have asked a wrong question. You should not ask the question what is the cross section to create two photons from the initial electron-positron pair. You should ask what is the cross section to create two photons plus an arbitrary number of photons whose energy is so small, below "epsilon", that you can't really detect them.

Once you admit that you can't measure the photons whose energy is below "epsilon", you eliminate the infrared divergences from the loop diagrams, and the part that diverges for "epsilon" approaching zero actually cancels against the cross section where you produce two visible photons plus one additional invisible photon whose energy is below "epsilon". If you don't like this "epsilon", you should realize that its value can be chosen arbitrarily small. If you choose any finite "epsilon", the infrared divergences disappear. The measurable results are independent of "epsilon".

For any finite "epsilon", the infrared problems are absent. In spacetimes of higher dimensionality, these infrared problems do not arise at all. They don't arise even though the higher-dimensional theories are closely related to the four-dimensional ones; the latter can be thought of as compactifications of the former. The infrared divergences in "d=4" don't mean that the internal mathematical structure of the theories in "d=4" and "d=6" is completely different; they just mean that we must carefully choose the questions and the nature of the right questions may depend on the dimension.

At any rate, if you ask physically meaningful questions with a cutoff, these problems become non-problems. Moreover, you will always be allowed to assume that the Feynman diagrams with additional vertices will be suppressed relatively to the diagrams with fewer vertices: they will be suppressed by the power of the fine-structure constant. The perturbative expansion in nice theories such as QED works whenever the coupling constant is small and whenever we ask meaningful questions. Prof. Buchholz seems to disagree. Algebraic quantum field theory seems to think that the perturbative expansions are lethally ill in some sense, and I have not understood the justification of this feeling.

Both in the case of the ultraviolet as well as the infrared divergences, the fact that the loop diagrams diverge means that you must deal with these divergences properly, and if you do so, the higher-order terms will always be suppressed by powers of the fine-structure constant because all the coefficients will become finite. Tree level diagrams, whenever they are non-zero, dominate over the loop diagrams in all physically meaningful observables.

The algebraic quantum field theory community seems to disagree. Just because a loop contribution is infrared divergent, they assume that the "infinite" higher-order terms dominate over the lower-order terms. They think that the higher-order terms make the theory inconsistent; they make the perturbative expansion break down; and some actual correlators become either zero or infinity. All these conclusions are incorrect, I think. Things can only become zero or infinity if we ask unphysical questions. And we should never do so.

There also exists another "lore" in algebraic quantum field theory that I find irritating: the infraparticle. Unfortunately, with all my respect for Prof. Buchholz, he is associated with this concept. The statement is that one-particle (one-electron) states that are contained within the same superselection sector of the Hilbert space can't form a representation under the Lorentz group, just because of the existence of soft photons that are produced in most realistic processes.

Of course, I disagree with this description. Wigner's classification of one-particle states is an absolutely robust quantum conclusion that works at any coupling and does not depend on the perturbative expansions. Start with any kind of dirty electron you want. Wait for a long enough time. Every photon that can escape to infinity will escape. In the limit, you end up with the electron, a single-particle state, with the smallest possible energy. Put an arbitrarily large box around the electron - that should still be smaller than the waiting time. What you end up with is an arbitrarily good approximation of the Minkowski space.

The punch line is that you can isolate "pure" one-electron states in the Minkowski space. For two electrons very far from each other, you can also find, by locality, a two-particle state made out of two "pure" electrons as a tensor product. For nearby particles, the simple tensor product construction won't be accurate because of their interactions but there is a construction of the right Hilbert space whose basis can be chosen to be a Fock space of all particles and their bound states.

We always consider initial states that contain the pure particle states without any kind of "soft cloud" that would raise their energy. Many soft massless particles may be contained in the final state, and we deal with them by introducing the infrared cutoff as explained above. If you add analogous extremely soft particles (below the measurable threshold) to the initial state, you can show that measurable physics won't be affected either. There is no open question here and there is no room for violations of Wigner's unitary action of the Lorentz symmetry on one-particle states.

The question of soft photons can be confusing - but it can also be completely comprehensible and meaningfully answered. In algebraic quantum field theory, the first alternative occurs; in conventional 21st particle physics, it is the second alternative that we believe to be correct. I prefer the second alternative, too: things should be clear especially if they are clear. Incidentally, these "soft photon clouds" could also obscure the uniqueness of the vacuum itself.

Cumrun Vafa was surprised that Prof. Buchholz did not include the uniqueness of the vacuum among his axioms - and instead offered slightly confusing comments about the possibility to work with "mixed states" containing "different vacua". I find mixed states constructed from vacua in different superselection sectors very problematic.

Physics in between

There are differences in the understanding of ultraviolet and infrared effects. But there are differences in between, too. Rajesh Gopakumar asked the same question as I did yesterday: is the algebraic quantum field theory formalism powerful enough to include gauge theories - quantum field theories whose importance for the 21st century physics can't be overstated? Does the formalism allow gauge-non-invariant operators as intermediate results that are eventually removed by imposing gauge invariance and/or BRST invariance? Is there a room for the Faddeev-Popov ghosts? If these things are not allowed and if we must work with gauge-invariant operators only, how could we ever derive the contributions of the Faddeev-Popov ghosts to one-loop quantities such as the beta-function?

I think that the answer that both of us obtained has made it rather clear that the field of algebraic quantum field theory has no idea how to deal with these questions that have become a routine in high-energy physics. Similar comments apply to other important phenomena such as gauge anomalies. How would you ever prove, in the context of algebraic quantum field theory, that the Standard Model is anomalous without the leptons? Note that this question can be formulated without any reference to the particular classical Lagrangian: we can describe the theory by its spectrum of spin 1/2 and spin 1 massless particles (let's avoid the Higgs mechanism in this gedanken experiment). The diagrams that calculate the anomaly can look like perturbative diagrams but it is also true that the one-loop contribution is actually exact. There are no further corrections.

Moreover, the concept of local operators and operators that can be uniquely associated to spacetime regions seems to contradict many insights about dualities. Strong-weak dualities typically interchange elementary excitations with solitons. Elementary excitations are created by local operators but solitons are non-local, extended configurations or solutions. It is clear that at a generic value of the coupling constant, a democracy between the elementary excitations and the soliton arises and these states can't be quite localized. All of them must be slightly extended. And they are mutually non-local. We can estimate the size of the objects in many cases. The rules of the operator algebras seem too strict and too point-like. I feel that many particular developments in gauge theories find no support in algebraic quantum field theory and they are perhaps marginally inconsistent with it, at least in four dimensions and higher:
  • gauge invariance, the contributions of Faddeev-Popov ghosts in loops
  • spontaneous symmetry breaking
  • chiral symmetry breaking and supersymmetry breaking by fermionic bilinears
  • strong-weak duality
  • non-commutative extensions of field theories
  • anomaly cancellation as a non-trivial condition on the spectrum
  • and, of course, the existence of gravity, diffeomorphism symmetry, all of its extensions from string theory, and holography
Differences in philosophy

This leads me to the final category of differences: the differences in philosophy. I find the approach of algebraic quantum field theory dogmatic and disconnected from the actual experiments and from the principles whose importance may be deduced from the experiments. When we are working on theoretical physics, the contact with experiment is the ultimate judge. Because we are often working on questions that are hard to test by direct experiments, the contact cannot be direct.

But we still care about an indirect contact. Our theories must be predictive at least in hypothetical universes that are qualitatively analogous to ours. They must contain qualitatively analogous ingredients that turned out to be important for the description of actual physics around: unitarity, gauge symmetries, diffeomorphism symmetries, Lorentz invariance, dependence of physics on the scale, Higgs mechanism, gauge anomaly cancellation, and so forth. I think that the approach to physics in which these ingredients of our picture of the universe are treated as details that are less important than some untested "principles" - such as the existence of cutoff-independent algebra, background independence - is a scientifically flawed approach.

Our experience with string theory teaches us that if we insist on the right principles, those that have actually been proved important by the experiments, we also obtain algebraically beautiful mathematics that fits together, gives us new and fresh ideas how to solve old puzzles, and allows us to answer questions that mathematicians found difficult, by applying our physics intuition.

I am afraid that the approach of algebraic quantum field theory is very different. It starts with some dogmas - and some of them are most likely not satisfied in the real world. As explained above, the approach seems to require that you can define all physical quantities without any reference to energy scales, ultraviolet cutoff, and the infrared cutoff. It even insists that you must be able to combine the operator distributions with arbitrary test functions, and still expect nice results from your theories.

Many of these assumptions seem incorrect to me. Algebraic quantum field theory respects the principles of quantum mechanics - the principles that are relevant to calculate the "1/n^2" spectrum of the Hydrogen atom. But it fails to respect the new subtle features that distinguish quantum field theory from simple quantum mechanics, especially those related to the correct role of cutoffs of both types and to gauge symmetries.

Also, the algebraic quantum field theory approach does not seem to ask the question whether we are asking the right questions or not; whether we are making reasonable assumptions; whether the mathematical structures that satisfy our axioms are interesting. Some axioms are probably legitimate and they are satisfied by the physically relevant and mathematically attractive theories. But these axioms may be too weak. Some other axioms that are added could be, on the contrary, too strong: they may be violated by the physically relevant and mathematically beautiful theories that we normally study.

It is very important to know whether our axioms have the secret power to lead us to new interesting theories that may describe reality and that will be praised as mathematically pretty in the future. It is very important to think whether the insights that seem to be true are true exactly or just approximately. The feedback mechanism is very important. If a direct feedback from the experimenters is impossible, we must rely on the feedback of theoretical experiments. I mean the work of colleagues who investigate particular examples and who can say whether a general framework is interesting - or whether it is too loose or too constraining.

The question which questions in physics are good questions is itself a scientific question, even though it may look vague. This meta-question must be studied without prejudices, by following the scientific method. This is my main reason why I feel that most of the conceptual assumptions of algebraic quantum field theory have been superseded by more concrete and more coherent insights, many of which have already been proved experimentally.

Evolving proton-electron mass ratio?

Update: In 2008, a new experiment with ammonia found no time-dependence in the last 6 billion years.
Klaus Lange has pointed out that

describes a Dutch experiment performed primarily in the European Southern Observatory - hold your breath, this observatory is located in Chile. They measured the spectrum of the molecular hydrogen that depends on the proton-electron mass ratio "mu".

Note that this ratio is about 1836.15. Twenty years ago I played with the calculator and it turned out that this number can be written as

  • 6.pi^5 = 1836.12.

This agreement has promoted me to the king of all crackpots: with only three characters, namely six, pi, five, I can match around 5 or 6 significant figures of the correct result.

Actually my calculator only had 8 significant figures (with no hidden figures) and I exactly matched 8 significant figures of 1836.1515 written in the mathematical tables of that time.

Later I learned that someone else has actually published this "discovery" fifty years ago, and the agreement got worse with better calculators and better measurements in particle physics.

More seriously, the Dutchmen now claim that the ratio was 1.00002 times higher twelve billion years ago. The New Scientist immediately speculates that this could prove extra dimensions or string theory. I, for one, have absolutely no idea where this statement comes from. I personally believe that these constants have been constant in the last 12 billion years - and moreover this opinion is completely and naturally compatible with string theory.




But of course, if some additional serious experimenters besides our European colleagues from Chile confirm the result, we will have to give an explanation. We will either have to add a new rolling scalar field analogous to quintessence that couples to the Standard Model nontrivially, or we will have to think about some dramatic non-local cosmological influences on local physics. Both approaches seem rather implausible to me and because I can't think of a third one, my guess is that the result will go away.

Alternatively it may be explained by some conventional physics that I am not able to identify right now - such as strong magnetic fields in the environment.