Thursday, November 30, 2006

Windows Vista

for businesses released



The counter above - try to drag it - resembles Vista's aero graphical user interface.

It's been five years since Microsoft released Windows XP. Right after 9/11/2001, I bought a computer with XP Home pre-installed, long before XP was released officially. It was rather clear that XP would be better than all previous operating systems and most of us would agree that the expectations were confirmed. It has served well for five years and the two Service Packs were the only major changes we received in five years.

Recall that Windows started as an application under MS-DOS: Windows 3.1 reproduced some concepts from the Apple systems within a more flexible framework. The next major step were Windows 95, the first intrinsically Windows-based operating system. Small updates of Windows 95 were called Windows 98 and Windows ME while Windows NT was a more stable version of the Windows product that never became popular with the consumers. Windows 2000 merged the accessibility of the Windows 95 group with the stable kernel of Windows NT. Finally, Windows XP brought a new design and many other things at the end of 2001.

The next edition of Microsoft's operating system used to be called Longhorn for some time but Windows Vista was chosen instead in 2005. Now it's out for businesses and the consumers will start to buy it at the end of January 2007. About 100 million copies of Vista will be sold in 2007. Microsoft will also sell about the same number of copies of Office 2007 and the new server software.



Bill Gates has famously said that 640 kB should be enough for everyone. Windows Vista expects 1,000,000 kB to run well but the hardware requirement are not astronomical and 60% of the new computers sold in 2007 will run Vista. Vista has many subeditions that are somewhat more complicated than the editions of XP. Vista comes with a brand new design including various new animations and glassy semi-transparent windows.

Some of the features that Windows Vista offers:
  • Aero UI - the glassy frames of the windows
  • Instant search - based on indices
  • New Explorer
  • Address Space Layout Randomizer - kills 99% of remote attacks by randomly moving the location of system files
  • Security - improved Defender, Firewall, Security Center, Phishing Filter, Parental Control, Windows Update
  • Windows ReadyDrive - a support for hybrid hard disks by Samsung and others
  • Windows ReadyBoost - a mechanism that allows you to use your USB flash memory chip as a fast cache (faster than hard drives) which can speed up your computer considerably: see how the boot time came from 43 seconds to 14 seconds when a 512 MB SD card was added to the system
  • Windows Experience Index - an automatic rating of your hardware
  • Windows SuperFetch - a system that learns when you are likely to use a program so that it loads it to the memory before you click and everything is faster
  • Disk Defragmentation - is automatic with Vista
  • Sleep - StandBy and Hibernation merged and were generalized
  • A lot of new software - e.g. many games are ready

See the best independent evaluation of Windows Vista and other Microsoft's products on Paul Thurrott's website called winsupersite.com. Everything looks nice and smart I hope to see it soon.

Wednesday, November 29, 2006

Massachusetts v. EPA

Original text: Today, the U.S. Supreme Court had one of the strangest hearings in many years. The environmental NGOs decided that no act is too ridiculous for them. So they have essentially sued EPA, the U.S. Environmental Protection Agency, for causing global warming.



Here is the transcript - thanks to YS!

The environmental organizations have not been regulated at all so they have literally spread like mosquitos which is why I can't enumerate all of them. But you can guess who is the main eco-activist group in this list. Yes: the main dissatisfied environmental organization is called the
  • Commonwealth of Massachusetts

which is why the case is called Massachusetts v. EPA. That's a rather painful name for all people in Massachusetts with some traces of common sense left. What is the sin of the EPA and how did this environmental agency suddenly become the main target of the eco-attacks? Well, according to these environmental groups which includes the elected officials of 12 states of the union, EPA should have protected and failed to protect the atmosphere against carbon dioxide according to the 1990 Clean Air Act. You don't exactly have to be a physics PhD to see that it is a complete absurdity.




The Clean Air Act is a law that protects the air against pollutants such as pesticides and smog. Of course that the people who were writing this law were not quite silly so they wrote the bill in such a way that it can't be misinterpreted by the first vicious organization that would intend to misuse the law. So if you look at the explanation what the pollutants are, you will see that they must cause at least one of the following things:

  1. injure health
  2. cause environmental damage
  3. cause property damage

Does CO2 injure health?

Well, if the concentration exceeds many thousands of ppm, it could. In the atmosphere, CO2 represents about 380 ppm (parts per million of volume) and this number increases roughly by 2 ppm every year and is expected to reach 560 ppm around 2100 which will mean that the pre-industrial CO2 concentrations will have been doubled; see climate sensitivity.

Does 380 ppm of carbon dioxide injure health? Given the fact that we live for many years longer than the people 100 years ago who were breathing a 300 ppm air, you could guess that 380 ppm of carbon dioxide probably doesn't injure health. Moreover, you could also notice that the typical office concentration of CO2 is between 600 and 800 ppm. Many people keep on spending hours in their offices whose CO2 concentration mimicks the atmosphere in 2200, assuming that we will continue to use fossil fuels for 200 more years.

If you read the actual Clean Air Act, you will see that by health problems, they really meant well-defined conditions such as cancer in the context of pesticides etc. CO2 can't come anywhere close to what they meant - and wrote. The only reasonable summary is that the existing emissions of this natural gas don't injure health.

Does CO2 cause environmental damage?

The notion of environmental damage is not defined rigorously but it is not completely ill-defined either. By environment, all these lawmakers clearly meant the fauna and the flora, together with the soil, rivers, oceans, and rocks where the organisms live. Are these things harmed by carbon dioxide? Animals generally live in these conditions as happily as the humans, and the previous paragraphs about the human health apply to all the animals I know of, too.

The plants, on the other hand, really love higher concentrations of CO2. They're thriving because carbon dioxide is what they eat. Because the oceans don't disappear and the stones don't break up, the conclusion is that I can't imagine any stretch of imagination that would allow someone to argue, under the pledge of honesty, that the carbon dioxide emissions are causing environmental damage.

Property damage

The question is equally clear if we look at the property damage. The people have invented many fantasies and scary scenarios how the carbon dioxide could start to damage the civilization and as far as I know, none of them was able to survive a few couple of observations or more detailed checks. For example, some people have argued that the rising sea levels could destroy cities. Despite the huge CO2 emissions, the sea levels currently rise by 1-2 millimeters per year or so and it is pretty clear that this trend is not going to destroy any cities or houses in any foreseeable future even if it accelerates, because of some unknown reason, by a hypothetical factor of five or more.

Other people have argued that the CO2 emissions measurably increase the number or intensity of hurricanes. After the most quiet Atlantic hurricane season in 10 years, these speculators prefer not to speak about their hypothesis and try to wait for "better" years that would provide them with some evidence. There has never been any scientific evidence for this hypothetical link and the available recent data strongly disfavor it and they could rule it out soon. I could go on and on and on but the conclusion is clear: the carbon dioxide emissions don't cause any property damage.

In the legal context, it is very clear that the causal relationships must be very sharp and a pseudoscientific speculation about carbon dioxide emissions in New Jersey that may contribute to a collapsing cottage in Indonesia 30 years later just can't be good enough evidence for the lawyers.

Standing doctrine

The previous argument sounds intuitive but we can quote precise sentences from the U.S. Constitution. Its Article III is dedicated to the judiciary. To make things short, you may find the so-called standing doctrine in this article which is a generalization of the presumption of innocence.

It says that the plaintiff in front of the federal courts must show that her injury is "concrete and particularized" as well as "actual or imminent". The founding fathers wrote these wise sentences exactly in order to make things like suppression of the freedom of speech or suppression of life and the work of companies with the help of hypothetical accusations impossible.

The carbon dioxide accusations are precisely the type of pseudoarguments that the Article III declares invalid because they are neither concrete or particularized nor actual or imminent. Instead, they are conjectural, hypothetical or speculative - and they can also be described as a "generalized grievance" - which are exactly the adjectives that are not allowed. Needless to say, the proposed "remedy" - namely regulation - is not "likely" to fix the alleged problem (global warming) because the Chinese and Indian companies will keep on emitting which hopefully makes it clear that EPA has to win.

America whose citizens like to sue each other would have already collapsed if the standing doctrine didn't exist. Once again, weird long-term alarmism and unprovable accusations such as those associated with the global warming are textbook examples of acts that the founders of the U.S. had to make legally irrelevant if they intended America to survive longer than a couple of years. The global warming alarmists are no special and original problem-makers but exactly the standard problem-makers who were carefully taken into account 230 years ago.

Carbon dioxide clearly can't become a subject of the 1990 Clean Air Act. The lawmakers wanted be very certain so they have essentially enumerated the compounds that should be regulated. The priority air pollutants are ozone, sulfur dioxide, respirable particulate matter, nitrogen dioxide, carbon monoxide, and lead. They gave a similar explanation of the hazardous air pollutants such as methyl isocyanate. These things have clearly nothing to do with compounds like CO2.

As a kid, I lived in Pilsen which was one of the most polluted towns in the world because it was the most industrial city in Czechoslovakia, one of the most technologically advanced socialist countries. Believe me that we had to learn what pollutants are and what pollutants aren't. Even the true pollutants weren't necessarily deadly but an addition of a few ppm of CO2 into the atmosphere is certainly not a thing to worry about.

Old gas, new ideology, and precedences

One of the characteristic features of the environmental activists is their megalomanic messianism. They believe that they have just learned about the most important holy spirit - or gas, if you wish - that decides about the life on the Earth and these new religious sentiments - or "scientific findings", as they call them - supersede and exceed all previous knowledge of the mankind.

Is carbon dioxide a new player in the town?

Such a feeling couldn't be further from reality. The truth is that carbon dioxide was one of the first gases that was distinguished from air. In the 17th century, Jan Baptist van Helmont burned charcoal and saw that the ash was much lighter than the original charcoal. He figured out that what happened was a transmutation of his charcoal into an invisible substance that he called "spiritus sylvestre" (wild spirit) or "gas".

Some of the chemical properties of this spirit were understood very quickly and the following 3+ centuries expanded our knowledge about CO2 tremendously. CO2 is an extremely important gas in the cycle of life and biologists, together with some of us, have learned very many things about it. Most of the processes in which CO2 participates are far more important than the effect discussed below.

"But the authors of the 1990 Clean Air Act didn't know about the greenhouse effect," many of the activists will say. Well, that won't be the brightest ones among the activists. The greenhouse effect was discovered by Joseph Fourier in 1824 and quantitatively described by Svante Arrhenius in 1896. Even James Hansen's testimony - with all the predictions that were falsified in the next decade - about the importance of the anthropogenic greenhouse effect came 2 years before the Clean Air Act was written down. No really new important insight for this lawsuit occured since 1990. The only insight about CO2 that has really changed since 1990 is that some people became uncontrollably irrational.

The authors of the Clean Air Act have known not only the carbon dioxide but also all of its effects and hypothetical effects that we know today and they intentionally excluded it from the list of pollutants, for a very good reason. Identifying CO2 as another pollutant is just another way to defend "phasing out" of the human civilization and the animal life in general.

Anton Scalia may sometimes confuse the troposphere and the stratosphere, and frankly speaking, I sometimes confuse them as well, but I can't imagine that he doesn't realize that his agreement with the environmental activists would mean to start a decay of the whole American legal system and the U.S. society in general.

If CO2 is legally identified as a pollutant because it is a greenhouse gas, what about water vapors? CO2 is the second most important greenhouse gas but water is the number one. With the precedence of a monstrous anti-CO2 decision, people could start to sue others for cooking, sweating, breathing, and emitting CO2 or H2O or any other of the new "pollutants" that could, in principle, contribute to an unexpected change of the weather on the opposite side of the planet 50 years later.

In some countries, such as New Zealand, as much as one third of the greenhouse gases including carbon dioxide and methane is emitted by cows and sheep (500 liters of CH4 and 6000 liters of CO2 per day per cow). Do we really want to start to exterminate them? Or do we want to argue that a gas is a poison or is not a poison depending on who emits it? Is this what we mean by justice?

I can't imagine that a responsible lawyer would allow EPA to lose and the profoundly unreasonable environmental activists to win. And according to everything I know, a scientist who claims in front of the Supreme Court that CO2 is a pollutant according to the Clean Air Act definitions should be tried for false testimony.

And that's the memo.

Additional popular climate articles on The Reference Frame

Bush and Klaus



Figure 1: Bush and Klaus in Riga before taking a family photo today. You can see a similar picture with the left guy replaced by your humble correspondent. ;-)

The U.S. president has assured his Czech counterpart that he will work on the visa issue that has been annoying for many people including your humble correspondent, despite some expected friction in the U.S. Congress.

More than the required 3% of the visa applications from the Czech Republic are rejected - more precisely, it was above 9 percent in 2005 - which is the technicality in which the member of the EU doesn't satisfy the U.S. criteria for visa-free tourism. Americans can visit Czechia without any visas.

You can see that with the number of visas I had needed in the past, the probability that at least one of them would be rejected was about 50 percent. Add collapsing assets due to market turmoils, harassment by left-wing radicals and by aggressive organized crackpots, and many other things to be sure that no one should have been jealous about my situation. ;-)

I think it is fair to say that it is really the Democrat Party with their labor market protectionism who is responsible for the asymmetric harassment of countries like Czechia. If the security were the only issue, Czechia as a solid ally would probably face no problems.

My additional personal suspicion is that some people in the U.S. also want to avoid a possible inflow of Roma Czechs which was also the primary reasons why the visas for Canada were re-introduced for all Czech citizens in the mid 1990s.

Canada decided to screen itself from such an inflow so that the country could go on to hypocritically pretend that they're such nice people ;-) and that it is surely others who is responsible for any problems that the ethnic minority could might be facing.

See other articles about Klaus and Bush and on this blog.

He's Still the One

The WaPo, Firedoglake, and Stephen Colbert have extensively covered John Hall, the first professional rock musician -- "rock" being a loose description, IMHO -- elected to Congress, so go click and read.

He was/is the lead singer for Orleans, which had two megahits in the Seventies, "Still the One" and "Dance With Me". My personal connection is that this music was released in 1976, my senior year in high school. It got plenty of play at our prom, and contained some good makeout tracks from what I can recall; you can listen to a blast from the past here.

That's the album cover of "Waking and Dreaming" on the right (Hall, with more hair, stands in the middle); it's obvious that the radical homosexual agenda was even then seeping into American culture.


Hall made two appearances on the Colbert Report; the first was in the recurring "Better Know a District" segment in which Colbert sends up an always-hilarious parody of a serious interview. In his bit with the future Congressman, Colbert produced a set of 'smear flash cards'. Hall drew the "My opponent smokes marijuana" one. After he was elected, Hall returned to sing a National Anthem duet with the host (the video snip is linked above).

Congratulations to John Hall, and thank goodness musicians and their fans finally have representation.

Tuesday, November 28, 2006

Choptuik exponent is Regge saturation exponent: maybe

Take two random articles from this weblog:
Such two random articles are expected to have nothing to do with each other. For example, the first one is concerned with a messy classical calculation in GR of a marginal formation of a black hole - essentially some dirty astrophysics - while the second one is some special limit of QCD relevant for some messy nuclear physics. Neither of them seems close to the fundamental equations and they moreover probe very different parts of physics at very different scales. And one of them is classical while the other one is quantum. So they can't be related. Or can they? ;-)

Using the BFKL pomeron exchange in a gauge theory, one can try to calculate some scattering amplitudes in the Regge limit (high-energies, small angle). A linear approximation of this calculation breaks down for some rapidity "y" that is related to the ratio of the sizes of certain two three-dimensional momenta by a scaling law
  • exp(y) = (k1 / k2)^{gammaBFKL}
where one can calculate that the exponent is numerically
  • gammaBFKL = 0.409552...
This breakdown essentially calculates the ratio of the sizes of the two nucleons as seen by the pomeron exchange.




Now take a completely different physical system: scalars coupled to gravity. Consider a line in the scalar configuration space of initial conditions of the scalars parameterized by a single parameter "p" such that for "p=0", the system evolves into an empty Minkowski space in the future while for large values of "p", a black hole is formed. So there must be a critical value "p0" above which the black hole is formed. How heavy the black hole will be: what is the mass "M"? It depends how closely you get to "p0". Choptuik's relation is
  • M = M0 x (p-p0)^{gammaBH}
That's nice, another critical exponent. Note that both of them were denoted by "gamma". Tonight,
argue that it is no coincidence, using the AdS5 / CFT4 correspondence, and both of these exponents are actually equal. If true, that's a rather fascinating relation between quantum phenomena in a gauge theory living in the flat space and a complicated non-linear but classical calculation in general relativity in the context of the birth of a black hole. I actually believe that the relation could be correct even though the Choptuik exponent for five-dimensional gravity is only known - also numerically - as
  • gamma = 0.408 +- 2 percent
It would be a good idea to try to calculate this one and perhaps also the other one more accurately or even analytically.

Climate debate: Leckner vs Michaels

Today at 1:30 pm, a debate between the representatives of the climate debate will start in Colorado:
The climate moderates and skeptics will be represented by Pat Michaels who is a senior fellow at the Cato Institute and a past president of the American Association of State Climatologists. He also wrote geographers' paper of the year 2003.

The climate radicals and alarmists - I really don't know a more flattering word to describe these people - will be represented by Prof Klaus Lackner who started to work on environmentally friendly fuels after his years in high-energy physics - at Caltech, SLAC, Los Alamos. If you want to see the debate, you must

Monday, November 27, 2006

Florida, Yukon: record cold temperatures

Four days ago, the daily cold records were set in most of the South Florida.
Update (January 3rd 2008): another record or near-record cold temperature, freezing citrus crops etc. Temporarily relevant link to Google News: Florida record cold
Yesterday, i.e. on Monday, the Yukon territory in Canada has seen its coldest November day on record: in -41 Celsius degrees which also happens to be around -41 Fahrenheit degrees, they could test some kinds of superconductors. Congratulations! ;-)



In Calgary, Canada, they're just approaching the record chill in the 110-year history. Nanaimo, British Columbia had a record daily snow for November and 56 cm for the weekend, and they are expecting a deep freeze.

In Vancouver, Canada where they broke the precipitation records two weeks ago, 30 centimeters of snow caused outages, a death, and flight cancellations. With an update, British Columbia that broke the snowfall record has seen four people killed by the cold so far.

Victoria, Canada has seen its record two-day snowfall on Monday and now it's time for a big chill.

Juneau, Alaska has broken the record low from 1985.

Oregon and Washington where they improved the record for the monthly rain in November are preparing for a winter storm and, together with Idaho, a deep freeze. Seattle has broken cold records for 11/29 by four degrees.



Sliding down icy SW 164th Street in Burien, WA, on Tuesday, before the cold really came to the town...

Monterey, California has near-record and near-freezing conditions on Wednesday and a real "freeze warming" is in effect for the interior valleys of San Francisco and Monterey Bay Areas for Thursday morning. Sacramento is forecast to match or beat the 1880 record low of 30 degrees F overnight.




Winter chill will hit Arizona soon, too, and breach the previous record of 19 degrees by 3 degrees. Utah has seen its record low afternoon temperature and received it first snow. Salt Lake City broke its previous record cold temperature by 5 degrees and Nevada is not far from the record either. Most of the fresh seven Northwestern storm casualties died in Colorado. In Texas, they learn that temperatures can plummet by 30-40 degrees F in less than one day.

What was happening on the opposite hemisphere yesterday, for example in Antarctica? Employees from New Zealand are responsible for maintaining Scott's sleeping bag in the historic Antarctic hut. The four guys had to shovel 85 tons of snow which improved the previous record for a snowdrift by 33%.



Kashmir and India will receive heavy snowfall on Monday 12/4. Heavy winter rains will kill 22 people in Pakistan.

When you read these comments, you must be almost certain that the researchers from the Russian Academy of Sciences must be right when they predict a new ice age for the next 60 years. Before you make this conclusion, notice that here in New England, people only go skiing on artificial snow. ;-) At the end of November, Massachusetts has normal temperatures but Norway and Finland have a kind of summer in November.

The cold weather from the West of the Northern America is going to move Eastwards. For example, Oklahoma will record single-digit record low temperatures on Saturday 12/2 and Colorado Springs will see another record low on 12/3 while Central Missouri will receive the largest snowfall in 60 years. On 12/4, Texas will break the record lows by eight degrees! On 12/8, Alabama will break the 1927 record and Tennessee will break the 1977 record. South Carolina will kill the 1968 record. On 12/9, Georgia will follow with the 1917 record.

Some places often get warm and other places often get cold. It has been like that for billions of years even though this is not the conclusion you would make after reading most of the newspapers these days. ;-)

Note that the DPA article under the previous "new ice age" link says that the U.S. National Academy of Sciences reported that it couldn't be said with any certainty what caused the recent warming: the long-term weather pattern remain largely unknown, they correctly write.

The only clear thing is that most of Northern America, including Manhattan, was covered by a mile-thick glacier in the period that ended 10,000 years ago, they argue. We will see whether the advocates of the action against the global warming can return the Earth to these natural and nice conditions before the evil capitalist species called homo sapiens (together with corrupt Mother Nature) started to ruin the planet. ;-)

The previous posting about record cold temperatures was focusing on Australia.

Non-metric gravity and renormalizability

Kirill Krasnov has asked me to debunk his paper
Now, I would really be happier if my reading could end up as an announcement of an important breakthrough ;-) but after having read his paper, it seems that he will get exactly what he asked for: a debunking.

Let me start with some general comments about the whole framework.

General strategy

First of all, there are many papers that attempt to rewrite gravity in different variables, combine them and recombine them, in order to get a better result. As we will discuss in detail, I think that this whole philosophy is fundamentally flawed. A change of variables is one thing but physics is an entirely different thing. In physics, we want to know what is the Hilbert space - the spectrum of particles - and what are the amplitudes (and consequently cross sections) of their interactions.

These amplitudes are completely physical and measurable in principle (up to an overall constant phase) and they cannot depend on your nationality, sex, or your choice of variables. If these people actually ever tried to compute at least one physical observable instead of this mumbo-jumbo with their formalism, they would have to see the same thing that everyone else does: the basic qualitative physical insights about quantum gravity are true and independent of any changes of your variables.

The coupling constant of gravity, Newton's constant, determines the length scale or the energy scale where quantum effects (such as loops in the Feynman diagrams) become as important as classical effects. The relative importance of the tree diagrams and multi-loop diagrams can be calculated by a simple dimensional analysis and the result doesn't depend on any field redefinitions. For example, the characteristic dimensionless coupling constant goes like "G.E^2" where "E" is the typical energy in your experiments and "G" is Newton's constant. Because it becomes of order one near the Planck scale, a better theory is needed as seen from the simple behavior of the "quantum foam" - the violent and topology-changing fluctuations of the metric near the Planck scale.




Kirill Krasnov seems to deny most of these facts, including the existence of the quantum foam. If there is an argument in that paper supporting such a denial, then I have missed it but it seems absolutely manifest to me that such a conclusion has to be wrong.

The self-dual split

OK, let's try to play Kirill's game anyway. He starts with a tetrad form of general relativity but the precise Lagrangian (unlike its variables) doesn't seem to play any role in the paper so we will ignore it. Then he recalls the first Plebanski self-dual Lagrangian for gravity. Its degrees of freedom are
  • B^a, an SU(2) Lie-algebra-valued two-form
  • Psi^{ab}, a two-form that contains the Weyl tensor components on-shell
  • A^a, an SU(2) gauge field.
Plebanski effectively worked in the four-dimensional Euclidean spacetime whose Lorentz group, SO(4), is isomorphic to "SU(2) x SU(2)": a special feature of a four-dimensional spacetime. The self-dual part and the anti-self-dual part are treated separately. Note that in the Minkowski space, these two sets become complex conjugates of each other so that they can't be separated as long as you consider Hermitean observables.

The metricity condition for "B^a" is that "B^a B^b" is proportional to the identity matrix "delta^{ab}" in the color space which implies that "B^a" can be related to the self-dual part of a two-form from the one-form formalism. The field "Psi^{ab}" acts as a Lagrange multiplier that imposes the metricity condition for "B^a". When the equations of motion are satisfied, "Psi^{ab}" itself contains components from the Riemann tensor.

The first intrinsically technical among Kirill's controversial statements is that the gravity action is effectively independent of Newton's constant because this constant can be reabsorbed to different fields. On physical grounds, this can be clearly seen to be wrong because the value of Newton's constant is quite physical. For example, its large enough value implies that we can't jump to the Moon.

Kirill says that Newton's constant only becomes physical when gravity is coupled to matter. That would be true in a classical theory. In a quantum theory, it's just wrong. Newton's constant determines the coefficient of the gravitational Einstein-Hilbert action, not the matter action. If any theory is called a theory of gravity, it should reduce to the classical Einstein's or Newton's laws at long distances. At some shorter distances, this classical approximation must break down. The scale where it breaks down is called the Planck scale and who believes that it never breaks down is a quantum mechanical denialist. Alternatively, you can define the Planck scale as the absolute value of the mass of the lightest massive black hole microstate that appears as a pole in the graviton-graviton scattering amplitude on the second sheet.

All these comments are completely physical - in principle, they're experimentally verifiable - and they don't depend on any field redefinitions as long as the field redefinitions are done properly.

Kirill argues that physics of gravity, when written in the Plebanski variables, is independent of the coupling constant. That seems obviously incorrect.

Witten's twistor approach to the N=4 gauge theory is also an expansion around a self-dual theory that is very analogous to the present case. We could make the same argument about the disappearance of the gauge coupling. Such a conclusion would, of course, be completely misguided. One must carefully remember the different powers of the gauge coupling that must be added to various diagrams. The power of the coupling constant may come from some unusual, topological terms in the action. But whatever their source is, it is clear that this dependence on the coupling constant can't disappear because it is a part of physics.

Whenever you split the variables to the self-dual and anti-self-dual ones, you must be careful because you are generally using a different scaling for these two parts. But their ratio still carries the information about the priviliged length scale - the Planck scale - in the gravity case or about the gauge coupling in the Yang-Mills case. In the Minkowski space, the relative scaling of the self-dual and anti-self-dual parts must really be one as long as they are complex conjugates to each other which seems necessary for you to be able to construct real actions whose reality seems to be necessary for unitarity.

Counterterms

Fine. So why does Kirill believe that this field redefinition magic makes the theory more convergent than the standard treatment? He argues that all the counterterms that are generated by UV divergences are of the form
  • function(Psi) B^a B^b: equation (4) in Kirill's paper
or, alternatively, you can redefine "Psi" to "Phi" so that all these counterterms are hidden in a different part of the Lagrangian. Later in the paper, he also speculates that one could also possibly generate a whole new pile of counterterms that have a different form, something that I am not able to confirm or deny.

Back to the "Psi" - "Phi" field redefinition. Whenever more than 10% of a paper is spent by one particular field redefinition, in this case the field redefinition in equation (5), you feel some compassion for the brave person who struggles with these vacuous operations so courageously even though you know in advance what the result will be.

Why does Kirill think that the counterterms he generates are less severe than the usual terms generated in the normal variables? Frankly, I have no idea. As we mentioned above, if you're on-shell, the components of "Psi^a" are nothing else than components of the Riemann tensor. So all the counterterms that are polynomials in "Psi^a" are nothing else than the usual polynomials in the Riemann tensor.

Kirill could have gotten confused by his unusual choice of dimensions in which "Psi" is dimensionless which could have led him to the idea that his counterterms are not higher-dimension operators. But they are. These choices don't change anything about the fact that "Psi" is really a curvature, something that we can actually measure as the tidal forces and other things. And even if you hypothetically did obtain an arbitrary counterterm-functional of dimensionless "Psi" variables, you would still be in trouble.

In fact, if one does the calculation properly, they should be exactly the same counterterms that one needs in the normal perturbative approach to general relativity, although they are written in different variables. He rewrites the action in terms of "Phi" instead of "Psi" in equation (6). Normally you would have no clue what he can ever gain by these rather straightforward operations. However, below equation (6), he summarizes all the counterterms that are at least cubic in "Phi" as "O(Phi^3)".

That almost looks like he wants to neglect them. Still, this nearly invisible "O(Phi^3)" contains all the infinitely many counterterms - the same counterterms - that are the technical reason why pure gravity can't be perturbatively quantized. Kirill says that the divergences in his theory are "clearly" much milder than in the normal treatment of GR which is why he calls the theory "quasi-renormalizable". As far as I understand, the adjective "quasi-renormalizable" is a politically correct word for "non-renormalizable": it's a theory with infinitely many types of divergences whose predictive power - besides the statements about the spectrum and symmetries that you put in - is equal to zero because you need to determine infinitely many parameters.

Kirill's counterterms could only depend on the self-dual part of the curvature (Psi) but it doesn't matter because his equation (7) shows that the metricity condition for B - something that is known exactly to all orders in normal GR - receives divergent corrections at all orders. So what he really does is to split the higher-derivative corrections to the effective action into the self-dual ones and others, and treats them separately. I don't know whether it is an interesting manipulation but what I know is that it can't change anything about the required counterterms in general relativity when it's done properly.

Later, Kirill only looks at the coefficient of the lowest new power of "Phi", namely "Phi^2", and claims that the coefficient of this new coupling is dimensionless (an overlined "g"). What the adjective "dimensionless" really means is somewhat obscured by Kirill's unusual absorption of Newton's constant into the fields. Had he used the conventional normalization of the action with the overall "1 / G" in front of it, his leading new term would have the same dimension as the familiar "R^2" terms. The coefficient contains a "length^2" besides the usual "1 / G" factor from the Einstein-Hilbert term. Every time you add a "Psi", you must add a "length^2" to the coefficient. In perturbative string theory, this "length^2" is of order alpha'. "Psi" or "Phi" is nothing else than some curvature.

Kirill then studies the running of this "overlined g" coupling constant which is, as we said, a coefficient of some "R^2" term, and argues it is asymptotically free. I forgot what the sign is but it is not too important anyway. This discussion is not physically crucial because there are three possible independent "R^2" terms in GR and all of them can be seen to be unimportant. One of them is the Euler density (the Gauss-Bonnet term) that is topological and doesn't influence perturbative physics. The other two depend on the Ricci tensor only and can be removed by field redefinitions: add a multiple of the Ricci tensor and of "R.g_{ab}" to the metric tensor "g_{ab}". Equivalently, you can say that the Ricci tensor is zero on-shell using the tree-level equations, so these two terms don't contribute.

Kirill speculates that all beta-functions in the theory are negative which would make GR as asymptotically free as QCD. This is essentially the old conjecture about the gravitational UV fixed point but I don't think that Kirill has any new evidence for this conjecture or a new identification of the Hilbert space that describes the UV physics if the conjecture is true. What are the excitations in this regime? Is the hypothetical UV fixed point a chiral theory including just the right-handed gravitons coming from the self-dual fields? This is just my conjecture that is not mentioned there but that at least offers an intriguing possibility. But I have no idea how you could make the left-handed gravitons disappear because they should appear at low energies and low energies are really equivalent to high energies for a single particle by Lorentz invariance. Removing some of the polarizations requires us to break Lorentz invariance.

Healthy one-loop graphs in general relativity

Kirill's test of the "Psi B B" terms was encouraging for him. Indeed, at the one loop level, there is no problem with general relativity, regardless of your choice of variables. We have recently discussed these issues in the context of finiteness of supergravity. The real loop problems of pure 4D gravity only start at the two-loop level. They resulting "Riemann^3" terms - in fact, "Weyl^3" terms - would get translated to "Psi^3.B.B" terms in Kirill's language and higher, and because Kirill apparently doesn't say anything about these terms, I think that his paper really says nothing about the divergences in quantum gravity. It's classical general relativity presented in such a way that it looks more complicated than it is.

There are calculations of a one-loop beta-function. I would like to emphasize that these beta-functions are determined by the logarithmic divergences which is why they're completely known just from the classical limit. They don't depend on any underlying theory. And we know what they are.

At the end, Kirill advocates a breakdown of GR and the replacement of dark matter by a MOND-like theory that could perhaps result from his formalism: the link is unclear to me except that the cosmological constant is incorporated into this guess in a way I don't follow.

After the dark matter was directly observed a few months ago, such alternative theories for dark matter have become really suspicious and I think that if the reality is described by a nice alternative without dark matter, its discoverer will have to complete her theory entirely before she is taken seriously by anyone else - simply because such a framework seems extremely unlikely right now.

Summary

I think that the hope that a change of variables will remove a physical inconsistency is logically inconsistent and impossible. The Feynman graphs written in the normal variables parameterize all the possible terms and most of the physics they imply is inevitable.

Throughout the history, the ultraviolet divergences only got cured by modifying the UV physics and adding new degrees of freedom and I am convinced it was no coincidence.

As the recent twistor uprising has demonstrated, the decomposition to self-dual variables may be a great tool to efficiently calculate some diagrams and to explain why some of them sum to zero. But it doesn't change anything about the results. The results can be obtained otherwise as long as the pictures are equivalent.

I am convinced that until someone in this "field redefinition" community has a result or a non-trivial condition that can be phrased in the normal variables and words that every high-energy physicist is familiar with, such as the metric tensor and the S-matrix, the serious physicists I know will continue to fully ignore this kind of papers because it just seems guaranteed at this moment that there can't be any interesting physics in them.

And that's the memo.

This Week in Irony (and it's only Monday)

One of the companies hired to build a wall to deter illegal immigration is being investigated by the Department of Homeland Security for hiring undocumented workers.

Senator Dan Patrick -- the general in his self-proclaimed army of douchebags conservatives -- might just be a bleeding rectum. And the other Republicans in Austin might have to curb their fascism alter their strategy.

The oil companies could be -- surprise! -- squeezing production in order to prop up the price of gasoline.

The Bush twins, Jenna and Not Jenna, went buckwild in South America for their 25th birthday celebration. Apparently they did oversee a little family bidness while they were there: their dad purchased a hundred thousand acre property in Paraguay and Jenna took a meeting with the president of the country and the US ambassador. I hope she didn't have to take her clothes off.

NBC and MSNBC decide to call it a civil war. They are not joined by the rest of the corporate media yet. Kofi Annan says it is almost civil war. The Bush administration calls it a "new phase".

Relativistic condensed matter physics: chiral QFT inside a pencil

Relativistic condensed matter physics sounds as a contradiction, doesn't it? If it's condensed, it must be low-energy, and whatever is low-energy, must be non-relativistic, if not low-brow, many people would say. ;-)

Andre Geim gave a colloquium at Harvard University in which he convinced us that the converse is true. One can study relativistic quantum field theory in condensed matter systems.

Andre Geim is an Ig Nobel prize winner for his discovery of a levitating frog; yes, it is now the featured Ig Nobel discovery on Wikipedia. He has also made it to the fifth annual "Scientific American 50" even though he is only scientific but not American, as Charles Marcus has emphasized with a strong nationalist - but still decisively leftist - accent. ;-) Geim was born in Russia although he works in Holland right now.

OK. What material do you need in order to study chiral quantum field theories within condensed matter physics systems?

Graphene

The material is called graphene and it was thought not to exist until 2004 or so when Geim et al. have made their breakthrough. It is a two-dimensional crystal of carbon filled with hexagons. This idealized material couldn't be observed in its two-dimensional incarnation for a long time but if had been observed in all other compactified editions (except for those with 4-10 large dimensions). We classify them according to the number of large dimensions:
  • 3D: graphite - the material inside your pencil - is a lot of layers of graphene
  • 2D: graphene - one layer - observed in 2004
  • 1D: carbon nanotubes may be viewed as graphene with one dimension compactified on a circle
  • 0D: carbon balls may be viewed as graphene compactified on a sphere
Fine. Geim et al. have simply used a better technology based on optics that allowed them to go to one layer of graphite: the graphene itself. Whenever you make a discovery, you should check literature and you will see that some Russian theorists or Japanese experimenters have done it 39 years ago. This was the case of nanotubes that were not discovered in 1991 but in 1952, we were told. Nevertheless, Geim et al. were lucky because they were indeed the first ones to isolate one layer of graphene.




Stability

The existence of purely two-dimensional crystals could have been surprising for many condensed matter physicists. It turns out that the substrate (sillicon) is important for stabilization of the two-dimensional shape of graphene. Without this background, graphene undergoes spontaneous compactification.

Relativistic dispersion relations

When you study electrons moving in this two-dimensional crystal, you will see that they have a different effective mass - the cyclotrone mass - and the effective speed of light is different, too. In their case, the speed is 1,000 km/s which is only 0.3% of the normal speed of light. Nevertheless, you find that the energy of the electrons grows linearly with the momentum:
  • E = PC = MC^2
where PC doesn't stand for political correctness and M,C in the equation above have unusual values. So electrons effectively follow the laws of 2+1-dimensional relativistic quantum field theory.

An entertaining intermezzo: Relativity with a different speed of light is analogous to "relativity of sound" discovered by Jan Fikáček, a philosopher who was a former chairman of Mensa Czechoslovakia and who was always a fun company to talk to. He argued that we only substitute the speed of light into Einstein's equations because we rely on vision. People who prefer hearing and music - for example the blind people - would end up with another type of special relativity that is based on the speed of sound. One of the striking predictions of Fikáček's theory that make it so attractive and so falsifiable - as well as quadruply special if not quintuply special - is that the blind people can't fly with supersonic airplanes. ;-)

They have a new type of isospin with two possible values. It's because the hexagon grid has two carbons in each fundamental cell. The electron can sit on either of them. It is helpful to write the wavefunctions of the electrons on the two types of carbons separately, as two components that depend on the continuous positions x,y. The most elementary transition is in between the two types of the cells which means that the Hamiltonian is an off-diagonal 2x2 matrix proportional to
  • p_x sigma_x + p_y sigma_y
Note that this differs from the non-relativistic Pauli equation where the Hamiltonian doesn't depend on the spin at all. The equation relevant for the electrons in graphene is more analogous to the Dirac equation for a Weyl spinor although the spin degree of freedom is "emergent". See their preprint. Once you identify this degree of freedom with the normal spin, you will find out that the precession of this spin agrees with the frequency of the rotation in a generic magnetic field.

Somewhat surprisingly, if you study a two-layer "graphene" instead of the single layer, the relativistic dispersion relation becomes non-relativistic again: the energy will scale like the momentum squared again. In this case, the emergent spin rotates with a frequency higher by a factor of two in comparison with the orbital angular momentum.

Resistivity

Geim has talked about various quantization rules for the resistivity. He also told us that it was already Mott, a Nobel prize winner who should be confused neither with your humble correspondent nor with the juice, who argued that the resistivity is always bounded from above and this law can be derived from a more fundamental fact that
  • the mean path is never shorter than the de Broglie wavelength.
In the recent article about the AdS/QCD correspondence, we mentioned that this fact can be reinterpreted in the dual gravitational variables as the Bekenstein bound. I guess that Geim doesn't realize this interpretation. He considers the inequality to be falsified in all systems where localization and interference play an important role, and I am somewhat confused by this remark.

He listed the authors of roughly 20 theories to explain these facts using condensed matter methods. Most of these theories end up with a missing factor of pi. Geim showed the conventional picture of a "missing pie", a very sweet one. ;-)

IQHE

If you discover or explain QHE for the first time, you may get a Nobel prize. If you discover a new kind of IQHE, you get an Ig Nobel prize. ;-)

While the usual fractional quantum Hall effect gives you charges that are fractional multiples of the elementary charge with an odd denominator, Geim et al. have discovered different kinds of quantum Hall effects with graphene. One of them gives you half-integer multiples of the elementary charge which follows from the "relativistic" Dirac equation. Others give you something else, and so forth.

Applications

He believes that their work could turn out to be useful for a new generation of laptop batteries. Today, some laptop batteries are made of carbon nanotubes. In 2010, there could be laptop batteries made out of graphene that could hypothetically lead to a superior technology.

Also, carbon belongs to the same group as sillicon (remember the Czech verse: "Co Si Gertrudo Snědla? Olovo (Pb)" which means "What did you eat, Gertruda? Lead") and you could in principle imagine better semiconducting circuits than the sillicon circuits we use today. Carbon could in principle be more pure than sillicon. Purity is measured by the contrast between the minimum and maximum resistivity and they can get very high.

Stay tuned.

Andre Geim says that people should try to play with different materials than the graphene because there is too much competition there and you could see the condensed matter physicists' fists. ;-) Not all possibilities have been tried because the man-power was not sufficient so far. Nancy Hopkins didn't attend the colloquium so no one in the room could throw up because the offending term "man-power" was not replaced by the correct term "person-power".

Sunday, November 26, 2006

Jan Laštovička: the sky is falling

The added focus of this blog on topics related to Czechia is natural. The newest article on RealClimate.ORG is about an article by a Prague climate scientist RNDr. Jan Laštovička and his team. The article in Science is called
  • The sky IS falling.

More precisely, this is how Gavin Schmidt has translated the original boring title to satisfy the intellectual, emotional, and religious demands of his readers. The original boring title was:

But Gavin's title sounds more progressive, doesn't it? ;-) And it is more or less equivalent, isn't it? Let me mention that "Jan" means "John" while "Laštovička" means a "small swallow".



It should really be spelled "Vlaštovička" but the typo is common enough so that it has been incorporated into the surnames. ;-)

More seriously, the paper argues that the cooling of the upper atmosphere (stratosphere plus ionosphere above it - i.e. everything that is higher than 20 km or so) is "exactly" as predicted by the greenhouse effect and it makes various layers of ions (called the "sky" by Gavin) to "fall". Many comments about the timescale and the numbers seem to be deliberately vague and non-quantitative. We will look at some aspects of the claims and ignore others: for example, I won't discuss strange claims about the "increasing geomagnetic activity".




Disclaimer: the sentence "the sky is falling" doesn't mean that there is anything wrong going on and the reader should try not to be confused by this deliberately ambiguous terminology of Dr. Schmidt.

Gavin Schmidt explains that the Czech work is very new and very important because it contains the same statements and material that were already written down by Jarvis et al. in 1998 and by the Nude Socialist in 1999. As far as I can see, we can indeed call the work a "timely review".

Update: I received a reply from RNDr. Laštovička.

He argues that Dr. Schmidt whom [I] cite seems to be kind of confused. [They] don't study stratosphere at all. They're focusing on the atmosphere above 50 km. The team of 5 people is international - from 4 countries. The claims that [their] results reproduce Jarvis et al. have nothing to do with reality either. The main application of our work are estimates of the lifetime of satellites. ;-)

Note that one expects the upper parts of the atmosphere to cool down because the tropopause - the boundary between the troposphere and the stratosphere above it - should be moved higher if the troposphere heats up. If it is higher, then it should be cooler.

Note that the climate guys won't agree about the reason of the cooling: there's a complete consensus except that no one knows what the consensus even says - this consensus used to be called Emperor's new clothes.

Also, just like the carbon dioxide increases the infrared absorption in the troposphere, its increased concentration in the higher layer, the stratosphere, is - on the contrary - expected to increase the ability of this layer to emit energy and to cool down. Maybe.

I hope that it is not difficult for the reader to understand that the global warming theory actually predicts cooling for most of the volume of the atmosphere. There's really no serious catch here. ;-)

That's a very cute and important contribution to the global warming consensus - or a "timely perspective article" as Gavin Schmidt calls it in the first sentence, making his motivation and the source of his excitement rather easy to look through - except that the stratospheric temperature has been constant at least since 1994, while the previous peaks can be attributed to known natural causes: see this graph.

Yes, in the last 12 years, the years of the highest CO2 concentrations and their highest growth, the predicted and celebrated cooling effect doesn't seem to exist. That's not a problem: if an effect doesn't exist for 12 years, it is just a natural fluke. On the other hand, if the effect did exist, it would have to be included as a clear evidence for the Gore theory and those who would say that it is a natural fluke should be burned at stake. A progressive scientist must be a politically correct perfectionist and only include effects that lead to the results that agree with the consensus and that Al Gore will appreciate.

More seriously, the situation is much like the surface global warming that stopped in 1998. We've been forced to say "Yes, yes, I don't question that the planet has become warmer in the last 100 years" but this likely fact kind of doesn't mean that it would continue to get warmer after we say this sentence: a subtlety that too many people don't seem to realize. Many of us have said this sentence in 1998 but we couldn't repeat it for the period 1998-2006 because there has been no warming since 1998. The Southern hemisphere has seen no warming for the last 25 years.

Moreover, there exists another part of the atmosphere, namely the upper troposphere, where the greenhouse theory predicts a significantly stronger warming than for the surface temperatures - because this is where the greenhouse effect really takes place - while the observations seem to lead to the opposite conclusion, despite some recent corrections of the satellite data. One half of the predictions seems to have been wrong for decades and the other half seems to start to be incorrect in the last 10 years and no details seem to fit really well.

Climate and stocks

Imagine that there are similar questions about a decision that is usually decided rationally: you want to buy stocks for $50,000. You're told by a woman who agrees with hundreds of other women and with an unsuccessful U.S. presidential candidate that you should buy a stock of Surface Ltd. because its price has been going up for the last century and it will continue to do so, the price of a stock of Upper Troposphere Inc. included in the first one would also go up, and a third stock of a competing company Ionosphere Inc. would always go down. The person who sells you the stock tells you that she has some insider information that promotes this simple description of the price dynamics into a law of Nature.

When you actually look what's going on, you see that the first two stocks went up in most of the decades of the last 100 years but one of them was flat and the other went down in the last 10 years, while the competing stock has been flat for 12 years, too. Roughly one half of the predictions that the woman was trying to convince you about are wrong and those that seem to be correct are apparently correct because of very different reasons than she has been saying.

But at least, she can repeat her opinion 8 years later without any modification of her theory and without any improvements of its agreement with reality: she just says some things in a more vague fashion so that one can't see any sharp contradictions. The debate was already over 8 years ago anyway, so there is no longer a reason to try to improve the theories even though 1/2 of their predictions are wrong. Al Gore himself is already a perfect Prophet of the truth and who disagrees is an oil industry shill. Well, I think that most rational people wouldn't follow the advise of that woman - or of those women.

Fine. Our example was about a lot of money, namely $50,000, so it was natural that people would try to act rationally. In the case of the climate regulation, we talk about trillions of dollars of investments, something that can swallow most of the GDP growth or change it into recession. A trillion is

  • $1,000,000,000,000

As you can see, it is a large number and therefore the rational reasoning should already be suppressed, right? ;-) Mathematics and thinking can't work for such large numbers, can they? It's just like the landscape: if the number 10^{350} appears anywhere in your theory, it proves that mathematics is inconsistent and not even wrong and everyone must return to the pre-scientific era, the enemies of physics argue, and work on the dogmas of the confused 17th century philosophers instead of high-energy physics.

In the context of the weather and the climate, because the amount of money that is at stake is so large, all of us should act like wild animals, like the defenders of the global warming theory that has already been settled even though 1/2 of its predictions are rather clearly falsified by observations.

I was brought up to believe, and I still believe, that natural science is an experimental enterprise. With the data above, is the mainstream research of the climate a science or a religion? Haven't some people lost their mind? We report, you decide...

Holography and climate

Finally, let me describe how theoretical physics would look like if it followed the template of climate science.

In late 1997 or early 1998, Maldacena wouldn't be as accurate as the real Maldacena so he would discover the de Sitter / asymptotically free (DS/AF) correspondence. It's very important that the beta function of the field theory is negative, he would write, and it corresponds to the positive cosmological constant of the de Sitter space.

That's a nice conjecture. In the real case, the 5,000 followups confirmed the real Maldacena's conjecture. In the climate case, roughly 40% of the papers would find discrepancies. Most of the authors who have found discrepancies would be identified as oil industry puppets because they don't support the asymptotic freedom that is so important to match the de Sitter space. The remaining 60% of physicists would announce that 100% of them agree with the consensus. The debate is over. In other words, the physicists wouldn't give up: there has been a consensus about this holography since 1998.

In 2006, someone would write a new paper that would summarize, once again, Maldacena's arguments but all arguments that were contradicted by the oil industry puppets would be written in a more vague form so that everything still looks marginally fine. A whole industry of papers that are unconstrained by the results of the previous papers would explode. The only thing that these papers would have to share are the assumptions that holography must always have a de Sitter space on one side and an asymptotically free, discrete theory on the other side: the non-zero beta function and the discreteness is so important. No one would ever propose the shameful conjecture backed by the oil industry that the beta function should cancel to give the AdS space instead of the dS space. Such a cancellation wouldn't be progressive and it would violate the background independence. But otherwise, everything goes.

Well, I should stop these speculations at this moment, in order to avoid a copyright infringement lawsuit from an author who is dreaming about "physics" done more or less exactly in this way. ;-)

Saturday, November 25, 2006

Tired of turkey already


... and of media reports of shopping. Do they simply regurgitate last year's story so that they don't have to go out to the mall and honestly report the percentage of the parking lots' capacity? As if that's news anyway?!

... How about football? Anybody tired of football yet? Shit, I might have to go shopping just to get away from it.

... who's grown weary of certain relatives they only see once a year?

... and why doesn't anyone serve a freaking vegetable at Thanksgiving dinner? Is green bean casserole as close as it gets? Cornbread dressing, oyster dressing, mashed potatoes, baked potatoes, baked yams, candied yams, squash casserole, dinner rolls, croissants, cranberry sauce, giblet gravy and the nearest I came to a vegetable was a piece of celery the size of my pinkie fingernail and a chive. One. Chive. No wonder everybody falls asleep after feasting on so many carbs.

Boy, I'm tired. And I think I want some sushi for dinner this evening. Or some Vietnamese soup. Maybe a movie. Anybody seen Bobby yet? The reviews are cruel. Those who've written the ones I've read must be all Republicans ...

Friday, November 24, 2006

Havel at Columbia University

Václav Havel, the last Czechoslovak president and the first Czech president, is spending 7 weeks at Columbia University. His local website
contains a lot of videos of Bollinger, Clinton, and Havel, and many other things, including the movie called Citizen Havel goes on vacation. Be ready that except for three funny sentences, Havel prefers to speak Czech. ;-)

It's been more than 15 years since my admiration of Havel became finite and started to be regulated by various considerations and the realization of his limitations but I still think that his presence is a clearly positive contribution to Columbia University. Whom do you think about these days when someone says "Columbia University"? Years ago, you would realize that Milton Friedman got his PhD there. And today? While there are great people like Brian Greene and others over there, the university has become a symbol of various fringe America-haters, science-haters, crackpots, and similar groups and movements.




For example, some students have recently destroyed a talk by John Gilchrist from Minuteman, an organization of volunteers who try to stop illegal immigration and who have been praised by as moderate people as Arnold Schwarzenegger. As Scott, a decent young Republican, has pointed out at the end of the video, the show was a display of Columbia's poor intellectual capabilities: the storm troopers who took over the stage were animals. No conservatives would ever behave in the way that the democratic liberals did. Can you imagine that some right-wingers would do something analogous during Mohammad Khatami's talk at Harvard - a talk that was undoubtedly more controversial?

Needless to say, the leadership of the Columbia University is more or less openly proud about this garbage that quite clearly dictates the political atmosphere at that university.

Smetana and My Country: Alaska



The video has nice pictures from Alaska and exceptional background music by Bedřich Smetana, the Czech national composer. Nevertheless, their combination may be viewed as a manifestation of ignorance and insensitivity because the true message of the symphonic poem Vltava (The Moldau) is entirely lost.

In reality, the composition describes the Czech national river, Vltava, from its two springs to its junction with Labe (The Elbe). Vltava is the national river because it is largest among the rivers whose whole length fits into the Czech lands.

At the beginning, you hear a single flute whose tones are quickly increasing and decreasing and oscillate like the waves on a heterotic string. That's nothing else than Teplá Vltava (The Warm Moldau), one of the creeks that is finding its zigzag path through the hilly meadows of Šumava (The Bohemian Forest) and that is licking its grassy shores in Southern Bohemia.

Another flute joins the first one after a few moments: that's Studená Vltava (The Cold Moldau) that has added its forces to Teplá Vltava near Mrtvý Luh (The Dead Lea). Both of them meander in between the swamps until the melody culminates by the wide and lofty theme of the children song "Kočka leze dírou, pes oknem" (The cat is crawling through the hole and the dog through the window, see 0:25-0:48 of this video) that represents the true pillar of the whole symphonic poem. This theme that has also inspired Samuel Cohen, the author of the Israeli national anthem, is a clear sign that Vltava has spread its molecules into the wide area of Lipenská přehrada (The Lipno Dam) whose construction was apparently predicted by Smetana years before it started.

The river is getting stronger. At 4:00 we hear some Czech countryside themes as the river probes the celebrating villages South of the capital: yes, it is a wedding and the bride seems to like polka. Vltava continues through the forests in between the proud castles and chateaux standing on the hills. During a moonlight scene, water nymphs are dancing around. Vltava is slowly approaching Prague around 9:00. At 10:00, we hear the original C-major theme of the children song.



At 10:20, the old Vyšehrad Castle appears on the right side as you can figure out from the tones of another symphonic poem of Smetana's that belongs to the My Country 6-part package, namely Vyšehrad. In the last minute, the river leaves Prague and slowly joins with Labe that disappears into Germany in the North, and that's the end of the story.

Bedřich Smetana was chosen as a competitive national composer by the Czech intellectual movement even though he (Friedrich Smetana or Frederick Whipping-Cream, if you wish) could only speak German until he was 60 years old or so - and he became deaf soon afterwards. Antonín Dvořák who was 17 years younger than Smetana was a real patriot and while in the U.S., he was genuinely homesick. Nevertheless, the roles had already been decided. I posted this comment because they were just playing Vltava on a Boston's classical music radio station.

Heavy ion physics and AdS/QCD

The Lagrangian of QCD has been written down more than 30 years ago and I think it is fair that say that we have known that it is the correct fundamental description of the strong interactions which includes nuclear physics and heavy ion physics, among other subfields, for many decades. Even the Nobel prize committee has agreed with this assertion for more than two years.

From the viewpoint of a complete idealized theorist, the main questions about the strong interactions have been settled for a very long time. And it has been a good idea for these theorists to re-focus on new physics at shorter distance scales.

But physics is not just about the fundamental Lagrangians as some of the idealized theorists could think. It is about the understanding of all possible phenomena and about predicting of the outcomes of experiments in a wide variety of physical situations. What did the people have to do in order to get a better grasp of nuclear physics, heavy ion physics, and similar "messy" fields that exhibit complex behavior whose essence is captured by QCD?

Well, people had to perform a lot of experiments, find phenomenological laws, and justify these laws from the fundamental equations. The physicists had to think about a plethora of possible new approximations, concepts, and ideas how to visualize what's going on in a more accessible way.

Some of these ideas look like a systematic - and in principle, arbitrarily accurate - mathematical treatment of the fundamental QCD Lagrangian: perturbative QCD and chiral perturbation theory. Others don't. People had to design models with new emergent objects that some of them could see in the mess and they had to think about new phases in the QCD phase diagram. The quark bag model and the color glass condensate are examples.

Holography




The most far-reaching new technique in the research of strongly coupled gauge theories is undoubtedly the AdS/CFT correspondence. The previous sentence doesn't mean that the AdS/CFT correspondence has suddenly become the only tool to be used in nuclear physics or heavy ion physics. But it does mean that many people have very good reasons to expect that this tool will be the most efficient one to understand questions about the strong force that have been difficult to understand with the previous tools.

Most of the 5,000 or so followups of Maldacena's original paper deal with the N=4 supersymmetric gauge theory in four dimensions. This theory has many theoretical advantages: the maximal amount of supersymmetry constrains its dynamics in such a powerful way that we not only know the exact Lagrangian of the boundary CFT but we also pretty much know how to expand the physics around the opposite point, using the bulk AdS description.

This N=4 theory exhibits S-duality - that is important not only for the Langlands program - and it has many features that allow us to study its properties using some of the simplest backgrounds of string theory. The rules to calculate its on-shell Green's functions using the twistor methods and CSW prescriptions look particularly natural. The theory may even be integrable. I could go on and on and on but the main point is clear: this theory is a priceless theorists' playground.

But it is not just a pure theorist's perspective that makes the N=4 gauge theory important. This theory has many features - qualitative and, in some limits, quantitative - that seem to be universal for all gauge theories or at least all confining or conformal gauge theories. Although a theory that is as fancy as the N=4 theory could have seemed to be an unexpected starting point to study the real world, it is becoming increasingly clear that this theory is a very important zeroth approximation for computing things and understanding of strongly-coupled gauge theories in general.

This relevance of a seemingly abstract theory for the understanding of reality couldn't have been obvious from the beginning. But it seems to be rather well-established today: such a new piece of knowledge is what I personally call progress because it shows which effects and fields are actually relevant for particular observed phenomena rather than to simply confirm someone's pre-conceptions and to fit them with a random phenomenological model. Also, all sensible physicists agree that for theories like the N=4 theory and probably many others, the holographic correspondence is not only true but it is also a very important lesson about the character of gravity and gauge forces in our quantum world.

QCD and holography

With the successes and theoretical depth in mind, it wasn't unexpected that many people started to apply the AdS/CFT methods to the real QCD: this direction of research is called the AdS/QCD correspondence. It is work in progress but it has already led to surprisingly good results. Of course, just like in other approaches, it is not guaranteed that the successes will grow arbitrarily high.

The real QCD doesn't have any supersymmetries: the fermions and scalars in the adjoint representations are missing. On the other hand, the real QCD has quark fields in the fundamental representation. This difference is unimportant for some phenomena - dominated by the pure gauge field sector - and important for other phenomena. Whenever the differences are important, it is usually possible to keep on refining the dual bulk description in order to get more accurate results.

I don't think that the real goal here is to find a completely accurate dual description of QCD: in some sense, I tend to think that the QCD Lagrangian as we know it is the only truly complete and accurate description of QCD physics and the best definition of a dual bulk theory is again the QCD Lagrangian. ;-) But the string theory methods are certainly extremely useful to get a better intuition for many situations and to generate new kinds of expansions: for example, at temperatures higher than "Tc", supersymmetry is broken anyway and the N=4 theory is expected to agree with observable phenomena.

Also, the dual stringy methods based on quantum gravity only become useful and gravity-like in the limit of a large 't Hooft coupling which usually requires a large number of colors N. In QCD, we have N=3 which is pretty close to infinity - in the sense that 1/N = 1/3 is pretty close to zero - which explains why already the first terms in the 1/N expansion gave results that were more encouraging than expected.

Some blog articles about the AdS/QCD program include:

As I have suggested, the AdS/QCD approach can't be expected to end up with a different rigorous form of the QCD Lagrangian because at a finite coupling, the dual bulk theory is not local and the incorporation of all the necessary corrections would make the very definition of the dual theory very complex.

But because the complex and non-local terms are suppressed by a small parameter such as powers of 1/N, the approach can be expected to describe an increasing ensemble of observations with an increasing accuracy. This approach is based on the insight that many theoretical physicists consider to be the most important result of theoretical physics in the last 10 years which makes it clear that the physicists will continue to look at it even if they encounter the first hurdles.

When the dust is settled, I am pretty sure that the description of many situations based on the AdS/QCD will become the favorite one simply because there are many situations in which the stringy behavior of the strong interactions is undeniable. The correct effective degrees of freedom in different regimes are those that are dictated by a careful analysis involving string theory, among other things, and there is no way for an informed physicist to deny this fact.

Many RHIC experimenters as well as theorists are really thrilled by this new framework to study their field. I believe that it is a responsibility of every expert to try to find the most correct answers to the questions that he or she is trying to solve. In this sense, I think that one should expect that the heavy ion theorists will try to learn these new things that have already been successful. If they don't look at it, they are at risk that their field will be taken over by younger colleagues equipped with more modern and up-to-date tools.

It would be counterproductive and silly for the older heavy ion theorists to be throwing artificial hurdles in front of the new researchers in the heavy ion field who are using tools that their older colleagues could have discovered years ago if they had been more creative. It is very entertaining if some of the old, well-known, but more reactionary members of the heavy ion community offer pictures of Pinocchio in their talks - but being entertaining is not necessarily the same thing as having something to say about physics.



Figure 1: "Say it," the blue politically correct official asked Pinocchio. "Sure, why not? The AdS/CFT correspondence is not the most important insight of the last decade about the behavior of strongly coupled gauge theories," Pinocchio said and his nose has exploded as it always does when the poor Pinocchio doesn't say the truth.

If someone is more interested in Pinocchio's awards rather than the fascinating bounds on viscosity or the mean free path that can never decrease below the de Broglie wavelength - the latter fact comes from the Bekenstein bound in the bulk - that are nearly saturated in reality and that can be rigorously calculated from the bulk gravitational theory (and, as far as I know, from nothing else), he may very well become a subject of Witten's anti-fuel policy. And The Reference Frame will leave the detailed discussion of Pinocchio to Backreaction and Asymptotia, websites that apparently think that Pinocchio is more important for QCD than we do. We prefer the picture if anything. ;-)

For a recent technical but accessible talk about AdS/QCD by Pavel Kovtun, see these files:

See also: