Several recent discussions have convinced me to write an essay about the conventions and physics, with a focus on field redefinitions. The main question is:
- Can a new set of conventions or coordinates reveal new physics?
- If you discover new physics using new degrees of freedom, new coordinates, new conventions, or new approach equivalent to the old one, it could have been harder and less natural to obtain the same results using the old conventions, the old degrees of freedom, and the old approach: but it would almost never be impossible.
By physics, I only mean the predictions of phenomena e.g. "Will the object XY explode if you do UV?" Clearly, not all of formalism is physics. Physical predictions are the quotient of all predictions of your theory modulo all details that you might have done differently. There is always a finite number of sign conventions and similar conventions in every theory. Once you fix them, infinitely many signs and other quantities can be predicted. Below, we will not talk about signs as much as about different coordinates and redundancies.
Classical physics
In classical physics, we have to work with configuration spaces "(x1...xn)" or phase spaces "(x1...xn, p1...pn)". In general, these spaces are manifolds and they admit many kinds of coordinates. We must be careful about the ranges for different coordinates and about all kinds of periodic identifications. But that's it. A description of a mechanical system such as the Solar System in terms of spherical coordinates is as good as the description in terms of spherical coordinates, among other examples.
In classical field theory, we can redefine not only the coordinates describing the spacetime itself but also the coordinates describing the infinite-dimensional configuration spaces of the fields. Any parameterization is good as long as it covers the physically interesting regions.
In general relativity, the coordinate reparameterization group is the group of the local symmetries of the theory. It can't be unexpected that we use all kinds of coordinates and indeed, the competition is much more severe. You can rarely say which coordinates are the best ones.
Coordinate singularities
Some coordinates behave nicely everywhere and the "local" physics - which either means "local" in spacetime or "local" some configuration space - follows some universal smooth laws that could have been determined by some general methods. But sometimes things can look bad. The metric tensor and other fields may diverge. There are cases in which this divergence is just an artifact of bad coordinates.
If you write the black hole metric in the original coordinates, it becomes singular near the horizon. Some components of the metric tensor go to zero and others go to infinity. Normally, someone could expect that no one knows what happens except that a classical general relativist knows what happens. She can find better coordinates in which the region near the horizon can be smoothly continued into the interior of the black hole. The horizon actually becomes a perfectly regular and ordinary place in spacetime. You can't even say whether you have crossed it or not.
A quantum gravity theorist may have some doubts about these statements and she can discover some new - probably tiny - effects that distinguish the interior of the black hole from the exterior that are probably necessary to solve the information loss paradox. Nevertheless, we still expect that the quantum gravity results will reduce to the classical general relativity results in the appropriate limit. Of course that we may be wrong but this "black holes don't exist" conclusion would represent a drastic violation of our intuition about locality of the laws of physics.
Imagine that the classical general relativist couldn't find the smooth coordinates. In that case, she would need to find some new laws describing what happens near the particular kind of singularity that is similar to the neutral black hole horizon. The answer to this question could potentially introduce another layer of arbitrariness to the theory. But the existence of better coordinates implies that no such arbitrarines exists near the black hole horizon and classical general relativity is able to predict the results of classical observations of infalling observers, too. It is merely a coordinate singularity.
Various sets of coordinates may suffer from various diseases. For example, a coordinate may be unable to describe physics outside the Solar System. If you insisted that this is a good coordinate, you would have to ask what happens when you reach the end of the world near Pluto or Eris. A whole new set of laws would be needed. For example, if you imagine that there is a gate to the Heaven at the end of the world, you would have to measure or calculate how many angels are employed as guards. These angels would become a part of physics. Such angels are not a real part of physics in any of the theories that have been considered important or true in the last 500 years or so because there always exist coordinates that cover all the relevant spaces.
Besides coordinate singularities, classical general relativity also leads to physical singularities. Some invariants such as the curvature invariants constructed out of the fields diverge. A better theory is needed. String theory is the only known theory - and arguably the only mathematically possible theory - that can give you answers what happens near such physical singularities. In some sense, you can view the physical singularity to be a coordinate singularity in some large configuration space in which you used coordinates that are only valid far away from the singularity.
String theory tells you to choose different degrees of freedom - and, let's admit, more degrees of freedom - near the singularity. It is actually important that the number of degrees of freedom is different because otherwise the legitimate physical diseases couldn't be cured. With the better string theoretical degrees of freedom, physics is again fully determined. String theory allows you to forbid the addition of new kinds of angels for every new kind of singularity. The actual physics around all these singularities is encoded in the same physical laws.
Hamiltonians and Lagrangians
Newton has found the laws of mechanics in terms of the differential equations. Some physicists and mathematicians after him were able to write the same equations in a more concise form, e.g. using the Lagrangians and Hamiltonians. Theorists love these newer approaches because of many reasons but it is fair to say that classical physics could be in principle done without them.
In quantum physics, we sometimes like the Hamiltonian formalism because the Hamiltonian becomes the operator that defines the time evolution. We could still define quantum mechanics in the Heisenberg picture in which the other operators are time-dependent, and write the appropriate differential equations for these operators without knowing that they can be obtained from a commutator with the Hamiltonian.
However, the fact that we know that the Hamiltonian seems to be behind all these equations in all the important cases changes our psychological perception of the space of possible theories and about the data we need to specify a theory. The Hamiltonian is surely a useful concept in many cases but in a very strict physical sense, it is not necessary.
Similar comments apply to the Lagrangians and actions. The laws of classical physics may be derived from the principle of stationary action. In quantum physics, we sum over all classical histories and the weight is determined by the classical action, as Dirac and Feynman have taught us. Feynman's approach is another, equivalent method to obtain the same amplitudes. And we use the same laws to compute the probabilities.
In this sense, once again, Feynman's approach is giving us a new picture to look at reality where various things may look more natural or less natural than before and where we think about different analogies from the everyday life. Many things are easier to deal with and other things may become more obscure. But the functional approach describes the same physics.
Redundant degrees of freedom
Feynman's approach makes Lorentz invariance in field theory much more manifest than the approaches based on the Hamiltonian. Also, it is priceless whenever we use redundant degrees of freedom. Electromagnetism was the first example of a physical theory in which redundant degrees of freedom were useful.
Historically, people would talk about "E" and "B" at each point because these electromagnetic vectors can be directly measured. But it was realized that the same physics can be encoded in the four-potential "A". Its values are not quite physical. Only "E" and "B" may be measured and there are many choices of "A" that give you the same "E" and "B". In quantum physics, the integrated "A" also matters modulo a periodicity - as explained by the Aharonov-Bohm effect.
Nevertheless, the electromagnetic potential became very useful even though nominally one of its off-shell components is unphysical and can be removed by a gauge transformation. The non-Abelian generalization of electromagnetism, the Yang-Mills theory, includes some extra Jacobians from the gauge-fixing. The path-integral approach makes the Lorentz-covariant calculations of Yang-Mills theories very convenient because these Jacobians can be accounted for by introducing the Faddeev-Popov ghosts, as Feynman figured out.
This is a whole new approach that has become so standard that most of us don't even know how to make the analogous calculations - including the Faddeev-Popov loops - in the Hamiltonian formalism. But it would still be possible although much harder because the manifest Lorentz symmetry would have to be sacrificed. In this jungle including the new unphysical friendly ghost fields, the physical states are obtained as BRST cohomologies of a nilpotent operator which can be seen to describe the same physics as the approaches without unphysical degrees of freedom.
The Batalin-Vilkovisky formalism extends the BRST formalism and allows you to include ghosts for ghosts which is very useful if the local symmetry itself becomes very complicated. If you need to know more about this difficult formalism, ask Dmitry Vaintrob. Still, things could in principle be done without these fancy machineries. However, it would be harder to correctly impose all the conditions such as unitarity and Lorentz symmetry.
Equivalent CFTs
In string theory we know a lot of dualities that are "hard": we can't even quite prove all of them in full generality because we don't have the full definition of string theory. Some of them can be proven in some particular backgrounds that are defined via the AdS/CFT correspondence or Matrix theory. For example, the U-dualities of M-theory on tori up to the five-torus can be proven from its Matrix-theoretical description.
These string dualities, to deserve the name, normally require at least one of the equivalent descriptions to be strongly coupled. We also know many equivalences on the worldsheet. Bosonic CFTs are equivalent to fermions which can be extended to interacting theories, different kinds of periodicity, and other cases. Many of the free theories are equivalent to non-linear sigma models on group manifolds, the WZW models, the minimal models - the canonical examples of well-defined theories that don't depend on a classical Lagrangian, and so forth.
Moreover, perturbative string theory allows us to depend not only on the RNS description but also on various alternative descriptions in terms of pure spinors, hybrid approaches, ghost pyramids, and others. All these approaches eventually agree on the physical spectrum and the particles' measurable interactions but they differ in their choice of the intermediate degrees of freedom, ghosts, and various redundant gauge symmetries. Each of them has some advantages and some disadvantages. All these differences can be described as technical differences although the technical differences may become very important if you actually need to complete a real calculation.
I could add a similar discussion about the unphysical choices we have to make when we regularize divergent expressions in quantum field theory, which can be done in many ways, and why these things are eventually irrelevant for the actual physical predictions due to the spirit of the renormalization group, at least within a certain error margin associated with effective field theories.
At any rate, I think that particle physicists and string theorists feel very strongly about the difference between new physical results - which is what really matters - and changes of formalism. From this viewpoint, one can't ever expect to get really new physics just by rewriting old physics in terms of unusual variables such as the BF theory. One shouldn't expect new physics from a parameterization of the momentum space by unconventional coordinates as they do in doubly special relativity either.
Unless you're lucky to guess new physics with the complete equations directly, new physics can only be revealed by identifying new possible principles, constraints, or physical mechanisms. A generic choice of new coordinates or a new selection of unphysical, redundant degrees of freedom is not quite new physics and we shouldn't expect that it can tell us something that we didn't know before.
Still, it is very easy to imagine that some of the new future important discoveries will sound like a mere conceptual breakthrough in the formalism. But in the string theory case, they must do more than just that in order to be really interesting. They must provide us with a key to easily transcend various approximations - weak coupling or a weak deviation from a superselection sector - that restrict our computational abilities today and that allow controversies about things like the vacuum selection problem to thrive. As we mentioned at the beginning, it is probably not impossible to find the new physical laws without a new, better formalism. But it could turn out to be damn hard.
And that's the memo.