One of the "great" ideas that are being proposed billions of times every day is the idea that the fundamental physical laws of Nature are "discrete". The world is resembling a binary computer - or at least a quantum computer, we're being told very often. "Discrete physics" even has its own USENET newsgroup "sci.physics.discrete" which has fortunately been completely silent since it was created. Various games and "types of atoms" that are supposed to produce spacetime at the Planck scale are even sold as "alternatives to string theory".
I am among those who are convinced that every single proposal based on the idea that "the fundamental entities must be discrete" has so far transparently been a crackpot fantasy. What's wrong with all of them?
Both discrete and continuous mathematics matter
First of all, both discrete as well as continuous mathematical structures are important for actual reasoning and calculations in physics and not only in physics. We just need both of these categories of tools and theorems. Many people who like to say that only one of them may be fundamental are usually the people who don't know the other set of insights well enough - or they don't know it all. And they don't want to learn it. Instead, they want to promote a "theory" that implies that it is good if you don't learn it. In other words, they ignore at least one half of the basic math that is needed for physics.
Discrete and continuous concepts are related
Second of all, there are many deep relations between discrete and continuous objects, or between combinatorics and calculus. The Riemann zeta function is a completely continuous function of a complex variable. Nevertheless, it knows more or less everything about the distribution of prime integers. There are many other examples like that in mathematics and even a bigger number of them may be found in theoretical physics.
Knots in three dimensions are discrete objects, too; nevertheless, their properties are encoded in correlators of operators in Chern-Simons theory. The Gromov-Witten invariants and other integers associated with manifolds and their topology that seem to be completely combinatorial in their character may be derived from another continuous theory, namely topological string theory. The integer degeneracies in many contexts are calculated as coefficients in generating functions that are, once again, completely continuous.
Crackpots are almost always discrete
The proof of Fermat's last theorem - a theorem that has attracted so many crackpots exactly because it looks so simple (and discrete) - is essentially based on geometry and smooth objects, too. Riemann's hypothesis has attracted many less crackpots - despite the $1,000,000 award from the Clay Institute. It's because most crackpots are discrete, zero-dimensional objects.
The mathematical insights that allow us to derive many "combinatorial" conclusions from analytical considerations, continuous functions, properties and equations defined over smooth manifolds etc. have been important at least for 200 years. As time goes, the fundamental concepts that underlie a significant part of our knowledge of mathematics and mathematical physics become continuous in character. The discrete features and integers are increasingly integrated into increasingly continuous structures. The combinatorial concepts are being integrated with geometry.
Modern history of continuous dominance
Both the 19th century as well as the 20th century have escalated and accelerated this process. The discrete, point-like particles in classical mechanics were largely replaced by continuous fields in the 19th century. Although new discrete processes and objects were observed later, their complete description always happened to be a continuous one.
The "old" quantum theory with its model of the Hydrogen atom from Niels Bohr became a childish game when the "new" quantum mechanics was constructed. The correct picture that explains the quantization of energies may be defined in terms of a continuous wavefunction. Shockingly enough, the eigenfunctions of a Hermitean operator may form a discrete set. Also, there still existed discrete particles. Nevertheless, it turned out that they were again manifestations of a discrete spectrum of a completely continuously defined operator in quantum field theory where it acts on a completely continuous Hilbert space.
The more and the deeper we understand something, the more continuous and geometric the fundamental objects underlying our descriptions become. The previous sentence is more or less a tautology for objects: "objects" only look "discrete" when we describe them in a superficial manner, without looking into their structure. The discrete social security number is only a good description of a person for those who are not interested about anything else connected with the person.
String theory continues in the same direction. In quantum field theories, the identity of elementary particles - and their charges and masses - were discrete in character. There were dozens of types of "matter" in the Standard Model. In string theory, there is only one type of matter and the discrete choices only emerge as properties of eigenstates of another Hamiltonian. Also, many "discrete" phenomena and features of low-energy physics are provided with a geometric interpretation and realization emerging from string theory; it often involves additional dimensions of space.
Future: more geometric
I find it rather obvious that this process will have to continue if we continue to make progress in our understanding of the fundamental mathematics underlying the laws of the Universe. All kinds of discreteness will have to be derived from a starting point that is rather continuous. A new kind of fuzziness, non-commutativity will be introduced to our physical laws while keeping them quantitative and predictive. Try to look for discrete objects and choices that are used in our current laws of physics and string theory in particular; they must be explained by a deeper principle that is inherently continuous and geometric, where the word "geometry" is used in a generalized sense.
Discreteness is always a derived concept if you look carefully enough. Discreteness is emergent if you wish; it can never be quite fundamental.
Even though the previous paragraphs were mostly about the future, I have also emphasized that the same kind of progress has been dominant in mathematics and theoretical physics at least for 200 years. This is why I am so flabbergasted by the huge number of people who seem to be interested in maths and physics but who have not yet realized what the trend has been for two centuries and who find their childish "discrete" theories so sexy.
As far as I can say, neither of these theories had anything to do with physics. What do I mean by physics? I mean the actual phenomena that we observe in Nature and their mathematical description that always happens to be quite unique. Examples? The Hydrogen atom. Other atoms. Chemistry. Newton's laws. The Lorentz invariance, the translational invariance, the rotational invariance, the required gauge symmetries and diffeomorphisms - all of them being continuous symmetries. Superconductors. Particle scattering. Radioactive decay. The laws of quantum field theory and general relativity into which we must embed the previous theories to agree with all the observed phenomena. String theory that is absolutely needed if we want to agree with quantum field theories as well as general relativity within a unified framework.
The "discrete geniuses" don't care about a single among these experimental constraints. Frankly speaking, their theories may have been silly even in the ancient Rome. They resemble Maxwell's pathetic and totally redundant models of aether from the 19th century - whose naivité was truly understood only when Einstein wrote down his special relativity, after appreciating insights by Hendrik Lorentz (who understood that only one electric and only one magnetic vector exists at each point and even in vacuum) - except that Maxwell's model of the luminiferous aether at least could agree with Maxwell's equations - and FitzGerald has actually constructed a working model out of many gears and wheels.
Modern "aether geniuses"
The modern "aether geniuses" don't care about a single physical effect in the list above or many other lists for that matter and they don't even try to construct the models because they probably know very well that they could not work. (The worse among them don't even care about quantum mechanics.) They may propose a theory that the Universe looks like LEGO or a binary computer or a binary quantum computer with a few operations. And they expect others to believe that all of physics will miraculously be reproduced. The Lorentz invariance will suddenly emerge without any reason (or maybe after a few (?) parameters of their LEGO building blocks are adjusted); the atoms will also emerge and they will have the right spectra. Gravity suddenly starts to act according to Einstein's laws. Why not? They have the great idea, don't they? The Universe is just like a simplified model of Commodore 64 - that's exactly what you need to revolutionize all of science.
A technical paragraph: Jacques Distler seems to be confused by my statement that the Lorentz symmetry can never emerge without a good reason, and his example is lattice QCD, apparently in Euclidean spacetime (and he apparently assumes some discrete symmetries that are nearly equivalent to the Euclidean Lorentz symmetry). I am talking about the actual "SO(d-1,1)" Lorentz symmetry in the Minkowski space. If you start with a non-Lorentz-invariant theory, there is a separate dimensional analysis for operators with respect to the scaling of time and the scaling of space. You definitely cannot derive that the IR limit is Lorentz-invariant for a generic non-relativistic field theory, and if Jacques ever did so, it's only because he already assumed the Lorentz invariance - a symmetry between space and time - from the beginning. The best thing that can happen is a theory with different speeds of light for different particles where a finite number of IR parameters must be adjusted. This is however not the case if you start with a generic non-relativistic theory. Then there are infinitely many fine-tunings needed for all the operators with arbitrarily many spatial derivatives (equivalently, in the language of lattice theories, with new interactions that are less local). Infinitely many miracles are needed to get a Lorentz-invariant theory from a generic non-relativistic interacting theory. You can only get to the point where the number of terms to adjust is finite if you assume that the theory will be a small perturbation of a Lorentz-invariant theory (and therefore the relativistic dimensional analysis is applicable), and it's simply not a right assumption when you start with a generic non-relativistic theory.
God is just like Lord Sinclair except that God is a bit more lazy so He created a simpler computer. At any rate, deep mathematics is certainly not needed. Every ordinary person may understand the "theory of everything", they believe. All the sins of physics between 1905 and 2005 - that have made physics counterinuitive and too abstract for the farmers - will be undone, they seem to think.
Can this kind of simplicity be a selection criterion in physics? Certainly not. Nature does not care how much time we need to spend to understand some of Her fancy concepts. Nature is simple but the simplicity is only revealed once we accept Her rules of the game. The laymen's "psychological" feelings of simplicity have nothing to do whatsoever with the kind of "beauty" that has become a good guide in the search for new and better theories. As the Russian physicist Okuň once said, "Prostoj prostoty něbůdět." There will no longer be any simple simplicity.
Do they really believe it?
Whenever I see these proposals that Nature must be a simple discrete system and everything else will hopefully work, it's hard to avoid the obvious question: have they completely lost their mind? In 2005, a usable theory must agree with billions of observations, and indeed, the Standard Model and General Relativity do agree with them. What miracle makes it happen? It's because the Standard Model and General Relativity satisfy many important principles that more or less imply the right physics, at least qualitatively - because the laws of mathematics are pretty strict if these principles are taken seriously. The principles are often almost enough to choose the right track or even the right theory.
Principles of physics matter
The Lorentz invariance of our theories guarantees that millions of properties of the actual collider observations will match with reality. The postulates of quantum mechanics are needed to agree with millions of experiments and many of their common features. Quantum mechanics plus special relativity implies that physics must look much like quantum field theory.
Quantum field theory only allows some types of fields and a very small number of relevant and marginal interactions. This simple classification of interactions is only possible because of a symmetry between space and time. When you think about the possibilities, there are not too many and because the principles are apparently right, our theories agree with experiments. A similar comment would apply to General Relativity, too. One just can't throw QFT and GR into the garbage bin. It is absolutely critical for any theory that is meant as a fundamental theory to reproduce the successful features of GR and QFT. It can be shown to be the case of many dual descriptions of reality that emerge from string theory even though some of these facts could seem like fascinating miracles at the beginning; and it can easily be shown not to be the case for all other "alternatives", especially the "discrete alternatives", that have ever been proposed as of November 2005.
But the "discrete geniuses" won't ever listen. They don't want to hear a 30-second proof that their theories can't work. This is not about any rational debates. The Discrete Universe is one of the postmodern religions. Why are there so many believers? The expansion of the computer industry may offer a clue.
Miracles in a discrete world
The Lorentz invariance of a generic interacting theory can never "emerge" accidentally from a starting point that is not Lorentz-invariant. The only reason how why it should emerge is that we could prove that it does - for example, because the theory is exactly equivalent to a manifestly Lorentz-invariant theory. This is the case of Matrix theory or the large N limit of the AdS/CFT correspondence where the Lorentz invariance is not manifest but in the appropriate large N limits, it appears because of the equivalence with a manifestly Lorentz-invariant description of the same physics.
But assuming that it will happen in a random discrete theory due to some miracle, without having a single rational reason why it should happen or why the discrete model should have anything to do with reality, makes Intelligent Design look like a highly reasonable up-to-date scientific theory in comparison.
It's just completely unreasonable to assume that a random, cheap, and childish discrete toy model will give you the Lorentz invariance, the physics of GR, the right atomic spectra, or the Standard Model. It's not only silly because it is too optimistic; we have dozens of rigorously proved theorems that show that such things can't work. String theory is able to circumvent many of these theorems because it is a very sophisticated theory that only differs from quantum field theories in very subtle ways. But the naive discrete models simply can't evade these theorems. The theorems were constructed to kill these silly theories, and they did so. I am aware that the first sentence of this paragraph implicitly says that thousands of people who are viewed as "alternative physicists" are completely unreasonable, and indeed, I don't think that their large number makes their opinions more justifiable.
The discrete religion can't be killed
At any rate, we will be hearing these things again and again because they have become a part of the "postmodern" era. The Universe is like an "Intel 8080" microprocessor or a simple type of a quantum computer with two or three gates. The Universe is a discrete history of tetrahedrons, dodecahedrons, or any Platonic polyhedra you can think about. The Lorentz invariance, gauge symmetries, diffeomorphism symmetries, chiral interactions, or the Higgs mechanism don't matter.
They will surely work out properly if the basic idea - namely the idea that the Universe is a simple computer that a mediocre geek can comprehend (but an idea that unfortunately does not have the tiniest glimpse of being realized in Nature) - is accepted and promoted to a new dogma. There will be a growing pressure on all of us to treat the "discrete geniuses" as physicists and never mention that we believe that they are really morons. This is simply how this postmodern era works.
Once again, I think that the statement that the proponents of Intelligent Design are less reasonable or more religious than the proponents of "discrete theory solves it all" hypotheses is a politically motivated and biased assertion. They're on the same level. Both of them may be described as pseudoscientific garbage that attempts to deny virtually everything we have learned in science (physics or biology) in the last 200 years. Both of them need a huge, rationally unjustifiable belief that seems to contradict everything we know and that is only supported by the dogmas themselves - dogmas that have seen no contact with the observable world whatsoever.
What has provoked me to write this rant?
I was asked what I thought about Seth Lloyd's ideas about quantum gravity. Seth Lloyd is a great expert in quantum computing whom many of us admire. In a slightly popular paper, he argued that the Universe can only have about 10^{90} bits. Well, today we believe that it is probably above 10^{100} because of the black holes at the galactic centers: the microwave background no longer dominates the entropy of the Universe. But except for this detail, Lloyd's statement is OK. We call the information "entropy" (up to a "ln(2)" factor) - and it is probably the only quantity discussed by Lloyd that can be defined in physics.
But he also talks about the upper bound of 10^{120} of "operations" (the term is being imported from computer science) that could have been done in the Universe. I am convinced that no one has ever defined an "operation" of the Universe. (It could be a spacetime path-integral counterpart of the coarse-grained entropy formula but I remain skeptical that it is possible at all.)
On the contrary, I am convinced that they can only be defined in systems that approximately behave as binary (or other discrete) computers and where we pick a priviliged basis of the Hilbert space. One of the important principles of quantum mechanics is that its Hilbert space has no priviliged basis in general. The number of "operations" then has no meaning in fundamental physics. The only similar concept that matters in fundamental physics is entropy and its time-derivative, and even these concepts are well-defined only when we define some "macroscopic quantities" that divide the microstates into groups. The increase of the total entropy is guaranteed to exceed the number of "operations" - usually by a lot. Computers generate much more entropy than the information that they manipulate with.
One of his conclusions is that the average operation must take 10^{-13} seconds, the geometric mean of the Planck time and the age of the Universe. This sounds as a complete nonsense to me. 10^{-13} seconds is a huge time in particle physics. A few microprocessors can surely make more operations than 10^{13} per second. What is the bound supposed to mean? Are these operations per Planck volume of space? Or in the environments with the particular bizarre values of the density?
Whatever you do, your dimensional analysis estimates of the total and global number of operations per unit time will either be clearly wrong; or it will be clearly higher than any practically realizable number of operations and clearly lower than the entropy times hbar divided by the total energy of the Universe. Pretty big window. ;-) And until someone gives me a quantitative definition (either a measurable one, or a function of operators in the theories we treat seriously) of the "number of operations in the Universe", I really don't care which point in the window someone chooses because it is not physics.
If someone says that the Universe is a family of angels on the tip of a needle, she can also ask how many sisters a particular angel has. If someone says that the Universe is an Apple Macintosh (why I chose this one will be clear after the following paragraph), he can similarly ask how many keys or operations it has. In both cases, it is an unphysical nonsense.
One of the happy punch lines is when Lloyd admits on page 15 that our Universe is not a computer that runs Windows or Linux. ;-)
In January 2005, Prof. Lloyd proposed that the world is a particular quantum computer. Fancy interdisciplinary terminology notwithstanding, it is essentially a paper about the Regge calculus with all the typical careless exercises. At the end of the paper, it is even suggested that - essentially - among the Standard Model gauge groups, SU(3) x U(1) may act on the wires while the SU(2) may act on the gates of the computer. Honestly, I really don't know whether it is a joke or not. (If it is not, it is at least as fascinating a representation of the Standard Model group as the intersecting brane worlds. Of course, my psychological certainty that this can't work exceeds 99.999%.) Honestly, it seems more funny to me than Bogdanovs' suggestions - especially because Lloyd's ideas refer to a system that we know pretty well: the Standard Model.