Sunday, June 5, 2011

Rebooting the cosmos?

When I was at faculty of Harvard, Ed Fredkin was among the fun and famous people who came to me and wanted to chat on a periodic basis. He told me that Richard Feynman had died some time ago but Feynman used to provide him with some vital feedback concerning Fredkin's ambitious digital theories about the Universe. So Fredkin found it natural that I would replace Feynman's functions.

I enjoyed the meeting with him but I couldn't afford to be distracted on a periodic basis.

Moreover, I've been really overwhelmed by such things since the middle of my undergraduate studies. As a freshman, I could have been open to a curious debate about "discrete vs continuous" but I quickly found out that the "discrete champions" simply get stuck with totally elementary things and simple errors. They were apparently denying empirically extracted principles that I consider essential in the structure of our knowledge about physics - and they may have been denying all empirical data, after all - and further discussions were just dragging me to the bottom of the sea.

It has always looked to me that I was nearly the only person on the planet who hadn't lost his mind concerning these matters. These patently wrong opinions - e.g. that the world has to be fundamentally digital - seem to be everywhere even today; people's time spent with the computers is arguably making this thing even worse. I am sure that most of the best physicists don't think it's right - just like the top physicists of the previous generations - but the top physicists today rarely talk to anyone outside the ivory towers so you don't really know. Even I am not really sure whether they have ever thought about these issues and what they have found.
Update: However, I just learned about the excellent essay by David Tong for the FQXI "discrete vs continuous" essay contest where he earned a silver prize even though the golden essay is kind of silly. He explains how all integers in Nature are derived, "emergent", and ultimately approximate and/or ill-defined. I would endorse every letter of his essay.

He starts with a silly quote by Kronecker - on "God who made the integers; everything else is an invention of Man" - that's been debunked by nearly all subsequent developments in maths. Tong shows that the integers are derived from more fundamental equations, if they exist (e.g. "n" in the Hydrogen atom's energy shows that "God only made the complex numbers and everything else follows from Schrödinger's equation"), and ill-defined in many other contexts (number of planets, particles, particle species, and even dimensions), and ends up with chiral fermions (that can't be put on lattice) as another way to prove that we don't live in a Matrix. ;-)

A priori, Tong says, the universe could have been discrete. However, remarkably, a posteriori, we were allowed to accumulate a large number of proofs that it is not.
Oh yeah, I have to include a picture of Steven Spielberg to clarify why a waitress wanted Fredkin's autograph. ;-)

So the result is that the pop-science has totally taken over the broader perception of physics. And the paradigm that the Universe is fundamentally digital is one of those holy and popular albeit flagrantly wrong concepts. I would have enjoyed talking with Fredkin on a periodic basis but only if I hadn't been exposed to thousands of deluded people who were not willing or capable to understand very simple arguments.

Moreover, I was kind of sure that Fredkin had to have heard all the disproofs he would hear from me from Feynman - every sane physicist has to converge to similar ideas about these matters and be able to produce counterexamples to any of the proposed discrete theories because they're really dictated by the knowledge about the discipline - so I couldn't have contributed anything truly "novel", I was afraid, because only positive things are viewed as "novel" by the discrete physicists. The digital Universe has become a religion. An insane one.

Phil Gibbs just posted a link to the videos of the World Science Festival in New York and one of them is called "Rebooting the Cosmos: Is the Universe the Ultimate Computer?". Here is the video:


Watch live streaming video from worldsciencefestival at livestream.com

The 96-minute video above features Ed Fredkin, Seth Lloyd, Fotini Markopoulou, and others. I swear that the video worked yesterday but the sponsors had to run out of money and it offers the audio only today. But whenever you click at "pause", it shows you the current screenshot haha.

I just completed the translation of the relevant chapter of Brian Greene's book, The Hidden Reality. Nine types of multiverses are nicely discussed in the book, although many things - especially about the interpretation of quantum mechanics - are just plain wrong. And of course, Ed Fredkin is discussed in the context of the simulated (and partly definitive) multiverses.




Just to be sure about Brian Greene's terminology, the simulated multiverse is the set of all universe-like simulations that run on computers and simulated computers within computers, and so on. The definitive multiverse is the set of all theories and mathematical structures with all the data, pretty much equivalent to the meta-structure of Max Tegmark.

From the reading of the book, it's clear that Brian Greene is a digital convert, too. So are all the participants of the "rebooted" discussion above. Now, that's crazy. Is that really impossible for the organizers to also invite someone who actually knows how these things work in the real world and someone who could show that all these visions, however attractive for many people, are just plain wrong?

First, it's clear that the Universe isn't a classical digital computer because of quantum mechanics. To reproduce experimentally verified high correlations and other features implied by quantum mechanics in a classical language, you would need some awkward non-local theory that would violate the Lorentz invariance at the fundamental level.

But the Lorentz symmetry may be shown to hold - in fact, up to the accuracy of 10^{-11} at the Planck scale - using birefringence and gamma-ray bursts. You would have to believe that all the tests of special relativity and Lorentz invariance are just giant conspiracies. It's not just one fine-tuning you need to avoid contradictions with the observations: there are dozens of currently doable measurements of effects (and terms in the effective Lagrangian) where the Lorentz violations could show up but they don't. The maximum speeds allowed for every particle species we know and every bound state is the speed of light. For each species, you would need one huge fine-tuning to explain why Einstein's 1905 theory works.

It's clearly insane and, as implied by Bell's inequality, GHZM experiments, Hardy's "paradox", and many other setups, this insanity would become even worse if you tried to add quantum mechanics to your computer-like description of the Universe.

Our Universe cannot be a digital computer. This is not a constantly open, unreachable, philosophical mystery. It's a part of the things that science has become able to address more than a century ago and the answer is No, it cannot work, at a confidence level that is much more certain than many other things we usually present as "certain". Our Universe can't even be a discrete quantum computer. The Lorentz symmetry is among several continuous symmetries that simply could never naturally "emerge" out of a fundamentally discrete starting point, even if it were were a quantum discrete description.

But even if I forget about the body of the empirical evidence that makes it clear that a digital computer - especially a classical digital computer - can't be a description of the world around us, I would have even "philosophical" problems with the idea that the digital viewpoint "should" be more fundamental.

Even if I forget about the experiments and the quantum and relativistic structures that followed from it, I just find it obvious that pretty much by definition, discrete objects are always less fundamental and less complete than the continuous ones. A discrete description of some object or phenomenon is always an approximation. It's what bureaucrats do with the people - they represent them by a few numbers. You use it if you only have a limited number of bytes to discuss something; if you want to draw "caricatures" of the reality; if you want to approximate complex things by oversimplified models. Atoms or people as dots or triangles.

But discrete fundamental objects couldn't have well-defined interactions that are constrained by physical principles. There's a kind of a duality or trade-off that if your objects are too discrete, there are too many rules that you may create for them - a multi-dimensional continuum of their possible interactions, so to say. The non-renormalizability of loop quantum gravity is an example - you may add infinitely many types of interactions to the spin network and the resulting theory will still belong to the same class of hopeless mess. It may sound paradoxical but only if you study continuous objects and continuous conditions that constrain them, you may reduce the number of solutions to a discrete number of possibilities. That's how you may derive the whole spectrum of atoms from a single equation; it only works because the equation works with continuous objects. If the fundamental objects are discrete, the number of different theories is continuously infinite. If the objects are continuous, you have a chance that the consistency conditions will actually tell you something.

The old Bohr model of that atom was too discrete a description of the hydrogen atom; the electron was just a point mass and the orbits were bureaucratically constrained in an ad hoc way. Clearly, they could have been constrained in other ways: this discrete model was just adjusted to agree with a pattern of the hydrogen spectrum. It had some OK features but it was obviously naive. A better theory, quantum mechanics, used a continuous wave in 3+1 dimensions. The same theory applied to many bodies has to use wave functions in many more dimensions. In some sense, they're more continuous than the 3+1-dimensional waves. Quantum field theory uses an even more continuous description - the wave functional depends on infinitely many degrees of freedom. In some sense, string theory goes even one step further but it can regulate the exploding complications.

In many situations, the Hilbert space may be produced by a discrete basis, and so on. But the precise energies etc. of an energy eigenstate basis have the values they have only because there is a fundamental continous explanation. If the discrete description were the last word, the energies of all the states and the interactions of objects could be changed in infinitely many ways.

So there will always be an interplay between discrete and continuous objects in physics. But it's always the case that when your description looks fundamentally discrete at some level, it's because you haven't understood it well enough yet. The same message works in pure mathematics, too. There are interplays between number theory and complex functions. But the complex functions - such as the Riemann zeta function - are the "more fundamental" perspective on these common problems. It becomes even more obvious if some of the continuous parameters - the physics' counterparts of the zeros of the Riemann zeta function - can actually be measured.