Some weeks ago, I was studying some papers and talks by Jack Ng. I am going to share with you what I have learned from them, about the wonderful subject of the “quantum spacetime foam”.
Some topics and ideas I will cover: holographic cosmology, MONDian Dark Matter and Dark Energy, plus some other uncommon subjects.
Firstly, let me quote J.A.Wheeler:
“Probed at the smallest scales, spacetime appears to be very complicated”
Something akin in complexity to a turbulent froth, that Wheeler himself dubbed “spacetime foam”.
The big idea of quantum spacetime and quantum gravity is that spacetime itself could be “foamy-like” and “emergent” on very small scales. It fits both the classical and quantum theories somehow, since
Even spacetime topology could change at quantum level! But how large are spacetime fluctuations? How foamy-like is the quantum spacetime?
A simple argument based on known theories provide interesting answers. On general grounds, we expect certain uncertainty in the measurement of any distance l (we could call this the principle of the microscopes: we alter spacetime when we observe it):
On average, we expect the above bound, some kind of “generalized uncertainty principle” (GUP). There, is the Planck length, defined by
and is some parameter related to the theory of quantum gravity we consider. Models of quantum gravity provide different values of , typically when . However, it could be possible that or as well! That is, we can build theories with , where . Therefore, the scale where quantum gravity appears can generally be , and it could even be closer to the electroweak/quark scale. At least from these simple arguments it could. Of course, there are many other arguments that make such a claim more unlikely, but I want to be open minded at this point.
Therefore, one of the most general GUP (Generalized Uncertainty Principles) in Quantum Gravity provide a general bound
Two accepted theories provide similar bounds.
1) Quantum Mechanics. The Margolus-Levitin theorem states an upper bound for any computational rate in terms of the available energy as follows
It is a very well understood bound in the world of Quantum Computing.
2) General Relativity. In order to prevent black hole formation, we must impose , i.e., the size of any system should be greater than the Schwarzschild radius or black holes (BH) arise!
from general relativity.
Therefore, we will see that quantum fluctuations of spacetime and spacetime topology are inevitable from almost any Quantum Gravity approach. In fact, there are some recent speculations about holographic foam cosmology and Mondian Dark Matter that I will explain you as well.
In the rest of this post, I will generally use natural units , and several ideas will emerge:
1) The critical role of the critical energy density of the Universe.
2) The seemingly inevitable existence of Dark Matter (DM) and Dark Energy (DE).
3) Non-locality and dark energy in the framework of “infinite statistics”.
4) MOND/Cold Dark Matter duality.
Suppose that is the accuracy with which we can measure any length .
1st Thought (Gedanken) experiment
Suppose there is some mass with size/diameter equal to , and we “probe” it at distances . Therefore, from the previous arguments, we get:
i) Quantum Mechanics.
and thus, neglecting the first term in the right-handed side, we get
ii) General Relativity.
we obtain (up to a factor 2)
Now, we combine i) and ii) multiplying the two above results, and it reads
Therefore, we get the nice result
from the QM+GR combination!
2nd Thought (Gedanken) experiment
The Universe has 3 dimensions of space and one single time dimension. It fits observations. The holographic idea states that all the information in the spacetime can be coded into a 2d surface. The maximum information in any region of spacetime is bounded by the area of that region. That is the holographic principle in the formulation of G.’Hooft, Susskind, Bousso,…). The numbers of degrees of freedom (DOF) is bounded according to the expression
The origin of this holographic principle rests in the physics of BH. The BH entropy is proportional to the horizon area, not to volume as we would expect from a naive Quantum Field Theory approach. That is . More precisely,
Let me point out that I consider this last equation as one of the most beautiful equations of current theoretical physics.
Now, divide a small cube from a given volume with infinitesimal size smaller cubes. Assume that there is 1 DOF per small cube. Then,
Then, the holographic principle provides
Therefore, the holographic principle provides this holographic GUP!
3rd Thought (Gedanken) experiment
Take any GPS and a good clock. How accurate can these clocks be to map out a spacetime volume with radius L in a given time ? Note that the clock has a mass M.
The answer is quite straightforward. A spacetime “volume” has dimensions . Using the Margolus-Levitin theorem,
Thus, we have
To prevent BH formation, , and then
The requirement of maximal spatial resolution implies that the clocks tick only ONCE, and every cell occupies a volume
And using the geometric mean to average the above spatial separation, taking the cubic root we get
That is, we recover the holographic GUP !
Remark: Maximal spatial resolution requires maximal density or packing
Maximal spatial resolution yields the holographic principle for bits, i.e.
However, IF we spread cells over both space and time, the temporal resolution should be expected to be similar to the spatial resolution and it should imply the average spatial separation over the whole spacetime (we should not only average over space!). Then, the result would be
and it gives the time separation of two succesive ticks, that is, . The interpretation of this alternative scenario is similar to that of a random walk model! I mean,
is similar to the expected time ticks in a random walk. That is, the time to communicate with neighbours (closest cells) is larger than the bound provided by holographic arguments.
From experiments, we could compare these two scenarios studying the incoherence of photons received from distant galaxies. The idea is as follows. The spacetime fluctuations could allow the loss of coherence in photon propagation! The size of these fluctuations can be measured with same phase , and then
In fact, the measured from PKS1413+135 provides
It shows that the holographic idea works and, in fact, that our above random walk model is ruled out! In fact, is not much lesser than the random walk bound.
Indeed, it is expected that the Very Large Telescope Interferometer will do reach a resolution , and, in principle, we could be able to prove or rule out the holographic idea! Therefore, quantum gravity and the spacetime foam are testable!
Spacetime foam and Cosmology
The Universe can be thought as some kind of computer. It perform operations and the software are the known physical laws. In fact, from basic Statistical Mechanics:
Therefore, the number of bits should be proportional to
If we know the amount of information in a given volumen and the Hubble radius, we can calculate the distance between the closest cells/bits to be . It implies that there is about to communicate those regions.
If we known the energy density and the Hubble radius, we can use the ML theorem about to estimate the computational rate, it gives .
If we know the information in a given volumen (or surface) and the operation rate, it implies that a single bit glips one every according to the above arguments!
(1) Ordinary matter maps out spacetime with accuracy corresponding to a random walk model.
(2) A random walk model is ruled out by current experiments, and then, spacetime can be mapped out finer with holographic ideas! This idea parallels what we know from Astronomy, Astrophysics and Cosmology: there exists unconventional (“dark”) matter/energy with better microscopy to map out the spacetime geometry and topology!
(3) The observed critical cosmic density could support the holographic idea. This is an attractive idea that we must test with further experiments and observations.
The spacetime foam idea implies some kind of holographic foamy Cosmology. There are two main quantities, the Hubble constant H and the Hubble radius . Moreover, we have the following interesting quantities
a) Critical energy density:
b) Bits of information in the whole Universe:
c) Average energy per bit:
d) Dark energy. It acts like some kind of (dynamical?) cosmological constant over very large scales:
Dark energy carries some class of long-wave particles (bits?) and a really tiny amount of energy, since
Moreover, the critical density behaves as
If we define the scale facter to be , we know that , and for radiation like fluids is not enough to account for recent/present acceleration of the Universe in the cosmic expansion. Therefore, dark energy/the cosmological constant/vacuum energy seems to exist!!!! It has an equation of state like .
The point with Dark Energy is the following. Suppose that there are some “dark” interactions between dark energy and dark matter (DE/DM). I could give transitions between accelerated and decelerated phases of the cosmic expansion! In fact, from Cosmology, the Universe has suffered three stages after the inflationary phase:
1st. Radiation dominated stage:
2nd. Matter dominated stage:
3rd. (Current era) Lambda/Dark energy dominated stage:
See you in my next spacetime foam post!
Current average ratings.