LOG#131. Spacetime foam(I).

Tomás Saraceno, © 2012

Some weeks ago, I was studying some papers and talks by Jack Ng. I am going to share with you what I have learned from them, about the wonderful subject of the “quantum spacetime foam”.

Some topics and ideas I will cover: holographic cosmology, MONDian Dark Matter and Dark Energy, plus some other uncommon subjects.

Firstly, let me quote J.A.Wheeler:

“Probed at the smallest scales, spacetime appears to be very complicated”

Something akin in complexity to a turbulent froth, that Wheeler himself dubbed “spacetime foam”.

The big idea of quantum spacetime and quantum gravity is that spacetime itself could be “foamy-like” and “emergent” on very small scales. It fits both the classical and quantum theories somehow, since

    \[ \mbox{Foaminess}\leftrightarrow \mbox{Quantum fluctuation of geometry} \leftrightarrow \mbox{Spacetime Foam}\]

Even spacetime topology could change at quantum level! But how large are spacetime fluctuations? How foamy-like is the quantum spacetime?

A simple argument based on known theories provide interesting answers. On general grounds, we expect certain uncertainty \delta l in the measurement of any distance l (we could call this the principle of the microscopes: we alter spacetime when we observe it):

    \[ \boxed{\delta l\geq l^{1-\alpha}L_p^{\alpha}}\]

On average, we expect the above bound, some kind of “generalized uncertainty principle” (GUP). There, L_p\sim 10^{-35}m is the Planck length, defined by

    \[ L_p^2=\dfrac{G_N\hbar}{c^3}\]

and \alpha is some parameter related to the theory of quantum gravity we consider. Models of quantum gravity provide different values of \alpha, typically \alpha\sim 1 when \delta l\geq L_p. However, it could be possible that \alpha>>1 or \alpha<<1 as well! That is, we can build theories with l_{QG}=l_c\neq L_P, where l_c=\hbar/mc. Therefore, the scale where quantum gravity appears can generally be L>>L_p<<10^{-15}m, and it could even be closer to the electroweak/quark scale. At least from these simple arguments it could. Of course, there are many other arguments that make such a claim more unlikely, but I want to be open minded at this point.

Therefore, one of the most general GUP (Generalized Uncertainty Principles) in Quantum Gravity provide a general bound

    \[ \boxed{\delta L\geq l_c\left(\dfrac{L_p}{l_c}\right)^{\alpha}=l_c^{1-\alpha}L_p^\alpha}\]

Two accepted theories provide similar bounds.

1) Quantum Mechanics. The Margolus-Levitin theorem states an upper bound for any computational rate in terms of the available energy as follows

    \[ \mbox{Computing rate}=\Gamma\leq \dfrac{E}{\hbar}\]

It is a very well understood bound in the world of Quantum Computing.

2) General Relativity. In order to prevent black hole formation, we must impose R\geq R_S, i.e., the size of any system should be greater than the Schwarzschild radius or black holes (BH) arise!

    \[ R\geq R_s\longrightarrow M\leq\dfrac{c^2}{2G_N}\]

since

R_S=\dfrac{2G_NM}{c^2}

from general relativity.

Therefore, we will see that quantum fluctuations of spacetime and spacetime topology are inevitable from almost any Quantum Gravity approach. In fact, there are some recent speculations about holographic foam cosmology and Mondian Dark Matter that I will explain you as well.

In the rest of this post, I will generally use natural units c=\hbar=k_B=1, and several ideas will emerge:

1) The critical role of the critical energy density of the Universe.

2) The seemingly inevitable existence of Dark Matter (DM) and Dark Energy (DE).

3) Non-locality and dark energy in the framework of “infinite statistics”.

4) MOND/Cold Dark Matter duality.

Suppose that \delta L is the accuracy with which we can measure any length L.

1st Thought (Gedanken) experiment

Suppose there is some mass M with size/diameter equal to D, and we “probe” it at distances L>>L_p. Therefore, from the previous arguments, we get:

i) Quantum Mechanics.

\delta L\underbrace{\left(\dfrac{2L}{c}\right)}_{time}=\delta L+\dfrac{2L}{c}\left(\dfrac{1}{m}\right)\dfrac{\hbar}{2\delta L}

and thus, neglecting the first term in the right-handed side, we get

\boxed{(\delta L)^2\geq \dfrac{\hbar L}{mc}}

ii) General Relativity.

From

\delta L\geq D

we obtain (up to a factor 2)

D\geq \dfrac{G_NM}{c^2}

or

\boxed{\delta L\geq \dfrac{G_NM}{c^2}}

Now, we combine i) and ii) multiplying the two above results, and it reads

(\delta L)^3\geq \dfrac{\hbar L}{Mc}\dfrac{G_NM}{c^2}

Therefore, we get the nice result

\boxed{\delta L\geq \sqrt[3]{LL_p^2}}

from the QM+GR combination!

 2nd Thought (Gedanken) experiment

The Universe has 3 dimensions of space and one single time dimension. It fits observations. The holographic idea states that all the information in the spacetime can be coded into a 2d surface. The maximum information in any region of spacetime is bounded by the area of that region. That is the holographic principle in the formulation of G.’Hooft, Susskind, Bousso,…). The numbers of degrees of freedom (DOF) is bounded according to the expression

N\leq A in L_p^2 units!

The origin of this holographic principle rests in the physics of BH. The BH entropy is proportional to the horizon area, not to volume as we would expect from a naive Quantum Field Theory approach. That is S_{BH}\sim A. More precisely,

\boxed{S_{BH}=\dfrac{1}{4}A=\dfrac{k_Bc^3}{4G_N\hbar} A}

Let me point out that I consider this last equation as one of the most beautiful equations of current theoretical physics.

Now, divide a small cube from a given volume late V=L^3 with infinitesimal size \delta L smaller cubes. Assume that there is 1 DOF per small cube. Then,

N_{DOF}L^3=N_{minicubes}=\left(\dfrac{L}{\delta L}\right)^3\leq \left(\dfrac{L^2}{L_p^2}\right)

Then, the holographic principle provides

\boxed{\delta L\geq L^{1/3}L_p^{2/3}}

Therefore, the holographic principle provides this holographic GUP!

3rd Thought (Gedanken) experiment

Take any GPS and a good clock. How accurate can these clocks be to map out a spacetime volume with radius L in a given time T=L/c? Note that the clock has a mass M.

The answer is quite straightforward. A spacetime “volume” has dimensions L^4/c. Using the Margolus-Levitin theorem,

n=\dfrac{\mbox{Number of operations}}{\mbox{Time}}=\mbox{Operation rate}\leq \dfrac{E}{\hbar}

Thus, we have

n\leq \dfrac{E}{\hbar}(\mbox{Time})=\dfrac{E}{\hbar}\dfrac{L}{c}=\dfrac{Mc^2}{\hbar}\dfrac{L}{c}=\dfrac{McL}{\hbar}

To prevent BH formation, M\leq Lc^2/G_N, and then

n=N(\mbox{space time cells})\leq \dfrac{L^2c^3}{\hbar G_N}=\dfrac{L^2}{L_p^2}

The requirement of maximal spatial resolution implies that the clocks tick only ONCE, and every cell occupies a volume

V\sim \dfrac{L^3}{L^2/L_p^2}=LL_p^2

And using the geometric mean to average the above spatial separation, taking the cubic root we get

\bar{L}=\sqrt[3]{V}\sim L^{1/3}L_p^{2/3}

That is, we recover the holographic GUP \delta L\geq L^{1/3}L_p^{2/3}!

Remark: Maximal spatial resolution requires maximal density or packing

\rho \sim \dfrac{3}{8\pi}(LL_p)^{-2}

Maximal spatial resolution yields the holographic principle for bits, i.e.

N_{bits}\sim (L/L_p)^2

However, IF we spread cells over both space and time, the temporal resolution should be expected to be similar to the spatial resolution and it should imply the average spatial separation over the whole spacetime (we should not only average over space!). Then, the result would be

\dfrac{L^4}{L^4/L_p^4}=(LL_p)^{1/2}

and it gives the time separation of two successive ticks, that is, \delta t\sim (LL_P)^{1/2}. The interpretation of this alternative scenario is similar to that of a random walk model! I mean,

\delta L\geq L^{1/2}L_p^{1/2}

is similar to the expected time ticks in a random walk. That is, the time to communicate with neighbours (closest cells) is larger than the bound provided by holographic arguments.

From experiments, we could compare these two scenarios studying the incoherence of photons \gamma received from distant galaxies. The idea is as follows. The spacetime fluctuations could allow the loss of coherence in photon propagation! The size of these fluctuations can be measured with same phase \phi_{fluc}, and then

    \[ \Delta \phi=\dfrac{2\pi \delta L}{\lambda}\sim 2\pi L^{1-\alpha}\dfrac{L^\alpha_p}{\lambda}\]

In fact, the measured \Delta \phi from PKS1413+135 provides

    \[ \Delta \phi\sim 10^{-9}(2\pi)<<(2\pi)\]

It shows that the holographic idea works and, in fact, that our above random walk model is ruled out! In fact, \Delta \phi \sim 10(2\pi) is not much lesser than the random walk bound.

Indeed, it is expected that the Very Large Telescope Interferometer will do reach a resolution \Delta \phi (5-10)\cdot 10^{-9}, and, in principle, we could be able to prove or rule out the holographic idea! Therefore, quantum gravity and the spacetime foam are testable!

 Spacetime foam and Cosmology

The Universe can be thought as some kind of computer. It perform operations and the software are the known physical laws. In fact, from basic Statistical Mechanics:

\mbox{Energy Density}\sim T^4

\mbox{Entropy Density}\sim T^3

Therefore, the number of bits should be proportional to

N_{bits}\sim (n_{op})^{3/4}\leq 10^{92}

If we know the amount of information in a given volumen and the Hubble radius, we can calculate the distance between the closest cells/bits to be \sim 10^{-3}cm. It implies that there is about 10^{-14}s to communicate those regions.

If we known the energy density and the Hubble radius, we can use the ML theorem about to estimate the computational rate, it gives \Gamma\leq 10^{106}/s.

If we know the information in a given volumen (or surface) and the operation rate, it implies that a single bit flips one every 10^{-14}s according to the above arguments!

Remarks:

(1) Ordinary matter maps out spacetime with accuracy corresponding to a random walk model.

(2) A random walk model is ruled out by current experiments, and then, spacetime can be mapped out finer with holographic ideas! This idea parallels what we know from Astronomy, Astrophysics and Cosmology: there exists unconventional (“dark”) matter/energy with better microscopy to map out the spacetime geometry and topology!

(3) The observed critical cosmic density \rho could support the holographic idea. This is an attractive idea that we must test with further experiments and observations.

The spacetime foam idea implies some kind of holographic foamy Cosmology. There are two main quantities, the Hubble constant H and the Hubble radius R_H. Moreover, we have the following interesting quantities

a) Critical energy density:

\rho_c=\dfrac{3}{8\pi}\left(\dfrac{H}{L_p}\right)^2\sim (R_HL_p)^{-2}

b) Bits of information in the whole Universe:

I\sim \left(\dfrac{R_H}{L_p}\right)^2

c) Average energy per bit:

\rho_c\dfrac{R^3_H}{I}\sim R_H^{-1}

d) Dark energy. It acts like some kind of (dynamical?) cosmological constant over very large scales:

\Lambda \sim 3H^2

Dark energy carries some class of long-wave particles (bits?) and a really tiny amount of energy, since

\dfrac{I}{\nu}\sim R_H:\;\; inert!

Moreover, the critical density behaves as

\rho\sim \dfrac{H^2}{G}

If we define the scale facter to be a(t), we know that H\propto a(t)^{-1}, a\propto t and P=-\rho/3 for radiation like fluids is not enough to account for recent/present acceleration of the Universe in the cosmic expansion. Therefore, dark energy/the cosmological constant/vacuum energy seems to exist!!!! It has an equation of state like \rho_\Lambda \sim -\rho.

The point with Dark Energy is the following. Suppose that there are some “dark” interactions between dark energy and dark matter (DE/DM). I could give transitions between accelerated and decelerated phases of the cosmic expansion! In fact, from Cosmology, the Universe has suffered three stages after the inflationary phase:

1st. Radiation dominated stage: \rho \propto a^{-4}

2nd. Matter dominated stage: \rho\sim a^{-3}

3rd. (Current era) Lambda/Dark energy dominated stage: \rho \sim a^{-2}

 See you in my next spacetime foam post!

View ratings
Rate this article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.