LOG#212. Understanding gravity(II).

Hi, there!

In the previous blog post and log, we introduced the high energy stringy part of the 4-point (particle!) tree approximation amplitude for the graviton scattering in type (II) superstrings \mathcal{A}_{stu}

(1)   \begin{equation*}\mathcal{A}=G_N\dfrac{<12>^4<34>^4}{stu}\dfrac{\Gamma(1-\frac{s}{M^2})\Gamma(1-\frac{t}{M^2})\Gamma(1-\frac{u}{M^2})}{\Gamma(1+\frac{s}{M^2})\Gamma(1+\frac{t}{M^2})\Gamma(1+\frac{u}{M^2})}\end{equation*}

This “simple” formula gives no negative probability due to positivity features it owns. Why quantum physics is about amplitudes and probabilities? Well, that is a good question. Perhaps, it is the history of Science the bias, and the thing bayesian probabilistic methods are not as popular as deterministic and frequentist approaches. The battle bayesianism and frequentist is yet seen at statistics and experimental science, but they are somehow completary views…Despite the fact, bayesianism is far superior in many (not all!) the cases. Focusing on the subject above, classical physics is just a deterministic theory just when formulated in deterministic ways. I mean, newtonian mechanics and classical mechanics formulated in terms of “forces” under the general principle F=Ma is a deterministic approach, because it states that…In order to know the position (in the futuro or past) from any particle in the universe, you only have to know the mass and the forces act on it. Period. The rest is a calculus problem. You can do it manually or with the aid of computers for bigger and more complex systems (fluids and non-linear systems are in this category). Of course, the loophole is the knowledge of the initial or boundary conditions. You can not know the exact position of a particle without initial conditions. However, classical physics admits a non-deterministic formulation. It was created during the 19th century. It is based on energy (kinetic, potential and others) and the action principle. Any classical theory under very mild assumptions can be formulated in terms on a minimal (more precisely critical) action principle with certain hamiltonian/lagrangian function and so the requirement \delta S=0 is seen as more fundamental. Why is the principle behind S=\int_\gamma L or S=\int_\Gamma H non manifestly deterministic. It is related to the fact that configurations of the particles are not fixed, but varying in configuration space or phase space. Indeed, this formulation allows us (Feynman was the creator of this approach following a Dirac suggestion) to build up and formulate quantum physics in a much more parallel way to classical mechanics. Thus, the Schrödinger path to quantum mechanics (QM) based on the solutions to H\vert\Psi>=E\vert\Psi>  turns to be changed in a more lagrangian or action (functional) way with the aid of action principle. The point is that, in order to understand QM from this viewpoint, you must accept that particles or fields follow every possible configuration or path through space-time. The amplitude for  the infinite paths is usually written as a sum, really it is a formal integral over certain infinite-dimensional space (infinite-forms are such a thing no mathematian surely would try to visualize here! Sorry, my dear mathematicians, infinitedimensional forms are “real”. Just joking, I am sure you know those things really exist, don’t you?). The amplitude reads

    \[\mathcal{A}_Q=\sum_{paths} c_\gamma e^{iS/\hbar}\]

or

    \[\mathcal{A}_Q=\int\left[DX\right]  e^{iS/\hbar}\]

Infinite dimensional matrices and spaces are “normal” in quantum physics, not so in classical physics (unless you live with fields instead of particles). The Nima’s work in the last years has been devoted to find out a new QFT reformulation where locality and unitarity emerge as some kind of deformation from a new mathematical structure and new objects arise. The real reason of this need is also related to Feynman graphs. They are complicated to handle with in complex theories. Physical theories require renormalizability and finiteness in the results. Regularization of sums and integrals are complicated. When considering gravity, the only true universal force on every scale, things become nasty at higher order loop corrections. You can not calculate by hand every Feynman graph correction. Computers will be necessary…Even if some new mathematics arise. The amplituhedron and related objects are not new really. They are variations in projective geometry. Anyway, it stuns everyone when you learn that amplituhedra allows you to simplify some calculations that Feynman graphs would do harder. Perhaps, this is a motivation of every physicist too: to simplify things. Locality is just a condition that can be seen a consequence of a constraint in momentum (\sum_i p_i)^2=0. Unitarity, beyond the hamiltonian evolution as a phase factr U=\exp(iHt/\hbar), can also be understood as a factorization property of Feynman graphs. Amplituhedra have relatives. They are named with even cooler names: associahedra, EFT-hedra, positroids, matroids,,…But they are almost all in the same category. They involve some type of combinatorial (polytopal) geometry (now, have you recalled that loop quantum gravity and spin networks have similar structures too? Please, don’t hate competitors…Make them yours!). Matrices are usually generalized into tensors or multiforms. However, there are other possible genrealization of matrices based on the notion of linear independence. This drives you into the idea of matroids. Positroids are certain type of matrics with “positive” properties (of course, no one likes negativity…True?). Lam polytopes or lamtopes and positroidrons (not they are not drones, sorry) are certain objects currently being studied by some eccentric mathematicians (however the sample of mathematicians studying these things are low, few people like strange things, and no, I am not meaning the Stranger Things Netflix TV series). Well, how to live with these “new” things? Where the amplituhedra emerged first? The answer is planar super-Yang-Mills (SYM) theories. Twistors (Penrose was so great creating them!) are mathematical objects encoding spacetime kinematics in SYM (also in superstrings, but that is another story) via:

    \[\overline{\lambda}_a=\dfrac{<a-1\;\; a>\mu_{a+1}+<a+1\;\; a-1>\mu_{a}+<a\;\; a+1>\mu_{a-1}}{<a-1\;\; a><a\;\; a+1>}\]

You can build up a vector

    \[Z_a=\begin{pmatrix}\lambda_a\\ \mu_a\\ y_a\end{pmatrix}\]

where Z_a is equivalent to t_aZ_a. This equivalence relation defines momentum twistor physics!

This equivalence relation defines momentum twistor physics! In the previous blog post, I remarked the amplituhedron is related to the volume of certain infinite-dimensional complex amplitude. Is there a more concrete definition at hand? Well, verbose mode on…The formal definition of the amplituhedron is that of a positive grassmannian G(k,n) scpace of k-planes in n-dimensions. The hope is that similar objects (or generalizations) can be found to simplify hundreds of pages of Feynman graphs into single pages. Computationally, given the fact we are entering the AI and Big Data era we need to keep things simple. The amplituhedra is, therefore, certain asymmetric matrix (matroid) kxn with entries in GL(k):

    \[\mathcal{C}_{\alpha a}=\begin{pmatrix}c_1 & c_2 & \cdots & c_n\end{pmatrix}\]

The positive grassmannian, the space G^+(k,n) of all minors with positive entries is just another name for the amplituhedra! Can you explain it to your children and kids? Well,…Maybe. The amplituhedra can just be seen as well as a generalization of a beautiful idea concerning High School triangles. Draw a 3-vertex triangle and mark the vertices and the center as the baricenter or center of mass inside the triangle. Define

    \[Y=c_1z_1+c_2z_2+c_3z_3\]

where z_i are the triangle vertices, and Y is the center of mass. The center of mass is a positive quantity in the sense you can find out c_i all positive giving you the center of mass! You can go further, and generalize this for any convex polygon. You can find the center of mass

    \[Y=c_1z_1+\cdots+c_nz_n\]

for a matrix Y=cz, where the numbers c_i, z_i live in G^+(1,n) or G^+(3,n) respectively. 2D-cells of these things are “triangles” but going more general you can build up G^+(k+4,n) or G^+(k,n) spaces, tied up to supergeometry in projective spaces! The amplituhedra are just objects transforming external particle data into polytopal shapes! This trick, just new, allowed Nima and collaborators to write all loop integrands and understand them as certain “faces” (polygons/polytopes) of the amplituhedra. The simplest way to understand the amplituhedra is, perhaps, the following statement:

“Polytopal shape=(Positive grassmmanian)(External data)”or

Generalized polytope amplitude=Amplituhedra times data

where the polytopal shape lives in certain (non positive grassmannian) just as the external data, but it turns that the amplituhedra must have certain positive properties to handle with consistent quantum mechanics. Thus, the amplituhedra generalizes classical triangles, polygons and polyhedra. It can be thought as certain apeirotope. It also highlights the need for new particles, since it arised in the context of SYM theories. The space of k-planes in k+4 dimensions in particular. The volume of the amplituhedra IS the all loop scattering amplitude. Certainly, it mimics the S-matrix ideas of the 70s but it seems something else. Time will tell if this technique evolves. However, I believe no High School student will learn (yet), triangles from the positive grassmannians, lol. The amplituhedra, as every generalized objects, is an abstract entity hard to visualize. Let me add that momentum twistor spaces define kinematics, but the grassmannian positivity of the amplituhedra encodes the dynamics of the field theory! Furthermore, the shape of the amplituhedra gives you physics. Locality and unitarity are related to the positivity. Then, think the amplituhedra as certain generalizations of the positive grassmannian G^+(k,n). Now, we can speculate: 

1st. Beyond dualities, no equivalence betwen existing gauge redundant descriptions was found. Maybe, a new approach is necessary in order to understand the origin of dualities.

2nd. Amplituhedron discovery: it implies, maybe, that we can understand representations of unitarity and locality without any evolution in space-time. Spacetime (GR) and quantum mechanics could be an illusion from shapes of higher dimensional objects. Yes, Plato’s cave returns…

3rd. Gravity and strings links. There is a yet unexplain magic even in planar SYM, related to the connection between gauge theories and gravity. Identities between color orders, helicities,…Similar strange combinatorics arise in the so-called BCFW formulae. Not just emergent string theory but also string theory plus QM arise from combinatorial geometry. Heritage from the spin networks and the twistor program. It hints any ultimate description of reality in terms of purely combinatorial (discrete, finite!) objects, even if the objects are infinite dimensional theirselves. Remarkly, the moonshine conjecture and its generalizations seems to be a further source for links into this continuum-discrete relationship.

By the other hand, what kind of physics can we do from the amplituhedron? Projective (complex) geometry mainly. Related to positive grassmannians, as we have seen, and extended to polytopes in certain likely infinite dimensions. The grassmannians work like triangles or “faces” and their interiors. The amplituhedra are like being inside of polygons/polytopes. You can define certain invariant “forms” (differential expressions) related to the singularities (particles) on the boundaries of the amplituhedra. Twistor-like variables encode the symmetry of the Lorentz group in 4d (specially in SYM) but, in principle, could be exported into more general spaces (that is the real hope…). The invariance of the amplituhedra are indeed more fundamental in the sense they allow you to derive (under positivity) unitarity and locality. Locality+Unitarity=Positivity. That is the point behind amplituhedra. In fact, Nima has tried to go further with EFThedra. Black holes are the end of reductionism in the sense they imply a breakdown of known physics. They also imply a link betwen IR/UV scales. Black holes, at least known black holes, are astrophysical entities. They are much greater than the Planck length. Therefore, understanding black holes also means, in the end, to understand short (ultrashort) scales. High energies are connected ultimately to long distances via gravity. Gravity is inevitable. It is universal. The holographic principle states, however, that local field theories fail to capture gravity essentials. The maximal number of degrees of freedom in any theory containing gravitational stuff is bounded. But it does NOT scale as the volume, as one naively would expect from QFT. It scales as the AREA of the boundary region. N\sim \exp(A/4G_N). This surprising fact constrains BH physics and make difficult field theory descriptions of it. Gravity is, generally, the weakest force. Extreme conditions near black holes, near black hole singularities, change this. During the last 40 years, even more if you take into account early Einstein trials, we have tried to go further GR. Einstein gravity signs that is uncomplete in the sense it predicts its own fall-down. String theory enters here as a the greatest hope. Key words: supersymmetry and positivity are well known to physicists from the 70s and 80s of the 20th century. Concrete examples of theories where physics of spacetime and the QM rules are sen to be derived from something more fundamental and combinatorial objects are known, even when they are not abundant. Nima’s great discovery of the amplituhedron and relatives in SYM N=4 theories points out in this direction. In the cosmological set-up, cosmological polytopes and generalized associahedra could also be possible. Hidden positivity and geometry of causality and unitarity, let Nima speculate about other entities: EFThedra and CFThedra from Effective Field Theories and Conformal Field Theories, respectively. However, these abstract entities are not yet fully understand, not even known! However, it shows that these objects WERE indeed imagined before. Gelfand in the 90s studied the grassmannians and generalized determinants, also arising from quantum black holes and quantum information in the form of hyperdeterminants, in the formula of black hole entropies! From polytopes, via Xhedra, to amplituhedra and beyond all of them is a secret that makes all of time and space…Motivic geometry and Grothendieck theory of motives are just one of the deepest aspects of geometry and number theory links that are also being involved into physics these years.

In summary:

  • Two principles of current physics, GR+QFT(SM/QM), are yet into trouble when we put both together.
  • Eventually, you will put enough energy to create a black hole and gravity then wins (usually, it is the weakest force, but no anymore). Normal GR breaks down at this point, and we get messy things, like information loss or non-unitarity in a simple naive approaches.
  • QM says amplitudes are the main observables, but what are the observables is another story. I mean, what are amplitudes and how they relate to particles or fields are the real problem. Philosophically, you can hate to transform the deterministic view of your pathetic life and reality in a wonderful non-deterministic view of your possible or unlikely possibilities. Hints: true observables, according to holography, live on the boundary, not on the bulk, when including gravity.
  • Action principle is yet fundamental, but it also could be derived from more fundamental stuff.
  • Not every consistent QG (quantum gravity) appears from string theory (ST). QG hints that duality is somehow fundamental as framework. The symmetry behind dualities and the generalized equivalence principle, quantum, from dualities is yet to emerge.
  • Specific predictions of HEP and Cosmology are necessary for new approaches in fundamental physics. The amplituhedra and other combinatorial structures are just mathematical constructs. We need concrete predictions going further. However, simplifying things and computations are also good things in the forthcoming AI (aided) era.
  • QFT redux: string theory has gauge+gravity theory implications. Till now, it is the only consistent theory that, despite its success, loss its experimental support (does it? Look at dark stuff, matter and energy). Maybe string theory is already here?
  • Gravity is the weakest force excepting in the strong gravity sector: black hole physics and the beginning/end of the universe.
  • The fate of the Universe seems to be inevitable from the current almost de Sitter phase of positively accelerated expansion of the Universe.
  • Black holes and more genrerally speaking, horizon spacetimes emit particles. Global charges are violated by semiclassical black hole physics and the black hole evaporation. Caveat: maybe the degrees of freedom are not available to every observer or, there are some extra fields carrying out the information in the evaporation phase when reached the transplanckian zone. However, no solution is yet at hand.
  • Firewalls imply that you can not save the following three pirnciples: 1) Equivalence principle, 2) Locality, 3) Unitarity. This amazing fact, also implies that in any theory with finite range corrections of gravity (breaking down the equivalence principle) you would find out that your theory containing light extended objects enforces you to not have long range inflation models. From inflationary theories, it is a problem.
  • We do not know what are the observables at the edge of the Universe, since we live inside it…
  • SUSY and quantum dimensions. By quantum dimensions I mean fermionic degrees of freedom. Spin does exist, so fermionic dimensions exist, as fermion fields! So, maybe SUSY has already been discovered. We know electrons, muons, taus,…I know. You want a proof. There are indeed stuff with classical anticommuting features like femions AB=-BA. High school teachers (yet) show you cross products to some of you…Aristarchos believed in the heliocentric model despite their critics said if true parallax should be found. Indeed, they calculated the parallax effect in the Greek times, but they were neglected since it would imply that stars would be very far away. Indeed they are. Maybe, the problem with supersymmetry is supersymmetry.
  • Accidents. Neutron and proton systems are bounded systems, but neutron plus neutrons are not. This is a fine tuning by 1%. The electroweak symmetry breaking scale seems to be fine-tuned. The moon seems to be eclipsing the full sun seems to be a fine-tuning. The low quadrupole of the CMB power spectrum seems to be fine-tuned. The critical density is too close to the actual current dark energy density. The vacuum energy, if the cosmological constant is equal to it, must to be cancelled out with a fantastic fine-tuning of precision in 122 orders of magnitude (61 with supersymmetry or so).
  • The Multiverse. Some people think the multiverse is inevitable. No multiverse theory indeed is available so far. We are limited to our horizon observations. We can not know what lies beyond the event horizon of black holes or the almost dS phase of our Universe. Edge states and their observables are unknown from Quantum Cosmology. Indeed, there is no accepted quantum cosmology theory right now. Analogy with black holes yield: no global picture, unknown number of multiverses (universes), degree of imprecision of universe entropy is unknow, as the correlators of the field at the edge or boundary of the Universe are unknown. Multiverse is about unknown unknowns!

Epilogue:

  1.  Why is there a macroscopic Universe? Response is Quantum Mechanics AND Relativity. Whatever it lies beyond, they must reduce to them. Whatever the ultimate theory is, it implies surely QM+Gravity(GR) and SR.
  2. At long distances, spins of particles are 0,1/2,1,3/2,2,…From these, only spin 3/2 particles are lacking after the Higgs boson, spin zero, discovery and the gravitational wave discovery (spin two).
  3. Central drama of physics: QM+GR are incompatible in extreme cases. Black holes and singularities are the real problem. The end of space-time could mean that also QM should be modified (not abandoned!) in cosmological or extreme situations.
  4. The cosmological constant is too small.
  5. The reason because the Universe contains big things is the hierarchy “problem”. Gravity is weak, and thus, Earth or elephants are big, and atoms or subatomic particles are small. However, at some point, we will probe tiny heavy objects. There, QM+GR predictions fail. We need something else. Quantum Gravity.
  6. Associahedra vs. permutohedra vs. amplituhedra. There is also cyclohedra (Bott-Taubes polytope). Permutohedra are generalizations of n-dimensions from the hexagon. Permutohedra are somehow n-dimensional honeycombs! They have (n+1)! vertices. There is also the Tamari lattice, the Leech lattice and other cool discrete spaces in higher dimensions with strange shapes and properties. The associahedra are related to the different way to bracket a string of n-dimensional letters x_1,\ldots,x_n.
  7. Tension of p-branes are known to be:

    \[T_{Dp}=\dfrac{1}{g_s(2\pi)^pl_s^{p+1}}\]

See you in my next blog post! (A rough one on Quantum Cosmology!)

View ratings
Rate this article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.