LOG#218. Atomic elephants.

Have you ever asked yourself why there is no elephants or humans with atomic sizes? Have you ever asked why Earth is so big in comparison with atoms? Why nuclei are 10000 times smaller than atoms? I will be discussing these questions in this short blog post…

1st. Why R_{Earth}>>R_{atom}?

Answer: It is related to the relative force between gravity and electromagnetism due to the Gauss law! In 3d space (4d spacetime) both, Coulomb and Newton laws are inversely proportional to the square of distance (spherical shells!). Then:

    \[\dfrac{F_E}{F_{a}}=\dfrac{R_E^2}{R_a^2}\]

and from this you get

    \[\dfrac{R_E}{R_a}=\sqrt{\dfrac{Ke^2}{GM_p^2}}=\left(\dfrac{e}{M_P}\right)\left(\sqrt{\dfrac{K_C}{G_N} }\right)=10^{18}\]

The reason is that the ratio of the electron to proton mass, and the square root of Coulomb to Newton constant is big. Earth is big, because of the nature of proton mass to electron charge, and the ratio of the coulomb to newton constant. Thus, you can not find atomic planets or atoms with the size of the Earth…Not go crazy and ask to change of the values of those constants…

2nd. Why R_{Elephant}>>R_{atom}?

Answer: Similarly, you can can compute the ratio between the the gravitational energy of an elephan and the electric energy of any atom:

    \[\dfrac{E(g)}{V_{El}}=\dfrac{E(atom)}{V_{atom}}\]

Then, you get

    \[\dfrac{R_{El}}{R_{atom}}=\left(\dfrac{GM_p^2}{K_Ce^2}\right)^{1/3}\sim 10^2(10)\]

Therefore, elephants or humans are constrained to a few meters…Great!

3rd. Why R_{atom}=10^{5}R_{nucleus}?

Answer: It shows to be the hardest to understand. The secret behind this answer lies in the Yukawa force and the exponential screening (short-distance) behaviour of nuclear forces that make them to be confined to a few proton radii (and of course, to the coupling constants). Take for instance the strong force case, then

    \[g_s^2\dfrac{\exp(-r/r_0)}{r^2}=\dfrac{K_Ce^2}{r^2}\]

then

    \[\dfrac{r_{nucleus}}{r_{atom}}=\sqrt{\dfrac{\alpha_s}{\alpha}}\exp(-r/2r_0)\]

Plugging \alpha_s\sim 1, \alpha0=1/137, r/2r_0\sim 5, you guess that the above ratio is

    \[\dfrac{r_{nucleus}}{r_{atom}}=10^{-5}\]

Fantastic! Allons-y!

Proton decay is expected naturally at some point between 10^{45}yrs or 10^{120} yrs from “standard assumptions” of virtual black holes or space-time foam expectations. It is inevitable to get some dimension 5 operator for neutrino masses in the symbolic form (LH^+LH^+)/M at about 10^{10} or 10^{14} GeV, and leptoquarks triggering proton decays at about 10^{16} GeV even without the avove quantum gravity general prediction. There are also interesting dimension 6 electric dipole operators for neutrons and other neutral particles at scales about 30 -100 TeV! The LHC is hardly sensitive to these coupling, beyond non direct measurements, but future colliders at 100 or 1000TeV (the Pevatron!) could test the flavor changing process due to kaon-antikaon systems. Much more subtle is the issue of Higgs mass, the baryon and lepton number symmetries, and the sources of CPT violations we have already investagated the past and current century. It is a mess nobody understands yet fully. We have understood the kinematics and dynamics of spin 1 and 1/2 particles, even shallowly explored the 2012 Higgs (spin zero) discovery! Higher spin particles? Spin two is gravity and surely it exists, after the GW observations by LIGO and future observations. There is the spin 3/2 particles, expected from general supersymmetry arguments. Spin 5/2 or spin 3 and higher interactions are much more complicated. Excepting some theories of gravity as hypergravity, Vassiliev higher spin theory, and higher spin theory coming from string/superstring theory, are really hard to analyze. In fact, string theory is an effective theory from higher spin theory, containing gravity! But the efforts into the higher spin understanding of higher spin spin theories are yet to come. We are far from a complete view of higher spin theories.

What about the LHC expectations? Colliders are artifacts whose performance is quantified by a quantity called luminosity. Luminosity is related to elementary cross sections from particle physics, and the number of events per unit time is essentially luminosity times cross section…

    \[\dfrac{N_E}{Time}=\mathcal{L}\cdot \sigma\]

For the LHC, \mathcal{L}\sim 10^{34}cm^{-2}s^{-1}, and SM cross sections are measured in barns 1barn=10^{-24}cm^2=10^{-28}m^2. The LHC works at 14 TeV, 1TeV^{-1}\sim 10^{-19}m. Typical electromagnetic (strong) interactions scale like \sigma_{em}=\alpha/(1TeV)^2, and thus the cross section is about \sigma\sim 10^{-36}cm^2=1pb, where pb denotes picobarn. Data acquiring is measured in terms of inverted barns (integrated luminosity over the number of events). Independently you calculate amplitudes with Feynman graphs, the amplituhedra or any other gadget, these things are surely universal. Neutrino interactions are related to the Fermi constant or the M_W masses, and i f you go to search for some universal principle for scattering amplitudes, you will find out that on general grounds, for spin one consistent theories provide you polarization sums of type \sum g_j=0, and for spin two \sum g_jV^\mu_j=0. There are likely issues at loop corrections, but surely the universal laws of physics should conspire to be coherent in the end even with loop corrections. How? That is puzzle.

Finally, let me show you how to calculate the time a photon needs, in the sun, to escape from the sun. The sun is a dense medium, so photon interacts with atoms and dense matter in the core before popping out from exterior shells and arrive to Earth.

Data: solar density at the core \rho_c=150\times 10^3kg/m^3. Opacity is \kappa\approx 0.2kg^2/m.

The mean path is L=1/\kappa\rho_c=3\times 10^{-5}m. Solar radius is about R_\odot=7\times 10^{8}m. A random walk of the photon in N-steps yields d=L\sqrt{N}, so if photon survivies, it takes for d=R_\odot a number of N=(R_\odot/L)^2 steps. Finally, the total distance traveled by the foton is D=NL, so D=R_\odot^2/L.  The time takes a photon to travel D meters is obtained dividing by the speed of light, so

    \[t_\gamma=\dfrac{R_\odot^2}{Lc}\]

Plugging numbers

    \[t_\gamma=\dfrac{49\cdot 10^{16}}{3\cdot 10^{-5}\cdot 3\cdot 10^{8}}\sim 5\cdot 10^{13}s \sim 1\cdot 10^{6}yrs\]

So, a photon scape from the sun interior to its surface in about 1Myr. If you enlarge opacity you would get a higher number, and if you let the mean path to go greater, decreasing opacity (matter transparency!) due to density effects, you could get escape times of about 10^3-10^5yrs. Note that if photon did not interact, it would escape from the sun in a few seconds, like a neutrino!

May the atoms be with you!

View ratings
Rate this article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.