Have you ever asked yourself why there is no elephants or humans with atomic sizes? Have you ever asked why Earth is so big in comparison with atoms? Why nuclei are 10000 times smaller than atoms? I will be discussing these questions in this short blog post…
1st. Why ?
Answer: It is related to the relative force between gravity and electromagnetism due to the Gauss law! In 3d space (4d spacetime) both, Coulomb and Newton laws are inversely proportional to the square of distance (spherical shells!). Then:
and from this you get
The reason is that the ratio of the electron to proton mass, and the square root of Coulomb to Newton constant is big. Earth is big, because of the nature of proton mass to electron charge, and the ratio of the coulomb to newton constant. Thus, you can not find atomic planets or atoms with the size of the Earth…Not go crazy and ask to change of the values of those constants…
2nd. Why ?
Answer: Similarly, you can can compute the ratio between the the gravitational energy of an elephan and the electric energy of any atom:
Then, you get
Therefore, elephants or humans are constrained to a few meters…Great!
3rd. Why ?
Answer: It shows to be the hardest to understand. The secret behind this answer lies in the Yukawa force and the exponential screening (short-distance) behaviour of nuclear forces that make them to be confined to a few proton radii (and of course, to the coupling constants). Take for instance the strong force case, then
Plugging , , , you guess that the above ratio is
Proton decay is expected naturally at some point between yrs or yrs from “standard assumptions” of virtual black holes or space-time foam expectations. It is inevitable to get some dimension 5 operator for neutrino masses in the symbolic form at about or GeV, and leptoquarks triggering proton decays at about GeV even without the avove quantum gravity general prediction. There are also interesting dimension 6 electric dipole operators for neutrons and other neutral particles at scales about 30 -100 TeV! The LHC is hardly sensitive to these coupling, beyond non direct measurements, but future colliders at 100 or 1000TeV (the Pevatron!) could test the flavor changing process due to kaon-antikaon systems. Much more subtle is the issue of Higgs mass, the baryon and lepton number symmetries, and the sources of CPT violations we have already investagated the past and current century. It is a mess nobody understands yet fully. We have understood the kinematics and dynamics of spin 1 and 1/2 particles, even shallowly explored the 2012 Higgs (spin zero) discovery! Higher spin particles? Spin two is gravity and surely it exists, after the GW observations by LIGO and future observations. There is the spin 3/2 particles, expected from general supersymmetry arguments. Spin 5/2 or spin 3 and higher interactions are much more complicated. Excepting some theories of gravity as hypergravity, Vassiliev higher spin theory, and higher spin theory coming from string/superstring theory, are really hard to analyze. In fact, string theory is an effective theory from higher spin theory, containing gravity! But the efforts into the higher spin understanding of higher spin spin theories are yet to come. We are far from a complete view of higher spin theories.
What about the LHC expectations? Colliders are artifacts whose performance is quantified by a quantity called luminosity. Luminosity is related to elementary cross sections from particle physics, and the number of events per unit time is essentially luminosity times cross section…
For the LHC, , and SM cross sections are measured in barns . The LHC works at 14 TeV, . Typical electromagnetic (strong) interactions scale like , and thus the cross section is about , where pb denotes picobarn. Data acquiring is measured in terms of inverted barns (integrated luminosity over the number of events). Independently you calculate amplitudes with Feynman graphs, the amplituhedra or any other gadget, these things are surely universal. Neutrino interactions are related to the Fermi constant or the masses, and i f you go to search for some universal principle for scattering amplitudes, you will find out that on general grounds, for spin one consistent theories provide you polarization sums of type , and for spin two . There are likely issues at loop corrections, but surely the universal laws of physics should conspire to be coherent in the end even with loop corrections. How? That is puzzle.
Finally, let me show you how to calculate the time a photon needs, in the sun, to escape from the sun. The sun is a dense medium, so photon interacts with atoms and dense matter in the core before popping out from exterior shells and arrive to Earth.
Data: solar density at the core . Opacity is .
The mean path is . Solar radius is about . A random walk of the photon in N-steps yields , so if photon survivies, it takes for a number of steps. Finally, the total distance traveled by the foton is , so . The time takes a photon to travel D meters is obtained dividing by the speed of light, so
So, a photon scape from the sun interior to its surface in about 1Myr. If you enlarge opacity you would get a higher number, and if you let the mean path to go greater, decreasing opacity (matter transparency!) due to density effects, you could get escape times of about yrs. Note that if photon did not interact, it would escape from the sun in a few seconds, like a neutrino!
May the atoms be with you!
Current average ratings.