Cold clouds of atoms—Bose-Einstein Condensates—will test quantum gravity, enable atom-scale lithography and prospect for minerals from afar.
Category: particle physics – Page 346
Forget about online games that promise you a “whole world” to explore. An international team of researchers has generated an entire virtual universe, and made it freely available on the cloud to everyone.
Uchuu (meaning “outer space” in Japanese) is the largest and most realistic simulation of the universe to date. The Uchuu simulation consists of 2.1 trillion particles in a computational cube an unprecedented 9.63 billion light-years to a side. For comparison, that’s about three-quarters the distance between Earth and the most distant observed galaxies. Uchuu reveals the evolution of the universe on a level of both size and detail inconceivable until now.
Uchuu focuses on the large-scale structure of the universe: mysterious halos of dark matter that control not only the formation of galaxies, but also the fate of the entire universe itself. The scale of these structures ranges from the largest galaxy clusters down to the smallest galaxies. Individual stars and planets aren’t resolved, so don’t expect to find any alien civilizations in Uchuu. But one way that Uchuu wins big in comparison to other virtual worlds is the time domain; Uchuu simulates the evolution of matter over almost the entire 13.8 billion year history of the universe from the Big Bang to the present. That is over 30 times longer than the time since animal life first crawled out of the seas on Earth.
Better control over free-falling cold atoms paves the way for new tests of fundamental physics.
O,.o!!!!! Circa 2018
High-energy laser pulses cause electrons to oscillate, giving off gamma rays that produce electrons and positrons.
Circa 2000
A 1940 paper by Gamow and Mario Schoenberg was the first in a subject we now call particle astrophysics. The two authors presciently speculated that neutrinos could play a role in the cooling of massive collapsing stars. They named the neutrino reaction the Urca process, after a well known Rio de Janeiro casino. This name might seem a strange choice, but not to Gamow, a legendary prankster who once submitted a paper to Nature in which he suggested that the Coriolis force might account for his observation that cows chewed clockwise in the Northern Hemisphere and counterclockwise in the Southern Hemisphere.
In the 1940s Gamow began to attack, with his colleague Ralph Alpher, the problem of the origin of the chemical elements. Their first paper on the subject appeared in a 1948 issue of the Physical Review. At the last minute Gamow, liking the sound of ‘alpha, beta, gamma’, added his old friend Hans Bethe as middle author in absentia (Bethe went along with the joke, but the editors did not). Gamow and Alpher, with Robert Herman, then pursued the idea of an extremely hot neutron-dominated environment. They envisioned the neutrons decaying into protons, electrons and anti-neutrinos and, when the universe had cooled sufficiently, the neutrons and protons assembling heavier nuclei. They even estimated the photon background that would be necessary to account for nuclear abundances, suggesting a residual five-degree background radiation.
We now realize that their scheme was incorrect. The Universe began with roughly equal numbers of protons and neutrons. Collisions with electrons, positrons, neutrinos and anti-neutrinos are more important than neutron decay, and the absence of stable nuclei with atomic numbers of five and eight creates a barrier to further fabrication in the early Universe. Nevertheless Alpher, Gamow and Herman’s work was the first serious attempt to discuss the observable consequences of a big bang and the basic framework was correct. Ironically, the term ‘Big Bang’ was coined by Fred Hoyle, an advocate of a steady-state model of the universe, to make fun of Gamow’s efforts.
Using a groundbreaking new technique at the National Institute of Standards and Technology (NIST), an international collaboration led by NIST researchers has revealed previously unrecognized properties of technologically crucial silicon crystals and uncovered new information about an important subatomic particle and a long-theorized fifth force of nature.
By aiming subatomic particles known as neutrons at silicon crystals and monitoring the outcome with exquisite sensitivity, the NIST scientists were able to obtain three extraordinary results: the first measurement of a key neutron property in 20 years using a unique method; the highest-precision measurements of the effects of heat-related vibrations in a silicon crystal; and limits on the strength of a possible “fifth force” beyond standard physics theories.
The researchers report their findings in the journal Science.
Physicists sifting through old particle accelerator data have found evidence of a highly-elusive, never-before-seen process: a so-called triangle singularity.
First envisioned by Russian physicist Lev Landau in the 1950s, a triangle singularity refers to a rare subatomic process where particles exchange identities before flying away from each other. In this scenario, two particles — called kaons — form two corners of the triangle, while the particles they swap form the third point on the triangle.
Black holes are regions of space-time with huge amounts of gravity. Scientists originally thought that nothing could escape the boundaries of these massive objects, including light.
The precise nature of black holes has been challenged ever since Albert Einstein’s general theory of relativity gave rise to the possibility of their existence. Among the most famous findings was English physicist Stephen Hawking’s prediction that some particles are actually emitted at the edge of a black hole.
Physicists have also explored the workings of vacuums. In the early 1970s, as Hawking was describing how light can escape a black hole’s gravitational pull, Canadian physicist William Unruh proposed that a photodetector accelerated fast enough could “see” light in a vacuum.
CERN Courier
Jennifer Ngadiuba and Maurizio Pierini describe how ‘unsupervised’ machine learning could keep watch for signs of new physics at the LHC that have not yet been dreamt up by physicists.
In the 1970s, the robust mathematical framework of the Standard Model ℠ replaced data observation as the dominant starting point for scientific inquiry in particle physics. Decades-long physics programmes were put together based on its predictions. Physicists built complex and highly successful experiments at particle colliders, culminating in the discovery of the Higgs boson at the LHC in 2012.
Along this journey, particle physicists adapted their methods to deal with ever growing data volumes and rates. To handle the large amount of data generated in collisions, they had to optimise real-time selection algorithms, or triggers. The field became an early adopter of artificial intelligence (AI) techniques, especially those falling under the umbrella of “supervised” machine learning. Verifying the SM’s predictions or exposing its shortcomings became the main goal of particle physics. But with the SM now apparently complete, and supervised studies incrementally excluding favoured models of new physics, “unsupervised” learning has the potential to lead the field into the uncharted waters beyond the SM.
Tiny particles from distant galaxies have caused plane accidents, election interference and game glitches. This video is sponsored by Brilliant. The first 200 people to sign up via https://brilliant.org/veritasium get 20% off a yearly subscription.
This video was inspired by the RadioLab Podcast “Bit Flip” https://ve42.co/BF — they’re brilliant science storytellers.
A Huge thanks to Dr Leif Scheick, Calla Cofield and the JPL Media Relations Team.
Thanks to Col Chris Hadfield. Check out his book: https://chrishadfield.ca/books/