Gordon Moore, the co-founder of Intel who died earlier this year, is famous for forecasting a continuous rise in the density of transistors that we can pack onto semiconductor chips. James McKenzie looks at how “Moore’s law” is still going strong after almost six decades, but warns that further progress is becoming harder and ever more expensive to sustain.
Category: physics – Page 105
It has almost been 20 years since the establishment of the field of two-dimensional (2D) materials with the discovery of unique properties of graphene, a single, atomically thin layer of graphite. The significance of graphene and its one-of-a-kind properties was recognized as early as 2010 when the Nobel prize in physics was awarded to A. Geim and K. Novoselov for their work on graphene. However, graphene has been around for a while, though researchers simply did not realize what it was, or how special it is (often, it was considered annoying dirt on nice, clean surfaces of metals REF). Some scientists even dismissed the idea that 2D materials could exist in our three-dimensional world.
Today, things are different. 2D materials are one of the most exciting and fascinating subjects of study for researchers from many disciplines, including physics, chemistry and engineering. 2D materials are not only interesting from a scientific point of view, they are also extremely interesting for industrial and technological applications, such as touchscreens and batteries.
We are also getting very good at discovering and preparing new 2D materials, and the list of known and available 2D materials is rapidly expanding. The 2D materials family is getting very large and graphene is not alone anymore. Instead, it now has a lot of 2D relatives with different properties and vastly diverse applications, predicted or already achieved.
University of Chicago physicists have finally engineered a way to create turbulence in a tank of water by using a ring of jets to blow loops until an isolated “ball” of turbulence forms and stays.
Long a matter of philosophical speculation, the idea of multiple realities has been given new artistic licence by physics.
A novel type of neural network is helping physicists with the daunting challenge of data analysis.
Editor’s note: For a more mainstream assessment of this idea, see this article by Dr. Ethan Siegel.
Sir Roger Penrose, a mathematician and physicist from the University of Oxford who shared the Nobel Prize in physics in 2020, claims our universe has gone through multiple Big Bangs, with another one coming in our future.
Penrose received the Nobel for his working out mathematical methods that proved and expanded Albert Einstein’s general theory of relativity, and for his discoveries on black holes, which showed how objects that become too dense undergo gravitational collapse into singularities – points of infinite mass.
By measuring inflated helium nuclei, physicists have challenged our best understanding of the force that binds protons and neutrons.
The ability of the phenomenon of criticality to explain the sudden emergence of new properties in complex systems has fascinated scientists in recent decades. When systems are balanced at their “critical point,” small changes in individual units can trigger outsized events, just as falling pebbles can start an avalanche. That abrupt shift in behavior describes the phase changes of water from ice to liquid to gas, but it’s also relevant to many other situations, from flocks of starlings on the wing to stock market crashes. In the 1990s, the physicist Per Bak and other scientists suggested that the brain might be operating near its own critical point. Ever since then, neuroscientists have been searching for evidence of fractal patterns and power laws at work in the brain’s networks of neurons. What was once a fringe theory has begun to attract more mainstream attention, with researchers now hunting for mechanisms capable of tuning brains toward criticality.
Learn more about the critical brain hypothesis: https://www.quantamagazine.org/a-physical-theory-for-when-th…-20230131/
- VISIT our Website: https://www.quantamagazine.org.
- LIKE us on Facebook: https://www.facebook.com/QuantaNews.
- FOLLOW us Twitter: https://twitter.com/QuantaMagazine.
Quanta Magazine is an editorially independent publication supported by the Simons Foundation https://www.simonsfoundation.org/
Basic, or “elementary,” cellular automata like The Game of Life appeal to researchers working in mathematics and computer science theory, but they can have practical applications too. Some of the elementary cellular automata can be used for random number generation, physics simulations, and cryptography. Others are computationally as powerful as conventional computing architectures—at least in principle. In a sense, these task-oriented cellular automata are akin to an ant colony in which the simple actions of individual ants combine to perform larger collective actions, such as digging tunnels, or collecting food and taking it back to the nest. More “advanced” cellular automata, which have more complicated rules (although still based on neighboring cells), can be used for practical computing tasks such as identifying objects in an image.
Marandi explains: “While we are fascinated by the type of complex behaviors that we can simulate with a relatively simple photonic hardware, we are really excited about the potential of more advanced photonic cellular automata for practical computing applications.”
Marandi says cellular automata are well suited to photonic computing for a couple of reasons. Since information processing is happening at an extremely local level (remember in cellular automata, cells interact only with their immediate neighbors), they eliminate the need for much of the hardware that makes photonic computing difficult: the various gates, switches, and devices that are otherwise required for moving and storing light-based information. And the high-bandwidth nature of photonic computing means cellular automata can run incredibly fast. In traditional computing, cellular automata might be designed in a computer language, which is built upon another layer of “machine” language below that, which itself sits atop the binary zeroes and ones that make up digital information.
Hundreds of physicists met in London this week for the ninth Future Circular Collider (FCC) Conferen.