Toggle light / dark theme

US lab sets up new supercomputer to test nuclear stockpile

It is expected to deliver performance up to eight times faster than its predecessor.

The Los Alamos National Laboratory (LANL) is in the final stages of setting up a new supercomputer dubbed Crossroads that will allow it to test the US nuclear stockpile without major tests, a press release said. The system has been supplied by Hewlett Packard and installation began in June of this year.

The Department of Energy (DoE) has been tasked with the responsibility of ensuring that the US nuclear stockpile can be relied upon if and when it needs to be used. For this purpose, the federal agency does not actually test the warheads but carries out simulations to determine the storage, maintenance, and efficacy of the weapons.

Exascale revolution: Supercomputers unleash a new era in biophysics discovery

In a recently published article featured on the cover of the Biophysical Journal, Dr. Rafael Bernardi, assistant professor of biophysics at the Department of Physics at Auburn University, and Dr. Marcelo Melo, a postdoctoral researcher in Dr. Bernardi’s group, shed light on the transformative capabilities of the next generation of supercomputers in reshaping the landscape of biophysics.

The researchers at Auburn delve into the harmonious fusion of computational modeling and experimental , providing a perspective for a future in which discoveries are made with unparalleled precision. Rather than being mere observers, today’s biophysicists, with the aid of advanced high-performance computing (HPC), are now trailblazers who can challenge longstanding biological assumptions, illuminate intricate details, and even create new proteins or design novel molecular circuits.

One of the most important aspects discussed in their perspective article is the new ability of computational biophysicists to simulate complex that range from the subatomic to whole-cell models, in extraordinary detail.

Physicists develop series of quality control tests for quantum computers

Quantum technologies—and quantum computers in particular—have the potential to shape the development of technology in the future. Scientists believe that quantum computers will help them solve problems that even the fastest supercomputers are unable to handle yet. Large international IT companies and countries like the United States and China have been making significant investments in the development of this technology. But because quantum computers are based on different laws of physics than conventional computers, laptops, and smartphones, they are more susceptible to malfunction.

An interdisciplinary research team led by Professor Jens Eisert, a physicist at Freie Universität Berlin, has now found ways of testing the quality of quantum computers. Their study on the subject was recently published in the scientific journal Nature Communications. These scientific quality control tests incorporate methods from physics, computer science, and mathematics.

Quantum physicist at Freie Universität Berlin and author of the study, Professor Jens Eisert, explains the science behind the research. “Quantum computers work on the basis of quantum mechanical laws of physics, in which or ions are used as computational units—or to put it another way—controlled, minuscule physical systems. What is extraordinary about these computers of the future is that at this level, nature functions extremely and radically differently from our everyday experience of the world and how we know and perceive it.”

Microsoft Wants to Build a Quantum Supercomputer Within a Decade

Since the start of the quantum race, Microsoft has placed its bets on the elusive but potentially game-changing topological qubit. Now the company claims its Hail Mary has paid off, saying it could build a working processor in less than a decade.

Today’s leading quantum computing companies have predominantly focused on qubits—the quantum equivalent of bits—made out of superconducting electronics, trapped ions, or photons. These devices have achieved impressive milestones in recent years, but are hampered by errors that mean a quantum computer able to outperform classical ones still appears some way off.

Microsoft, on the other hand, has long championed topological quantum computing. Rather than encoding information in the states of individual particles, this approach encodes information in the overarching structure of the system. In theory, that should make the devices considerably more tolerant of background noise from the environment and therefore more or less error-proof.

An IBM Quantum Computer Beat a Supercomputer in a Benchmark Test

The teams pitted IBM’s 127-qubit Eagle chip against supercomputers at Lawrence Berkeley National Lab and Purdue University for increasingly complex tasks. With easier calculations, Eagle matched the supercomputers’ results every time—suggesting that even with noise, the quantum computer could generate accurate responses. But where it shone was in its ability to tolerate scale, returning results that are—in theory—far more accurate than what’s possible today with state-of-the-art silicon computer chips.

At the heart is a post-processing technique that decreases noise. Similar to looking at a large painting, the method ignores each brush stroke. Rather, it focuses on small portions of the painting and captures the general “gist” of the artwork.

The study, published in Nature, isn’t chasing quantum advantage, the theory that quantum computers can solve problems faster than conventional computers. Rather, it shows that today’s quantum computers, even when imperfect, may become part of scientific research—and perhaps our lives—sooner than expected. In other words, we’ve now entered the realm of quantum utility.

Solving ordinary and partial differential equations using an analog computing system based on ultrasonic metasurfaces

Wave-based analog computing has recently emerged as a promising computing paradigm due to its potential for high computational efficiency and minimal crosstalk. Although low-frequency acoustic analog computing systems exist, their bulky size makes it difficult to integrate them into chips that are compatible with complementary metal-oxide semiconductors (CMOS). This research paper addresses this issue by introducing a compact analog computing system (ACS) that leverages the interactions between ultrasonic waves and metasurfaces to solve ordinary and partial differential equations. The results of our wave propagation simulations, conducted using MATLAB, demonstrate the high accuracy of the ACS in solving such differential equations. Our proposed device has the potential to enhance the prospects of wave-based analog computing systems as the supercomputers of tomorrow.

How Will Quantum Computers Change The World?

Quantum computers are the next step in computation. These devices can harness the peculiarities of quantum mechanics to dramatically boost the power of computers. Not even the most powerful supercomputer can compete with this approach. But to deliver on that incredible potential, the road ahead remains long.

Still, in the last few years, big steps have been taken, with simple quantum processors coming online. New breakthroughs have shown solutions to the major challenges in the discipline. The road is still long, but now we can see several opportunities along the way. For The Big Questions, IFLScience’s podcast, we spoke to Professor Winfried Hensinger, Professor of Quantum Technology at the University of Sussex and the Chief Scientific Officer for Universal Quantum, about the impact these devices will have.

This New AI Supercomputer Outperforms NVIDIA! (with CEO Andrew Feldman)

In this video I discuss New Cerebras Supercomputer with Cerebras’s CEO Andrew Feldman.
Timestamps:
00:00 — Introduction.
02:15 — Why such a HUGE Chip?
02:37 — New AI Supercomputer Explained.
04:06 — Main Architectural Advantage.
05:47 — Software Stack NVIDIA CUDA vs Cerebras.
06:55 — Costs.
07:51 — Key Applications & Customers.
09:48 — Next Generation — WSE3
10:27 — NVIDIA vs Cerebras Comparison.

Mentioned Papers:
Massively scalable stencil algorithm: https://arxiv.org/abs/2204.03775
https://www.cerebras.net/blog/harnessing-the-power-of-sparsi…-ai-models.
https://www.cerebras.net/press-release/cerebras-wafer-scale-…ge-models/
Programming at Scale:
https://8968533.fs1.hubspotusercontent-na1.net/hubfs/8968533…tScale.pdf.
Massively Distributed Finite-Volume Flux Computation: https://arxiv.org/abs/2304.

Mentioned Video:
New CPU Technology: https://youtu.be/OcoZTDevwHc.

👉 Support me at Patreon ➜ https://www.patreon.com/AnastasiInTech.
📩 Sign up for my Deep In Tech Newsletter for free! ➜ https://anastasiintech.substack.com

“Quantum Avalanche” — A Phenomenon That May Revolutionize Microelectronics and Supercomputing

New Study Solves Mystery on Insulator-to-Metal Transition

A study explored insulator-to-metal transitions, uncovering discrepancies in the traditional Landau-Zener formula and offering new insights into resistive switching. By using computer simulations, the research highlights the quantum mechanics involved and suggests that electronic and thermal switching can arise simultaneously, with potential applications in microelectronics and neuromorphic computing.

Looking only at their subatomic particles, most materials can be placed into one of two categories.

China Builds Exascale Supercomputer with 19.2 Million Cores

After the U.S. government imposed crippling sanctions against select Chinese high-tech and supercomputer companies through 2019 and 2020, firms like Huawei had to halt chip development; it is impossible to build competitive processors without access to leading-edge nodes. But Jiangnan Computing Lab, which develops Sunway processors, and National Supercomputing Center in Wuxi kept building new supercomputers and recently even submitted results of their latest machine for the Association for Computing Machinery’s Gordon Bell prize.

The new Sunway supercomputer built by the National Supercomputing Center in Wuxi (an entity blacklisted in the U.S.) employs around feature approximately 19.2 million cores across 49,230 nodes, reports Supercomputing.org. To put the number into context, Frontier, the world’s highest-performing supercomputer, uses 9,472 nodes and consumes 21 MW of power. Meanwhile, the National Supercomputing Center in Wuxi does not disclose power consumption of its latest system.

/* */