Dec 26, 2023
2301.09575 (1).Pdf
Posted by Dan Breeden in categories: alien life, computing, quantum physics
Alien civilizations that may use black holes as super quantum computers.
Shared with Dropbox.
Alien civilizations that may use black holes as super quantum computers.
Shared with Dropbox.
Transistor performs energy-efficient associative learning at room temperature.
An artistic interpretation of brain-like computing. Image by Xiaodong Yan/Northwestern University.
Harvard’s breakthrough in quantum computing features a new logical quantum processor with 48 logical qubits, enabling large-scale algorithm execution on an error-corrected system. This development, led by Mikhail Lukin, represents a major advance towards practical, fault-tolerant quantum computers.
In quantum computing, a quantum bit or “qubit” is one unit of information, just like a binary bit in classical computing. For more than two decades, physicists and engineers have shown the world that quantum computing is, in principle, possible by manipulating quantum particles – be they atoms, ions or photons – to create physical qubits.
But successfully exploiting the weirdness of quantum mechanics for computation is more complicated than simply amassing a large-enough number of physical qubits, which are inherently unstable and prone to collapse out of their quantum states.
Quantum computing is often hailed as the next frontier of technology, promising to solve some of the most complex and challenging problems in science, engineering, and business. But how close are we to achieving this quantum dream, and what are the limitations of this emerging field?
As IEEE Spectrum shares in its detailed report, some of the leading voices in quantum computing have recently expressed doubts and concerns about the technology’s current state and prospects. They argue that quantum computers are far from being ready for practical use and that their applications are more restricted than commonly assumed.
This Review examines the development of neuromorphic hardware systems based on halide perovskites, considering how devices based on these materials can serve as synapses and neurons, and can be used in neuromorphic computing networks.
The big difference is that all the rendering would be handled by chiplets instead of a big compute chip like on its existing GPUs.
The Sydney team exploited stimulated Brillouin scattering, a technique which involves converting electrical fields into pressure waves in certain insulators, such as optical fibers. In 2011, the researchers reported that Brillouin scattering held potential for high-resolution filtering, and developed new manufacturing techniques to combine a chalcogenide Brillouin waveguide on a silicon chip. In 2023, they managed to combine a photonic filter and modulator on the same type of chip. The combination gives the experimental chip a spectral resolution of 37 megahertz and a wider bandwidth than preceding chips, the team reported in a paper published 20 November in Nature Communications.
“The integration of the modulator with this active waveguide is the key breakthrough here,” says nanophotonics researcher David Marpaung of the University Twente in the Netherlands. Marpaung worked with the Sydney group a decade ago and now leads his own research group that is taking a different approach in the quest to achieve wide-band, high-resolution photonic radio sensitivity in a tiny package. Marpaung says that when someone reaches sub-10-MHz spectral resolution across a 100 gigahertz band, they will be able to replace bulkier electronic RF chips in the marketplace. Another advantage of such chips is that they would convert RF signals to optical signals for direct transmission through fiber optic networks. The winners of that race will be able to reach the huge market of telecoms providers and defense manufacturers who need radio receivers capable of reliably navigating complicated radio-frequency (RF) environments.
“Chalcogenide has a very strong Brillouin effect; it’s very good, but there is still a question of whether this is scalable…it’s still perceived as a lab material,” Marpaung says. The Sydney group had to figure out a new way to fit the chalcogenide waveguides in a 5-millimeter-square package into a standard manufactured silicon chip, which was no easy task. In 2017, the group figured out how to combine chalcogenide onto a silicon input/output ring, but it took until this year for anyone to manage the combination with a standard chip.
What just happened? IBM’s concept nanosheet transistor demonstrated nearly double the performance improvement at the boiling point of nitrogen. This achievement is expected to result in several technological advances and could pave the way for nanosheet transistors to replace FinFETs. Even more excitingly, it could lead to the development of a more powerful class of chips.
Liquid nitrogen is widely used throughout the semiconductor manufacturing process to remove heat and create inert environments in critical process areas. However, when brought to the boiling point, which is 77 Kelvin or −196 °C, it can no longer be used in certain applications because the current generation of nanosheet transistors hasn’t been designed to withstand temperatures of this kind.
This limitation is unfortunate because it has been theorized that chips could boost their performance in such an environment. Now, this possibility may be realized, as demonstrated by a concept nanosheet transistor IBM presented at the 2023 IEEE International Electron Device Meeting held earlier this month in San Francisco.