Brain-computer interfaces are devices that allow for direct communication between the brain and external devices, such as computers or prosthetics. As significant investments flow into R&D, cutting-edge companies are gearing up for human trials. These trials aim to showcase and fine-tune the potential of these interfaces to treat conditions such as Parkinson’s disease, epilepsy and depression.
While these technologies’ immediate use is for treating conditions, they also have the potential to access vast information at unprecedented speeds. As it stands today, the field not only aims to aid recovery, but also enhance existing cognitive functions. These goals introduce various ethical and… More.
Can cutting-edge technology transform the way humans learn, remember and evolve?
Trinity and IBM Dublin simulate superdiffusion on a quantum computer, marking a milestone in quantum physics.
Quantum physicists at Trinity have teamed up with IBM Dublin in an innovative project, successfully simulating superdiffusion on a quantum computer. This significant accomplishment is among the initial results of the TCD-IBM predoctoral scholarship program.
Scientists working in connectomics, a research field occupied with the reconstruction of neuronal networks in the brain, are aiming at completely mapping of the millions or billions of neurons found in mammalian brains. In spite of impressive advances in electron microscopy, the key bottleneck for connectomics is the amount of human labor required for the data analysis. Researchers at the Max Planck Institute for Brain Research in Frankfurt, Germany, have now developed reconstruction software that allows researchers to fly through the brain tissue at unprecedented speed. Together with the startup company scalable minds they created webKnossos, which turns researchers into brain pilots, gaining an about 10-fold speedup for data analysis in connectomics.
Billions of nerve cells are working in parallel inside our brains in order to achieve behaviours as impressive as hypothesizing, predicting, detecting, thinking. These neurons form a highly complex network, in which each nerve cell communicates with about one thousand others. Signals are sent along ultrathin cables, called axons, which are sent from each neuron to its about one thousand “followers.”
Only thanks to recent developments in electron microscopy, researchers can aim at mapping these networks in detail. The analysis of such image data, however, is still the key bottleneck in connectomics. Most interestingly, human annotators are still outperforming even the best computer-based analysis methods today. Scientists have to combine human and machine analysis to make sense of these huge image datasets obtained from the electron microscopes.
Another concern was the dissipation of electrical power on the Enchilada Trap, which could generate significant heat, leading to increased outgassing from surfaces, a higher risk of electrical breakdown and elevated levels of electrical field noise. To address this issue, production specialists designed new microscopic features to reduce the capacitance of certain electrodes.
“Our team is always looking ahead,” said Sandia’s Zach Meinelt, the lead integrator on the project. “We collaborate with scientists and engineers to learn about the kind of technology, features and performance improvements they will need in the coming years. We then design and fabricate traps to meet those requirements and constantly seek ways to further improve.”
Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.
Graphene nanoribbons have outstanding properties that can be precisely controlled. Researchers from Empa and ETH Zurich, in collaboration with partners from Peking University, the University of Warwick and the Max Planck Institute for Polymer Research, have succeeded in attaching electrodes to individual atomically precise nanoribbons, paving the way for precise characterization of the fascinating ribbons and their possible use in quantum technology.
Quantum technology is promising, but also perplexing. In the coming decades, it is expected to provide us with various technological breakthroughs: smaller and more precise sensors, highly secure communication networks, and powerful computers that can help develop new drugs and materials, control financial markets, and predict the weather much faster than current computing technology ever could.
To achieve this, we need so-called quantum materials: substances that exhibit pronounced quantum physical effects. One such material is graphene. This two-dimensional structural form of carbon has unusual physical properties, such as extraordinarily high tensile strength, thermal and electrical conductivity—as well as certain quantum effects. Restricting the already two-dimensional material even further, for instance, by giving it a ribbon-like shape, gives rise to a range of controllable quantum effects.
“There’s no road map,” said Michael Sipser, a veteran complexity theorist at the Massachusetts Institute of Technology who spent years grappling with the problem in the 1980s. “It’s like you’re going into the wilderness.”
It seems that proving that computational problems are hard to solve is itself a hard task. But why is it so hard? And just how hard is it? Carmosino and other researchers in the subfield of meta-complexity reformulate questions like this as computational problems, propelling the field forward by turning the lens of complexity theory back on itself.
Computer simulations confirm that the African Superplume causes the unusual deformations and rift-parallel seismic anisotropy detected below the East African Rift System.
Continental rifting involves a combination of stretching and fracturing that penetrates deep within the Earth, explains geophysicist D. Sarah Stamps. This process pertains to the elongation of the lithosphere, Earth’s rigid outer layer. As it becomes more taut, the lithosphere’s upper sections undergo brittle changes, leading to rock fractures and earthquakes.
Stamps, who studies these processes by using computer modeling and GPS.
Trinity’s quantum physicists in collaboration with IBM Dublin have successfully simulated super diffusion in a system of interacting quantum particles on a quantum computer.
This is the first step in doing highly challenging quantum transport calculations on quantum hardware and, as the hardware improves over time, such work promises to shed new light in condensed matter physics and materials science.
The work is one of the first outputs of the TCD-IBM predoctoral scholarship programwhich was recently established where IBM hires Ph.D. students as employees while being co-supervised at Trinity. The paper was published recently in npj Quantum Information.
Not many pure-play quantum computing start-ups have dared to go public. So far, the financial markets have tended to treat the newcomers unsparingly. One exception is IonQ, who along with D-Wave and Rigetti, reported quarterly earnings last week. Buoyed by hitting key technical and financial goals, IonQ’s stock is up ~400% (year-to-date) and CEO Peter Chapman is taking an aggressive stance in the frothy quantum computing landscape where error correction – not qubit count – has increasingly taken center stage as the key challenge.
This is all occurring at a time when a wide variety of different qubit types are vying for dominance. IBM, Google, and Rigetti are betting on superconducting-based qubits. IonQ and Quantinuuum use trapped ions. Atom Computing and QuEra use neutral atoms. PsiQuantum and Xanadu rely on photonics-based qubits. Microsoft is exploring topological qubits based on the rare Marjorana particle. And more are in the works.
It’s not that the race to scale up qubit-count has ended. IBM has a 433-plus qubit device (Osprey) now and is scheduled to introduce 1100-qubit device (Condor) late this year. Several other quantum computer companies have devices in the 50–100 qubit range. IonQ’s latest QPU, Forte, has 32 qubits. The challenge they all face is that current error rates remain so high that it’s impractical to reliably run most applications on the current crop of QPUs.