Toggle light / dark theme

An international research team has shown that phonons, the quantum particles behind material vibrations, can be classified using topology, much like electronic bands in materials. This breakthrough could lead to the development of new materials with unique thermal, electrical, and mechanical properties, enhancing our understanding and manipulation of solid-state physics.

An international group of researchers has found that quantum particles, which play a key role in the vibrations of materials affecting their stability and other characteristics, can be classified through topology. Known as phonons, these particles represent the collective vibrational patterns of atoms within a crystal structure. They create disturbances that spread like waves to nearby atoms. Phonons are crucial for several properties of solids, such as thermal and electrical conductivity, neutron scattering, and quantum states including charge density waves and superconductivity.

The spectrum of phonons—essentially the energy as a function of momentum—and their wave functions, which represent their probability distribution in real space, can be computed using ab initio first principle codes. However, these calculations have so far lacked a unifying principle. For the quantum behavior of electrons, topology—a branch of mathematics—has successfully classified the electronic bands in materials. This classification shows that materials, which might seem different, are actually very similar.

Strong field quantum optics is a rapidly emerging research topic, which merges elements of non-linear photoemission rooted in strong field physics with the well-established realm of quantum optics. While the distribution of light particles (i.e., photons) has been widely documented both in classical and non-classical light sources, the impact of such distributions on photoemission processes remains poorly understood.

Perturbative expansion is a valuable mathematical technique which is widely used to break down descriptions of complex quantum systems into simpler, more manageable parts. Perhaps most importantly, it has enabled the development of quantum field theory (QFT): a theoretical framework that combines principles from classical, quantum, and relativistic physics, and serves as the foundation of the Standard Model of particle physics.

For quantum computers to go from research curiosities to practically useful devices, researchers need to get their errors under control. New research from Microsoft and Quantinuum has now taken a major step in that direction.

Today’s quantum computers are stuck firmly in the “noisy intermediate-scale quantum” (NISQ) era. While companies have had some success stringing large numbers of qubits together, they are highly susceptible to noise which can quickly degrade their quantum states. This makes it impossible to carry out computations with enough steps to be practically useful.

While some have claimed that these noisy devices could still be put to practical use, the consensus is that quantum error correction schemes will be vital for the full potential of the technology to be realized. But error correction is difficult in quantum computers because reading the quantum state of a qubit causes it to collapse.

Genomics is revolutionizing medicine and science, but current approaches still struggle to capture the breadth of human genetic diversity. Pangenomes that incorporate many people’s DNA could be the answer, and a new project thinks quantum computers will be a key enabler.

When the Human Genome Project published its first reference genome in 2001, it was based on DNA from just a handful of humans. While less than one percent of our DNA varies from person to person, this can still leave important gaps and limit what we can learn from genomic analyses.

That’s why the concept of a pangenome has become increasingly popular. This refers to a collection of genomic sequences from many different people that have been merged to cover a much greater range of human genetic possibilities.

The tension between quantum mechanics and relativity has long been a central split in modern-day physics. Developing a theory of quantum gravity remains one of the great outstanding challenges of the discipline. And yet, no one has yet been able to do it. But as we collect more data, it shines more light on the potential solution, even if some of that data happens to show negative results.

That happened recently with a review of data collected at IceCube, a neutrino detector located in the Antarctic ice sheet, and compiled by researchers at the University of Texas at Arlington. They looked for signs that gravity could vary even a minuscule amount based on quantum mechanical fluctuations. And, to put it bluntly, they didn’t find any evidence of that happening.

To check for these minuscule fluctuations, they analyzed more than 300,000 detected neutrinos that IceCube had captured. IceCube is an impressive engineering feat, with thousands of sensors buried over one sq km in the ice. When one of the detectors is triggered by one of a hundred trillions of neutrinos passing through it every second, data on whether it was affected by any perturbations in the local gravity of that area can be collected.