Menu

Blog

Archive for the ‘computing’ category: Page 355

Nov 14, 2021

Physicists develop a device that could provide conclusive evidence for the existence (or not) of non-Abelian anyons

Posted by in categories: computing, particle physics, quantum physics

What kinds of ‘particles’ are allowed by nature? The answer lies in the theory of quantum mechanics, which describes the microscopic world.

In a bid to stretch the boundaries of our understanding of the world, UC Santa Barbara researchers have developed a device that could prove the existence of non-Abelian anyons, a that has been mathematically predicted to exist in two-dimensional space, but so far not conclusively shown. The existence of these particles would pave the way toward major advances in topological quantum computing.

In a study that appears in the journal Nature, physicist Andrea Young, his graduate student Sasha Zibrov and their colleagues have taken a leap toward finding conclusive evidence for non-Abelian anyons. Using graphene, an atomically thin material derived from graphite (a form of carbon), they developed an extremely low-defect, highly tunable device in which non-Abelian anyons should be much more accessible. First, a little background: In our three-dimensional universe, elementary particles can be either fermions or bosons: think electrons (fermions) or the Higgs (a boson).

Nov 13, 2021

AR Pioneer Warns That Metaverse Could Make “Reality Disappear”

Posted by in categories: augmented reality, computing, virtual reality

An innovator in early AR systems has a dire prediction: the metaverse could change the fabric of reality as we know it.

Louis Rosenberg, a computer scientist and developer of the first functional AR system at the Air Force Research Laboratory, penned an op-ed in Big Think this weekend that warned the metaverse — an immersive VR and AR world currently being developed by The Company Formerly Known as Facebook — could create what sounds like a real life cyberpunk dystopia.

“I am concerned about the legitimate uses of AR by the powerful platform providers that will control the infrastructure,” Rosenberg wrote in the essay.

Nov 13, 2021

Tiny chip provides a big boost in precision optics

Posted by in categories: computing, innovation

“If you want to measure something with very high precision, you almost always use an , because light makes for a very precise ruler,” says Jaime Cardenas, assistant professor of optics at the University of Rochester.

Now, the Cardenas Lab has created a way to make these optical workhorses even more useful and sensitive. Meiting Song, a Ph.D. student, has for the first time packaged an experimental way of amplifying interferometric signals—without a corresponding increase in extraneous, unwanted input, or “noise”—on a 1 mm by 1 mm integrated photonic . The breakthrough, described in Nature Communications, is based on a theory of weak value amplification with waveguides that was developed by Andrew Jordan, a professor of physics at Rochester, and students in his lab.

Nov 13, 2021

Crypto Miners Driving High Demand for AMD CPUs with Big L3 Caches

Posted by in categories: bitcoin, computing, cryptocurrencies, information science

Now that crypto miners and their scalping ilk have succeeded in taking all of our precious GPU stock, it appears they’re now setting their sights on one more thing gamers cherish: the AMD CPU supply. According to a report in the UK’s Bitcoin Press, part of the reason it’s so hard to find a current-gen AMD CPU for sale anywhere is because of a crypto currency named Raptoreum that uses the CPU to mine instead of an ASIC or a GPU. Apparently, its mining is sped up significantly by the large L3 cache embedded in CPUs such as AMD Ryzen, Epyc, and Threadripper.

Raptoreum was designed as an anti-ASIC currency, as they wanted to keep the more expensive hardware solutions off their blockchain since they believed it lowered profits for everyone. To accomplish this they chose the Ghostrider mining algorithm, which is a combination of Cryptonite and x16r algorithms, and thew in some unique code to make it heavily randomized, thus its preference for L3 cache.

In case you weren’t aware, AMD’s high-end CPUs have more cache than their competitors from Intel, making them a hot item for miners of this specific currency. For example, a chip like the Threadripper 3990X has a chonky 256MB of L3 cache, but since that’s a $5,000 CPU, miners are settling for the still-beefy Ryzen chips. A CPU like the Ryzen 5900X has a generous 64MB of L3 cache compared to just 30MB on Intel’s Alder Lake CPUs, and just 16MB on Intel’s 11th-gen chips. Several models of AMD CPUs have this much cache too, not just the flagship silicon, including the previous-gen Ryen 9 3900X CPU. The really affordable models, such as the 5800X, have just 32MB of L3 cache, however.

Nov 13, 2021

Video-level computer vision advances business insights

Posted by in categories: business, computing

Determine which video-level computer vision task you need to perform based on the insights you want to gain.

Nov 12, 2021

GPU-based quantum simulation on Google Cloud

Posted by in categories: computing, quantum physics

O,.o woah!


This instructs qsim to make use of its cuQuantum integration, which provides improved performance on NVIDIA GPUs. If you experience issues with this option, please file an issue on the qsim repository.

After you finish, don’t forget to stop or delete your VM on the Compute Instances dashboard to prevent further billing.

Continue reading “GPU-based quantum simulation on Google Cloud” »

Nov 12, 2021

Researchers achieve first quantum simulation of baryons

Posted by in categories: computing, particle physics, quantum physics

A team of researchers led by an Institute for Quantum Computing (IQC) faculty member performed the first-ever simulation of baryons—fundamental quantum particles—on a quantum computer.

With their results, the team has taken a step towards more complex quantum simulations that will allow scientists to study neutron stars, learn more about the earliest moments of the universe, and realize the revolutionary potential of quantum computers.

Continue reading “Researchers achieve first quantum simulation of baryons” »

Nov 12, 2021

First ever simulation of baryons on a quantum computer

Posted by in categories: computing, quantum physics

The first ever simulation of baryons on a quantum computer is reported by the University of Waterloo.

Nov 12, 2021

How Removing Cobalt From Batteries Can Make EVs Cheaper

Posted by in categories: computing, mobile phones, sustainability, transportation

Cobalt has been getting a lot of attention lately because it is one of the most expensive materials found in lithium-ion batteries, which power everything from laptops and cell phones to electric vehicles. Cobalt extraction is largely concentrated in the Democratic Republic of Congo, where it is linked to human rights abuses and child labor, while cobalt refinement is almost exclusively done in China, making cobalt part of a tenuous supply chain. These are some of the reasons why battery manufacturers like Samsung and Panasonic and car makers like Tesla and VW, along with a number of startups are working to eliminate cobalt from lithium-ion batteries completely.

» Subscribe to CNBC: https://cnb.cx/SubscribeCNBC
» Subscribe to CNBC TV: https://cnb.cx/SubscribeCNBCtelevision.
» Subscribe to CNBC Classic: https://cnb.cx/SubscribeCNBCclassic.

Continue reading “How Removing Cobalt From Batteries Can Make EVs Cheaper” »

Nov 12, 2021

These new WD 20TB hard drives could hold your entire Steam collection

Posted by in category: computing

That’s a lot of data.


OptiNAND HDDs store metadata in a flash cache which promises to deliver more performance, higher capacities and better reliability.