Menu

Blog

Archive for the ‘supercomputing’ category: Page 35

Sep 25, 2022

Bubbles hold clue to improved industrial structures

Posted by in category: supercomputing

Insights into how minute, yet powerful, bubbles form and collapse on underwater surfaces could help make industrial structures such as ship propellers more hardwearing, research suggests.

Supercomputer calculations have revealed details of the growth of so-called nanobubbles, which are tens of thousands of times smaller than a pin head.

The findings could lend valuable insight into damage caused on industrial structures, such as pump components, when these bubbles burst to release tiny but powerful jets of liquid.

Sep 24, 2022

PsiQuantum Has A Goal For Its Million Qubit Photonic Quantum Computer To Outperform Every Supercomputer On The Planet

Posted by in categories: quantum physics, supercomputing

The founders all believed that the traditional method of building a quantum computer of a useful size would take too long. At the company’s inception, the PsiQuantum team established its goal to build a million qubit, fault-tolerant photonic quantum computer. They also believed the only way to create such a machine was to manufacture it in a semiconductor foundry.

Early alerts

PsiQuantum first popped up on my quantum radar about two years ago when it received $150 million in Series C funding which upped total investments in the company to $215 million.

Sep 22, 2022

China Launches World’s Fastest Quantum Computers | China’s Advancement In Quantum Computers #techno

Posted by in categories: government, mathematics, quantum physics, supercomputing

https://www.youtube.com/watch?v=slEceKBmqts

China Launches World’s Fastest Quantum Computers | China’s Advancement In Quantum Computers #technology.

“Techno Jungles”

Continue reading “China Launches World’s Fastest Quantum Computers | China’s Advancement In Quantum Computers #techno” »

Sep 12, 2022

Most Powerful Supercomputer — SURPASSES The HUMAN BRAIN (64 EXAFLOPS)

Posted by in categories: biotech/medical, robotics/AI, supercomputing

The most powerful Exascale Supercomputer is going to release in 2021 and will feature a total of 64 Exaflops. More than 6 times as much, as the Leonardo Supercomputer that’s also set to release this year.
This is accomplished with the help of a new type of processor technology from Tachyum that’s called “Prodigy” and is described as the first Universal Processor.

This new processor is set to enable General Artificial Intelligence at the speed of the human brain in real-time. It’s many times faster than the fastest intel xeon, nvidia graphics card or apple silicon. This new super-computer will enable previously-thought impossible simulations of the brain, medicine and more.

Continue reading “Most Powerful Supercomputer — SURPASSES The HUMAN BRAIN (64 EXAFLOPS)” »

Sep 11, 2022

No knowledge, only intuition!

Posted by in categories: big data, complex systems, computing, innovation, internet, life extension, lifeboat, machine learning, posthumanism, robotics/AI, science, singularity, supercomputing, transhumanism

Article originally published on LINKtoLEADERS under the Portuguese title “Sem saber ler nem escrever!”

In the 80s, “with no knowledge, only intuition”, I discovered the world of computing. I believed computers could do everything, as if it were an electronic God. But when I asked the TIMEX Sinclair 1000 to draw the planet Saturn — I am fascinated by this planet, maybe because it has rings —, I only glimpse a strange message on the black and white TV:

0/0

Continue reading “No knowledge, only intuition!” »

Sep 3, 2022

Quantum Matter Is Being Studied At A Temperature 3 Billion Times Colder Than Deep Space

Posted by in categories: particle physics, quantum physics, space, supercomputing

A team of Japanese and US physicists has pushed thousands of Ytterbium atoms to just within a billionth of a degree above absolute zero to understand how matter behaves at these extreme temperatures. The approach treats the atoms as fermions, the type of particles like electrons and protons, that cannot end up in the so-called fifth state of matter at those extreme temperatures: a Bose-Einstein Condensate.

When fermions are actually cooled down, they do exhibit quantum properties in a way that we can’t simulate even with the most powerful supercomputer. These extremely cold atoms are placed in a lattice and they simulate a “Hubbard model” which is used to study the magnetic and superconductive behavior of materials, in particular the collective motion of electrons through them.

The symmetry of these models is known as the special unitary group, or, SU, and depends on the possible spin state. In the case of Ytterbium, that number is 6. Calculating the behavior of just 12 particles in a SU Hubbard model can’t be done with computers. However, as reported in Nature Physics, the team used laser cooling to reduce the temperature of 300,000 atoms to a value almost three billion times colder than the temperature of outer space.

Sep 2, 2022

Revolutionizing image generation through AI: Turning text into images

Posted by in categories: information science, robotics/AI, supercomputing

Creating images from text in seconds—and doing so with a conventional graphics card and without supercomputers? As fanciful as it may sound, this is made possible by the new Stable Diffusion AI model. The underlying algorithm was developed by the Machine Vision & Learning Group led by Prof. Björn Ommer (LMU Munich).

“Even for laypeople not blessed with artistic talent and without special computing know-how and , the new model is an effective tool that enables computers to generate images on command. As such, the model removes a barrier to expressing their creativity,” says Ommer. But there are benefits for seasoned artists as well, who can use Stable Diffusion to quickly convert new ideas into a variety of graphic drafts. The researchers are convinced that such AI-based tools will be able to expand the possibilities of creative image generation with paintbrush and Photoshop as fundamentally as computer-based word processing revolutionized writing with pens and typewriters.

In their project, the LMU scientists had the support of the start-up Stability. Ai, on whose servers the AI model was trained. “This additional computing power and the extra training examples turned our AI model into one of the most powerful image synthesis algorithms,” says the computer scientist.

Aug 31, 2022

Making Computer Chips Act More like Brain Cells

Posted by in categories: biological, chemistry, neuroscience, supercomputing

The human brain is an amazing computing machine. Weighing only three pounds or so, it can process information a thousand times faster than the fastest supercomputer, store a thousand times more information than a powerful laptop, and do it all using no more energy than a 20-watt lightbulb.

Researchers are trying to replicate this success using soft, flexible organic materials that can operate like biological neurons and someday might even be able to interconnect with them. Eventually, soft “neuromorphic” computer chips could be implanted directly into the brain, allowing people to control an artificial arm or a computer monitor simply by thinking about it.

Like real neurons — but unlike conventional computer chips — these new devices can send and receive both chemical and electrical signals. “Your brain works with chemicals, with neurotransmitters like dopamine and serotonin. Our materials are able to interact electrochemically with them,” says Alberto Salleo, a materials scientist at Stanford University who wrote about the potential for organic neuromorphic devices in the 2021 Annual Review of Materials Research.

Aug 30, 2022

ROBE Array could let small companies access popular form of AI

Posted by in categories: information science, robotics/AI, supercomputing

A breakthrough low-memory technique by Rice University computer scientists could put one of the most resource-intensive forms of artificial intelligence—deep-learning recommendation models (DLRM)—within reach of small companies.

DLRM recommendation systems are a popular form of AI that learns to make suggestions users will find relevant. But with top-of-the-line training models requiring more than a hundred terabytes of memory and supercomputer-scale processing, they’ve only been available to a short list of technology giants with deep pockets.

Rice’s “random offset block embedding ,” or ROBE Array, could change that. It’s an algorithmic approach for slashing the size of DLRM memory structures called embedding tables, and it will be presented this week at the Conference on Machine Learning and Systems (MLSys 2022) in Santa Clara, California, where it earned Outstanding Paper honors.

Aug 28, 2022

Inside Tesla’s Innovative And Homegrown “Dojo” AI Supercomputer

Posted by in categories: military, nuclear weapons, robotics/AI, space travel, supercomputing

How expensive and difficult does hyperscale-class AI training have to be for a maker of self-driving electric cars to take a side excursion to spend how many hundreds of millions of dollars to go off and create its own AI supercomputer from scratch? And how egotistical and sure would the company’s founder have to be to put together a team that could do it?

Like many questions, when you ask these precisely, they tend to answer themselves. And what is clear is that Elon Musk, founder of both SpaceX and Tesla as well as a co-founder of the OpenAI consortium, doesn’t have time – or money – to waste on science projects.

Continue reading “Inside Tesla’s Innovative And Homegrown ‘Dojo’ AI Supercomputer” »

Page 35 of 94First3233343536373839Last