Toggle light / dark theme

NVIDIA introduces QODA, a new platform for hybrid quantum-classical computing, enabling easy programming of integrated CPU, GPU, and QPU systems.


The past decade has seen quantum computing leap out of academic labs into the mainstream. Efforts to build better quantum computers proliferate at both startups and large companies. And while it is still unclear how far we are away from using quantum advantage on common problems, it is clear that now is the time to build the tools needed to deliver valuable quantum applications.

To start, we need to make progress in our understanding of quantum algorithms. Last year, NVIDIA announced cuQuantum, a software development kit (SDK) for accelerating simulations of quantum computing. Simulating quantum circuits using cuQuantum on GPUs enables algorithms research with performance and scale far beyond what can be achieved on quantum processing units (QPUs) today. This is paving the way for breakthroughs in understanding how to make the most of quantum computers.

In addition to improving quantum algorithms, we also need to use QPUs to their fullest potential alongside classical computing resources: CPUs and GPUs. Today, NVIDIA is announcing the launch of Quantum Optimized Device Architecture (QODA), a platform for hybrid quantum-classical computing with the mission of enabling this utility.

TensorFlow.NET is a library that provides a. NET Standard binding for TensorFlow. It allows. NET developers to design, train and implement machine learning algorithms, including neural networks. Tensorflow. NET also allows us to leverage various machine learning models and access the programming resources offered by TensorFlow.

TensorFlow

TensorFlow is an open-source framework developed by Google scientists and engineers for numerical computing. It is composed by a set of tools for designing, training and fine-tuning neural networks. TensorFlow’s flexible architecture makes it possible to deploy calculations on one or more processors (CPUs) or graphics cards (GPUs) on a personal computer, server, without re-writing code.

Machine learning is transforming all areas of biological science and industry, but is typically limited to a few users and scenarios. A team of researchers at the Max Planck Institute for Terrestrial Microbiology led by Tobias Erb has developed METIS, a modular software system for optimizing biological systems. The research team demonstrates its usability and versatility with a variety of biological examples.

Though engineering of biological systems is truly indispensable in biotechnology and , today machine learning has become useful in all fields of biology. However, it is obvious that application and improvement of algorithms, computational procedures made of lists of instructions, is not easily accessible. Not only are they limited by programming skills but often also insufficient experimentally-labeled data. At the intersection of computational and experimental works, there is a need for efficient approaches to bridge the gap between machine learning algorithms and their applications for biological systems.

Now a team at the Max Planck Institute for Terrestrial Microbiology led by Tobias Erb has succeeded in democratizing machine learning. In their recent publication in Nature Communications, the team presented together with collaboration partners from the INRAe Institute in Paris, their tool METIS. The application is built in such a versatile and modular architecture that it does not require computational skills and can be applied on different biological systems and with different lab equipment. METIS is short from Machine-learning guided Experimental Trials for Improvement of Systems and also named after the ancient goddess of wisdom and crafts Μῆτις, or “wise counsel.”

A Swedish researcher tasked an AI algorithm to write an academic paper about itself. The paper is now undergoing a peer-review process.

Almira Osmanovic Thunstrom has said she “stood in awe” as OpenAI’s artificial intelligence algorithm, GPT-3, started generating a text for a 500-word thesis about itself, complete with scientific references and citations.

“It looked like any other introduction to a fairly good scientific publication,” she said in an editorial piece published by Scientific American. Thunstrom then asked her adviser at the University of Gothenburg, Steinn Steingrimsson, whether she should take the experiment further and try to complete and submit the paper to a peer-reviewed journal.

But Cabello and others are interested in investigating a lesser-known but equally magical aspect of quantum mechanics: contextuality. Contextuality says that properties of particles, such as their position or polarization, exist only within the context of a measurement. Instead of thinking of particles’ properties as having fixed values, consider them more like words in language, whose meanings can change depending on the context: “Time flies like an arrow. Fruit flies like bananas.”

Although contextuality has lived in nonlocality’s shadow for over 50 years, quantum physicists now consider it more of a hallmark feature of quantum systems than nonlocality is. A single particle, for instance, is a quantum system “in which you cannot even think about nonlocality,” since the particle is only in one location, said Bárbara Amaral, a physicist at the University of São Paulo in Brazil. “So [contextuality] is more general in some sense, and I think this is important to really understand the power of quantum systems and to go deeper into why quantum theory is the way it is.”

Researchers have also found tantalizing links between contextuality and problems that quantum computers can efficiently solve that ordinary computers cannot; investigating these links could help guide researchers in developing new quantum computing approaches and algorithms.

The US Department of Commerce’s National Institute of Standards and Technology (NIST) has selected the first-ever group of encryption tools that could potentially withstand the attack of a quantum computer.

The four selected encryption algorithms will now reportedly become part of NIST’s post-quantum cryptographic (PQC) standard, which should be finalized in about two years.

More specifically, for general encryption (used for access to secure websites), NIST has selected the CRYSTALS-Kyber algorithm.

The future is now!


Technology continues to move forward at incredible speeds and it seems like every week we learn about a new breakthrough that changes our minds about what is possible.

Researchers in Toronto used a photonic quantum computer chip to solve a sampling problem that went way beyond the fastest computers and algorithms.

The paper the researchers published says that the Borealis quantum chip took only 36 microseconds to solve a problem that would take supercomputers and algorithms 9,000 years to figure out.

Neuromorphic photonics/electronics is the future of ultralow energy intelligent computing and artificial intelligence (AI). In recent years, inspired by the human brain, artificial neuromorphic devices have attracted extensive attention, especially in simulating visual perception and memory storage. Because of its advantages of high bandwidth, high interference immunity, ultrafast signal transmission and lower energy consumption, neuromorphic photonic devices are expected to realize real-time response to input data. In addition, photonic synapses can realize non-contact writing strategy, which contributes to the development of wireless communication.

The use of low-dimensional materials provides an opportunity to develop complex brain-like systems and low-power memory logic computers. For example, large-scale, uniform and reproducible transition metal dichalcogenides (TMDs) show great potential for miniaturization and low-power biomimetic device applications due to their excellent charge-trapping properties and compatibility with traditional CMOS processes. The von Neumann architecture with discrete memory and processor leads to high power consumption and low efficiency of traditional computing. Therefore, the sensor-memory fusion or sensor-memory-processor integration neuromorphic architecture system can meet the increasingly developing demands of big data and AI for and high performance devices. Artificial synaptic devices are the most important components of neuromorphic systems. The performance evaluation of synaptic devices will help to further apply them to more complex artificial neural networks (ANN).

Chemical vapor deposition (CVD)-grown TMDs inevitably introduce defects or impurities, showed a persistent photoconductivity (PPC) effect. TMDs photonic synapses integrating synaptic properties and optical detection capabilities show great advantages in neuromorphic systems for low-power visual information perception and processing as well as brain memory.

Making pizza is not rocket science, but for this actual rocket scientist it is now. Benson Tsai is a former SpaceX employee who is now using his skills to launch a new venture: Stellar Pizza, a fully automated, mobile pizza delivery service. When a customer places an order on an app, an algorithm decides when to start making the pizza based on how long it will take to get to the delivery address. Inside Edition Digital’s Mara Montalbano has more.