Toggle light / dark theme

Hyperparameter tuning is important for algorithms. It improves their overall performance of a machine learning model and is set before the learning process and happens outside of the model. If hyperparameter tuning does not occur, the model will produce errors and inaccurate results as the loss function is not minimized.

Hyperparameter tuning is about finding a set of optimal hyperparameter values which maximizes the model’s performance, minimizes loss and produces better outputs.

In recent years, roboticists have been trying to improve how robots interact with different objects found in real-world settings. While some of their efforts yielded promising results, the manipulation skills of most existing robotic systems still lag behinds those of humans.

Fabrics are among the types of objects that have proved to be most challenging for to interact with. The main reasons for this are that pieces of cloth and other fabrics can be stretched, moved and folded in different ways, which can result in complex material dynamics and self-occlusions.

Researchers at Carnegie Mellon University’s Robotics Institute have recently proposed a new computational technique that could allow robots to better understand and handle fabrics. This technique, introduced in a paper set to be presented at the International Conference on Intelligent Robots and Systems and pre-published on arXiv, is based on the use of a and a simple machine-learning algorithm, known as a classifier.

Unstable black holes would require a rewrite of Einstein’s gravitational theory.

An international group of scientists finally proved that slowly rotating Kerr black holes are stable, a report from Quanta Magazine

In 1963, mathematician Roy Kerr found a solution to Einstein’s equations that accurately described the spacetime around what is now known as a rotating black hole.


The solutions to Einstein’s equations that describe a spinning black hole won’t blow up, even when poked or prodded.

I have been invited to participate in a quite large event in which some experts and I (allow me to not consider myself one) will discuss about Artificial Intelligence, and, in particular, about the concept of Super Intelligence.

It turns out I recently found out this really interesting TED talk by Grady Booch, just in perfect timing to prepare my talk.

No matter if you agree or disagree with Mr. Booch’s point of view, it is clear that today we are still living in the era of weak or narrow AI, very far from general AI, and even more from a potential Super Intelligence. Still, Machine Learning bring us with a great opportunity as of today. The opportunity to put algorithms to work together with humans to solve some of our biggest challenges: climate change, poverty, health and well being, etc.

Near-term quantum computers, quantum computers developed today or in the near future, could help to tackle some problems more effectively than classical computers. One potential application for these computers could be in physics, chemistry and materials science, to perform quantum simulations and determine the ground states of quantum systems.

Some quantum computers developed over the past few years have proved to be fairly effective at running . However, near-term quantum computing approaches are still limited by existing hardware components and by the adverse effects of background noise.

Researchers at 1QB Information Technologies (1QBit), University of Waterloo and the Perimeter Institute for Theoretical Physics have recently developed neural , a new strategy that could improve ground state estimates attained using quantum simulations. This strategy, introduced in a paper published in Nature Machine Intelligence, is based on machine-learning algorithms.

Quantum Information Science / Quantum Computing (QIS / QC) continues to make substantial progress into 2023 with commercial applications coming where difficult practical problems can be solved significantly faster using QC (quantum advantage) and QC solving seemingly impossible problems and test cases (not practical problems) that for classical computers such as supercomputers would take thousands of years or beyond classical computing capabilities (quantum supremacy). Often the two terms are interchanged. Claims of quantum advantage or quantum supremacy, at times, are able to be challenged through new algorithms on classical computers.

The potential is for hybrid systems with quantum computers and classical computers such as supercomputers (and perhaps analog computing in the future) could operate in the thousands and potentially millions of times faster in lending more understanding into intractable challenges and problems. Imagine the possibilities and the implications for the benefit of Earth’s ecosystems and humankind significantly impacting in dozens of areas of computational science such as big data analytics, weather forecasting, aerospace and novel transportation engineering, novel new energy paradigms such as renewable energy, healthcare and drug discovery, omics (genomics, transcriptomics, proteomics, metabolomic), economics, AI, large-scale simulations, financial services, new materials, optimization challenges, … endless.

The stakes are so high in competitive and strategic advantage that top corporations and governments are investing in and working with QIS / QC. (See my Forbes article: Government Deep Tech 2022 Top Funding Focus Explainable AI, Photonics, Quantum—they (BDC Deep Tech Fund) invested in QC company Xanadu). For the US, in 2018, there is the USD $1.2 billion National Quantum Initiative Act and related U.S. Department of Energy providing USD $625 million over five years for five quantum information research hubs led by national laboratories: Argonne, Brookhaven, Fermi, Lawrence Berkeley and Oak Ridge. In August 2022, the US CHIPS and Science Act providing hundreds of millions in funding as well. Coverage includes: accelerating the discovery of quantum applications; growing a diverse and domestic quantum workforce; development of critical infrastructure and standardization of cutting-edge R&D.

Color coding makes aerial maps much more easily understood. Through color, we can tell at a glance where there is a road, forest, desert, city, river or lake.

Working with several universities, the U.S. Department of Energy’s (DOE) Argonne National Laboratory has devised a method for creating color-coded graphs of large volumes of data from X-ray analysis. This new tool uses computational data sorting to find clusters related to physical properties, such as an atomic distortion in a . It should greatly accelerate future research on structural changes on the atomic scale induced by varying temperature.

The research team published their findings in the Proceedings of the National Academy of Sciences in an article titled “Harnessing interpretable and unsupervised to address big data from modern X-ray diffraction.”