Toggle light / dark theme

Researchers Shatter “Impassable Barrier” in Camera Technology

AI and nanotechnology converge in a metalens that rivals traditional optics. The discovery promises smaller, smarter imaging systems. Cameras have become a constant presence in daily life. Over the past two centuries, they have evolved from rare inventions into essential tools used across countle

NVIDIA’s Next-Gen Rubin GPUs Have Reportedly Entered Production, Also Secures HBM4 Samples From All Major DRAM Manufacturers

NVIDIA’s next-generation Rubin GPUs have entered production, and the company has also secured samples of HBM4 memory from all major suppliers.

A few weeks ago, NVIDIA’s CEO, Jensen Huang, showcased the next-gen Vera Rubin Superchip for the first time at GTC 2025 in Washington. We got to see two super massive GPUs stacked together with the next-generation Vera CPU, and loads of LPDDR memory on the outskirts. The Vera Rubin Superchip will lay the framework for the next wave of AI computing in data centers, and it looks like there are some good reports regarding the production timeline.

Saturday Citations: Black hole flare unprecedented; the strength of memories; bugs on the menu

This week, researchers reported finding a spider megacity in a sulfur cave on the Albania-Greece border, and experts say that you, personally, have to go live there. Economists are growing nervous about the collapse of the trillion-dollar AI bubble. And a new study links physical activity levels with the risk of digestive system cancers.

Additionally, astronomers reported the most massive and distant black hole flare ever observed; researchers determined why are more vivid; and the scientists are once again exploring farmed insects as a food source—this time, for lengthy interplanetary missions:

Introducing Nested Learning: A new ML paradigm for continual learning

The last decade has seen incredible progress in machine learning (ML), primarily driven by powerful neural network architectures and the algorithms used to train them. However, despite the success of large language models (LLMs), a few fundamental challenges persist, especially around continual learning, the ability for a model to actively acquire new knowledge and skills over time without forgetting old ones.

When it comes to continual learning and self-improvement, the human brain is the gold standard. It adapts through neuroplasticity — the remarkable capacity to change its structure in response to new experiences, memories, and learning. Without this ability, a person is limited to immediate context (like anterograde amnesia). We see a similar limitation in current LLMs: their knowledge is confined to either the immediate context of their input window or the static information that they learn during pre-training.

The simple approach, continually updating a model’s parameters with new data, often leads to “catastrophic forgetting” (CF), where learning new tasks sacrifices proficiency on old tasks. Researchers traditionally combat CF through architectural tweaks or better optimization rules. However, for too long, we have treated the model’s architecture (the network structure) and the optimization algorithm (the training rule) as two separate things, which prevents us from achieving a truly unified, efficient learning system.

Magnetic materials discovered by AI could reduce rare earth dependence

Researchers at the University of New Hampshire have harnessed artificial intelligence to accelerate the discovery of new functional magnetic materials, creating a searchable database of 67,573 magnetic materials, including 25 previously unrecognized compounds that remain magnetic even at high temperatures.

“By accelerating the discovery of sustainable magnetic materials, we can reduce dependence on , lower the cost of electric vehicles and renewable-energy systems, and strengthen the U.S. manufacturing base,” said Suman Itani, lead author and a doctoral student in physics.

The newly created database, named the Northeast Materials Database, helps to more easily explore all the which play a major role in the technology that powers our world: smartphones, , power generators, electric vehicles and more. But these magnets rely on expensive, imported, and increasingly difficult to obtain rare earth elements, and no new permanent magnet has been discovered from the many magnetic compounds we know exist.

Self-driving system makes key plastic ingredient using in-house generated H₂O₂

An eco-friendly system capable of producing propylene oxide (PO) without external electricity or sunlight has been developed. PO is a vital raw material used in manufacturing household items such as polyurethane for sofas and mattresses, as well as polyester for textiles and water bottles.

A research team led by Professors Ja Hun Kwak and Ji-Wook Jang from the School of Energy and Chemical Engineering at UNIST, in collaboration with Professor Sung June Cho of Chonnam National University, has successfully created a self-driven PO production system utilizing in-situ generated hydrogen peroxide (H₂O₂).

The research is published in Nature Communications.

/* */