Toggle light / dark theme

A new proof by SFI Professor David Wolpert sends a humbling message to would-be super intelligences: you can’t know everything all the time.

The proof starts by mathematically formalizing the way an “inference device,” say, a scientist armed with a supercomputer, fabulous experimental equipment, etc., can have knowledge about the state of the universe around them. Whether that scientist’s knowledge is acquired by observing their universe, controlling it, predicting what will happen next, or inferring what happened in the past, there’s a mathematical structure that restricts that knowledge. The key is that the inference device, their knowledge, and the physical variable that they (may) know something about, are all subsystems of the same universe. That coupling restricts what the device can know. In particular, Wolpert proves that there is always something that the inference device cannot predict, and something that they cannot remember, and something that they cannot observe.

“In some ways this formalism can be viewed as many different extensions of [Donald MacKay’s] statement that ‘a prediction concerning the narrator’s future cannot account for the effect of the narrator’s learning that prediction,’” Wolpert explains. “Perhaps the simplest extension is that, when we formalize [inference devices] mathematically, we notice that the same impossibility results that hold for predictions of the future—MacKay’s concern—also hold for memories of the past. Time is an arbitrary variable—it plays no role in terms of differing states of the universe.”

Read more

1. blame the American public that lost serious interest in science in the 1990’s, And 2. the US government who’s only real interest now is war, and how to spend money on war.


If you want to crunch the world’s biggest problems, head east. According to a newly published ranking, not only is China home to the world’s two fastest supercomputers, it also has 202 of the world’s fastest 500 such devices—more than any other nation. Meanwhile, America’s fastest device limps into fifth place in the charts, and the nation occupies just 144 of the top 500 slots, making it second according to that metric.

The world’s fastest supercomputer is still TaihuLight, housed at the National Supercomputing Center in Wuxi, China, and pictured above. Capable of performing 93 quadrillion calculations per second, it’s almost three times faster than the second-place Tianhe-2. The Department of Energy’s fifth-placed Titan supercomputer, housed at Oak Ridge National Laboratory, performs 17.6 quadrillion calculations per second—making it less than a fifth as fast as TaihuLight.

China also beats out all comers on total computational resources, commanding 35.4 percent of the computing power in the list, compared with America’s 29.6 percent. The new list clearly and painfully underscores America’s decline as a supercomputing heavyweight. Indeed, this is the weakest representation by the U.S. since the Top500 supercomputers list started ranking the industry 25 years ago.

Using supercomputer modeling, University of Oregon scientists have unveiled a new explanation for the geology underlying recent seismic imaging of magma bodies below Yellowstone National Park.

Yellowstone, a supervolcano famous for explosive eruptions, large calderas and extensive lava flows, has for years attracted the attention of scientists trying to understand the location and size of below it. The last caldera forming eruption occurred 630,000 years ago; the last large volume of lava surfaced 70,000 years ago.

Crust below the park is heated and softened by continuous infusions of magma that rise from an anomaly called a , similar to the source of the magma at Hawaii’s Kilauea volcano. Huge amounts of water that fuel the dramatic geysers and hot springs at Yellowstone cool the crust and prevent it from becoming too hot.

Read more

Since their invention, computers have become faster and faster, as a result of our ability to increase the number of transistors on a processor chip.

Today, your smartphone is millions of times faster than the computers NASA used to put the first man on the moon in 1969. It even outperforms the most famous supercomputers from the 1990s. However, we are approaching the limits of this electronic technology, and now we see an interesting development: light and lasers are taking over electronics in computers.

Processors can now contain tiny lasers and light detectors, so they can send and receive data through small optical fibres, at speeds far exceeding the we use now. A few companies are even developing optical processors: chips that use laser light and optical switches, instead of currents and electronic transistors, to do calculations.

Read more

Amazing.


(credit: iStock)

An international team of scientists has developed an algorithm that represents a major step toward simulating neural connections in the entire human brain.

An international group of researchers has made a decisive step towards creating the technology to achieve simulations of brain-scale networks on future supercomputers of the exascale class. The breakthrough, published in Frontiers in Neuroinformatics, allows larger parts of the human brain to be represented, using the same amount of computer memory. Simultaneously, the new algorithm significantly speeds up brain simulations on existing supercomputers.

The human brain is an organ of incredible complexity, composed of 100 billion interconnected nerve cells. However, even with the help of the most powerful supercomputers available, it is currently impossible to simulate the exchange of neuronal signals in networks of this size.

“Since 2014, our software can simulate about one percent of the in the human brain with all their connections,” says Markus Diesmann, Director at the Jülich Institute of Neuroscience and Medicine (INM-6). In order to achieve this impressive feat, the software requires the entire main memory of petascale supercomputers, such as the K computer in Kobe and JUQUEEN in Jülich.

Read more