Toggle light / dark theme

How expensive and difficult does hyperscale-class AI training have to be for a maker of self-driving electric cars to take a side excursion to spend how many hundreds of millions of dollars to go off and create its own AI supercomputer from scratch? And how egotistical and sure would the company’s founder have to be to put together a team that could do it?

Like many questions, when you ask these precisely, they tend to answer themselves. And what is clear is that Elon Musk, founder of both SpaceX and Tesla as well as a co-founder of the OpenAI consortium, doesn’t have time – or money – to waste on science projects.

Just like the superpowers of the world underestimated the amount of computing power it would take to fully simulate a nuclear missile and its explosion, perhaps the makers of self-driving cars are coming to the realization that teaching a car to drive itself in a complex world that is always changing is going to take a lot more supercomputing. And once you reconcile yourself to that, then you start from scratch and build the right machine to do this specific job.

The CHIPS Act of 2022 was signed into law on Aug. 9. It provides tens of billions of dollars in public support for revitalization of domestic semiconductor manufacturing, workforce training, and “leap ahead” wireless technology. Because we outsource most of our device fabrication — including the chips that go into the Navy’s submarines and ships, the Army’s jeeps and tanks, military drones and satellites — our industrial base has become weak and shallow. The first order of business for the CHIPS Act is to address a serious deficit in our domestic production capacity.

Notoriously absent from the language of the bill is any mention of chip security. Consequently, the U.S. is about to make the same mistake with microelectronics that we made with digital networks and software applications: Unless and until the government demands in-device security, our competitors will have an easy time of manipulating how chips function and behave. Nowhere is this more dangerous than our national security infrastructure.

This is just one of many military advancements the nation has made against its arch-rival.

Back in July, South Korea undertook a 33-minute flight of its homegrown KF-21 fighter jet for the first time flaunting its military might and perhaps sending a message to North Korea.


South Korea is pursuing stealth drones that could take out North Korean air defenses as part of a “manned-unmanned teaming system.”

Reminder: The first test was successfully conducted in September 2021.

A hypersonic cruise missile co-created by Raytheon Technologies has passed its second flight test in a row. This is an important step in the U.S. Department of Defense’s plan to use weapons that can travel faster than five times the speed of sound.

In July 2022, a Hypersonic Air-breathing Weapon Concept (HAWC) was put to the test. It has been slightly improved after a successful test in September 2021. The most recent test, in which it was dropped from an airplane and shot past Mach 5, went just as the company’s data models predicted.

Artificial General Intelligence — Short for AGI is a trending and recent topic of debate among AI researchers and computer scientists. A pressing issue for AI or artificial Intelligence is the AI alignment problem. The AI control problem could be the most important task for humanity to solve. There have been many suggestions from AI researchers to avoid the dangers of artificial general intelligence or a digital super-intellgience. It seems among the best solutions to this problem has been a merging scenario with AGI. Elon Musk has suggested we regulate artificial intelligence and we should proceed very carefully if humanity collectively decides that creating a digital super-intelligence is the right move. Elon Musk is the founder of many high tech companies, including Neuralink. Which develops implantable brain–machine interfaces. Elon Musk warns that AI is probably the biggest existential threat for humanity. AGI is probably even more dangerous than nuclear warheads and nobody would suggest we allow anyone to build nuclear weapons if they want. The pressing issue for a potential AGI development and eventually the creation of a digital super-intelligence is going to be increasingly relevant in the coming years. Dr. Ben Goertzel, CEO & Founder, of SingularityNET Foundation, is one of the world’s foremost experts in Artificial General Intelligence. According to him these reactions are probably going to look very silly to people a few decades from now, as they go about their lives which have been made tremendously easy and happy and fascinating compared to 2020 reality, via the wide rollout of advanced AGI systems to handle manufacturing service, and all the other practical jobs that humans now spend their time doing. Elon musk suggested, the merge scenario with A.I. is the one that seems like probably the best,” or as he put it on the Joe Rogan Experience. “If you can’t beat it, join it.

#AGI #AI #Artificialintelligence.

SUBSCRIBE to our channel “Science Time”: https://www.youtube.com/sciencetime24
SUPPORT us on Patreon: https://www.patreon.com/sciencetime.
BUY Science Time Merch: https://teespring.com/science-time-merch.

Sources:

The U.S. government is planning to review the environmental effects of operations at one of the nation’s prominent nuclear weapons laboratories, but its notice issued Friday leaves out federal goals to ramp up production of plutonium cores used in the nation’s nuclear arsenal.

The National Nuclear Security Administration said the review—being done to comply with the National Environmental Policy Act—will look at the potential environmental effects of alternatives for operations at Los Alamos National Laboratory for the next 15 years.

That work includes preventing the spread and use of nuclear weapons worldwide and other projects related to national security and global stability, the notice said.

It was clearly about Albert Einstein although not a lot of people seemed to be aware of the fact. It was a great song for a video, and one of my favorite on this particular album, the other’s being “Nobody Home”, “Closet Chronicles”, and “Dust In The Wind”.

Point of Know Return was HUGE in 1978! I remember listening to it over and over and over…loved the many instrumental breaks and solos.

The video is just layers and layers of masked images and masked video. Tried for some really cool effects and found some pretty neat ones…‘specially fond of that tree recoil effect from the atom bomb at that cool little note drag!

Anyway, as usual…I hope you find something to enjoy.

The invasion that Russia has wrongfully started in Ukraine has led to more people talking about the threat of Nuclear war and World War 3. How does the Doomsday Clock relate to all this?

And Lifespan News: https://www.youtube.com/LifespanNews.

Support the great work being done by Lifespan, the team powering Life Noggin: https://www.lifespan.io/life-noggin/

Script: