Menu

Blog

Archive for the ‘computing’ category: Page 783

Feb 22, 2016

Prosthetics: Amputee James Young unveils hi-tech synthetic arm inspired by Metal Gear Solid

Posted by in categories: biotech/medical, computing, cyborgs, engineering

The job advertisement was highly specific: applicants had to be passionate about computer games and live in the UK. Oh, and they also had to be amputees who were interested in wearing a futuristic prosthetic limb.

James Young knew straight away he had a better shot than most. After losing an arm and a leg in a rail accident in 2012, the 25-year-old Londoner had taught himself to use a video-game controller with one hand and his teeth. “How many amputee gamers can there be?” he asked himself.

In the end, more than 60 people replied to the ad, which was looking for a games-mad amputee to become the recipient of a bespoke high-tech prosthetic arm inspired by Metal Gear Solid, one of the world’s best-selling computer games. Designed and built by a team of 10 experts led by London-based prosthetic sculptor Sophie de Oliveira Barata, the £60,000 carbon-fibre limb is part art project, part engineering marvel.

Continue reading “Prosthetics: Amputee James Young unveils hi-tech synthetic arm inspired by Metal Gear Solid” »

Feb 22, 2016

Don’t Set Your iPhone Back to 1970, No Matter What

Posted by in categories: computing, mobile phones

The trolls have gone retro.

Read more

Feb 22, 2016

IARPA Project Targets Hidden Algorithms of the Brain

Posted by in categories: computing, information science, neuroscience, robotics/AI

Whether in the brain or in code, neural networks are shaping up to be one of the most critical areas of research in both neuroscience and computer science. An increasing amount of attention, funding, and development has been pushed toward technologies that mimic the brain in both hardware and software to create more efficient, high performance systems capable of advanced, fast learning.

One aspect of all the efforts toward more scalable, efficient, and practical neural networks and deep learning frameworks we have been tracking here at The Next Platform is how such systems might be implemented in research and enterprise over the next ten years. One of the missing elements, at least based on the conversations that make their way into various pieces here, for such eventual end users is reducing the complexity of the training process for neural networks to make them more practically useful–and without all of the computational overhead and specialized systems training requires now. Crucial then, is a whittling down of how neural networks are trained and implemented. And not surprisingly, the key answers lie in the brain, and specifically, functions in the brain and how it “trains” its own network that are still not completely understood, even by top neuroscientists.

In many senses, neural networks, cognitive hardware and software, and advances in new chip architectures are shaping up to be the next important platform. But there are still some fundamental gaps in knowledge about our own brains versus what has been developed in software to mimic them that are holding research at bay. Accordingly, the Intelligence Advanced Research Projects Activity (IARPA) in the U.S. is getting behind an effort spearheaded by Tai Sing Lee, a computer science professor at Carnegie Mellon University’s Center for the Neural Basis of Cognition, and researchers at Johns Hopkins University, among others, to make new connections between the brain’s neural function and how those same processes might map to neural networks and other computational frameworks. The project called the Machine Intelligence from Cortical Networks (MICRONS).

Continue reading “IARPA Project Targets Hidden Algorithms of the Brain” »

Feb 22, 2016

IARPA wants to improve human/machine forecasting

Posted by in category: computing

Human and machine forecasting.


The agency’s Hybrid Forecasting Competition is intended to improve how humans and computers interact on geopolitical and geoeconomic analysis.

Read more

Feb 21, 2016

Robot chores: Machines tipped to take 15m Brit jobs in the next ten years

Posted by in categories: computing, employment, habitats, robotics/AI

“No offense; but your robots are ugly”

Robots today (especially for home and care giver usage) will need to improve drastically. We’re still designing robots like the are a CPU for homes which frankly freaks some kids out, scares some of the elderly population that it’s too fragile to operate, and my own cat will not come near one. If robotics for home use is ever going to be adopted by the large mass of the population they will need to look less like they are a robot part of a manufacturers’s assembly line, will need a softer/ low noise sound with volume controls for those with hard of hearing, will need modifications for the deaf and blind, will all need to be a multi purpose robot that can do 2 or more types of work inside the home vacumn/ dust/ cook/ wash dishes/ wash clothes, etc., not complicated to set up and operate, reliable (not needing repairs all the time & not over heat), less bulky, better sensors to determine stairs and can climb stairs, etc.


From mowing the lawn to cooking dinner, experts say automatons are set to take over some of our most tedious tasks.

Read more

Feb 20, 2016

United Nations CITO: Artificial intelligence will be humanity’s final innovation

Posted by in categories: computing, internet, quantum physics, robotics/AI, security

I hate to break the news to the UN’s CITO — has she ever heard of “Quantum Technology?” After AI flood into the scene; the next innovation that I and others are working on is Quantum Computing which will make AI, Internet, Cyber Security, devices, platforms, medical technology more advance with incredible performance.


The United Nations Chief Information Technology Officer spoke with TechRepublic about the future of cybersecurity, social media, and how to fix the internet and build global technology for social good.

Artificial intelligence, said United Nations chief information technology officer Atefeh Riazi, might be the last innovation humans create.

Read more

Feb 20, 2016

Gaming Chip Is Helping Raise Your Computer’s IQ

Posted by in categories: computing, entertainment, mobile phones, robotics/AI

Using gaming chips to read people’s images, etc. definitely makes sense especially as we move more and more in the AI connected experience.


Facebook, Google and Microsoft are tapping the power of a vintage computer gaming chip to raise your smartphone’s IQ with artificially intelligent programs that recognize faces and voices, translate conversations on the fly and make searches faster and more accurate.

Read more

Feb 20, 2016

Basic income may be needed to combat robot-induced unemployment, leading AI expert says

Posted by in categories: computing, economics, employment, robotics/AI

I do believe that there will be some level of expansion of social services to help employees to be retrained for the new positions that are coming as well as assist lower skill workers to be retrained. However, the larger question is who should pay. Some people are saying tech should assist governments in retooling since the AI technology created the situation; others say it’s a governments issue only, etc. It will be interesting to say the least how the retraining program and other services are covered.


A leading artificial intelligence (AI) expert believes that societies may have to consider issuing a basic income to all citizens, in order to combat the threat to jobs posed by increased automation in the workplace.

Dr Moshe Vardi, a computer science professor at Rice University in Texas, believes that a basic income may be needed in the future as advances in automation and AI put human workers out of jobs.

Continue reading “Basic income may be needed to combat robot-induced unemployment, leading AI expert says” »

Feb 20, 2016

Infographic: Combining Electronics and Photonics Opens Way for Next-Generation Microprocessors

Posted by in categories: computing, electronics, engineering

Integrated circuits traditionally have been a domain reserved for electrons, which course through exquisitely tiny transistors, wires and other microscopic structures where the digital calculations and data processing that underlie so much of modern technology unfold. Increasingly, however, chip designers have been acting on a long-ripening vision of enlisting photons instead of, or in tandem with, electrons in the operation of microprocessors. Photons, for one, can serve as fast-as-light carriers of information between chips, overcoming digital traffic jams that at times put the brakes on electrons. Recently, DARPA-funded scientists designed and crafted a breakthrough microprocessor that combines many of the best traits of electrons and photons on a single chip. The result is a remarkable and elegant hybrid microtechnology that boggles the mind for the intricate complexity of its sub-Lilliputian architecture. To appreciate the engineering acumen involved in the development of this chip and its tens of millions of resident electronic and photonic components, DARPA has produced an annotated, graphical tour of the new chip’s innards. Check it out, and lose yourself in a world of highways, toll gates and traffic circles populated by some of the physical world’s smallest commuters.

Infographic

Read more

Feb 19, 2016

Scientists say all the world’s data can fit on a DNA hard drive the size of a teaspoon

Posted by in categories: biotech/medical, computing, genetics

Even though it’s looking increasingly likely that humanity will find a way to wipe itself off the face of the Earth, there’s a chance that our creative output may live on. Servers, hard drives, flash drives, and disks will degrade (as will our libraries of paper books, of course), but a group of researchers at the Swiss Federal Institute of Technology have found a way to encode data onto DNA—the very same stuff that all living beings’ genetic information is stored on—that could survive for millennia.

One gram of DNA can potentially hold up to 455 exabytes of data, according to the New Scientist. For reference: There are one billion gigabytes in an exabyte, and 1,000 exabytes in a zettabyte. The cloud computing company EMC estimated that there were 1.8 zettabytes of data in the world in 2011, which means we would need only about 4 grams (about a teaspoon) of DNA to hold everything from Plato through the complete works of Shakespeare to Beyonce’s latest album (not to mention every brunch photo ever posted on Instagram).

There are four types of molecules that make up DNA, which form pairs. To encode information on DNA, scientists program the pairs into 1s and os—the same binary language that encodes digital data. This is not a new concept—scientists at Harvard University encoded a book onto DNA in 2012—but up to now, it had been difficult to retrieve the information stored on the DNA.

Read more