Archive for the ‘computing’ category: Page 737
Jun 22, 2016
Particle zoo in a quantum computer
Posted by Karen Hurst in categories: computing, particle physics, quantum physics
Excellent story and highlights how Quantum computers may provide a way to overcome the obstacles around particle physics because QC can simulate certain aspects of elementary particle physics in a well-controlled quantum system.
Physicists in Innsbruck have realized the first quantum simulation of lattice gauge theories, building a bridge between high-energy theory and atomic physics. In the journal Nature, Rainer Blatt’s and Peter Zoller’s research teams describe how they simulated the creation of elementary particle pairs out of the vacuum by using a quantum computer.
Jun 22, 2016
Viewpoint: Hiding a Quantum Cache in Diamonds
Posted by Karen Hurst in categories: computing, internet, nanotechnology, quantum physics
Entanglement purification, a vital enabler for practical quantum networks, has been shown to be feasible with secluded nuclear memories in diamond.
Quantum devices can team up to perform a task collectively, but only if they share that most “spooky” of all quantum phenomena: entanglement. Remote devices have been successfully entangled in order to investigate entanglement itself [1], but the entanglement’s quality is too low for practical applications. The solution, known as entanglement purification [2], has seemed daunting to implement in a real device. Now new research [3] shows that even quite simple quantum components—nanostructures in diamond—have the potential to store and upgrade entanglement. The result relies on hiding information in almost-inaccessible nuclear memories, and may be a key step toward the era of practical quantum networks.
The concept of an interlinked network is absolutely fundamental to conventional technologies. It applies not only to distributed systems like the internet, but also to individual devices like laptops, which contain a hierarchy of interlinked components. For quantum technologies to fulfill their potential, we will want them to have the flexibility and scalability that come from embracing the network paradigm.
Jun 21, 2016
Structure-mapping engine enables computers to reason and learn like humans, including solving moral dilemmas
Posted by Andreas Matt in categories: computing, ethics, neuroscience
Northwestern University’s Ken Forbus is closing the gap between humans and machines.
Using cognitive science theories, Forbus and his collaborators have developed a model that could give computers the ability to reason more like humans and even make moral decisions. Called the structure-mapping engine (SME), the new model is capable of analogical problem solving, including capturing the way humans spontaneously use analogies between situations to solve moral dilemmas.
“In terms of thinking like humans, analogies are where it’s at,” said Forbus, Walter P. Murphy Professor of Electrical Engineering and Computer Science in Northwestern’s McCormick School of Engineering. “Humans use relational statements fluidly to describe things, solve problems, indicate causality, and weigh moral dilemmas.”
Jun 21, 2016
Using Enzymes to Enhance LEDs
Posted by Karen Hurst in categories: computing, engineering, particle physics, quantum physics, solar power, sustainability
Robert Dunleavy had just started his sophomore year at Lehigh University when he decided he wanted to take part in a research project. He sent an email to Bryan Berger, an assistant professor of chemical and biomolecular engineering, who invited Dunleavy to his lab.
Berger and his colleagues were conducting experiments on tiny semiconductor particles called quantum dots. The optical and electronic properties of QDs make them useful in lasers, light-emitting diodes (LEDs), medical imaging, solar cells, and other applications.
Dunleavy joined Berger’s group and began working with cadmium sulfide (CdS), one of the compounds from which QDs are fabricated. The group’s goal was to find a better way of producing CdS quantum dots, which are currently made with toxic chemicals in an expensive process that requires high pressure and temperature.
Jun 21, 2016
Voice: How To Architect A Cognitive Future For Business
Posted by Klaus Baldauf in categories: biotech/medical, business, computing, education, finance, mobile phones, neuroscience, robotics/AI
Whether referred to as AI, machine learning, or cognitive systems, such as IBM Watson, a growing cadre of business leaders is embracing this opportunity head on.
That’s because their consumers are using cognitive applications on a daily basis — through their phones, in their cars, with their doctors, banks, schools, and more. All of this consumer engagement is creating 2.5 quintillion bytes of data every day. And thanks to IT infrastructures designed for cognitive workloads — that can understand, reason, and learn from all this data — organizations and entire industries are transforming and reaping the benefits.
What’s important to remember is that this sci-fi-turned-reality-show of cognitive computing cannot happen without the underlying systems on which the APIs, software, and services run. For this very reason, today’s leading CIOs are thinking differently about their IT infrastructure.
Continue reading “Voice: How To Architect A Cognitive Future For Business” »
Jun 20, 2016
Viewpoint: Classical Simulation of Quantum Systems?
Posted by Karen Hurst in categories: computing, quantum physics
Nice.
Richard Feynman suggested that it takes a quantum computer to simulate large quantum systems, but a new study shows that a classical computer can work when the system has loss and noise.
The field of quantum computing originated with a question posed by Richard Feynman. He asked whether or not it was feasible to simulate the behavior of quantum systems using a classical computer, suggesting that a quantum computer would be required instead [1]. Saleh Rahimi-Keshari from the University of Queensland, Australia, and colleagues [2] have now demonstrated that a quantum process that was believed to require an exponentially large number of steps to simulate on a classical computer could in fact be simulated in an efficient way if the system in which the process occurs has sufficiently large loss and noise.
Continue reading “Viewpoint: Classical Simulation of Quantum Systems?” »
Jun 20, 2016
New chip design makes parallel programs run many times faster and requires one-tenth the code
Posted by Klaus Baldauf in categories: computing, robotics/AI
Computer chips have stopped getting faster. For the past 10 years, chips’ performance improvements have come from the addition of processing units known as cores.
In theory, a program on a 64-core machine would be 64 times as fast as it would be on a single-core machine. But it rarely works out that way. Most computer programs are sequential, and splitting them up so that chunks of them can run in parallel causes all kinds of complications.
In the May/June issue of the Institute of Electrical and Electronics Engineers’ journal Micro, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) will present a new chip design they call Swarm, which should make parallel programs not only much more efficient but easier to write, too.
Jun 20, 2016
Researchers create organic nanowire synaptic transistors that emulate the working principles of biological synapses
Posted by Bruno Henrique de Souza in categories: computing, nanotechnology, quantum physics
(Phys.org)—A team of researchers with the Pohang University of Science and Technology in Korea has created organic nanowire synaptic transistors that emulate the working principles of biological synapses. As they describe in their paper published in the journal Science Advances, the artificial synapses they have created use much smaller amounts of power than other devices developed thus far and rival that of their biological counterparts.
Scientists are taking multiple paths towards building next generation computers—some are fixated on finding a material to replace silicon, others are working towards building a quantum machine, while still others are busy trying to build something much more like the human mind. A hybrid system of sorts that has organic artificial parts meant to mimic those found in the brain. In this new effort, the team in Korea has reached a new milestone in creating an artificial synapse—one that has very nearly the same power requirements as those inside our skulls.
Up till now, artificial synapses have consumed far more power than human synapses, which researchers have calculated is on the order of 10 femtojoules each time a single one fires. The new synapse created by the team requires just 1.23 femtojoules per event—far lower than anything achieved thus far, and on par with their natural rival. Though it might seem the artificial creations are using less power, they do not perform the same functions just yet, so natural biology is still ahead. Plus there is the issue of transferring information from one neuron to another. The “wires” used by the human body are still much thinner than the metal kind still being used by scientists—still, researchers are gaining.
Jun 18, 2016
Google’s quantum computer inches nearer after landmark performance breakthrough
Posted by Karen Hurst in categories: computing, government, nanotechnology, particle physics, quantum physics, space
Over 20 years ago, I was interviewed by a group that asked me about the future of technology. I told them due to advancements such as nanotechnology that technology will definitely go beyond laptops, networks, servers, etc.; that we would see even the threads/ fibers in our clothing be digitized. I was then given a look by the interviewers that I must have walked of the planet Mars. However, I was proven correct. And, in the recent 10 years, again I informed others how and where Quantum would change our lives forever. Again, same looks and comments.
And, lately folks have been coming out with articles that they have spoken with or interviewed QC experts. And, they in many cases added their own commentary and cherry picked people comments to discredit the efforts of Google, D-Wave, UNSW, MIT, etc. which is very misleading and negatively impacts QC efforts. When I come across such articles, I often share where and why the authors have misinformed their readers as well as negatively impacted efforts and set folks up for failure who should be trying to plan for QC in their longer term future state strategy so that they can plan for budgets, people can be brought up to date in their understanding of QC because once QC goes live on a larger scale, companies and governments will not have time to catch up because once hackers (foreign government hackers, etc.) have this technology and you’re not QC enabled then you are exposed, and your customers are exposed. The QC revolution will be costly and digital transformation in general across a large company takes years to complete so best to plan and prepare early this time for QC because it is not the same as implementing a new cloud, or ERP, or a new data center, or rationalizing a silo enterprise environment.
The recent misguided view is that we’re 30 or 50 years away from a scalable quantum chip; and that is definitely incorrect. UNSW has proven scalable QC is achievable and Google has been working on making a scalable QC chip. And, lately RMIT researchers have shared with us how they have proven method to be able to trace particles in the deepest layers of entanglement which means that we now can build QC without the need of analog technology and take full advantage of quantum properties in QC which has not been the case.
Continue reading “Google’s quantum computer inches nearer after landmark performance breakthrough” »