Menu

Blog

Archive for the ‘computing’ category: Page 756

May 3, 2016

IBM Inches Ahead of Google in Race for Quantum Computing Power

Posted by in categories: computing, quantum physics

IBM believes it can demonstrate an experimental chip that will prove the power of quantum computers in just a few years.

Read more

May 3, 2016

A Computer Was Programmed to Write a Novel, This Is What It Wrote

Posted by in categories: computing, robotics/AI

I had thought my job was safe from automation—a computer couldn’t possibly replicate the complex creativity of human language in writing or piece together a coherent story. I may have been wrong. Authors beware, because an AI-written novel just made it past the first round of screening for a national literary prize in Japan.

The novel this program co-authored is titled, The Day A Computer Writes A Novel. It was entered into a writing contest for the Hoshi Shinichi Literary Award. The contest has been open to non-human applicants in years prior, however, this was the first year the award committee received submissions from an AI. Out of the 1,450 submissions, 11 were at least partially written by a program.

Here’s a except from the novel to give you an idea as to what human contestants were up against:

Continue reading “A Computer Was Programmed to Write a Novel, This Is What It Wrote” »

May 3, 2016

Quantum logical operations realized with single photons

Posted by in categories: computing, particle physics, quantum physics

More insights around the logical quantum gate for photons discovered by Max Planck Institute of Quantum Optics (MPQ). Being able to leverage this gate enables Qubits in transmission and processing can be more controlled and manipulated through this discovery, and places us closer to a stable Quantum Computing environment.


MPQ scientists take an important step towards a logical quantum gate for photons.

Scientists from all over the world are working on concepts for future quantum computers and their experimental realization. Commonly, a typical quantum computer is considered to be based on a network of quantum particles that serve for storing, encoding and processing quantum information. In analogy to the case of a classical computer a quantum logic gate that assigns output signals to input signals in a deterministic way would be an essential building block. A team around Dr. Stephan Dürr from the Quantum Dynamics Division of Prof. Gerhard Rempe at the Max Planck Institute of Quantum Optics has now demonstrated in an experiment how an important gate operation — the exchange of the binary bit values 0 and 1 — can be realized with single photons. A first light pulse containing one photon only is stored as an excitation in an ultracold cloud of about 100,000 rubidium atoms.

Read more

May 3, 2016

There’s a new thing called ‘fog computing’ and no, we’re not joking

Posted by in category: computing

Enough said; glad folks are seeing the light — (no pun intended)


It could be the thing after cloud computing, if Cisco has its way.

Read more

May 3, 2016

School’s in session — Nvidia’s driverless system learns

Posted by in categories: computing, robotics/AI, transportation

After spending 72 hours watching human drivers, Nvidia’s GPU-based computers were able to construct a road-worthy driverless car system.

Read more

May 2, 2016

Raytheon developing technology to make software “immortal”

Posted by in categories: computing, engineering, life extension, military

Making software immortal; Raytheon is trying to make it a reality.


CAMBRIDGE, Mass., May 2, 2016 /PRNewswire/ — A team led by Raytheon BBN Technologies is developing methods to make mobile applications viable for up to 100 years, despite changes in hardware, operating system upgrades and supporting services. The U.S. Air Force is sponsoring the four-year, $7.8 million contract under the Defense Advanced Research Projects Agency’s Building Resource Adaptive Software Systems program.

“Mobile apps are pervasive in the military, but frequent operating system upgrades, new devices and changing missions and environments require manual software engineering that is expensive and causes unacceptable delays,” said Partha Pal, principal scientist at Raytheon BBN. “We are developing techniques to eliminate these interruptions by identifying the way these changes affect application functionality and modifying the software.”

Continue reading “Raytheon developing technology to make software ‘immortal’” »

May 2, 2016

Bill Gates: No reason to fear AI yet; in fact, it could be your new assistant

Posted by in categories: computing, drones, quantum physics, robotics/AI, terrorism

I am so glad to see this from Bill. Until we drastically improve the under pinning technology to an advance mature version of Quantum Computing; AI is not a threat in the non-criminal use. The only danger is when terrorists, drug cartels, and other criminals uses AI such as drones, robotics, bots, etc. to attack, burglarize, murder, apply their terror, etc.; and that is not AI doing these things on their own.


Munger, Gates on future of AI

Charlie Munger, Berkshire Hathaway vice-chairman shares his thoughts on American Express, Costco and IBM’s future working with artificial intelligence. And Bill Gates, explains why it will be a huge help.

Read more

May 2, 2016

How AI will make information akin to electricity

Posted by in categories: biotech/medical, computing, engineering, government, internet, life extension, mathematics, mobile phones, robotics/AI, wearables

Ask an Information Architect, CDO, Data Architect (Enterprise and non-Enterprise) they will tell you they have always known that information/ data is a basic staple like Electricity all along; and glad that folks are finally realizing it. So, the same view that we apply to utilities as core to our infrastructure & survival; we should also apply the same value and view about information. And, in fact, information in some areas can be even more important than electricity when you consider information can launch missals, cure diseases, make you poor or wealthy, take down a government or even a country.


What is information? Is it energy, matter, or something completely different? Although we take this word for granted and without much thought in today’s world of fast Internet and digital media, this was not the case in 1948 when Claude Shannon laid the foundations of information theory. His landmark paper interpreted information in purely mathematical terms, a decision that dematerialized information forever more. Not surprisingly, there are many nowadays that claim — rather unthinkingly — that human consciousness can be expressed as “pure information”, i.e. as something immaterial graced with digital immortality. And yet there is something fundamentally materialistic about information that we often ignore, although it stares us — literally — in the eye: the hardware that makes information happen.

As users we constantly interact with information via a machine of some kind, such as our laptop, smartphone or wearable. As developers or programmers we code via a computer terminal. As computer or network engineers we often have to wade through the sheltering heat of a server farm, or deal with the material properties of optical fibre or copper in our designs. Hardware and software are the fundamental ingredients of our digital world, both necessary not only in engineering information systems but in interacting with them as well. But this status quo is about to be massively disrupted by Artificial Intelligence.

Continue reading “How AI will make information akin to electricity” »

May 1, 2016

Optical Processing Pioneer wins Project with DARPA

Posted by in categories: computing, engineering, information science, mathematics

https://youtube.com/watch?v=EyOuVFQNMLI

Cambridge University spin-out Optalysys has been awarded a $350k grant for a 13-month project from the US Defense Advanced Research Projects Agency (DARPA). The project will see the company advance their research in developing and applying their optical co-processing technology to solving complex mathematical equations. These equations are relevant to large-scale scientific and engineering simulations such as weather prediction and aerodynamics.

The Optalysys technology is extremely energy efficient, using light rather than electricity to perform intensive mathematical calculations. The company aims to provide existing computer systems with massively boosted processing capabilities, with the aim to eventually reach exaFLOP rates (a billion billion calculations per second). The technology operates at a fraction of the energy cost of conventional high-performance computers (HPCs) and has the potential to operate at orders of magnitude faster.

Continue reading “Optical Processing Pioneer wins Project with DARPA” »

Apr 30, 2016

Sony Patents Own Contact Lens Camera

Posted by in categories: augmented reality, computing, electronics

I forgot Sony in the list of contact lens patents. Sony’s new camera contact patent. So, we have Google, Huawei, and Samsung with AR and CPU patents and Sony’s patents on the camera. Waiting for Apple and my favorite Microsoft’s announcements.


Sony has joined Google and Samsung in the world of contact lens camera patents, Sony’s version also has zoom and aperture control built in.

Read more