Toggle light / dark theme

Search-and-replace genome editing without double-strand breaks or donor DNA

Okay, Science time.


  • Article
  • Published: 21 October 2019

This is an unedited manuscript that has been accepted for publication. Nature Research are providing this early version of the manuscript as a service to our customers. The manuscript will undergo copyediting, typesetting and a proof review before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers apply.

Can scientists reverse time with a quantum computer?

The universe is getting messy. Like a glass shattering to pieces or a single wave crashing onto the shore, the universe’s messiness can only move in one direction – toward more chaos and disorder. But scientists think that, at least for a single electron or the simplest quantum computer, they may be able to turn back time, and restore order to chaos. This doesn’t mean we’ll be visiting with dinosaurs or Napoleon any time soon, but for physicists, the idea that time can run backward at all is still a pretty big deal.

Normally, the universe’s trend toward disorder is a fundamental law: the second law of thermodynamics. It says more formally that any system can only move from more to less ordered, and that the chaos or disorder of a system – its entropy – can never decrease. But an international team of scientists led by researchers at the Moscow Institute of Physics and Technology think they may have discovered a loophole.

This New Drone Bill Would Make the Airspace Over People’s Homes Private Property

A new proposal scheduled to be released from the office of Senator Lee (R-Utah) tomorrow would put the airspace up to 200 feet in altitude over private property under the control of the property owner – and would restrict the FAA’s right to regulate airspace below 200 feet in altitude, making any zoning or regulatory decisions the right of the state or tribal entity governing the land.

DRONELIFE received a summary of the bill from Senator Lee’s office. We may not quote from the summary, as the bill is due to be released tomorrow. In essence, the bill seeks to clarify and control two significant legal issues that remain unresolved: 1) Establish the airspace to 200 feet in altitude above private property as under the exclusive control of the property owner; and 2) Establish state, tribal and local governments as having exclusive and absolute rights to regulate that airspace.

In summary, the bill would propose that the altitude between 200 and 400 feet be designated for the use of civilian drones – although it wouldn’t prohibit the FAA from allowing drones above 400 feet. The area under 200 feet would be under the jurisdiction of state, local and tribal governments – and the bill would call for a redefinition of “navigable airspace” to make that clear.

For $150,000 you can now order your own Hoverbike

Circa 2018


After first spotting this crazy looking motorcycle-styled hoverbike in early 2017, we were skeptical the contraption would ever move beyond just an odd engineering curiosity. However, Russian company Hoversurf has just revealed its hoverbikes are now ready for production and preorders are open, with delivery scheduled for sometime in 2019.

Ever since the Scorpion hoverbike was revealed we seriously questioned its safety, with such a crazy close proximity between spinning blades and fleshy legs it seemed like a device only really suitable for “aspiring amputees”. Nevertheless, Hoversurf has rapidly moved from ambitious prototype to commercial aircraft, first revealing a deal to sell the aircraft to Dubai Police, and then more recently passing the US Federal Aviation Administration requirements to be classified as a legal ultralight vehicle.

The plan to classify the hoverbike as an ultralight vehicle resulted in some minor design tweaks to fulfill the legal requirements of the classification, but this final commercial iteration is still, at its core, the same crazy quadcopter hoverbike.

We need robots to have morals. Could Shakespeare and Austen help?

John Mullan, professor of English literature at University College London, wrote an article on The Guardian titled “We need robots to have morals. Could Shakespeare and Austen help?”.

Using great literature to teach ethics to machines is a dangerous game. The classics are a moral minefield.

When he wrote the stories in I, Robot in the 1940s, Isaac Asimov imagined a world in which robots do all humanity’s tedious or unpleasant jobs for them, but where their powers have to be restrained. They are programmed to obey three laws. A robot may not injure another human being, even through inaction; a robot must obey a human being (except to contradict the previous law); a robot must protect itself (unless this contradicts either of the previous laws).

Researchers invent low-cost alternative to Bitcoin

The cryptocurrency Bitcoin is limited by its astronomical electricity consumption and outsized carbon footprint. A nearly zero-energy alternative sounds too good to be true, but as School of Computer and Communication Sciences (IC) Professor Rachid Guerraoui explains, it all comes down to our understanding of what makes transactions secure.

To explain why the system developed in his Distributed Computing Lab (DCL) represents a paradigm shift in how we think about cryptocurrencies—and about digital trust in general—Professor Rachid Guerraoui uses a legal metaphor: all players in this new system are “innocent until proven guilty.”

This is in contrast to the traditional Bitcoin model first described in 2008 by Satoshi Nakamoto, which relies on solving a difficult problem called “consensus” to guarantee the security of transactions. In this model, everyone in a distributed system must agree on the validity of all transactions to prevent malicious players from cheating—for example, by spending the same digital tokens twice (double-spending). In order to prove their honesty and achieve consensus, players must execute complex—and energy-intensive—computing tasks that are then verified by the other players.

Your Software Could Have More Rights Than You

Much like US corporations do now.


Debates about rights are frequently framed around the concept of legal personhood. Personhood is granted not just to human beings but also to some non-human entities, such as corporations or governments. Legal entities, aka legal persons, are granted certain privileges and responsibilities by the jurisdictions in which they are recognized, and many such rights are not available to non-person agents. Attempting to secure legal personhood is often seen as a potential pathway to get certain rights and protections for animals1, fetuses2, trees and rivers 3, and artificially intelligent (AI) agents4.

It is commonly believed that a new law or judicial ruling is necessary to grant personhood to a new type of entity. But recent legal literature 5–8 suggests that loopholes in the current law may permit legal personhood to be granted to AI/software without the need to change the law or persuade a court.

For example, L. M. LoPucki6 points out, citing Shawn Bayern’s work on conferring legal personhood on AI7, 8, “Professor Shawn Bayern demonstrated that anyone can confer legal personhood on an autonomous computer algorithm merely by putting it in control of a limited liability company (LLC). The algorithm can exercise the rights of the entity, making them effectively rights of the algorithm. The rights of such an algorithmic entity (AE) would include the rights to privacy, to own property, to enter into contracts, to be represented by counsel, to be free from unreasonable search and seizure, to equal protection of the laws, to speak freely, and perhaps even to spend money on political campaigns. Once an algorithm had such rights, Bayern observed, it would also have the power to confer equivalent rights on other algorithms by forming additional entities and putting those algorithms in control of them.”6. (See Note 1.)