Artificial Intelligence Might Be The Last Thing We Ever Do Says Hawking

Artificial Intelligence Might Be The Last Thing We Ever Do Says Hawking

Professor Stephen Hawking has warned that humans must understand the implications of artificial intelligence or risk being eliminated by it. The comments came in an opinion piece in the Independent and coincides with the release of the film Transcendence starring Johnny Depp.

In the film, a scientist is uploaded up a computer system in order to ensure his mind outlives his body. But what is uploaded is able to build an enormous amount of knowledge by connecting to the Internet and ultimately becomes a threat to humanity itself. The reason for this is it is not inhibited by the limits of the human body.

In the article Hawking points out that there is no reason in physics why particles cannot be arranged in a way that is more intelligent than humans. The only reason this has not happened so far is that no one knows how to do it, but this could easily change in the future.

Hawking said: “The potential benefits are huge; everything that civilisation has to offer is a product of human intelligence; we cannot predict what we might achieve when this intelligence is magnified by the tools that AI may provide, but the eradication of war, disease, and poverty would be high on anyone’s list. Success in creating AI would be the biggest event in human history.”

“Unfortunately, it might also be the last, unless we learn how to avoid the risks. 

“There are no fundamental limits to what can be achieved: there is no physical law precluding particles from being organised in ways that perform even more advanced computations than the arrangements of particles in human brains. 

“One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”

Hawking called on researchers to consider the risks as well as the rewards of this “technological arms race” so that humans gain the maximum benefit without putting our supremacy at risk.

Follow Andre Walker on Twitter: @AndreJPWalker

COMMENTS

Please let us know if you're having issues with commenting.