Stephen Hawking warns of AI havoc

Celebrated physicist Stephen Hawking has redefined the world as we know it. So when he speaks, the world listens. On being asked at a Reddit AMA recently about the dangers of advancements in AI technology, his response set social media abuzz with anxiety. Hawking warned of dangers of a possible intelligence explosion; the concept which, proposes that once we build Artificial Intelligence (AI) with human intelligence levels, it can consequently improve itself to the point of exceeding human intelligence, what is famously known as superintelligence. Technological singularity is another term that describes the scenario. “It’s clearly possible for a something to acquire higher intelligence than its ancestors,” said Hawking at the Reddit AMA. “The line you ask about is where an AI becomes better than humans at AI design, so…


Link to Full Article: Stephen Hawking warns of AI havoc

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!