Friday, April 27, 2018

Afraid The Robots Will Take Over? Relax, Says Steven Pinker

Intelligence has to be defined relative to goals and the knowledge needed to attain them. In any case the argument against the doomsday fear-mongering of existing AI extends to more powerful systems: any system that monomaniacally pursued one goal (such as making paperclips) while being oblivious to every other goal (such as not turning human beings into paperclips) is not artificially intelligent: it’s artificially stupid, and unlike anything a technologically sophisticated society would ever invent and empower. And scenarios in which the systems take over themselves commit the fallacy that intelligence implies a will to power, which comes from confusing two traits that just happened to come bundled in Homo sapiens because we are products of Darwinian natural selection.



Article source here:Arts Journal

No comments:

Post a Comment

Academy Decides Not To Bar Streaming Movies From Oscars

The board of governors of the Academy of Motion Picture Arts and Sciences “left intact Rule Two, the one that established that a film” — in...