New book claims superintelligent AI development is racing toward global catastrophe
(NEW YORK) -- A new book by two artificial intelligence researchers claims that the race to build superintelligent AI could spell doom for humanity.In "If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All," authors Eliezer Yudkowsky and Nate Soares claim... Read More.

