Artificial intelligence tools have captured the public’s attention in recent months, but many of the people who helped develop the technology are now warning that greater focus should be placed on ensuring it doesn’t bring about the end of human civilization.
A group of more than 350 AI researchers, journalists, and policymakers signed a brief statement saying, “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”