Connect with us

International

Experts: AI Could Lead To Human Extinction

USA – Artificial intelligence could pose a “Risk of extinction” to humanity on the scale of nuclear war or pandemics, and mitigating that risk should be a “Global priority,” according to an open letter signed by AI Leaders.

Published

on

USA – Artificial intelligence could pose a “Risk of extinction” to humanity on the scale of nuclear war or pandemics, and mitigating that risk should be a “Global priority,” according to an open letter signed by AI Leaders such as Sam Altman of Openai as well as Geoffrey Hinton, known as the “Godfather” of AI.

The one-sentence open letter, issued by the nonprofit center for AI safety, is both brief and ominous, without extrapolating how the more than 300 signees foresee AI developing into an existential threat to humanity.

Comments
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement
Advertisement

Trending