Comment by Eliezer Yudkowsky

the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die. Unverified source (2023)
Like Share on X 6h ago
Polls
replying to Eliezer Yudkowsky