We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Anthony Aguirre
Physicist; Future of Life cofounder
ai (2)
ai-governance (2)
ai-regulation (2)
ai-safety (2)
existential-risk (2)
ai-ethics (1)
ai-policy (1)
ai-risk (1)
democracy (1)
future (1)
Top
New
-
Should humanity build artificial general intelligence?
Anthony Aguirre strongly disagrees and says:
We don't have to do this. We have human-competitive AI, and there's no need to build AI with which we can't compete. We can build amazing AI tools without building a successor species. The notion that AGI and superintelligence are inevitable is a choice masquerading as fate. By imposing some hard, global limits, we can keep AI's general capability to approximately human level while still reaping the benefits of computers' ability to process data in ways we cannot, and automate tasks none of us wants to do. [...] Humanity must choose to close the Gates to AGI and superintelligence. To keep the future human. (2025) source Unverified -
Should humanity ban the development of superintelligence until there is broad scientific consensus that it will be done safely and controllably and strong public buy-in?
Anthony Aguirre strongly agrees and says:
Time is running out. The only thing likely to stop AI companies barreling toward superintelligence is for there to be widespread realization among society at all its levels that this is not actually what we want. That means building public will and scientific clarity first, and only then moving ahead on anything that would concentrate world-altering power in a machine. This isn’t a casual slowdown; it’s an affirmative choice about what kind of future we are consenting to. If there isn’t strong public buy‑in and broad scientific consensus that it can be done safely and controllably, then pressing on would be reckless engineering and reckless governance. The right move is to hold off until those conditions are met, and to make creating those conditions the priority. (2025) source Unverified