We don't have to do this. We have human-competitive AI, and there's no need to build AI with which we can't compete. We can build amazing AI tools without building a successor species. The notion that AGI and superintelligence are inevitable is a choice masquerading as fate. By imposing some hard, global limits, we can keep AI's general capability to approximately human level while still reaping the benefits of computers' ability to process data in ways we cannot, and automate tasks none of us wants to do. [...] Humanity must choose to close the Gates to AGI and superintelligence. To keep the future human. (2025) source Unverified
Comment X 3d ago
Polls
replying to Anthony Aguirre
Terms · Privacy · Contact