We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Comment by Will MacAskill
Philosopher; effective altruism cofounder; researcher at Forethought Centre for AI Strategy
Another possibility, if there's a large enough intelligence explosion, is that the first project to build AGI organically becomes a de facto world government. This possibility is worth taking pretty seriously, given the stakes and the fact that an intelligence explosion is fairly likely. [...] If there's a large enough intelligence explosion, then AGI would quickly lead to superintelligence ("ASI"). This would give the project such a huge capabilities advantage over the rest of the world that — unless quickly checked by other actors — they could effectively achieve a decisive strategic advantage sufficient to enable it to achieve complete world domination.
Unverified
source
(2026)
Policy proposals and claims
replying to Will MacAskill