Philosopher; 'Superintelligence' author; FHI founder
I think broadly speaking with AI, … rather than coming up with a detailed plan and blueprint in advance, we’ll have to kind of feel our way through this and make adjustments as we go along as new opportunities come into view. […] I think ultimately this transition to the superintelligence era is one we should do. It would be in itself an existential catastrophe if we forever failed to develop superintelligence. (2025) source Unverified
Comment X 2d ago
Polls
replying to Nick Bostrom
Terms · Privacy · Contact