Philosopher; 'Superintelligence' author; FHI founder
Superintelligence could become extremely powerful and be able to shape the future according to its preferences. If humanity had been sane and had our act together globally, the sensible course of action would be to postpone development of superintelligence until we figured out how to do so safely. And then maybe wait another generation or two just to make sure that we hadn’t overlooked some flaw in our reasoning. Unfortunately, we do not have the ability to pause. (2014) source Unverified
Comment X 22h ago
Polls
replying to Nick Bostrom
Terms · Privacy · Contact