Comment by Rodney Brooks

Roboticist; former MIT CSAIL director
My own opinion is that of course this is possible in principle. I would never have started working on Artificial Intelligence if I did not believe that. [...] Even if it is possible I personally think we are far, far further away from understanding how to build AGI than many other pundits might say. B. The Singularity. This refers to the idea that eventually an AI based intelligent entity, with goals and purposes, will be better at AI research than us humans are. Then, with an unending Moore’s law mixed in making computers faster and faster, Artificial Intelligence will take off by itself, and, as in speculative physics going through the singularity of a black hole, we have no idea what things will be like on the other side. [...] Even if there is a lot of computer power around it does not mean we are close to having programs that can do research in Artificial Intelligence, and rewrite their own code to get better and better. Unverified source
Like Share on X 3d ago
Polls
replying to Rodney Brooks