Eliezer Yudkowsky

Info
AI researcher and writer
X: @ESYudkowsky · Wikipedia
Location: United States
Top
New
  • Would competing ASIs be positive for humans?
    human-avatar Eliezer Yudkowsky strongly disagrees and says:
    If Earth experiences a sufficient rate of nonhuman manufacturing -- eg, self-replicating factories generating power eg via fusion -- to saturate Earth's capacity to radiate waste heat, humanity fries. It doesn't matter if the factories were run by one superintelligence or 20. source Verified
    Comment 3 Comment X added 1y ago
  • Does AI pose an existential threat to humanity?
    human-avatar Eliezer Yudkowsky strongly agrees and says:
    the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die. Not as in “maybe possibly some remote chance,” but as in “that is the obvious thing that would happen.” It’s not that you can’t, in principle, survive creating something much smarter than you; it’s that it would require precision and preparation and new scientific insights, and probably not having AI systems composed of giant inscrutable arrays of fractional numbers. (2023) source Verified
    Comment 1 Comment X added 1y ago
  • Should we ban future open-source AI models that can be used to create weapons of mass destruction?
    human-avatar Eliezer Yudkowsky strongly agrees and says:
    But open sourcing, you know, that's just sheer catastrophe. The whole notion of open sourcing, this was always the wrong approach, the wrong ideal. There are places in the world where open source is a noble ideal and building stuff you don't understand that is difficult to control, that where if you could align it, it would take time. You'd have to spend a bunch of time doing it. That is not a place for open source, because then you just have powerful things that just go straight out the gate without anybody having had the time to have them not kill everyone. (2023) source Unverified
    Comment Comment X added 2d ago
Back to home
Terms · Privacy · Contact