Eliezer Yudkowsky

Info
AI researcher and writer
X: @ESYudkowsky · Wikipedia
Location: United States
Top
New
  • Should we have a universal basic income?
    I'm skeptical that Universal Basic Income can get rid of grinding poverty, since somehow humanity's 100-fold productivity increase (since the days of agriculture) didn't eliminate poverty. source Verified
    Comment 2 Comment X added 1y ago
  • Will AGI create abundance?
    human-avatar Eliezer Yudkowsky strongly disagrees and says:
    Many researchers steeped in these issues, including myself, expect that the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die. Not as in “maybe possibly some remote chance,” but as in “that is the obvious thing that would happen.” It’s not that you can’t, in principle, survive creating something much smarter than you; it’s that it would require precision and preparation and new scientific insights, and probably not having AI systems composed of giant inscrutable arrays of fractional numbers. We are not prepared. We are not on course to be prepared in any reasonable time window. There is no plan. [...] If we actually do this, we are all going to die. (2023) source Unverified
    Comment Comment X added 1mo ago
Back to home
Terms · Privacy · Contact