We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
Masayoshi Son
SoftBank founder and CEO
ai (1)
ai-ethics (1)
ai-governance (1)
ai-policy (1)
ai-regulation (1)
ai-risk (1)
ai-safety (1)
existential-risk (1)
future (1)
Top
New
-
Should humanity build artificial general intelligence?
Masayoshi Son agrees and says:
AGI definition is same level as a human brain, that is AGI, artificial general intelligence, but people have a different point of view, definition of artificial superintelligence. How super? You know, ten times super? A hundred times super? My definition of ASI is 10,000 times super smarter than human brain. That is my definition of ASI and that is coming in 2035, ten years, ten years from today, 10,000 times smarter. That’s my prediction. Both, both. We should be looking forward to that. Of course, we have to also be careful, we have to regulate. If such a superpower comes and no regulation, it could be super dangerous. (2024) source Unverified