Comment by Samuel Hammond

to train models beyond a sufficiently high threshold of compute should be required to pre-register training runs [...] a threshold of 10^26 FLOPs would likely suffice. Unverified source (2023)
Like Share on X 2d ago
Polls
replying to Samuel Hammond