Professor of Psychology and Neural Science
OpenAI has also said, and I agree, “it’s important that efforts like ours submit to independent audits before releasing new systems”, but to my knowledge they have not yet submitted to such audits. They have also said “at some point, it may be important to get independent review before starting to train future systems”. But again, they have not submitted to any such advance reviews so far. We have to stop letting them set all the rules. AI is moving incredibly fast, with lots of potential — but also lots of risks. We obviously need government involved. We need the tech companies involved, big and small. But we also need independent scientists. Not just so that we scientists can have a voice, but so that we can participate, directly, in addressing the problems and evaluating solutions. And not just after products are released, but before. We need tight collaboration between independent scientists and governments—in order to hold the companies’ feet to the fire. Allowing independent scientists access to these systems before they are widely released – as part of a clinical trial-like safety evaluation - is a vital first step. (2023) source Verified
Comment 1 X 9d ago
Polls
replying to Gary Marcus
Terms · Privacy · Contact