less than 1 minute read

The FDA’s head of AI, Jeremy Walsh, admitted that Elsa can hallucinate nonexistent studies.

“Elsa is no different from lots of [large language models] and generative AI,” he told CNN. “They could potentially hallucinate.”

Sounds like a need for some kind of tool to verify that studies are at least real, if not accurately represented, in the answers government scientists are using to make critical decisions.