"You can't spell 'Gell-Mann Amnesia' without 'LLM'"

· gclv.es

via esham.io:

And yet I see people who should know better say things like, “I asked a conversational AI some questions that I knew the answers to, and its answers were well-written, but they had the kinds of subtle errors that could lead someone badly astray. But then I asked it some questions that I didn’t know the answers to, and it gave me really good, clear information! What a great learning tool!”

This is known as the Gell-Mann Amnesia Effect.

Thoughts? Comments? Shoot me an email!