We know that ChatGPT can hallucinate content.
But it’s not just hallucinating the information it’s providing… it’s hallucinating the citations for that information.
We know that ChatGPT can hallucinate content.
But it’s not just hallucinating the information it’s providing… it’s hallucinating the citations for that information.
I’d been writing training scripts that teach educators how to better provide for at-risk students when the client let me know that they’d be using AI to read the voiceover.
“Oh no,” I thought. “That’s going to be a problem.”