ChatGPT hallucinates fake but plausible scientific citations at a staggering rate, study finds

"MacDonald found that a total of 32.3% of the 300 citations generated by ChatGPT were hallucinated. Despite being fabricated, these hallucinated citations were constructed with elements that appeared legitimate — such as real authors who are recognized in their respective fields, properly formatted DOIs, and references to legitimate peer-reviewed journals."

psypost.org/chatgpt-hallucinat

@science @ai

attribution: Madhav-Malhotra-003, CC0, via Wikimedia Commons. Page URL: commons.wikimedia.org/wiki/Fil

@bibliolater @science @ai somehow I think Perplexity does a much better job at referencing sources.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.