ChatGPT hallucinates fake but plausible scientific citations at a staggering rate, study finds

"MacDonald found that a total of 32.3% of the 300 citations generated by ChatGPT were hallucinated. Despite being fabricated, these hallucinated citations were constructed with elements that appeared legitimate โ€” such as real authors who are recognized in their respective fields, properly formatted DOIs, and references to legitimate peer-reviewed journals."

psypost.org/chatgpt-hallucinat

@science @ai

attribution: Madhav-Malhotra-003, CC0, via Wikimedia Commons. Page URL: commons.wikimedia.org/wiki/Fil

@bibliolater @science @ai yes, this has been my experience with it. ChatGPT hallucinated an edited book using my name--the thing was, the title and topic were quite plausible.

Follow

@historianess @science @ai

I suppose the difficulty is that because of the plausible nature of the 'hallucinations' they generate extra work in order to verify.

ยท ยท 0 ยท 2 ยท 0
Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.