ChatGPT hallucinates fake but plausible scientific citations at a staggering rate, study finds

"MacDonald found that a total of 32.3% of the 300 citations generated by ChatGPT were hallucinated. Despite being fabricated, these hallucinated citations were constructed with elements that appeared legitimate โ€” such as real authors who are recognized in their respective fields, properly formatted DOIs, and references to legitimate peer-reviewed journals."

@science @ai

attribution: Madhav-Malhotra-003, CC0, via Wikimedia Commons. Page URL:

ยท ยท 6 ยท 35 ยท 30


@science @ai

I really feel that shops read "
ChatGPT FALSIFIES fake but plausible scientific citations at a staggering rate"

I really hate the use of the words "hallucinate", "hallucination", etc for this. We should be calling these falsehoods. That is, after all, exactly what they are.

But the AI industry doesn't like the negative connotation of "falsehoods", aka lies. They prefer the more whimsical sounding "hallucinations".

@arniepix @science @ai

For some controlling perceptions may be more important than deciding nomenclature.

@bibliolater @science @ai somehow I think Perplexity does a much better job at referencing sources.

@bibliolater @science @ai I was reading that this morning - fascinating.

I've already caught it out several times - even to the point of connecting with authors via LinkedIn to see if the citations were real or not (they weren't). One to watch...

@bibliolater @science @ai yes, this has been my experience with it. ChatGPT hallucinated an edited book using my name--the thing was, the title and topic were quite plausible.

@historianess @science @ai

I suppose the difficulty is that because of the plausible nature of the 'hallucinations' they generate extra work in order to verify.

@bibliolater "Hallucinated" (I prefer the term non-anthropomorphic term BS-generated) citations are relatively benign. The real toxic sludge are the citations that in fact exists but have no substantive relation to the text they are supposed to back up. Impossible to spot without painstaking manual labor.

@DetersHenning It looks like that rather than assisting humans the system is adding to our collective workloads.

Would be interested to see how well Scopus's AI product does in comparison. I haven't spent a lot of time with it, but have not yet seen false citations.

@science @ai

@NearerAndFarther @science @ai

Am I right in thinking it is behind a paywall and not accessible to the general public?

Short answer is yes. You at least need institutional Scopus access, but it does not seem to currently be an additional cost/subscription.

What's kind of weird is that it doesn't always show up. If you go to Scopus, right above the default search bar there is *sometimes* a button for "Scopus Ai". It doesn't seem to be browser specific. I'm thinking they are doing a very slow rollout, but can't tell exactly what the conditions of access are.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.