ChatGPT hallucinates fake but plausible scientific citations at a staggering rate, study finds
"MacDonald found that a total of 32.3% of the 300 citations generated by ChatGPT were hallucinated. Despite being fabricated, these hallucinated citations were constructed with elements that appeared legitimate — such as real authors who are recognized in their respective fields, properly formatted DOIs, and references to legitimate peer-reviewed journals."
#AI #ArtificialIntelligence #ChatGPT #LLM #Science #Research #DOI #Fake #Citations #Academia #Academic #Academics @science @ai
#Image attribution: Madhav-Malhotra-003, CC0, via Wikimedia Commons. Page URL: https://commons.wikimedia.org/wiki/File:Artificial_Intelligence_Word_Cloud.png
@bibliolater @science @ai somehow I think Perplexity does a much better job at referencing sources.
@bibliolater @science @ai I was reading that this morning - fascinating.
I've already caught it out several times - even to the point of connecting with authors via LinkedIn to see if the citations were real or not (they weren't). One to watch...
Sounds like a lot of extra work!
@bibliolater @science @ai yes, this has been my experience with it. ChatGPT hallucinated an edited book using my name--the thing was, the title and topic were quite plausible.
I suppose the difficulty is that because of the plausible nature of the 'hallucinations' they generate extra work in order to verify.
@bibliolater "Hallucinated" (I prefer the term non-anthropomorphic term BS-generated) citations are relatively benign. The real toxic sludge are the citations that in fact exists but have no substantive relation to the text they are supposed to back up. Impossible to spot without painstaking manual labor.
@DetersHenning It looks like that rather than assisting humans the system is adding to our collective workloads.
@bibliolater
Would be interested to see how well Scopus's AI product does in comparison. I haven't spent a lot of time with it, but have not yet seen false citations.
@NearerAndFarther @science @ai
Am I right in thinking it is behind a paywall and not accessible to the general public?
@bibliolater
Short answer is yes. You at least need institutional Scopus access, but it does not seem to currently be an additional cost/subscription.
What's kind of weird is that it doesn't always show up. If you go to Scopus, right above the default search bar there is *sometimes* a button for "Scopus Ai". It doesn't seem to be browser specific. I'm thinking they are doing a very slow rollout, but can't tell exactly what the conditions of access are.
@NearerAndFarther I found a similar situation as to what you described.
@bibliolater
@science @ai
I really feel that shops read "
ChatGPT FALSIFIES fake but plausible scientific citations at a staggering rate"
I really hate the use of the words "hallucinate", "hallucination", etc for this. We should be calling these falsehoods. That is, after all, exactly what they are.
But the AI industry doesn't like the negative connotation of "falsehoods", aka lies. They prefer the more whimsical sounding "hallucinations".