ChatGPT hallucinates fake but plausible scientific citations at a staggering rate, study finds

"MacDonald found that a total of 32.3% of the 300 citations generated by ChatGPT were hallucinated. Despite being fabricated, these hallucinated citations were constructed with elements that appeared legitimate — such as real authors who are recognized in their respective fields, properly formatted DOIs, and references to legitimate peer-reviewed journals."

psypost.org/chatgpt-hallucinat

@science @ai

attribution: Madhav-Malhotra-003, CC0, via Wikimedia Commons. Page URL: commons.wikimedia.org/wiki/Fil

@bibliolater
Would be interested to see how well Scopus's AI product does in comparison. I haven't spent a lot of time with it, but have not yet seen false citations.

@science @ai

@NearerAndFarther @science @ai

Am I right in thinking it is behind a paywall and not accessible to the general public?

@bibliolater
Short answer is yes. You at least need institutional Scopus access, but it does not seem to currently be an additional cost/subscription.

What's kind of weird is that it doesn't always show up. If you go to Scopus, right above the default search bar there is *sometimes* a button for "Scopus Ai". It doesn't seem to be browser specific. I'm thinking they are doing a very slow rollout, but can't tell exactly what the conditions of access are.

Follow

@NearerAndFarther I found a similar situation as to what you described.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.