Hey y'all, i know you know this, but while you definitely shouldn't use GPTs for legal research, also don't rely on GPTs for RESEARCH, PERIOD.

They are neither giving nor TRYING to give you intersubjectively associated and derived facts; they are not even remixing factual CONCEPTS into new forms.

They are modelling human biases out into digestible bullshit with a statistically-determined high probability of being swallowed.

That is all.

They don't have to be this way, but, at present, the people making them have no incentive to change them. So. Don't lean on them for fact stuff. It's not what they do.

Follow

@Wolven it’s just so wrong at so many levels! Who could really think using a LLM trained from the internet (the universal source of misinformation and bs) could help them write a scientific paper with actual facts?

That and providing OpenAI with access to all your unpublished results for whatever they’re using their user input/interaction data, which is totally obscure and many countries have raised concerns about it.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.