Here's a glitch/bug/massive-fucking-privacy-risk associated with the new #Bing #chatbot feature that I haven't seen documented.

Answers from one query bleed over into another.

Earlier today I asked it about the black power salute that Tommie Smith and John Carlos gave at the 1968 Olympic Games.

Tonight I asked it about headlines referencing chatbot halluncinations.

Look what it did. It returned an utterly irrelevant answer to a previous query.

Amazing this is production technology.

Follow

@ct_bergstrom Lots of privacy problems with LLMs, but I'm having a hard time understanding why this is one of them given that anyone who knows how to click on the History button can get the same info.

Sign in to participate in the conversation
Qoto Mastodon

QOTO: Question Others to Teach Ourselves
An inclusive, Academic Freedom, instance
All cultures welcome.
Hate speech and harassment strictly forbidden.