Here's a glitch/bug/massive-fucking-privacy-risk associated with the new #Bing #chatbot feature that I haven't seen documented.
Answers from one query bleed over into another.
Earlier today I asked it about the black power salute that Tommie Smith and John Carlos gave at the 1968 Olympic Games.
Tonight I asked it about headlines referencing chatbot halluncinations.
Look what it did. It returned an utterly irrelevant answer to a previous query.
Amazing this is production technology.