This is offensive to me totally and completely #news #ai #therapy
Your next therapy session could be with an AI bot https://www.nbcnews.com/tech/tech-news/dartmouth-researchers-look-meld-therapy-apps-modern-ai-rcna146558
I would feel cheated if a doctor referred me to a chatbot. But... I have had conversations with GPT I'd consider therapeutic. I can easily imagine situations it would be better than nothing.
@HumanServitor @MATAK79 , @bleakfuture
There have been a few surveys. In one, people over 30 hated the idea of chatbot therapy. In another, people under 30 were open to the idea. Certainly lots of people are playing with the free ones.
I mostly hate the idea -- especially if the data collected is not in the hands of people who value privacy.
I do suppose it is better than nothing and therapists are in short supply
I see the value in AI assistants to healthcare professionals. One company has an AI reading voice tone and facial features to determine mood and state of the patient and feed that to the clinician. Others are developing diagnostic assistants.
My nightmare scenario -- and I'm totally making this up -- is that American health insurance companies force therapy chatbots on clients just because they can. So the remaining therapists become the equivalent of Tier 3 Tech Support -- handling only the difficult escalations, stepping in where a chatbot screws up, and having their license be responsible for the behavior of the bots under their watch.
@MATAK79 @HumanServitor @bleakfuture
So can I.
Yet caring relationship is key in healing (certainly in psychotherapy). I wonder what happens to that.