Sam Altman Issues Chilling Warning: ChatGPT Is Not A Therapist – Your Deepest Secrets Could Be Used Against You In Court With No Legal Protection

Ezza Ijaz
Sam Altman warns about using ChatGPT for therapy
Sam Altman warns ChatGPT lacks legal confidentiality

Nowadays, users increasingly rely on AI tools for assistance with their daily workload, and some even seek the platform for medical, personal, and even professional advice. The tool has become more of a personal assistant, considered to be the go-to for everyday problems, often leading to over-dependence on the chatbot. While it might seem harmless to seek therapy from the platform, there is no guarantee that the shared information would remain under wraps, unlike professional help, where confidentiality is maintained. It became more of an eye-opener after Sam Altman, the CEO of OpenAI, issued a warning about not relying excessively on the AI assistant, especially with deeply personal information.

Sam Altman cautions that ChatGPT does not offer therapist-client confidentiality

With AI tools gaining more capabilities and having better emotional understanding, many have started relying on chatbots for therapy or emotional support. Unlike traditional therapy, which prioritizes doctor-patient confidentiality, AI does not have the legal framework necessary to safeguard sensitive conversations. This has been backed up by Sam Altman as well, who recently shared his concerns about confiding deeply personal matters during an appearance at This Past Weekend w/ Theon Van via TechCrunch.

Related Story Every 14-Inch M5 MacBook Pro On Amazon, Regardless Of Which Color, Storage Or RAM Version You Purchase, Is $200 Off, Starting From $1,399, Its Lowest Figure Ever

During his conversation, Sam Altman recognized that, as AI tools now offer more emotional understanding and have the ability to engage in more supportive dialogues, they give a sense of privacy. However, users should not rely on them for therapy or emotional support due to the risks that come with them. This is because it does not work in the same way as professional mental health care, and until there are proper regulations, AI should not be taken as a substitute for seeking therapy. While stating his apprehensions regarding the use of the chatbot, he said:

People talk about the most personal sh** in their lives to ChatGPT. People use it — young people, especially, use it — as a therapist and a life coach; they are having these relationship problems and [asking], ‘What should I do?’ And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT.

Since there is no legal confidentiality when it comes to these AI tools, Altman advises caution and warns about the serious consequences one can face in a legal scenario. Say, for instance, a person is involved in some legal trouble and OpenAI is required to share the conversations in the ongoing case, it would have no legal protections in place to keep the confidentiality, and would have no choice but to give away the deeply personal information shared. He further expressed how AI should, in fact, have the same right to privacy, but given how quickly the technology has evolved, the legal safeguards have not been able to keep pace.

Follow Wccftech on Google to get more of our news coverage in your feeds.

Button