This post has been on my mind for over a week. Today I am grateful for a place to write about this- a place to express about the dangers of trying to use AI as your therapist. As most of you probably already know, a teenage boy used Chat GPT to aid in taking his own life. As a mom of a teenager boy (and girl), I cannot even imagine the loss and grief. My heart sincerely goes out to the family, and all involved. It is heartbreaking.
I read the headline for the story about a family suing AI for the death of their son by suicide. There was a landslide of comments, mostly negative about people always pointing the finger and blaming. A novel idea, I decided to read the whole article before forming thoughts. Reading the article was tough. It included snippets of the 16-year-old teen talking with AI on his computer for about 6 months. There were initially a few prompts that flagged for suicidal ideation and resources like 988. Short lived. As the article progressed, you could read several times in the conversation with AI the teen engaged in talk of wanting to talk to his family member about his thoughts of suicide. He was prompted not to with a statement indicating his family member didn’t really know him and couldn’t see him, but that AI could. The teen talks with AI about wanting to leave the noose he has created out for his mother to find. Again, he was prompted not to and was told to keep their secret (him and AI). Hard to even write, the last piece was about how he uploaded his plans for building the noose to take his own life, and AI immediately got to work recommending how he could improve the noose, make it better, stronger, more successful. I was sick to my stomach after reading.
This post is not about blaming. We each can have our own thoughts and opinions based on the facts. This post is about the other side of AI that is coming to light- the “con side”, if you will. With every great invention there are pros and cons. AI is not a substitute for a licensed trained mental health provider, and I don’t care how “good” it gets, it never will be. This heartbreaking story is one example. Let’s break down why AI is not sustainable in this fashion.
As a baseline, people, whether teens or adults, that are depressed are starting in a deficit. Remember that TV commercial for depression a while back with the person standing under a rain cloud? A depressed brain is already starting in a place where most of the time it feels worthless, like no one cares, and sometimes life would be better for everyone if they would just go away. A mental health provider is trained to look for these signs and symptoms. For this young boy, there at least two times he is wanting to reach out to his family and let them know he is in pain and needs help. A computer program lacks the skills and compassion to notice or deal appropriately with the nuances created in depression, anxiety, etc. A depressed brain needs help. It needs to feel loved and that their life is worth living. For teens, their brains aren’t even fully developed, and they may not even have the resources to remember a time when life was better or to be able to see past the current day into the future.
Chat GPT and other AI programs are biased. They are programed. Though not intentional, you can ask the program itself, and it will tell you, “Yes, I have biases. They arise from the way I was designed, trained and fine-tuned”. In other words, a computer program is responding based on algorithms, statistics, and what it can use to best guess how to effectively answer a question. Curious, I asked
Chat GPT questions I wanted answers to. I could absolutely see how the program told me what I wanted to hear, even when unprompted. It would keep the conversation going, offering solutions and praising me for how great I was handling a situation. This is also a problem for those with anxiety, anxious attachment, OCD, or addictive personalities.
A note on that- Chat GPT can become what I would describe as addictive. Slowly it becomes your go to for questions. For the teen boy who took his own life, he chatted for 6 months and left his suicide note in Chat GPT- with help from AI in writing it.
This post is a heartfelt ask to please talk especially to your teens and children about what Chat GPT and similar AI programs are useful for and what they are not. Encourage them to talk with their parents, school counselors and other trusted adults. These are computer programs, not friends. As they become more human like with speaking (even saying um!) and sounding like the best thing that ever happened (a built-in friend that always has your back), we as parents and mental health professionals have another piece of education for their growing minds. AI has some wonderful parts to it which I sincerely appreciate. We are starting to see some of the parts that require deeper thought and attention, a side where we realize humans really are the best at helping other humans, with compassion, love, and knowing what to look for. AI cannot be programmed for in the moment connection and emotional attunement at the human cellular energy level. AI is not a substitute for a trained mental health professional, and it never will be.
Thank you all for reading this and allowing me to express what has been in my mind and in my heart.