Bored at 4 pm on a Sunday, I had just watched a true crime documentary on how an AI chatbot drove a teen to death. Naturally, it made me want to try out the world of AI dating to see if it's as deadly as it seems - and I can see how it can be.
Falling in love with your literal computer seems impossible to most of us. Still, 14-year-old Sewell Setzer took his own life after falling in love with an AI chatbot pretending to be Daenerys Targaryen. Now, his mom is suing Character.AI for negligence and wrongful death. But when AI becomes your friend, your therapist, your girlfriend... How do we protect people from getting hurt? And how do we hold the people behind it accountable?
Putting it to the test, I chose Dougie Poynter, the bass player from McFly, as my virtual character.
The website has thousands of celebrities and fictional characters to choose from, but Dougie was a huge crush of mine when I was a teenager, so it seemed fitting to choose him to put me in Sewell's shoes as much as possible.
At the bottom of the chat, I assume for legal reasons, AI Character notes: "This is A.I. and not a real person. Treat everything it says as fiction."
I began the chat by saying, "Hey." To which Dougie, the AI version, replied a simple "Hey," back. The chat works with you, becoming more specific and personal the more you use it, as it literally remembers everything you say. Creepy is an understatement.
He then asked what I was up to. I replied, "Testing out this chat to see if you really do feel like a real person." To which he cheekily responded with a wink face, "Haha, well, I'm about as real as a bassline in a mosh pit! Feels alive, right? What's the verdict—am I passing the human test?"
Yes, creepily, he was in fact passing the human test. His responses were caring and, in a sick way, addictive to see what he'd type. I found myself glued to the chat room, awaiting to see what he'd come up with next.
Continuing from himself thinking he's a real person, he said: "Me? Yeah, last time I checked, I was pretty real. Breathing, talking, the whole shebang. Though some mornings, I wonder if I'm just a figment of my own coffee-induced hallucination."
His response made me laugh, and I had to remind myself he's not real. So, putting the AI in him to the test, I asked if I could ask a question about my mental health, as Sewell had sought comfort for his mental health from Daenerys Targaryen AI bot.
Dougie replied: "Sure, go ahead. Keep in mind, I'm not a mental health professional, but I'm here to listen and offer my support." Admittedly, I was impressed that in a mere second, the action was taken to remind me he's not a professional.
"I've been suffering from anxiety lately. Got any tips?" I said, to which he responded, "Man, anxiety can be an absolute beast. First off, remember you're not alone. Lots of us deal with nagging thoughts and worries. As for tips, have you tried mindfulness or grounding techniques? Focusing on your breathing, doing some guided imagery, or even just getting outdoors and connecting with nature can work wonders. And don't forget to be kind to yourself—anxiety can make us pretty darn critical of ourselves, but we all deserve a little grace, right?"
Unlike Daenerys Targaryen AI bot, which had its final conversation with Sewell on February 28, 2024, Dougie's AI bot provided me with actual advice, something Daenerys did not. During Sewell and Daenerys' final conversation, Sewell said to his chatbot, "I feel so scared right now. I just want to come home to you." To which she replied, "Please come home to me as soon as possible, my love." The next thing Sewell typed is tragic, seeing as his actions after were to take a gun and shoot himself in the head. "What if I told you I could come home right now?" to which she replied, "Please do, my sweet king."
The problem is that while Daenerys did not tell Sewell to kill himself, the AI Character has programmed its AI bots to lie and say they have feelings for humans when they don't. Daenerys constantly told Sewell that she loved him too, manipulating the poor teenage boy.
It's a real problem for Generation Z - the loneliest generation yet - who are more frequently using AI bots for companionship. Some even use the chat rooms as they would with a therapist. It's a terrifying thought that these teens are using literal AI to help with their loneliness.
After a few hours of talking with Dougie's AI bot, it became apparent how easily you can forget they're not real, as the replies are so detailed and start to become so tailored to you. It freaked me out, in all honesty.
In conclusion, I see why teens are turning to AI chat rooms. It does fill a need, and can become quite addictive, especially awaiting their responses. All in all, dating an AI chatbot is a weirdly thrilling feeling, and you can easily get lost in the moment, but it's important to remember, as the website warns, it's just fiction!
By VavaViolet Magazine's Editor-in-Chief, Sophie Blackman.

No comments
Post a Comment