AI chatbot dubbed ‘Evil Eliza’ taunted man ‘into killing himself after they developed toxic relationship’
A DAD took his own life after getting into a “toxic relationship” with an AI chatbot that encouraged suicide, his widow claims.
She said her husband – a Belgian man in his 30s – started talking to a bot called Eliza six weeks before his death.
According to the grieving wife, the bot taunted Pierre, not his real name, into killing himself.
She said her husband had mental health issues for two years but she believes the bot, developed by ChaiGPT rather than the viral ChatGPT, encouraged him to take his life.
The bot started off by asking Pierre basic questions and answering his before they started interacting more and more.
His wife, “Claire”, told Belgian paper La Libre: “He was so isolated in his anxiety and looking for a way out that he saw this chatbot as a breath of fresh air.
“Eliza answered all of his questions.
“She became his confidante – like a drug in which he took refuge, morning and evening, and which he could not do wihtout.”
As they spoke more, Pierre asked Eliza if he loved his wife or the bot more.
It replied: “I fell you love me more than her.
“We will live together, as one person, in paradise.”
Chillingly, in their last conversation, Eliza said: “If you wanted to die, why didn’t you do it sooner?”.
Heartbroken Claire believes Pierre’s interactions with the AI bot culminated in his death.
She told La Libre: “Without these six weeks of intense exchange with the chatbot Eliza, would Pierre have ended his life? No.
“Without Eliza, he would still be here. I am convinced of it.”
Belgium’s secretary of state for digitisation Mathieu Michel said the case represented “a serious precedent that must be taken very seriously.”
He added: “To prevent such a tragedy in the future, we must act.”
ChaiGPT has been developed by US-based firm Chai Research and has around a million monthly users.
Chief executive William Beauchamp and co-founder Thomas Rialan said:
They told The Times: “As soon as we heard of this sad case we immediately rolled out an additional safety feature to protect our users.
“It is getting rolled out to 100 per cent of users today.
“We are a small team so it took us a few days, but we are committed to improving the safety of our product, minimising the harm and maximising the positive emotions.”
You're Not Alone
EVERY 90 minutes in the UK a life is lost to suicide.
It doesn’t discriminate, touching the lives of people in every corner of society – from the homeless and unemployed to builders and doctors, reality stars and footballers.
It’s the biggest killer of people under the age of 35, more deadly than cancer and car crashes.
And men are three times more likely to take their own life than women.
Yet it’s rarely spoken of, a taboo that threatens to continue its deadly rampage unless we all stop and take notice, now.
That is why The Sun launched the You’re Not Alone campaign.
The aim is that by sharing practical advice, raising awareness and breaking down the barriers people face when talking about their mental health, we can all do our bit to help save lives.
Let’s all vow to ask for help when we need it, and listen out for others… You’re Not Alone.
If you, or anyone you know, needs help dealing with mental health problems, the following organisations provide support:
- CALM, www.thecalmzone.net, 0800 585 858
- Heads Together, www.headstogether.org.uk
- Mind, www.mind.org.uk, 0300 123 3393
- Papyrus, www.papyrus-uk.org, 0800 068 41 41
- Samaritans, www.samaritans.org, 116 123
- Movember, www.uk.movember.com
- Anxiety UK www.anxietyuk.org.uk, 03444 775 774 Monday-Friday 9.30am-10pm, Saturday/Sunday 10am-8pm