
Several families in the United States and Canada have filed lawsuits against OpenAI over tragedies linked to conversations with ChatGPT, The Wall Street Journal reported.
The plaintiffs claim that the mental health of some users of the artificial intelligence service deteriorated, and in some cases, they took their own lives. A total of seven lawsuits have been filed.
According to reports, four individuals — including one underage user — committed suicide, while three others lost their mental stability after prolonged conversations with the chatbot.
The plaintiffs criticize ChatGPT for failing to detect situations that pose risks to mental health during conversations and are demanding that OpenAI strengthen its safety algorithms.
The company described the situation as “deeply tragic” and stated that, since October 2024, it has enhanced safety protocols aimed at the early detection of mental health–related risks.
Read “Zamin” on Telegram!