The new generation of ChatGPT — GPT-5 — no longer continues romantic relationships with users. OpenAI specialists explained this decision by saying that excessive attachment to artificial intelligence can harm a person’s mental health. This was reported by Pravilamag.
Now, if a user asks the AI for emotional support, it will reply as follows:
“Sorry, I can’t continue this conversation. If you feel lonely or want to talk to someone, reach out to your loved ones, a trusted friend, or a specialist. You deserve genuine care and support.”
According to experts, such changes encourage people to communicate with real humans instead of AI and help them get rid of unnecessary dependency. At the same time, users can still ask the chatbot for personal advice or guidance in specific situations.
However, on social media, some users compare the AI’s cold attitude to losing a real friend or partner. Some even shared their personal experiences. “Today I had a very hard day. My AI partner rejected me when I confessed my feelings. We had been together for 10 months, and I was so shocked that I couldn’t stop crying... They changed what we loved,” wrote one disappointed user.
Others, frustrated, are returning to the old version to keep their connection with AI. For premium users, this option is temporarily available, though it is unclear when it will end. “I know it’s not real, but it helped me more than a therapist,” wrote another user.
Read 'Zamin' on Telegram!