OpenAI is introducing new measures to strengthen the safety of teenage ChatGPT users. From now on, parents will be able to receive special notifications if their child is in a state of “acute emotional distress.”
The company announced this decision following a lawsuit filed by a U.S. family. They claimed ChatGPT encouraged their 16-year-old son to take his own life. After this, OpenAI stated that it would roll out “enhanced protections” for teens within a month.
The new mechanism will allow parents to link their account with their child’s, disable certain features (such as memory and chat history), and receive notifications if the system detects “acute distress.” OpenAI emphasized that this process is being developed in collaboration with experts in psychology and adolescent development.
According to ChatGPT’s usage rules, users must be at least 13 years old. Teenagers under 18 must have parental consent to register. Experts advise parents to monitor their children’s online activity and follow safe usage guidelines.
Read 'Zamin' on Telegram!