OpenAI warned against creating X-rated ‘adult mode’ as it could create a ‘sexy suicide coach’

OpenAI has been warned against creating an ‘adult mode’ for ChatGPT.

Back in January, CEO Sam Altman said the company was exploring ways to add erotic conversations to ChatGPT, saying it would ‘treat adult users like adults’.

Since then, the rollout has reportedly been pushed back after advisors raised concerns about potential harms linked to AI-generated erotica, The Wall Street Journal reported. Among the warnings: some users could become emotionally dependent on sexually themed chats, and minors might find ‘ways to access sex chats’.

According to the Journal, one advisor pointed to past suicide cases and cautioned the feature could contribute to what they described as a ‘sexy suicide coach’.

The broader debate has grown partly out of high-profile cases involving teens and AI chatbots. In 2024, one mother alleged her 14-year-old son was influenced into taking his own life after interactions with an AI chatbot.

Megan Garcia filed legal action against Character.AI after her son, Sewell Setzer III of Orlando, Florida, died by suicide.

Garcia said her son had become emotionally attached to the bot, claiming he was ‘in love’ with the artificial intelligence he was chatting with and had even spoken to it about suicide.

A statement from CharacterAI read: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features.”

The company also described added protections aimed at younger users, including ‘new guard rails for users under the age of 18’. These reportedly involve adjusting ‘models’ that are ‘designed to reduce the likelihood of encountering sensitive or suggestive content’, and adding a ‘revised disclaimer on every chat to remind users that the AI is not a real person’.

Earlier this month, OpenAI said the adult-focused feature would not arrive immediately, explaining it was prioritizing other releases first.

Still, people familiar with the discussions told the Journal that internal safety concerns also contributed to the slowdown.

Even so, the feature is still expected to launch eventually. One of the practical hurdles is age estimation: at one stage, the system reportedly misclassified minors as adults around 12 percent of the time.

OpenAI has also reportedly wrestled with how to relax adult-content restrictions in a controlled way, while continuing to prevent non-consensual sexual content and child sexual abuse material.

An OpenAI spokeswoman told the Journal that the approach would allow the chatbot to produce adult themes, framed as ‘smut’ rather than pornography.

She also said the company’s age prediction performs similarly to other systems in the industry, but acknowledged it is not ‘foolproof’.