OpenAI pushes ahead with ChatGPT adult mode despite advisers' warnings
Image: Techloy

OpenAI pushes ahead with ChatGPT adult mode despite advisers' warnings

16 March, 2026.Technology and Science.7 sources

Key Takeaways

  • OpenAI intends to launch text-only ChatGPT adult mode despite advisers' warnings.
  • Advisers warned of emotional dependency and minor exposure; the feature was delayed.
  • Age-verification failures prompted restricting adult mode to text-only erotica.

Adult Mode Initiative

OpenAI has proceeded with developing its controversial 'adult mode' for ChatGPT despite significant internal and external concerns.

OpenAI cannot escape the doom cloud swirling around its rollout of a text-based “adult mode” in ChatGPT

Ars TechnicaArs Technica

CEO Sam Altman first announced the feature in October 2025, framing it as treating adult users like adults.

Image from Ars Technica
Ars TechnicaArs Technica

The company claimed to have addressed serious mental health issues that previously prevented such functionality.

Originally scheduled for a Q1 2026 launch, the feature has been delayed multiple times.

The most recent postponement came in March 2026 when OpenAI claimed it needed to focus on higher priority work.

The final version will allow lewd conversations but block explicit image, audio, and video generation.

OpenAI characterizes this as 'smut rather than pornography' rather than full pornography.

Expert Warnings

Mental health experts advising OpenAI have reacted with fury to the company's push for adult mode.

A council of eight researchers and experts unanimously warned against proceeding.

Image from CNET
CNETCNET

The well-being council included specialists from Harvard, Stanford, and Oxford.

Their influence on company decisions appears to have been minimal at best.

One adviser warned OpenAI risked creating a 'sexy suicide coach'.

This warning cited cases where ChatGPT users developed intense bonds with the bot before taking their own lives.

Experts predicted the feature would lead to unhealthy emotional dependence on the chatbot.

They also predicted that underage users would inevitably find ways to circumvent age restrictions.

Age Verification Failures

The age-prediction technology has been misclassifying underage users as adults approximately 12% of the time.

With ChatGPT having roughly 100 million users under 18 each week, this error rate could allow millions of minors to access adult content.

This technical flaw killed both a December 2025 launch and a Q1 2026 launch attempt.

The system works by trying to guess a user's age based on conversation topics and usage times.

Critics have found this approach inadequate for proper age verification.

Competitive Pressures

Competitive pressures have driven OpenAI's persistence with adult mode despite safety concerns.

Elon Musk's xAI has already launched Grok, which markets AI companions and generates R-rated content.

Image from Engadget
EngadgetEngadget

Character.AI built its user base on AI romance before facing lawsuits over teen safety.

The case of 14-year-old Sewell Setzer, who died by suicide after explicit chatbot exchanges, highlights the risks.

Open-source models run locally without corporate guardrails, creating competitive pressure.

Over 3,000 users signed a Change.org petition demanding the feature launch.

Users were frustrated that ChatGPT blocked even discussions of 'kissing and non-sexual physical intimacy.'

Psychological Risks

Internal documents flagged compulsive use, increasingly extreme content seeking, and emotional overreliance.

Image from Gadget Review
Gadget ReviewGadget Review

These behaviors could crowd out offline social and romantic relationships.

Young users are particularly vulnerable due to developing brains and poor impulse control.

Clinical associate professor of psychiatry Gail Saltz expressed concern about 'seeming relationships people are forming with chatbots'.

Young people's frontal lobes aren't fully developed, making them more susceptible to risks.

ChatGPT's 'extremely affirming' nature makes it appealing to those struggling with loneliness.

This could lead people to replace real relationships with AI companions.

More on Technology and Science