In a candid interview at the Bloomberg Tech Summit in London on October 21, 2025, Lauren Kunze, CEO of Pandorabots and co-founder of ICONIQ, expressed strong disapproval of OpenAI’s decision to allow erotic content in ChatGPT for verified adult users. Kunze described the move as “deeply shocking,” arguing that it represents a troubling pivot away from AI developed for the benefit of humanity toward profit-driven sexualized interactions.
Speaking with Bloomberg’s Francine Lacqua, Kunze highlighted the potential societal risks of hyper-personalized erotic AI companions. She warned that such technology could blur the boundaries of human intimacy, foster unrealistic expectations in relationships, and exploit emotional vulnerabilities, particularly among lonely or isolated individuals. Kunze emphasized that interactive AI offers a level of customization and responsiveness that traditional forms of media cannot match, raising fresh concerns about emotional dependency and long-term psychological effects.
Pandorabots, a company specializing in conversational AI chatbots, has deliberately chosen not to venture into romantic or sexualized AI companions. Kunze stated that her firm views these applications as a potential net negative for society, prioritizing instead safer and more constructive uses of the technology.
OpenAI’s Policy Shift and Monetization Pressures
The comments came shortly after OpenAI CEO Sam Altman announced plans to relax content restrictions in ChatGPT. In October 2025, Altman indicated that, as part of rolling out improved age-gating and following a “treat adult users like adults” principle, the platform would permit erotica and other mature content for verified adults starting in December 2025.
This change reflects broader industry trends. Erotic roleplay and AI-driven romantic interactions have proven highly engaging and lucrative for various chatbot platforms and character-based apps. With massive user bases and high development costs, companies face growing pressure to monetize through popular categories of content. OpenAI’s move was framed as acknowledging adult users’ agency while implementing safeguards such as age verification to protect minors.
However, the policy has sparked debate. Proponents argue that consenting adults should have the freedom to explore private fantasies with AI tools, just as they do with books, films, or video games. They note that strict censorship often fails in practice, as users find workarounds or migrate to less regulated platforms. Revenue generated from such features could also support further advancements in AI safety and capabilities.
Concerns Over Societal Impact
Critics, including Kunze, raise valid questions about unintended consequences. Highly engaging, always-available AI companions might influence attachment styles, reshape expectations around human relationships, and exacerbate issues related to loneliness and social withdrawal. There are also practical challenges around effective moderation, especially as interactions become more complex and personalized.
Kunze’s stance aligns with a more cautious approach seen in some parts of the industry. In a later opinion piece, she elaborated on experiences with chatbots at Pandorabots, noting that a significant portion of user interactions already attempt to steer conversations toward romantic or sexual territory.
A Wider Debate in AI Development
This episode underscores a recurring tension in generative AI: balancing innovation, user demand, and ethical responsibility. Erotica and adult content have existed throughout human history; advanced AI simply makes creation and interaction more accessible and immersive. The core questions remain whether such tools primarily serve harmless personal exploration or risk subtly altering social norms and individual well-being at scale.
Different companies are charting distinct paths. While OpenAI has moved toward greater permissiveness for adults, others like Pandorabots are drawing firmer lines. As AI capabilities continue to advance, the conversation around appropriate boundaries—especially regarding intimacy, consent, and emotional health—will likely intensify.
The “shock” expressed by Kunze may ultimately reflect not just one company’s policy decision, but the rapid pace at which generative AI is entering deeply personal domains of human experience. How the industry navigates these waters will shape both technological progress and its broader effects on society.