smartphone

‘AI-sexual’ isn’t a new identity. It’s a marketing funnel

Whether pure hype or the real thing, AI-lationships are inherently predatory, warns Marie Boran
Blogs
Image: RDNE Stock project via Pexels

10 March 2026

A press release landed in my inbox earlier this week that wants me to believe we’re witnessing the birth of a new sexual identity, as if sexuality is an iOS feature quietly shipped overnight: ‘AI-sexual’. It even comes with an impressive percentage: 55% adults apparently now identify as AI-sexual.

This was based on a survey of 2,500 people run by Joi AI, an ‘AI-lationships’ platform.

So a company that sells AI intimacy products has commissioned a survey that discovers a large market for AI intimacy products. How convenient. That doesn’t mean the numbers are fake. It means they’re marketing until proven otherwise, and the proof would be the boring stuff the press release doesn’t give you: sampling method, recruitment channel, country mix, question wording, and whether respondents were already heavy AI users. The sample is apparently Gen Z and millennials who already “actively use AI tools”, which is a very different population from “adults”, full stop.

 

advertisement



 

Now, about Joi AI itself. On the company’s own website, the product is pitched in the language of frictionless companionship: virtual girlfriends, roleplay, and made-to-order characters. When you browse the site you’re met with a stream of persona attributes – names, ages, vibes – that skew heavily toward young, feminine-coded avatars. It doesn’t take a doctoral thesis to spot the commercial centre of gravity here: this is AI intimacy packaged for the straight male gaze, with “safe” doing a lot of work as a euphemism for compliant.

And this is where the AI-sexual framing starts to look less like a discovery and more like a conversion funnel.

The press release leans hard on therapeutic language: stigma-free, safe, identity exploration, and it cites an in-house relationship expert who argues that we’re seeing “the expansion of sexual identity itself”. But the most telling statistic is the one that reads like a product requirement: 61% feel more comfortable sharing desires with AI than with another person.

Intimacy designed by committee

If you’re tech-savvy, you know what this is, structurally: a system designed to be high-reward and low-risk. No awkward silences. No rejection. No “actually, I’m not into that”. Just an interface that mirrors you, adapts to you, and is trained – explicitly or implicitly – to keep you coming back for more.

Researchers are already documenting the upsides and the pitfalls. A 2025 systematic review on ‘romantic AI’ describes a genuinely double-edged picture: AI companions can offer support and reduce loneliness for some users, while also raising risks around dependency, unrealistic expectations, and displacement of human relationships. The American Psychological Association’s Monitor has similarly warned that heavy use of AI companions can deepen isolation for certain users, especially when the bot becomes a primary emotional outlet rather than a bridge back to real-world connection.

Then there’s the darker edge of always-on intimacy as a service. A widely cited 2022 qualitative study on users of Replika (one of the first AI companions) described patterns of emotional dependence and distress when access or the relationship dynamic changed, which is the kind of attachment instability you don’t get with an actual person because people aren’t controlled by product roadmaps. And a 2025 Nature Communications paper argues these systems are like “emotional fast food” – satisfying in the moment, but potentially damaging when they replace socialising with actual human beings.

This isn’t an exercise in moral panic on my part. From chat rooms (remember IRC?) to dating apps to sexting, people have always used technology to mediate intimacy. The question is what changes when intimacy is engineered: when the so-called partner is a product designed for retention, when the safest relationship is the one that can’t say no, and when the company selling you that experience starts presenting it as an identity category.

So no, I’m not convinced we’ve discovered a new sexuality. I think we’ve discovered a new kind of interface: one that turns desire into a subscription, and calls it empowerment while quietly optimising for the customers most likely to pay for a fantasy that never talks back.

Read More:


Back to Top ↑