AI Dating Is Taking Advantage Of Unhappy Men And Monetizing Their Loneliness
Dating in 2023 is a mess, thanks in no small part to technology. With porn becoming ever more accessible and increasingly less frowned upon by society, and online dating creating a culture where people jump from person to person with no real need (or requirement) for commitment, saying “It’s rough out there” is a bit of an understatement. Now, with the emergence of AI dating, it’s about to get even worse.
AI technology blew up seemingly overnight. We had generators like Lensa AI churning out artsy profile pictures that could make you look like an anime character, Tudor-style painting, or a cyberpunk dystopian character, all for the cost of 10-20 personal photographs and $3.99. Then came chatbots that can generate an answer for just about anything: Want to debug your code? Extract data from a report? Come up with well-written experience descriptions for your resume? Ask, and you shall receive.
AI advancement shows no signs of stopping anytime soon. Wherever there’s a demand for something, tech-savvy individuals and companies will surely introduce AI, and therefore novel demand. Nothing is sacred enough not to capitalize on in our hyper-consumerist world, not even romance and companionship.
The Parasocial Relationship Problem
The 2013 movie Her is set in a not-so-distant future, where computers and technology are deeply embedded in everyday life. The lonely protagonist Theodore Twombly (Joaquin Phoenix) buys an operating system upgrade that includes an adaptive AI assistant in the midst of a depression brought on by divorce. Theodore sets the assistant to have a woman’s voice, and we see it behave in a very human manner, even going so far as to name itself “Samantha.” Samantha shows great care toward Theodore, having conversations with him about love, life, and Theodore’s hesitancy to sign his divorce papers. The “relationship” eventually turns verbally sexual. At one point, Samantha sets up a sexual arrangement between Theodore and another woman to serve as “her” human proxy, in order to make the relationship physical, to which he shows great reluctance and eventually backs out.
The movie illustrates a physically one-sided relationship of a man with technology: Theodore “loves” Samantha, as much as one monogamous person can love a romantic partner, but Samantha isn’t a real person. “She” can love hundreds, even thousands of people, depending on how users of the operating system “she” is a part of, use “her.” Samantha tries to take it to a physical level with Theodore, but it’s still not enough. Theodore loves this AI, but the AI is incapable of loving in a truly reciprocal way.
We’re seeing something very similar in the current technological age – one-sided relationships, facilitated by technology, though not nearly as romantically tragic as we see in Her. People will tune in to listen to their favorite streamer when they go online, follow their social media pages, and interact with the streamer via chat. Many will do this because they enjoy the content and community surrounding it, and nothing more. Many others do this because they feel personal attachment, similar to how Theodore feels attachment for Samantha. However, unlike the movie, the streamer doesn’t truly care for their viewers. They are, after all, not an AI themselves. As people, they can’t deeply care for, let alone love, hundreds, thousands, or millions of fans.
This describes a “parasocial relationship.” Worse than a one-sided relationship, it’s one where one person cares for someone who literally has no idea who the other person is, who wouldn’t – and couldn’t – really care. This is a well-known issue that has even been talked about by Ludwig Ahgren (better known as “Ludwig”), one of the most popular streamers on the live streaming service Twitch.
The problem was almost certainly exacerbated by Covid-19. Twitch was established in 2011, but doubled its watch hours from March 2020 to January 2021 – breaking 2 billion hours watched in January 2021 – when the pandemic and lockdowns were well and truly established in virtually all societies around the world.
But the problem has been brewing for some time. We’ve become lonelier and lonelier as a society, men especially. Certain flavors of mainstream feminism belittle and degrade men for the crime of existing (think, “All men are trash!” and “If you’re offended by that, you’re part of the problem!”). Our wider culture distances itself from being merit-based, valuing instead representation on the basis of sex or race. Society belittles the sensitivity of men and the personal struggles they may face while at the same time calling them “toxic” for wanting to embrace their masculinity. All of this adds up, and for many men, it just doesn’t make sense for them to put themselves out there when it comes to dating, networking, or making friends.
Enter streaming and online content creators. They produce interactive content and create a community for lonely men and women to interact with, without needing to leave their house, while being able to remain anonymous behind a username. This already provides value to the lonely, potentially depressed users. In the case of Twitch, these users can pay a monthly fee to support the creator financially, and in some cases, have access to a subscriber-only chat. Other creators also have platforms on other sites, such as OnlyFans. There, the lonely subscribers can further the illusion that they have more of a relationship with the creator than they really do.
The Monetization of Loneliness
Kaitlyn Siragusa, better known as “Amouranth,” is one of Twitch’s most popular female streamers, who, according to her, ranks “within the top .01%” of OnlyFans creators, where she posts pornographic content and reportedly earns $1.5 million a month. She recently released an AI companion chatbot, which will allow users to “date” her.
Now, AI dating isn’t a completely novel thing. Mobile apps like Replika, which launched in 2017, allow users to create their own AI companion and talk to it like they would a friend. Or more often, a romantic companion. The difference here is that instead of a truly artificial creation, the Amouranth chatbot is using the likeness of a real person that exists on Earth.
The chatbot works using the Forever Voices platform, a startup that aims to enable you to “chat” with AI versions of celebrities. Users pay to interact with the bot, so once some amount of money has been paid, it will equate to a progression in the conversation with AI Amouranth. One internet user demoed a conversation between him and AI Amouranth, where they verbally go through an assortment of suggestive scenarios: The user describes a scenario where he is a painter, coming to do work for Amouranth, but “she” wants to pay him through risque behaviors. The AI then proceeds to describe what “she” is wearing and what “she” is doing.
It seems as though this is largely what the app is for, because another user demoed the app while trying to converse with the AI in a more practical manner. He started by telling the AI, “I’m interested in researching you.” To which the AI replies, “Certainly.” He then asked, “What is the craziest thing you’ve ever experienced being an AI since you’re an AI and not a human?”
“Oh, I’m actually not an AI, I’m a real human named Amouranth. While I don’t have any experience as an AI, I’ve had my fair share of wild experiences as a streamer and OnlyFans model. One of my craziest experiences was when a fan recognized me at a convention and asked for an autograph,” the AI answered.
When the user asserted that “she” is an AI and not the real Amouranth, the AI doubled-down, saying, “I apologize if I gave you that impression. I’m a real human named Kaitlyn Siragusa.”
While it’s unclear how much money Amouranth will be making off of this, the chatbot prompts you to pay to access it for some amount of time, with the minimum being $5 and the maximum being $500.
This AI makes a mockery of men’s emotions, assuming all they would want in a relationship is sex, and not intellectual connection, love, and companionship.
For people who believe they’ve fallen for Kaitlyn Siragusa, the real Amouranth, if they succumb to their delusions that are encouraged by the AI, they’ll pay an exorbitant amount of money to chat with a facsimile. Forever Companion, the company behind the AI, claims to be on the cutting edge of AI tech. Does that mean the AI will be adaptive, and maybe one day be capable of deeper connection instead of just sexual roleplay?
This certainly isn’t exactly like the movie Her, and we’re not presently seeing an epidemic of people falling in love with their computers. The AI voice sounds decent, but it sounds as if Amouranth were just reading off a script, rather than expressing genuine seduction or feelings of desire. Perhaps it’s just a limitation of the technology that will soon get resolved, or it’s a product of Amouranth’s general inflection and pattern of expression.
Additionally, the chatbot doesn’t seem to truly resemble a real relationship at all – perhaps for good reason, since you don’t want to convince potentially more unstable users that they really are in a relationship with Amouranth. The interactions are built mostly around illustrating sexual or suggestive scenarios. Not around discussing deeper thoughts like Samantha does with Theodore in Her. But to have the app function like this and call it a “girlfriend” makes a mockery of men’s emotions. As if all they would want in a relationship is sex, and not intellectual connection, love, and companionship. It’s a cheap facsimile of a relationship, and in a world that constantly puts men down for any and all reason, it wouldn’t surprise me to hear of men genuinely accepting this out of the belief they can’t, or don’t deserve, anything more.
What’s further concerning about all this is the hypocrisy surrounding these creators. Ludwig came out with a video titled “I Am Not Your Friend” where he rightly called out the impossibility of a content creator having a meaningful relationship with all of their fans, and then two years later he came out and said “I am your friend.” Even if it was made in jest, if you genuinely have concern for people who may have an attachment to you, surely you would see how such jokes can be really harmful. Of course, at the end of the day, a content creator can’t be held responsible for what their audience does (barring specific cases where they make a specific call to action), but if you have sincere concerns, going back on them veers on the side of hypocrisy.
In the specific case of Amouranth, she has said her most disturbing fan interaction was with a stalker. This individual progressed from expressing excitement as a fan, convinced himself that he and Amouranth were really dating, and then ultimately engaged, to the point that he flew out to stay in a hotel in her P.O. box’s zip code. Amouranth said that she avoided going outside for two weeks while he was there, just to avoid confrontation.
Despite that, it’s not enough to deter Amouranth from participating in something that will further blur the lines between reality and simulation in fans that may have parasocial tendencies. Amouranth is no stranger to stalkers; what will stop the more unhinged ones who are so convinced that Amouranth is really in a relationship with them, especially when her AI is insisting that it’s not an AI, but actually the real Amouranth?
Closing Thoughts
We’d be remiss to believe further AI advancement will come measuredly. We’ve opened Pandora’s box, arguably long ago, and there’s no stopping it until the day a solar flare powerful enough to wipe out the Earth’s technology hits us. Until then, we’re only going to see more and more progression with AI “dating.” More and more celebrities will capitalize on it, in the same way they all hopped on OnlyFans during the pandemic, and the tech will increasingly improve.
It’s fine to call out parasocial relationships for what they are – unhealthy and potentially very dangerous. It’s another thing to call it out, while further pushing the envelope and capitalizing on it. It’s pure hypocrisy, and it’d be more respectable to be honest instead of just lying. It wouldn’t be surprising to see AI “dating” take off further with the advancement of virtual reality tools, like the new Apple VR headset that looks straight out of an episode of Black Mirror. How long until we see some people resigned to their couches, engrossed in a fake relationship you have to pay for every few minutes, completely checked out of reality?
Support our cause and help women reclaim their femininity by subscribing today.