A new chatbot-based texting service is coaching users in sexting. Photo / 123RF
The future is robots, and they're teaching us how to flirt.
Robots flirt more or less how you'd expect: awkwardly, employing clichés, direct questions and the occasional emoji to communicate interest.
Sound like the guy you've been talking to on Bumble? Well, that's a good thing as far as anemerging group of tech entrepreneurs is concerned. "Flirttech," if you will, has recently assumed the form of chatbots — computer programs that serve as proxies for romantic partners — that are designed to help woeful daters sext, ghost and develop vocabulary around consent.
"People think sex and dating is supposed to be easy and innate," said Brianna Rader, the founder and chief executive of Juicebox, a sex-education app. "But it's not. It's absolutely a life skill just like all other life skills, but unfortunately we're never formally taught these things."
Hence the need for Slutbot. The chatbot-based texting service, offered through the Juicebox app, is meant to coach users 18 and up in sexting. After confirming that a user is of age, Slutbot designates a safe word. Then the user and the bot begin a "flow," or conversation, which can be "Slow & Gentle" or "Hot & Sexy." There are options within those two categories for sexual orientation and other specific interests.
To break the ice, Slutbot sends a winky-face emoji and a rigid come-on: "It sounds like you are looking for some dirty talk."
During my own "flows" with Slutbot, I was told that I had "such lovely lips"; that it was "so ready" when we kissed; and that my tongue drove it "wild." Some of the banter is unprintable here, but none of it felt vulgar. The bot was also very conscientious about the relationship between pleasure and consent, asking frank questions such as, "Did you like turning me on?"
"We feel like Slutbot is kind of a safe space," Rader said, noting that you can't embarrass or offend a bot, even with the most forthright expression of desire.
Other apps are less explicitly about sex and romance, but can still be used to cultivate communication in those arenas. Mei, for example, is marketed as a way to improve a user's texting relationship with anyone.
The app monitors and logs every text message and time a phone call is made (but only on Androids, the sole device where it's available as of now). It then uses that information to build a database for analysing inflections in mood and language. The app makes inferences about the personalities of users — and, somewhat alarmingly, of all their friends and contacts, too. (The company said it does not ask for or retain any identifying information, and that it is compliant with EU privacy laws.)
Based on what the app can glean about the user, it acts as a kind of AI assistant, offering in-the-moment advice on texts: "you are more adventurous than this person, respect their cautiousness," for example.
"Machines and computers are great at counting things," said Mei's founder, Es Lee, who previously ran another chatbot-based dating advice service called Crushh. "Why not use the technology that's available to help with something like this?"
The counting Lee is referring to is more of a pattern analysis. He said Mei's algorithm scores each participant on personality traits like "openness" and "artistic interest," then offers a comparison — a "similarity score" — of the two parties who are communicating. It then issues little statements ("You are more emotionally attuned than this contact, don't feel bad if they don't open up") and questions ("It seems like you're more easily stressed than calm under pressure, right?") that pop up at the top of the screen.
In theory, Mei could give users insight into questions that plague modern dating: Why isn't my partner texting back? What does this emoji mean? In practice, the potential ways for it to backfire seem limitless. But the idea, Lee said, is to prompt users to think about nuance in their digital communication.
Ghostbot, another app, eschews communication entirely. Instead, it is used to ghost, or quietly dump, aggressive dates on a user's behalf. It is a collaboration between Burner, a temporary phone number app, and Voxable, a company that develops conversational AI. The app is meant to give people greater control, said Greg Cohn, a co-founder and the chief executive of Burner, by letting them opt out of abusive or inappropriate interactions.
"I think that sometimes people don't quite realize the emotional burden that can come with dealing with all that," said Lauren Golembiewski, Voxable's CEO.
The way it works is simple: By setting a contact to "ghost," the app automatically responds to that person's texts with curt messages like, "sorry, I'm swamped with work and am socially MIA." The user never has to see their communication again.
Of course, the problem with all of this software, and any digital dating hack, is still the problem with humans. Communication, in dating and otherwise, is subjective. Whether something is offensive, sexy or misleading can be a matter of opinion. And apps that run on AI are certain to reflect some of the perspectives and biases of the programmers who create them.
How are robot dating apps supposed to account for that?
Lee spoke of AI learning as the bigger project. "The very purpose of building AI is to understand the biases of people," the Mei founder said, adding that it is the responsibility of those creating these algorithms to ensure that they are applied in a manner consistent with that goal.
Rader, of Slutbot, acknowledged the possibility of violent or unwelcome language slipping into an algorithm. But, she said, "As a queer woman collaborating with sex educators and erotic fiction writers, we were the right people to think through these concerns."
Regardless, digital dating tools aren't going anywhere. There are many sex-education apps on the market, and the established dating apps are getting in on the action, too. EHarmony is developing an AI feature that nudges users to connect in person after they exchange a certain number of messages on the app. Bumble is planning a feature, set to launch in June, that uses AI to block lewd images. And individual users of dating apps have long been known to create their own chatbots and hacks to swipe through users and spam matches with AI-generated messages. Which is to say, dating is well past the point of disruption.
Written by: Rainesford Stauffer
Photographs by: Amrita Marino