Valentine’s Day is just around the corner. Love is in the air … and on the internet, where many singles will turn looking to score a date. About 30% of U.S. adults — including 53% of people under 30 — have used a dating site or app, according to 2022 Pew Research Center data. According to the same survey, 40% of users say online dating has made the search for a long-time partner easier.
Dating apps make no secret of their use of artificial intelligence (AI) to help users find their perfect match, although just how the algorithms work is less clear. Many of the most popular dating apps — including Tinder, Bumble, eHarmony and OKCupid — use the data you provide and your interactions within the apps to curate lists of potential matches, making the sea of fish a little bit smaller and more manageable, said Liberty Vittert, a professor of practice of data science at Olin Business School at Washington University in St. Louis.
But recent reports of online dating app users employing AI to strike up conversations and flirt with matches — or worse, scam them — have some saying AI has gone too far.
Romance ‘beyond reach’ for robots
Plenty of would-be suitors — fictional and real — have sought help to woo their love interests. Who can forget “Cyrano de Bergerac,” the 19th-century play that tells the story of a man who helps his inarticulate rival win Roxanne’s heart by feeding him love poems and letters? But human romance is beyond reach, currently, for robots, Vittert said.
“Robots don’t have human emotions. We are actually a long way away from what we see in movies,” Vittert said. “They don’t work very well outside what they are programmed to do. For example, they can beat a grand master at chess, but if you then ask it to choose to play checkers instead, it can’t necessarily make that decision.”
And because the technology is so new, no one is regulating or stopping it.
“The scariest part is that we have no idea what the implications are going to be, but we do know that when the use of AI has been rushed, that there are dire consequences,” Vittert said.
For example, police relying on AI facial recognition to decide who to arrest, when the algorithm does a terrible job identifying people of color “has resulted in completely innocent individuals being jailed for up to a week,” Vittert said, “or Amazon hiring based on resumes that had the keywords ‘fraternity, male, lacrosse,’ we have already seen serious, unforeseen consequences.”
How to spot a bot
As with any dating situation — online or in person — it’s important to use caution. Avoid sharing personal information and do not respond to requests for financial help. Most importantly, listen to your gut. If something doesn’t feel right, there’s a good chance it’s not.
“Warning signs that you might be chatting with a bot versus a real person are going to be hard to tell as the AI gets better and better, but if you think it seems a little off, a little weird, not quite getting the tone — that is where you can tell,” Vittert said.
“AI can’t yet understand humor or tone, so if the responses to your humor or tone don’t seem to jive, then it’s possible you are talking to a bot.”
Comments and respectful dialogue are encouraged, but content will be moderated. Please, no personal attacks, obscenity or profanity, selling of commercial products, or endorsements of political candidates or positions. We reserve the right to remove any inappropriate comments. We also cannot address individual medical concerns or provide medical advice in this forum.