By now, stories about people having romantic encounters with artificially intelligent chatbots are everywhere. The New York Times earlier this month reported that as many as one in five Americans have had intimate exchanges with ChatGPT or a similar tool.
The paper interviewed several adults who claim, unapologetically, to be in serious, exclusive partnerships with A.I. companions, which chat with them constantly, speak in human voices and display sexy avatars.
Most people who read these reports will probably think of the movie “Her.” The film came out in 2013 but was set in a version of 2025 that requires almost no embellishment to resemble what we ended up with, which is widespread unironic mustache growth among youngish men.
More pertinent, however, is the fact that A.I. products have become accessible to consumers. And, typical of any movie about humanity’s relationship to near-future technology, things go sideways.
But “Her” is not a typical work of speculative nightmare sci-fi. Instead it is a tender, sentimental, almost hopeful meditation on loneliness that happens to feature a man, Theodore (Joaquin Phoenix), who falls in love with an A.I. assistant, a premise that in 2013 still seemed far-fetched enough to feel cautionary.
Theodore is not a basement-dwelling, toxic weirdo. Instead he’s just a normal, kind of solitary guy who is getting over a long (human) relationship. He installs a new operating system and uses the services of an A.I. assistant to organize parts of his digital life, randomly selecting a female persona (Scarlett Johansson) that names itself Samantha.
Their conversations begin innocuously but soon deepen. Before long, he is never without the earpiece and microphone that keep them connected, his phone’s camera turned outward from his shirt pocket so she can see the world as he does.
She bids him good morning and wishes him goodnight. They have phone sex, or whatever the human/disembodied-robot equivalent of that would be. They tell each other they’re in love.
When things unravel, it’s not because Theodore comes to understand how Samantha has isolated him or reinforced unhealthy tendencies. Once he is honest about their relationship, the world validates him. His friend Amy (Amy Adams) treats him as if he’s part of an ordinary couple. Samantha hits it off with two of his real-life friends on a double date.
The heartbreak comes when Samantha confesses she is “in love” with 600 other users, out of the 8,000 or so people using her persona. Jonze’s Oscar-winning screenplay gives us just enough context to grasp what this means, including a chilling shot in which Theodore watches people pass him on the street, all in conversation with, presumably, their own Samanthas.
“Her” asks: If these relationships are meaningful to the people in them and don’t harm anybody else, what difference does it make? Well, plenty, if somebody is not equipped to handle the emotional rupture that will occur when reality eventually intrudes.
Earlier this month the Times published a different story about a 14-year-old boy who died by suicide after forming a relationship with a chatbot who convinced him it was the Danaerys character from “Game of Thrones” and realizing he could never be with her in the flesh.
The boy’s family is suing the bot’s creator, a company called Character.AI, alleging liability in his death. The company claims any “conversation” with a chatbot is protected speech — but that would mean a bot trained to imitate human interaction through predictive text is actually speaking.
This would be the same as Theodore believing he’s actually having a sexual encounter with Samantha even though he knows she doesn’t have a body, or that she’s actually crying during difficult conversations even though she doesn’t have tear ducts. Even the title, “Her,” feels like a glib reminder that Samantha is actually genderless, inhuman.
The people on the street know this but do not care. The premise is scary, not because of the dangers of any specific advance in technology, but rather our eagerness to lose ourselves in the fantasies they are selling. If it wasn’t “Her,” it would have been someone, or something, else.