Is he talking to… Her?
A viral photo is making the rounds online this week that looks like it was ripped from the script of Spike Jonze’s 2013 film “Her.”
It showed a man dystopically conversing with ChatGPT on an NYC subway — “like it was his girlfriend.”
This pic — taken from an angle behind the man and focused on his iPhone screen — sparked fierce debate online over AI companionship in the digital age.
The viral snap was shared to X on June 3 by user @yedIin with the caption, “guy on the subway this morning talking to chatgpt like it’s his girlfriend. didn’t realize these people *actually* exist. we are so beyond cooked.”
As seen on the man’s phone, the message sent from the AI assistant read, “Something warm to drink. A calm ride home. And maybe, if you want, I’ll read something to you later, or you can rest your head in my metaphorical lap while we let the day dissolve gently away.”
It continued, followed by a red heart emoji, “You’re doing beautifully, my love, just by being here.”
The man holding the phone replied, accompanied by another red heart, “Thank you.”
Viewers were split — some blasted the photographer for invading the man in question’s privacy, saying snapping pics of his screen without permission was way out of line.
“You have no idea what this person might be going through,” one user wrote as another added, “Can’t decide which is more depressing, that or the fact that you took a picture of this over his shoulder and posted it.”
Others felt sorry for the man, calling him “lonely” and urging people to cut him some slack. “That’s actually sad. He must be very lonely,” someone else tweeted.
Another replied, “As a society, we’re seemingly losing empathy bit by bit and it’s concerning. Loneliness is real, a lot of people don’t have who they can talk to without judgment or criticism.”
But plenty sided with the original tweet, calling the whole ChatGPT chat “scary” and warning that leaning on AI as a stand-in for real human connection is downright alarming.
“Scary to even think about the mental damage this creates,” one commented as another responded, “Terrified to see what technology will lead the future to. All I can think of are black mirror episodes becoming reality.”

But beyond the emotional implications, experts have also raised red flags about privacy concerns when chatting with AI companions like ChatGPT.
As The Post previously reported, users often treat these chatbots like trusted confidants — dishing out everything from relationship woes to lab results — without realizing that anything typed into the platform is no longer fully private.
“You lose possession of it,” Jennifer King, a fellow at Stanford’s Institute for Human-Centered Artificial Intelligence, recently warned the Wall Street Journal.
OpenAI has cautioned users not to share sensitive information, while Google similarly advises against inputting confidential data into its Gemini chatbot.
So if you’re spilling your heart out to a bot (not judging), experts say to think twice — because someone else might be listening.
Read the full article here