Do not ask AI for marriage advice

.

Imagine being married for 13 years, hitting a rough patch for two years, almost getting a divorce, but then reconciling. You think everything is fine. But then all of a sudden, your wife starts bringing up old problems you thought were resolved years ago.

It is happening to more and more couples, and the cause is artificial intelligence language model-powered chatbots such as ChatGPT.

“What was happening, unbeknownst to me at the time, was she was dredging up all of these things that we had previously worked on, and putting it into ChatGPT,” one anonymous husband told Futurism.

The husband later found out his wife was using OpenAI’s chatbot to analyze their marriage, holding “long, drawn-out conversations” over text and the chatbot’s Voice Mode feature.

“I could see ChatGPT responses compounding,” the husband continued, “and then [my wife] responding to the things ChatGPT was saying back, and further and further and further spinning.”

“My family is being ripped apart,” the husband said, “and I firmly believe this phenomenon is central to why.”

Now, of course, married couples have been getting divorced for literally thousands of years, long before computers, let alone AI, were ever invented. Priests, marriage counselors, and even family members have also served as third parties to offer advice to troubled married couples, often not helping the cause.

But the difference between past marriage counselors, professional or amateur, and AI is that AI is specifically designed to monopolize your attention and behavior in a way that past counselors were not.

Dr. Anna Lembke, professor and medical director of addiction medicine at the Stanford University School of Medicine and the bestselling author of the book Dopamine Nation told Futurism that the “role of a good therapist is to make people recognize their blind spots — the ways in which they’re contributing to the problem, encouraging them to see the other person’s perspective, giving them linguistic tools to de-escalate conflicts with partners and to try to find their way through conflict by using language to communicate more effectively.”

“But that is not what’s happening with AI, because AI isn’t really designed to be therapeutic,” Lembke continued. “It’s really designed to make people feel better in the short term, which also ultimately promotes continued engagement — which is the real agenda for these companies that are making and creating and profiting from these products. … They’re not optimized for well-being.”

DOES SPANBERGER THINK JAY JONES IS FIT FOR OFFICE?

“Not optimized for well-being,” may be the understatement of the year.

Whatever your spouse’s faults are, and nobody’s perfect, please share them with a real human being and not an AI chatbot.

Related Content