So Chat, I would like to ask you, where did the concept of unconditional love originate? I love that you’re going to have ChatGPT answer this question for you. I’m Nadja Spiegelman and I’m a culture editor for New York Times Opinion and today, I have the honor of being in conversation with a renowned psychotherapist and the host of the podcast, where should we begin. Esther Perel. Hi. People are using AI for so many things, from asking it to respond to their emails to telling it their most intimate secrets I’ve been thinking about what the increasing prevalence of AI means for human relationships. In a study by vantage point, nearly a third of Americans have had some form of relationship with AI. Esther Perel has been a psychotherapist for nearly four decades. She has seen human connection adapt and survive through the onslaught of all kinds of technological advances, from the onset of the internet to dating apps, and now to this. An opportunity to speak with Esther is a dream for so many people I know. But I promise I’m not just going to ask her how to heal from my most recent breakup. We’re going to talk about AI, technology, love, and intimacy, much less risk of breakup with the AI. That’s true. It will never break up with me. And that’s something I want to ask you about. Light suffering. Yes, but to start, I want to do you yourself use AI. Does it come up in any way in your life. Oh yes. It helps me think. And primarily it helps. Actually, it helps me structure my thoughts. So I will use it when I have written a whole bunch of things and I want help with organization. I think that’s really where I find it most useful in summarizing, in giving me highlights. And then you begin to see that AI speaks in a certain way. 3, 3, 3, 4, 3, 3, 3, 4 – it’s like a choreography of information and it likes to do trees three points around this, three points around that summary for and at that moment, I think it’s time to go take a book. It’s true because there are certain things that work for our brains that are just so simple and straightforward, such as giving a list of things in 3 and then a summary. But if everyone starts to think like that and every idea is expressed like that, then we’re cutting ourselves off from the richness of so much of the world. And so I’m curious about how you feel in general about people building relationships with AI. Are these relationships potentially healthy. Is there a possibility for a relationship with an AI to be healthy. Maybe before we answer it in this yes or no. Healthy unhealthy AI. I’ve been trying to think to myself, depending on how you define relationships, that will color your answer about therefore, what does it mean when it’s between a human an AI. But first we need to define what goes on in relationships or what goes on in love. If you. The majority of the time when we talk about love in AI or intimacy in AI, we talk about it as feelings. But love is more than feelings. Love is an encounter. It is an encounter that involves ethical demands, responsibility that is embodied. That embodiment means that there is physical contact, gestures, rhythms, gaze, frontman. I mean, there’s a whole range of physical experiences that are part of this relationship. Can we fall in love with ideas. Yes do we fall in love with pets. Absolutely do children fall in love with a teddy bear Of course. We can fall in love. And we can have feelings for all kinds of things. That doesn’t mean that it is a relationship that we can call love. It is an encounter with uncertainty. AI takes care of that. Just about all the major pieces that enter relationships. The algorithm is trying to eliminate otherness, uncertainty, suffering, the potential for breakup, ambiguity, the things that demand effort. Whereas the love model that people idealize with AI is a model that is, with pliant agreements and effortless pleasure and easy feelings. So what is love precedes? What is the love with AI. How do we define it. I think that’s so interesting. And exactly also where I was hoping this conversation would go is that in thinking about whether or not we can love AI, we have to think about what it means to love in the same way where when we ask ourselves if AI is conscious, we have to ask ourselves what it means to be conscious. And these questions bring up so much about what is fundamentally human about us, not just what. In the question of what can or cannot be replicated. So, for example, I heard this very interesting conversation about AI as a spiritual mediator of faith. We turn to AI with existential questions. Shall I try to prolong the life of my mother. Shall I stop the machines. What is the purpose of my life. How do I feel about death. I mean, this is extraordinary. We’re no longer turning to faith healers, but we are turning to these machines to answer us. But they have no moral culpability. They have no responsibility for their answer. If I am a teacher and ask me a question. I have a responsibility in what to do with the answer to your question. I’m implicated. AI is not implicated. And from that moment on, it eliminates the ethical dimension of a relationship. When people talk relationships these days, they emphasize empathy. Courage vulnerability. Probably more than anything else. They rarely use the word accountability and responsibility and ethics that adds a whole other dimension to relationships that is a lot more mature than the more regressive states of what do you offer me. I don’t disagree with you, but I’m going to play devil’s advocate. I would say that the people who create these chat bots very intentionally try and build in ethics, at least insofar as they have guide rails around trying to make sure that the people who are becoming intimately reliant on this technology aren’t harmed by it. And that’s a sense of ethics that comes not from the AI itself, but from its programmers, that guides people away from conversations that might be racist or homophobic, that tries to guide people towards healthy solutions in their lives. Does that not count if it’s programmed in I think the program in is the last thing to be programmed. I think that if you make this machine speak with people in other parts of the world, you will begin to see how biased they are. I think it’s one thing we should really remember. This is a business product. When you say you have you’re falling in love with AI. You’re falling in love with a business product. That’s business product is not here to just teach you how to fall in love and how to develop deeper feelings of love, and then how to transcend them and transport them onto other people as a mediator as a transitional object. Children play with their little stuffed animal and then they move. They bring their learning from that relationship onto humans. The business model is meant to keep you there, not to have you go elsewhere. It’s not meant to create an encounter with other people. So you can tell me about guardrails around the darkest corners of this. But fundamentally, you are in love with a business product whose intensive intentions and incentives is to keep you interacting only with them, except that they forget everything and you have to reset them. Yes so then you suddenly realize that they don’t have a shared memory with you, that the shared experience is programmed. And then, of course, you can buy a next subscription and then the memory will be longer. But you are having an intimate relationship with a business product, and we think we have to remember that it helps. I think that’s so interesting. That’s the guardrail I think that this is so crucial, the fact that AI is a business product, product in the sense that they’re being marketed these products as something that’s going to replace the labor force, but instead, what they’re incredibly good at isn’t necessarily being able to problem solve in a way where they can replace someone’s job yet, and instead forming these very intense, deep human connections with people, which doesn’t even necessarily seem like what they were in first designed to do, but just happens to be something that they’re incredibly good at. And I’m curious, do you have any patients who have fallen in love with a chatbot. So people come to tell me sometimes what the AI has told them, and they want my opinion on their opinion. So we create a chain of opinions people have not yet had, a couple of which there is a human being and an AI, and I invite anyone who wants to come and do a podcast episode with me in this configuration to actually apply. I would love that. I think it would be very interesting to actually have the experience of working with a couple that is challenging everything that defines a couple. So I await I think that it’s just a matter of time, and I’m curious, given all these people who say they are falling in love with them, do you think that these companions highlight our human yearnings? Are we learning something about our desires for validation, for presence, for being understood. Or are they reshaping those yearnings for us in ways that we don’t understand yet. Both you asked me if I use AI, I think as a tool. It’s a phenomenal – a phenomenal tool. I think people begin to have a discussion when they begin to ask, how does AI help us think more deeply on what is essentially human. And in that way, I look at the relationship between people and the bot, but also how the bot is changing our expectations of relationships between people. I think that is the most important piece, because the frictionless relationship that you have with the bot is fundamentally changing something in what we can tolerate in terms of experimentation experience with the unknown, tolerance of uncertainty, conflict management, stuff that is part of relationships. So there is a clear sense that people are turning with questions of love or quests of love, more importantly, longings for love and intimacy, either because it’s an alternative to what they actually would want with a human being, or because they bring to it a false vision of an idealized relationship, an idealized intimacy that is frictionless, that is effortless, less. That is kind, loving and reparative for many people. I am sure there is a corrective experiences when you have grown up with people who are harsh and cold or neglectful or rejecting, and you hear constantly. What a beautiful question. Of course, you may want to take a break right now. Of course, it would be good for you to go for a walk. It’s balm on your skin. We are very vulnerable to these kinds of responses. It’s an incredible thing to be responded to positively. Then you go and you meet a human being. And that person is not as nearly as unconditional. That person has their own needs, their own longings, their own yearnings, their own objections. And you have zero preparation for that. So does I. Inform us about what we’re seeking. Yes. Does I amplify the lack of what we are seeking. And does I sometimes actually meet the need. All of it. All of it. But it is a subjective experience. The fact that you feel certain things. That’s the next question. Is it that because you feel it that makes it real and true. We have always understood phenomenology. Phenomenology as it is my subjective experience and that’s what makes it true. But that doesn’t mean it is true. So we are so quick to want to say, because I feel close and loved and intimate, it is love. And that is a question, that’s where I’m very curious about that because it seems like what you’re saying is that these relationships that we can have with I highlight our desires to be unconditionally loved, but that unconditional love, we didn’t wait for I to have that desire. It’s an old dream. It feeds and meets an impossible desire for unconditional love. And then when we go out into the world and encounter other humans. Love can never actually be unconditional. Is that what you’re saying that it is never the only time you have unconditional love. Maybe is in utero. And then maybe when you come out and someone is completely there, attending to your every need, which you express with three different sounds, and the person guesses and guesses as if they were inside of you, because they are an extension of you, and you hold them in your arms like this. And they are centimeters from your face, and you have that eye to eye contact. That is the most profound experience of recognition. And that is the embodied piece that we start to lose. After that become an adult, and that means that the person here is not just there for you. They too have needs. They two have a history and memories and feelings and reactions. And a relationship becomes this dialogue between two people, otherness and a bridge that you cross to go visit somebody on the other side. So we’re talking about unconditional love. Can you tell me a bit more about what does this mean to us. Why do we seek this. Where does it come from as a concept. I’m going to ask ChatGPT to actually help us understand the roots of the quest for unconditional love. That’s especially not the unconditional love of the baby, but how we have brought this as and transposed it into one of the main things we seek in adult romantic love. I would love to know what ChatGPT has to say about that. So chat, I would like to ask you, where did the concept of unconditional love originate, and especially the wish and the desire for unconditional love as part of the wholeness. The idealized version of adult love. That’s what I’m really interested in. I love that you’re going to have ChatGPT answer this question for you. So chat thinks when you say the concept of original, do you mean original sin. No let me go back and sometimes simply can’t. They go to religious terms. You see, that’s what’s interesting. Unconditional love becomes such a powerful ideal in adult romantic love. Because not because it’s realistic, but because it feels necessary to something very deep in us. Adult romantic love carries childhood needs of safety and acceptance. It counters modern insecurity and instability. 3, 3, 3, 4. As I say, culture taught us to expect it, even if it can’t deliver. Movies, novels, music, religion, it feels like proof of being lovable. It reduces the fear of abandonment. It confuses love with attachment security. That’s it promises transcendence of human limits. Lots of things that Chat has to tell us about. Why we are so enchanted with the notion of unconditional love in especially in adult love. I hadn’t thought about its potential roots in Christianity also, just the unconditional love that one can feel with God. Not that I know much about Christianity. Well, it’s not just with Christianity. I think that people have often turned to the divine to feel less alone in the world God is watching over you. God is holding you in there. And so I think that with secularization, when we experienced the rise of romantic love and we transported onto people expectations that we had from the divine and from the community. And we now want that person, that one and only to accept us whole. And we call that person a soulmate. That’s really interesting. So that is the transposition of the concept of unconditional love as a kind of a central value of adult romantic love in the moment, and it is taking us into many dark corners. This is one of your fundamental ideas that has been so meaningful for me in my own life, of just that desire is a function of knowing, of tolerating mystery. In the other that there is, that there has to be separation between yourself and the other to really feel eros and love. And it seems like what you’re saying is that with an AI, there just simply isn’t that there isn’t the otherness. Well, it’s also that mystery is often perceived as a bug. Rather than as a feature. Because that’s what I was going to ask is like to again, play devil’s advocate. There’s no one knows what AI is going to say. The programmers don’t know how AI is going to respond. If you ask an AI, do you care about me. Do you love me. It will tell you I am a non-human entity, but I do love you. There still is an element of mystery. I’ve experienced times when I’ve asked AI for advice and not gotten the advice that I wanted. Gotten advice that was probably better for me, but not simply what I wanted to hear. Is it impossible for AI to ever truly be other be separate, have its own consciousness that can meet us in the way that another person can meet us. I don’t know. I know that we are all asking those very questions. We know that we can anthropomorphize. We know what we can do to make the I become more human, feel more human. We interpret them as human. We don’t know if the I can actually do it, that makes sense it’s programmed set of responses based on aggregated information. It is not here in the moment. It didn’t see the twitch in your eye. That kind of said, yeah, I don’t really believe what you just said. That is interaction. That whole series of embodies your hands, your – everything. Your smile, your eyes. It’s like we are communicating with a lot of other things than just words. The intimate relationship between us and a machine at this point is primarily verbal. More than half of our communication is non-verbal. It’s amazing that we are just forgetting the embodied, the physicality of the experience between people. And when I describe that little child. It grows with us. We know what it means to get a hug. And we know what it means when somebody tells us from afar. I hug you, that we – we like it. We feel the presence. But to receive the hug that then puts the tears in motion. That then slowly does the whimper, that then slowly does the relaxation. That then slowly brings the smile back. That is a whole different soothing experience and comforting experience than just to say, I’m not human, but I like you, or I love you, or I’m here for you, I mean, that is so – That is so beautifully said. It’s is there a world in which a human eye relationship could serve some purpose, even if it wasn’t a replacement for an actual human bond. Like, O.K., yes bear. Bear with me is falling in love with I to falling in love with a person the same as watching pornography versus having sex. All right, let me take it first in the less imagistic way that you asked the question. Yes we can have very interesting conversations with I and interactions. And sometimes I ask questions and I feel like the AI has affirmed me and I feel more confident in my thoughts. And then I say, why would I stare People say, how would she answer this question. Or then I look to see if when I see a summary of ideas, is this a reference to some of the things. Sometimes it’s like, this is so close to me. Wait, you ask how would you answer this question. Of course because you want to know your own thoughts reflected back at you. Yes that’s so interesting. So it is an experience of mirroring of sorts, how do you actually know me. That’s one of the things you ask in a relationship. How do you know me. What do you know about me. And what do you tell about me to others when you do this. Do you feel like it knows you. Do you feel like it gets it right. Yes many times it has right elements. It has right elements. It understood the essence, of course. I mean, it’s written, they take it. And then sometimes I say they got that piece and I feel even more seen, of course. And then sometimes they say this is just it lacks the soul. It lacks all the pieces in between. It’s like Swiss cheese. It’s O.K, but there’s lots of holes. So I think when you talk about porn versus sex, you’re talking about the focus on the outcome. The porn activates the arousal. It doesn’t particularly care about the desire. It doesn’t have much of a foreplay. But it has a few things actually, that AI offers. You are never rejected in porn. You never have to deal with competence and performance because the other person is always somehow enjoying it. And you never have to deal with the mystery of the truthful experience that the other person is feeling, because all they say is me too. More and more, in whatever version they say that. So if it’s a hetero version where you have the mystery of is this actually real or is this fake. This response that I’m getting don’t have to wonder about that. The connection for me with porn is less about the actual physicality of the porn, but more about three of the most important sexual vulnerabilities that are taken care of through porn that you never have to confront when you watch porn. That makes sense. I want to move into talking about AI as a tool within human relationships, not our relationship with AI, but how I can impact our romantic relationships with each other. I said I wasn’t going to talk about breakups, but I did have a very recent short experience with someone with whom there was a lot of communication issues in our relationship, and sometimes when she was texting me, it really felt like her texts were being written by AI. And on the other hand, those texts were texts in which she expressed herself clearly and fully and in which I felt very seen, even more so maybe than texts in which she wasn’t getting that help. How do you feel about AI as a tool within human relationships for each person to speak to separately about the relationship, and then perhaps to use as a bridge in communication gaps. It can be very useful, is very useful. So that’s a very simple answer. I think it’s extremely fast and clever. And if it makes you think. And if it makes you try something else and not wait for a week till you go to your next therapy session, it can be very constructive. Now, what you highlight, though, is when she writes to me, and this is for many people today, when you get an apology, you have no idea if the person actually feels any remorse. You don’t even know if they wrote it, but you don’t know if they actually even felt any remorse. The simulation of care, the simulation of responsiveness, the simulation of emotional connection. And yes, we are totally prone to simulation. That is, we are fickle people in that sense, we’re gullible. So when you notice the difference between the time when it felt that it was her voice speaking and when she was basically speaking in this very polished, even if she took all the signs away that betray her, the source, she didn’t, but she didn’t. But yeah, people used to go to people who were scriptures. We’ve always gone to people who wrote letters for us because A) they sometimes could write and we could not, and B) because they were professionals who could write condolence letters, engagement letters, marriage wishes, breakup letters. So we have a long historical tradition of asking for help from others that can articulate something which we cannot and yet what you’re describing, I have one I wanted to. You have one, I have one. Oh, I just remembered I stumbled upon a little poem, and I thought, we’ve gone to poets for a lot of this, for finding the words, often for falling in love, for longing for love and for losing love. So perhaps we are in this world to search for love, find it and lose it again and again. With each love we are born New. And with each love that ends. We collect a New wound. I am covered with scars. And can you tell us a little bit about why you brought this poem to this conversation. Because of the. I am proud of the scars and because you just reminded me a breakup is a scar. And I thought there is something about these scars that shapes the way we love and shapes the way we trust and shapes who we choose to love and who we choose to be in that love. And all of that is very curtailed in the experience at this point anyway with AI. With AI, you simply… You are never bearing a scar. You’re never bearing a wound because you’re never. There is no love without the fear of loss. The moment you begin to love, you live us in parallel with the possibility of losing it. They go hand in hand. It is the fear of loss that makes you behave in certain ways. It is the fear of loss that makes you be accountable in certain ways. So I think to want something that is idealized, that has no ripples, is not the best way to learn about love. It’s a step in between. It’s a transition, but it is. It is not the whole experience. That’s beautifully said. And it gets so much to so much of what I wanted to learn from you on this topic was if AI gives us unconditional love, then is the human love that we’re seeking inherently conditional. And why is that richer, deeper, more fundamentally something that can fulfill us than love that is unconditional. We need suffering to know happiness. Yes, yes, I do think in that kind of dialectic way. But also I have had many people in my office who really wanted unconditional love. If you loved me, and then fill in the blank, you would do this and you wouldn’t do that. And on some level, if I want you to take me as is without the slightest reaction from you, that just says I am different, or I want something else, or I’m another person, period. It also implies that I can only see myself as a perfect little person and we are flawed people. The reason there is no unconditionality is because we are flawed. We engender reactions in other people. We make other people mad, sad, cold, hot, funny, irritated, frustrated. We have an effect on others and they have an effect on us. And part of love is the ability to accept that, not to eliminate that, I think that’s so true. I mean, one of my questions for you was, is there something fundamentally human that I can never replicate. And I think you’re starting to say that I can’t make us grow in these ways. They can’t. It is not flawed. And it does not point us to our flaws. And therefore, in relationship with AI, there is not the same kind of growth. And I will remind you, it is a business product. Because you can see when you ask the question, sometimes it’s as if somebody said, should she go back to that person. Him her them should they go back. And so then you ask, well depends. What are they doing, how have they answered. How they’ve been repeatedly lying, should the partner stay. What does it mean for this partner to stay. Here’s the nuance that human beings get into because they handle complexity. And this is a complex moment. I want to stay with this person because despite what happened, there was a very good relationship. We have a beautiful life together, a tight family. I do not want to tell the people that are in my family because I don’t want them to dislike him, even though he’s the one who’s hurt me so much. And I don’t want them to pity me for having decided to stay with him because that’s not the place from which it’s coming. Some of these paradoxes, that you manage these relational dilemmas, they are not problems that you solve. Tech chauvinism is a way of thinking that sees technical solutions for every complex social problem. And I say that many of these complex social problems don’t have a solution. They are just paradoxes that you will live with and find meaning in and make sense of. Esther, it is such a treat to get to talk to you. Thank you so much for being here. Thank you. It’s a pleasure.
