100 WVIA Way
Pittston, PA 18640

Phone: 570-826-6144
Fax: 570-655-1180

Copyright © 2024 WVIA, all rights reserved. WVIA is a 501(c)(3) not-for-profit organization.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Exploring concerns around users building emotional dependence on AI chatbots

MICHEL MARTIN, HOST:

Artificial intelligence has become a part of our everyday lives, controlling our smart homes, if we have one, conjuring up whatever we can think of on screen, and even finishing our...

A MARTÍNEZ, HOST:

....Sentences.

MARTIN: Oh, thank you, A.

MARTÍNEZ: Oh, sure. I have no I to add to the A in my name. Now, AI today is not just widespread, it's becoming increasingly humanlike. So human that you might even start to develop feelings for your coded friend.

LIESEL SHARABI: I could see a future where, you know, people have relationships with AI, and it's similar to a relationship with a friend or family member, a romantic partner.

MARTIN: That is Liesel Sharabi, an associate professor at the Hugh Down School of Human Communication at Arizona State University. She says people do form relationships with AI all the time.

MARTÍNEZ: Now, that could be a good thing for helping you get your work done or even helping people overcome social anxieties. But what if relationships with AI start to replace human connections?

MARTIN: And that sounds like the plot of the 2013 film "Her," starring Joaquin Phoenix and the voice of Scarlett Johansson.

(SOUNDBITE OF FILM, "HER")

JOAQUIN PHOENIX: (As Theodore) The woman that I've been seeing, Samantha - she's an operating system.

SCARLETT JOHANSSON: (As Amy) You're dating an OS? What is that like?

MARTIN: But in reality, some users are already falling in love with AI chatbots.

SHARABI: I mean, there are people who feel like they have really deep intimate relationships with it.

MARTÍNEZ: Sharabi understands the appeal. After all, AI does not come with baggage, and it says all the right things.

SHARABI: People can sometimes be disappointing. Relationships are really complicated. And AI essentially cleans them up. And in that way, I think it sometimes creates sort of unrealistic expectations of what a relationship should look like.

MARTIN: OK, and here's something to think about. Any relationship with AI, whether romantic, platonic, or otherwise does have a third party involved. Sharabi points out that AI is controlled by a business. So the terms of that relationship can change or end at any moment.

SHARABI: If you feel like you are emotionally attached to something like ChatGPT, OpenAI controls that relationship. It's not like a relationship with another person, so I think there's also some potential concern there.

MARTÍNEZ: That means the memories you create with AI could just vanish, just like Roy Batty says in "Blade Runner," and all those moments will be lost in time.

MARTIN: Like tears in the rain.

(SOUNDBITE OF VANGELIS' "LOVE THEME (FROM 'BLADE RUNNER')") Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Hosts
[Copyright 2024 NPR]