Home Artists Posts Import Register

Downloads

Content

Dr Kirk interviews Jennetta George about how AI can cure loneliness.

For more information about Jannetta:

https://www.candorai.co/

https://www.instagram.com/__jennetta/

00:00 Thinking about loneliness

04:20 Introducing Jenetta George

07:02 Candor AI & how does AI work?

14:36 The Algorithm & matchmaking

27:16 Her & types of AI

40:52 The future of parasocial relationships

48:21 AI in reparative work

1:00:08 Safeguards & learning how to use AI

Become a member: https://www.youtube.com/channel/UCOUZWV1DRtHtpP2H48S7iiw/join

Become a patron: https://www.patreon.com/PsychologyInSeattle

Email: https://www.psychologyinseattle.com/contact

Website: https://www.psychologyinseattle.com

Merch: https://teespring.com/stores/psychology-in-seattle

Cameo: https://www.cameo.com/kirkhonda

Instagram: https://www.instagram.com/psychologyinseattle/

Facebook Official Page: https://www.facebook.com/PsychologyInSeattle/

TikTok: https://www.tiktok.com/@kirk.honda

November 10, 2023

The Psychology In Seattle Podcast ®

Trigger Warning: This episode may include topics such as assault, trauma, and discrimination. If necessary, listeners are encouraged to refrain from listening and care for their safety and well-being.

Disclaimer: The content provided is for educational, informational, and entertainment purposes only. Nothing here constitutes personal or professional consultation, therapy, diagnosis, or creates a counselor-client relationship. Topics discussed may generate differing points of view. If you participate (by being a guest, submitting a question, or commenting) you must do so with the knowledge that we cannot control reactions or responses from others, which may not agree with you or feel unfair. Your participation on this site is at your own risk, accepting full responsibility for any liability or harm that may result. Anything you write here may be used for discussion or endorsement of the podcast. Opinions and views expressed by the host and guest hosts are personal views. Although, we take precautions and fact check, they should not be considered facts and the opinions may change. Opinions posted by participants (such as comments) are not those of the hosts. Readers should not rely on any information found here and should perform due diligence before taking any action. For a more extensive description of factors for you to consider, please see www.psychologyinseattle.com

Files

Comments

Anonymous

I want this. All of this. 😂

Anonymous

Very late to this comment section, but had to pause the episode to write this! So many ethical red flags!! When Jennetta mentioned that there was no risk to falling in love with an AI because they’re not actually real I was shocked! She’s absolutely brilliant, and I mean no disrespect whatsoever. I feel like the risk is evident in the statement however, the relationship isn’t real! More and more we see people shirking meaningful social interaction with other humans to stay indoors and online, forming complicated parasocial relationships. Not to say that these online relationships aren’t real, but there’s certainly a difference between watching a Twitch streamer and hanging in their chat, vs going outdoors and talking to friends or meeting new people. Is the future everyone with their own personal AI that fulfills their emotional needs so other real humans don’t have to? Developing an AI version of someone’s dead relative or past lover to use in therapy for corrective experience? It seems incredibly high risk, and like there are a number of ways something like that could go terribly wrong, especially if trauma is involved. It’s mentioned that the client knows it’s not their real parent, but do they actually? The danger I see in these relationships is that their authenticity is almost seductive and malignant. I can easily see a world where someone becomes incredibly attached to an AI version of a past lover, or dead relative, and suddenly we have a whole new problem of taking that relationship from them again! AI is incredible technology, but in a world of ever increasing disconnectedness, I strongly believe it’s dangerous. I’m not some tech boomer. I’m 27, I grew up with YouTube and Twitch. I use Discord daily to talk with my friends and play games online. I don’t believe AI is going to (or is capable of) entirely solving our problems of disconnection, parenting, education, or whatever else. People are, and always will be, necessary components in that equation in my view. This scares me!