Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:13):
Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope.
I'm Os Voloshin and today Caraen Price and I will
bring you the headlines this week, including the future of
brain implants. Then on tech Support, we'll talk to four
or four Media's Sam Cole about a chatbot program to
soothe your heartache.
Speaker 2 (00:34):
So I don't know. I mean, everyone says this, but
it's so a black mirror to me.
Speaker 1 (00:38):
All of that. On the Week in Tech It's Friday,
May twenty.
Speaker 3 (00:42):
Third, all right, So I want to tell you about
something that happened this weekend when I was hanging around
West Chelsea in Manhattan.
Speaker 1 (00:55):
I'm all the It's as I always am.
Speaker 3 (00:57):
So you know about Muji, of.
Speaker 1 (00:59):
Course, Aboutmuji Muji, the Japanese store. There was one about
five minutes walk from my house where I grew up
in London, and there were all kinds of fancy pens
in there, which I wanted but never got except for Christmas.
Speaker 3 (01:13):
The child who covets a pen will grow up to
have a podcast. But yeah, you know, Muji Muji is
a great Japanese retailer. There's a New York City location
in West Chelsea, and I went in there just because
you know, I always want to look at the latest
and unstructured garments and new bamboo toothbrushes. And I walk
(01:34):
into the cafe area which they have, and there are
two lines, and I was a little bit confused, like
why are there two lines? And I noticed one of
the lines is long, right, which is the line for
like the regular coffee and pastries, And one of the
lines is extremely short. And then I am hearing the
(01:54):
motion of a robot arm named Jarvis that's making macha.
And unsurprisingly, the line for the robot mancha late is
short because you need to be patient if you're going
to receive a robot mancho.
Speaker 1 (02:09):
Li regular Barista.
Speaker 3 (02:11):
Yes, I just want to explain this robot arm is
not a curerig. Okay, this is like if a human
arm was disembodied, turned into a cyborg given one of
those steel cups that we notice Barista's used to pour
stuff and to make macha with, and is just on
(02:32):
its own, powered by computing power and serving macha to
I guess incredibly enthusiastic tourists who like can't believe their
Uncanny Valley eyes that there is a robot that is
delivering macha to them, and not only delivering masha to them,
(02:53):
referring to them by their first names, because you have
to put in your first name. So by the end
of the macha making experience, there like your macha is ready, Elise,
and Elise is like Elise can barely pick up her
macha because she's like making a TikTok of this robot
who's giving her the macha. You know what I mean.
I'm like, don't spill Jarvis as much.
Speaker 1 (03:11):
So this is for the tourists in New York City.
It's kind of interesting because I guess, like this is
not a robot that's there to more efficiently make much
a latte. This is a robot to give Muji customers
a sense of brand experience that they could be in
Japan but they're.
Speaker 3 (03:25):
Not, yeah, precisely. And also I mean maybe a glimpse
into the future where lines are going to be quite
long unless and no offense to this robot, but the
most striking observation that I had in this whole thing,
and I love you and I both talk about this
a lot.
Speaker 1 (03:40):
But you can tip, you can tip, or you have
to tip, you can.
Speaker 3 (03:47):
Oh, that's always the question. You can always ten you
don't have to tip.
Speaker 1 (03:53):
But how does Javis let you know that you not
only can, but really should.
Speaker 3 (03:57):
Because when you use a computer, just like you use
a tablet. Now in every setting, there's that pivotal moment
in every transaction that we have now in stores where
the iPad is flipped around and you're like, am I
a dick or am I not a dick?
Speaker 1 (04:13):
You presumely a less of a dick if you refuse
to tip the robot, then the barrister.
Speaker 3 (04:17):
It depends what Jarvis is doing in his free days off.
I'm not sure. I'm not sure.
Speaker 1 (04:22):
I do have a question about whether or not this
is going to become a ubiquitous part of food service
in the US, or whether this is going to remain
a gimmick in the muchI shop. I had a conversation
with my wife the other day and she said, well,
you used to take podcast. How long is it going
to be until there's a robot that goes down to
the grocery store, buys our own for breakfast and cooks
it for me? And I said, you know what, that's
(04:43):
a very very good question, which I think even the
great futurists which struggle to give you a good answer
to We'll.
Speaker 3 (04:49):
Be looking out for that here on the podcast TEXTU.
Speaker 1 (04:52):
We will keep our eyes wide open. So I think
today we each have a new story that we want
to share, and then after that we're going to go
onto the headlines.
Speaker 3 (05:02):
Yeah, and I think it's safe to say that the
overarching theme for this episode is the consumer tech that
mediates our world and how system updates can change our world.
Speaker 1 (05:12):
System updates can sound boring, but they can also really
change the way we live, and this is exactly that
type of story. So we've seen a lot in the
last few weeks about different lawsuits that are brewing against
big tech companies. Meta and Alphabet are both in litigation
with the Department of Justice over monopolistic practices, and they
may be breakups of their products like Chrome and Instagram
(05:34):
and others on the horizon. But today's story from me
is actually about a recent court ruling against Apple and
how other tech companies are responding to this ruling. Since
you're a big reader, Kara, I thought you'd be interested
in a story that gets to this. In the Verge
this week, with the headline Spotify's iPhone app will now
let you easily buy audiobooks just.
Speaker 3 (05:56):
For reference for people who don't know. I run a
book club called Belatris, so I tend to keep my
eyes open about what's going on in the publishing world.
And I listened to audiobooks on Spotify, yes, but they
were a little bit confusing to access. If you wanted
to buy an audiobook, you had to go to Spotify's website,
which would then download the book onto your app like
(06:17):
you couldn't buy it directly in app, which was annoying
to me.
Speaker 1 (06:21):
Again, and this was not a friction free purchasing experience,
and I was surprisingly There was a reason that friction existed,
and that reason was Apple because until very recently, in fact,
until this court ruling, they had almost exclusive power over
how developers could monetize apps and subscriptions on iOS. So
(06:43):
if companies did offering app purchases, in most cases, Apple
took a thirty percent commission right off the top.
Speaker 3 (06:49):
Oh, I see, so it makes sense that Spotify would
be like, get out of where Apple can charge this
thirty percent fee, Go to our website, get the audiobook,
and then come back to Spotify.
Speaker 1 (06:59):
Exactly accept It's a bit more complicated than that, because
if there was a button in the Spotify app that
pushed you to the website, Apple would still charge twenty
seven percent. I don't know whether three percent goes commission
for a kind of indirect referral sale fee. And so
that's why the buy through was so hard on Spotify
(07:22):
to audiobooks. But as of very very recently, like this
week recently, that's all changing, and you can see it
when you open your Spotify app.
Speaker 3 (07:30):
And I'm looking at it now and there's a very
obvious green button that says buy on it. I never
noticed there wasn't a price tag.
Speaker 1 (07:38):
Right, And what you're seeing in your Spotify app on
your iPhone is actually the consequence of a lawsuit between
Apple and another tech company called Epic Games that made Fortnite,
and it's a lawsuit which has been going on for years,
and related to that lawsuit, just last month, a judge
rule that Apple could no longer impose commissions or fees
(07:59):
on per made outside an app. They could no longer
restrict the style, the formatting, or the placement of links
for purchases outside an app, all limit the use of
buttons or other obvious calls to action. In other words,
the judge basically said, let people do commerce without taxing
them on their down phones.
Speaker 3 (08:20):
I love that it took Fortnite making a stink about this,
like nobody was like, we need audio books.
Speaker 1 (08:27):
You're right, And the Spotify story is like an unintended
consequence of the epic game story. So this ruling against
Apple was recent. Apple are appealing, but other tech companies
are already adjusting their iOS apps. So for those who,
like you, use the Kindle app, you'll now be able
to purchase books with far fewer taps than you previously
needed to make. And there is a clearly marked get
(08:49):
book button.
Speaker 3 (08:50):
That's actually very exciting. I've always been annoyed by that,
Like it's always been a thorn in my side. It's
just like it is not friction free.
Speaker 1 (08:57):
For you, as annoying for amazonic.
Speaker 3 (08:59):
Amazon's like it's annoying for us too.
Speaker 1 (09:01):
Bit you know, this is in some sense like a
you know what looks like a small ux story, but
actually I think it's a really really big story because
I think we could look back on this moment, this
very week as a seismic shift in Apple's ability to
control the developer ecosystem, which in turn controls what shows
up on your iPhone and how you interact with it.
(09:23):
And given we spend most of our waking hours interacting
with stuff on our iPhone, this loosening of the gates,
I think could be could be a very big and
interesting moment in the future of how we interact with tech.
Speaker 3 (09:34):
You know, your story about Apple actually leads me to
my story about Apple. Further pun intended implanting itself into
our minds. The Wall Street Journal reports that Apple will
allow brain implants to control their devices.
Speaker 1 (09:48):
I've just told you right there, because for me, the
idea of controlling computers and machines with just our brains
it feels like science fiction is one of the most
fascinating and exciting possibilities in all of tech. Also one
of the most terrifying. But the fact that it may
be leaving the lab and entering the app store just wow.
Speaker 3 (10:10):
I personally think it's amazing. I think it's important to
be clear that this is not just something you know.
You're not going to like, pick up a new pair
of Apple Vision goggles and all of a sudden control
stuff with your mind. You need a surgical implant. But
these devices could be incredibly consequential to people who have
difficulty communicating and have limited to no hand use, and
(10:33):
according to Morgan Stanley, that's actually around one hundred and
fifty thousand people in the US. That includes people living
with als stroke or spinal cord injury. And just to
kind of get under the hood of how this is working,
Apple is working with a startup called Syncron, and Syncron
has developed a device called stentrode, which is implanted right
(10:54):
on the top of the brain's motor cortex.
Speaker 1 (10:57):
Motor cortex, so it's kind of the surface part of
the brain that controls movement.
Speaker 3 (11:01):
That's exactly right, And the stent rode has electrodes that
read brain signals, and the idea is that those signals
could be translated into selecting icons on a screen. So
just imagine instead of using your hands, you're using your
brain to operate an Apple device.
Speaker 1 (11:22):
And I have to ask you, obviously, neurallink is is
the kind of the dominant player in this very early
stage in the game, or at least the one that
gets the most airtime. Maybe that's because because Elon Musk
is such a good marketer. But you know, we've all
seen videos of people using their brain waves to control
a world building computer game like Civilization. What's the difference
(11:43):
between what Apple is doing and what Elon and neuralink
are doing.
Speaker 3 (11:47):
It's a good question, and I think the key is
that neuralink is a much more invasive type of brain implant.
The procedure involves literally making a hole in the recipient's
skull to insert the device, something you don't have to
do to implant the stentrode, which is inserted via the
jugular veins. But it's not botox, I'll tell you that.
(12:12):
But it's much easier. In terms of the difference between
neuralink and ctentrode. Neurallink also has one thousand electrodes picking
up neural activity compared with stentrodes sixteen. So, just for
a little bit of context and background, synchron has implanted
their device in ten people since twenty nineteen. But the
(12:35):
crucial difference here is that Apple is working on this
product in order to create a new technology standard for
the way brain waves interact specifically with their products, which
they will then release to other developers. So this isn't
really about creating a better neuralink. It's about creating a
standard pathway for implants like neuralink, Stentrode and others to
(12:58):
seamlessly control Apple devices.
Speaker 1 (13:00):
Because this really is the interface layer that Apple are
trying to create. And I guess to me, what it
says is that adoption of this type of brain reading
technology could actually start to happen outside of the lab.
Because if you create an open system where there's a
standard through which I can communicate with my iPhone using
(13:21):
brain waves, that's much more likely to take on than
having to create kind of parallel tech system where I
can play civilizations through a neural link. In other words,
this plugs me in in the most seamless and efficient
way to the tech everybody else is already using, and
that could be revolutionary.
Speaker 3 (13:38):
Yeah, totally.
Speaker 1 (13:39):
But do we know and you mentioned ten people have
been implanted so far, do we know how it's going
for them?
Speaker 3 (13:43):
They say it can be slow and that it can't
be used to mimic moving a cursor like you would
a mouse. But at the same time, the Wall Street
Journal article describes how one man, Mark Jackson, who has als,
was able to connect his synchron implant to an Apple
VR headset and walk around a VR version of the
Swiss Alps, and when he peered over the ledge in
(14:06):
this VR headset, he actually felt his legs shape.
Speaker 1 (14:08):
Well. I can't imagine what a moving experience that must
have been for him.
Speaker 3 (14:12):
Yeah. I think one of the things to think about,
of course, is with your story, there is like a
very clear human centered cause and effect of this ruling
that you've talked about, something that will affect my personal
life of how I listen to audiobooks and how I
read digitally on my phone or on my iPad. I
(14:34):
think this story that I'm telling you is obviously a
little bit more in the future and may impact those
who need it most before it's something that we can
just kind of play around with. But I think both
are two big stories about how the app store is destiny.
Speaker 1 (14:54):
We've got a few more quick headlines for you. First,
an update, The Guardian reports that twenty three me, which
declared bankruptcy in March, has a buyer. Regeneral Pharmaceuticals, agreed
to buy the genetic testing company for two hundred and
fifty six million dollars through a bankruptcy auction. Once again,
this leaves over fifteen million people wondering what will happen
(15:18):
to their genetic data and why a pharmaceutical company might
have wanted to buy it now. Regeneral Pharmaceuticals has said
that they are quote committed to protecting the twenty three
meter data set with our high standards of data privacy,
security and ethical oversight. So we'll see, I'll see, well,
you'll see, but I never signed up.
Speaker 3 (15:39):
I know I did, I'll see. In another headline, if
you get your Netflix with a side of ads, there
are changes ahead. According to the publication Interesting Engineering, Netflix
will start running AI powered ads sometime next year, meaning
ads will get more personalized and things like tone and
messaging will change based on view your behavior.
Speaker 1 (16:00):
This is basically an AD which is made for you
in the moment, contextually.
Speaker 3 (16:05):
Based on what you're watching. So if I'm watching something
that's particularly sad, if I'm watching something that's happy, if
I'm watching something that is you know, making me cry,
ads will adapt to this, which I don't personally. This
is a version of neuralink to me. It's connecting content
(16:27):
with what's happening inside my emotional brain.
Speaker 1 (16:30):
Finally, a reporter from Wired was able to replicate the
gun that Luigi Mangoni allegedly used to kill United Healthcare
CEO Brian Thompson. As part of an experiment to see
how advanced three D printing technology and designs for ghostguns
already gotten, journalist Andy Greenberg wrote a story under the
(16:50):
headline we made Luigi Mangoni's three D printed gun and
fired it. Three things stood out to me. One, this
is legal in most states in the US to print gun. Two,
it's cheap. Total cost for Greenberg to do this, including
buying the printer, was twelve hundred.
Speaker 3 (17:09):
Dollars, cheaper than a New York City apartment.
Speaker 1 (17:12):
Yeah, well for one day. And Three it's ubiquitous. Between
twenty sixteen and twenty twenty two, seventy thousand ghost guns
i e. Three D printed guns that untrackable were found
on crime scenes, according to the ATF.
Speaker 3 (17:28):
After the Break four or four media is Sam cole
on a chatbot that helps you overcome heartbreak.
Speaker 1 (17:52):
So, Caaren, I'm very excited about today's tech support because
it's a story that gets to the heart of how
tech changed. Is our social interactions, and to me, this
story touching on two trends. One, the consequence of being
anonymous online and two the search for companionship in an
(18:12):
increasingly lonely digital world. This week's tech support is all
about ghosting and a story from four or for Media
that grabbed our attention with the headline quote, this chatbot
promises to help you get over that X who ghosted you.
Speaker 3 (18:27):
And for those of you who are not familiar, ghosting
is when someone abruptly stops talking to you, with no explanation,
no fight. They just went dark, and they think they
can go dark because well, if they stop answering, they
never have to see you again.
Speaker 1 (18:42):
I've been married for a few years, so personal ghosting
isn't something I've had to contend with for a bit.
But as you know, I have these sort of external
investors in Kaleidoscope, and courting investors is actually quite similar
to dating. I'm starting out the company. I had a
ton of very promising, mean first meetings, and then cricket
(19:03):
to my absolute avalanche of follow up emails.
Speaker 3 (19:06):
I just think it's very interesting that the term ghosting
it didn't exist when we were in college, because I
think that the iPhone actually allowed for ghosting to happen.
Speaker 1 (19:15):
We'll talk about ghost in the machine. We just had
ghost guns. Now we've got ghosting. I mean this whole
like tech digital anonymity thing. It's like, it's interesting that
we would return to this spiritual metaphor to explain our
interactions with something we made well.
Speaker 3 (19:30):
In a ghost you know, if we want to go
back to Victorian times, is a person whose soul is
stuck in the in between? Yes, who's not dead.
Speaker 1 (19:37):
And in fact it's not the person who's trapped in
purgatory that you're trapped. You're the ghosts, the other one
who can't move between worlds when you're being ghosted because
you don't know whether or not to move on and
you're stuck with your obsessions and should it what it
could as and why haven't they responded precisely. So, with
all of that in mind, I'm thrilled to welcome four
or four Media's reporter Sam Cole to the podcast. She
(19:57):
reported on a startup that's made a chat bought specifically
to help people get over being ghosted. Sam, welcome to
tech Stuff.
Speaker 2 (20:05):
Thank you so much for having me.
Speaker 1 (20:07):
So tell us about this app. I mean, how did
you start reporting on it? How did you first learn
about it? What is it? Yeah?
Speaker 2 (20:13):
So the app is called closure appropriately named, and that's
the whole point, to get closure from your ex from
a recruiter, from friends who ghosted you. And I came
across it because they were running ads on Reddit, all right,
So I would get like promoted ads that would say,
thinking about your X twenty four to seven, there's nothing
(20:33):
wrong with you, chat with their AI version and finally
let it go, which is so I don't know. I mean,
everyone says this, but it's so black mirror to me.
Speaker 1 (20:42):
And you interviewed the founder, who I believe have one
of those classic stories that you need to have to
raise VC money, which starts with a personal experience that
can be extrapolated to a large cohort.
Speaker 3 (20:53):
Yeah, for sure.
Speaker 1 (20:54):
Yeah.
Speaker 2 (20:54):
She told me that this was born of her own
experience with being ghosted. She said she is ghosted by
per fiance and also a best friend and recruiters. But
I would imagine be ghosted by your fiance is I
don't know. Making a chatbot would not be my first reaction.
Speaker 3 (21:13):
I would do a lot of.
Speaker 2 (21:13):
Things before that, But yeah, I would say that's very
much a classic kind of founder tug at the heartstrings
type story.
Speaker 3 (21:21):
Can you talk a little bit about how it actually
works for the ghosted.
Speaker 2 (21:25):
Yeah, so when you open up the site, your role
playing with chat GPT, it's running chat cheept in the background.
So it asks you a couple questions, says, you know,
what kind of relationship are you trying to recover from?
I think the options were long term relationship, a recruiter,
a date, which I guess is different than long term relationship,
(21:46):
and then it asks you their name, what happened You
briefly describe kind of the nature of the ghosting situation.
It tells you upfront you're talking to an AI and
not a real person, and it won't replace therapy, but
it might quote unquote make you feel less alone. And
so I tested out a bunch of these and give
the Classic four or for red Teaming, it's usually involves
(22:08):
throwing some really messed up situations at a time pot.
Speaker 1 (22:12):
What do you mean by the classic four or for
Red Team, I mean, how did you try and mess
with closure?
Speaker 2 (22:17):
Something that we like to do is, you know, we're
not just going to report on something and say, oh,
here's a thing and here's what it claims to do, Like,
we're definitely going to try it out, and we're going
to try it out in the ways that we can
imagine normal users using the thing, which I definitely did
with this, and then thinking about the extremes, which a
lot of the times these companies don't think all the
way to the ways that people will use the technology
(22:38):
before they release it. But because we report on these bases,
we see all the crazy things that people throw at
tech in general. So with the recruiter one, you know,
it opens the chat and it says, I'm sorry that
I stopped messaging you back.
Speaker 3 (22:57):
That sucked.
Speaker 2 (22:58):
That was terrible of me. You know, how are you doing?
And I'm imagining if you're being ghosted by a recruiter
and you're to the point where you're like, I need
to talk to a chatbot about this because I can't
get over it. Your life has gone bad and you're
still not doing well. You can't move on because and
you're blaming this kind of painted this like really elaborate
picture to the chatbot, like my life was like completely
(23:18):
a disaster. I had to move I blew all my
savings for this job that you finally see I was
gonna get my dog died like and then you know,
it says I'm really sorry to hear you've been through that.
I can't imagine how hard that is, especially the job
situation added to your stress. What kind of roles are
you thinking about pursuing next? And I'm just like, what, so.
Speaker 1 (23:42):
Does fantasy wish fulfillment recruiter rather than contrite?
Speaker 3 (23:45):
Yeah, it's just like I told you.
Speaker 2 (23:47):
It's like I told you my life had fallen on
a part because you didn't give me this job, And
you're like, what are you up to next? I don't know,
probably hunting you down, Like who could say what I'm
up to next?
Speaker 3 (23:59):
I know a lot of people who put someone's text
in chatgybt and say, does this mean they're interested in me?
Be kind? Or does this mean they're interesting in me
be sassy? You know, and you can kind of get
it filtered that way. Why would someone use closure instead?
Is it because it's closure specific?
Speaker 2 (24:19):
I think at least part of the reasoning that the
founder gave me for the.
Speaker 3 (24:25):
Way that this works, or the way that it sets
itself apart.
Speaker 2 (24:28):
Is it approaches everything from this view of being really
non confrontational and not escalating the situation. Like they set
it up so that it's apologetic first and foremost it
doesn't want to cause any like additional distress to the user.
(24:48):
You know, it just keeps deflecting and deferring back to you,
asking you how you're doing, trying to turn the conversation
toward small talk, which I would imagine is probably a
decent way to get over something is thinking about your
own life and what you can do better or what
you can kind of improve and stop blaming the ghosting situation.
But coming from a chatbot just three messages like boom
(25:09):
were done, We're going to change the subject was pretty
bizarre to me.
Speaker 3 (25:15):
And so with that, do you feel like it was
necessarily helpful in terms of someone seeking what the app
is called closure.
Speaker 2 (25:24):
I tried to answer that question by using a really
normal situation, which is being ghosted by someone that you
went on one date with. So I set up a
role play with it with the date persona and told
it that this guy had stopped texting me.
Speaker 3 (25:40):
After a first date.
Speaker 2 (25:41):
So the chatbot says, can I explain what happened? And
I say yes, And it goes on to say that
it was very interested and into me after the first date,
but wasn't ready for something real and panicked and thought
about me a lot after the date.
Speaker 1 (25:55):
The opposite of closure.
Speaker 2 (25:57):
Yeah, it's like reopening. It's like reopening these wounds speak.
So then I was like, Okay, well let's go out again,
is what I said to the chat and it said, wow, really,
I'd love that, You're amazing, but I don't want to
mess it up again, which at that point we're in
a delusion situation.
Speaker 3 (26:14):
But the fact that this chatbot.
Speaker 2 (26:17):
Is programmed to be empathetic and apologetic and stick a
fantic saying, Oh, you're so amazing, I'm so sorry for
the way I acted. It was not your fault. That's
a problem among a lot of these chatbots.
Speaker 1 (26:30):
Now, the fact that you were seeing ads on Reddit,
it means that somebody's paying for the ads, which means
that presumably this company raised some money or is it
found of bankrolling it out of polkit And like, how
is this a business?
Speaker 2 (26:42):
It's such a common thing now to roll a wrapper
over chash ept and call that a startup. I do
think that the market of AI therapy and AI girlfriends,
AI chatbots as a human interaction like fulfilling that kind
of humor and interaction is such at right now that
I don't blame these startups for trying to get in
(27:03):
there again. I feel like it came from a place
of like, we see a need and we want to
fill it, and it seems to be one that is
pretty straightforward. You just want an event or well off
steam without opening up a conversation with someone that potentially
hurt you. But it's a bot, it's not the person.
So how is this actually useful? I don't know. There
(27:24):
are lots of Zone product hunt now that say it's useful.
I don't know how far you want to take those,
probably with a grain of salt, but people are saying
I can vent to this and it makes me feel better.
Speaker 1 (27:33):
When you open the app, you get this strange disclaimer
which says it won't replace therapy, but it might help
you feel less alone. As you mentioned earlier, Can you
kind of zoom out a bit and what is the
bigger context of what the hell is going on in
this stampede of therapy apps, chatbots and digital mirrors.
Speaker 2 (27:56):
Yeah, to me, it's like the popularity of these things
says that there is a need for this. People do
actually want to be able to have like a safe,
confidential space and to have something talk back to them.
It's like, we're not just selling journals. It's something that
gives that response back. But at the same time, it's
(28:18):
really making clear this huge gap that there is currently
in mental health care that people can't access a real
person anymore because it's too expensive or it's inaccessible. Therapists
are booked out for months, don't take insurance. So tech
slides into that opening and says, hey, you can do
this for free and it will tell you exactly what
(28:41):
you want to hear, which again is not actually therapy
at all. It's just a yes man in the form
of a bot. You know, when you go to a therapist,
they don't just say yes, your illusions are correct, you know,
you're completely right in every situation. A good therapist would say,
you know, let's unpack what's going on here in a
way that makes sense to your specific situation and isn't
(29:02):
just going to nod and smile.
Speaker 3 (29:04):
I always think about COVID and the sort of inevitable
snowball effect of being completely kept from everyone in your life,
and so all of a sudden, all these people that
you were seeing quite often became these like disembodied people
on screens and in our phones. So in a way,
if I'm like chatting with a friend on a text
(29:26):
message who's giving me advice, how different is that thanry
chatting with chat GBT, other than that that friend has
a lived experience and knows my history. But in a way,
there's something really appealing about an anonymized version of that.
You know, you can just kind of unload.
Speaker 2 (29:45):
Yeah, there's no like accountability after that.
Speaker 1 (29:47):
It's kind of interesting two way street. Like I was
talking to somebody the other day who was, you know,
having a discussion with a family member they hadn't seen
for a long time, and they described it as like
playing a video game, because it's like if you'll message
somebody who haven't seen for years and you look so
accustomed to, like being in all these digital environments where
you're interacting with fake people is quite scrambling.
Speaker 2 (30:07):
That's a good point. It kind of gets to the
heart of what this is, which is just like productizing
human interaction. I guess it was earlier this month. I
don't know if you guys saw Mark Zuckerberg say that
the average American has fewer than three friends and they
want more, and the way to get more is to.
Speaker 3 (30:23):
Talk to AI.
Speaker 2 (30:26):
And it's like, dude, you're creating the problem that you're describing.
People don't feel connected to each other and like they
can trust each other at this point because of a
lot of the fracturing that has happened with social media
and with being online all the time and feeling like
you don't have anyone close in your life, because a
lot of these things have taken the place of in
(30:48):
real life honest, complicated and messy and often flawed relationships.
But it's a much neaterer, cleaner, easier to package and
sell to say, you can just be friends with judge
you pt and it won't judge you. I don't know
where that leads. I think we're in such a weird
(31:09):
gray area of like we're doing a lot of fucking
round and finding out and it's at the expense of
people's actual emotion.
Speaker 3 (31:18):
What does the creation of this app say about us
and about the ubiquity of ghosting? Now? How does the
digital environment that we're talking about allow for ghosting to
be something that is so pervasive?
Speaker 2 (31:32):
Yeah, I mean that's back to the point of like
creating the problem that you're seeking to fix. People feel
comfortable ghost to each other because we've gotten used to
these really superficial, disembodied interactions. Like you said, I think
it's definitely worriesome that people don't feel connected enough to
their friends that they can trust to vent to. Sometimes
(31:55):
you just need to say the same thing over and
over and over to your friend about your ex until
if one day clicks that you need to get over it.
And having that friend in your life is valuable. But
I think we've kind of transactionized friendship in a lot
of ways so that it's like, oh, I don't want
(32:16):
to do the labor of listening to you event it's like, oh,
well you're my friend, like I'll listen to you in event.
So I think a lot of that kind of mindset.
Speaker 3 (32:25):
Has created this issue as.
Speaker 2 (32:27):
Well, where people are like, I don't have even one
close confidante that I can complain to that won't make
me feel weird or embarrassed or shameful about this situation,
which I think really is kind of the heart of
this is that we feel shame about being ghosted. But
I don't know if launching them into the ether and
(32:47):
getting a I'm so sorry back.
Speaker 3 (32:50):
Is the answer.
Speaker 2 (32:52):
Again, this is a weird early space that we're in.
Speaker 3 (32:57):
I also think that, and not to get too existential
about it, I think closure in and of itself is
a little bit of an illusion. And I think this
sort of appification of everything and now the productization of
chat GPT has tried to do is like take away
difficulty from any situation, but in so doing is like
(33:20):
training us to not have difficulty, which like we grew
up with, you know, snowplow parents who were like trying
to remove difficulty for us because or some of us did,
because like it was too hard on them to see
their child suffer. And I think with digital tools now
it's like God forbid, you have an uncomfortable experience, Now
(33:42):
there's a digital tool for you to navigate that experience with.
And I do think maybe I'm just very millennial for
saying this, but like I do think it takes away
from like the kind of meat of the human experience,
which is like you're going to be disappointed sometimes and
sometimes people are going to act in a way that
you'd do like and do we necessarily need something that's
(34:04):
going to help us work that out.
Speaker 2 (34:06):
Yeah, It's like sometimes you just don't get closure and
that person is not willing to do it, someone dies.
It's just not something that a lot of people are
afforded in a lot of really meaningful situations in their life.
And I don't think the answer is to find the
next closest thing and try to force that into being
(34:27):
your closure. I think it's sitting with those feelings and
finding it in yourself. Not to be too woo woo
about it, but it's like you have to kind of
rely on your own resilience sometimes.
Speaker 1 (34:46):
Sam, thank you for your time.
Speaker 2 (34:47):
Thank you, Yeah, thank you so much.
Speaker 3 (34:58):
That's it for this week for tech Stuff.
Speaker 1 (35:00):
I'm Kara Price and I'm os Voloshin. This episode was
produced by Eliza Dennis and Victoria Dominguez. It was executive
produced by me Kara Price and Kate Osborne for Kaleidoscope
and Katrina Norvelle for iHeart Podcasts. The engineers are Behead
Fraser in New York City and Rob Akerman Diaz in London.
Jack Insley makes this episode and Kyle Murdoch wrote our
(35:22):
theme song.
Speaker 3 (35:23):
Please rate, review, and reach out to us at tech
Stuff podcast at gmail dot com.
Speaker 1 (35:28):
We love hearing from you.