Artificial intelligence is becoming part of everyday life. People now ask AI for advice, opinions, and even guidance about personal decisions. But this raises a deeper cultural question: what happens when machines begin to replace human wisdom and spiritual guidance?
In this episode of Into the Desert, we explore the growing role of AI in shaping how people think, decide, and understand the world. If more individuals are turning to artificial intelligence for answers about life and meaning, are we slowly confusing technology with something more profound?
This conversation looks beyond the technology itself and asks a bigger philosophical question: what is reality? From modern AI assistants to ancient thinkers like Plato and Aristotle, the struggle to distinguish between appearance and truth has always been part of human history.
Join us as we explore whether AI is simply a tool—or if our culture is beginning to treat it as something much more.
👍 Don’t forget to like, subscribe, and share if you enjoy thoughtful conversations that encourage discussion rather than division.
Tired of divisive finger-pointing? We need your support. Join us on Patreon https://www.patreon.com/c/IntoTheDesert
Ryan George AI Video: https://www.youtube.com/watch?v=zDYabud11jQ
Purchase your copy of Seeking Truth by Elgin L. Hushbeck Jr. today: https://amzn.to/3UCWnSS
Audible Version: https://www.audible.com/pd/B0B399JRWY/?source_code=AUDFPWS0223189MWT-BK-ACX0-310976&ref=acx_bty_BK_ACX0_310976_rh_us
Narration: Larissa Munz
Image by pngtree.com
#ArtificialIntelligence #AI #FutureOfAI #TechnologyAndSociety #Philosophy #Reality #HumanWisdom #DigitalAge #Aiethics #CriticalThinking #IntoTheDesert
In this episode of Into the Desert, we explore the growing role of AI in shaping how people think, decide, and understand the world. If more individuals are turning to artificial intelligence for answers about life and meaning, are we slowly confusing technology with something more profound?
This conversation looks beyond the technology itself and asks a bigger philosophical question: what is reality? From modern AI assistants to ancient thinkers like Plato and Aristotle, the struggle to distinguish between appearance and truth has always been part of human history.
Join us as we explore whether AI is simply a tool—or if our culture is beginning to treat it as something much more.
👍 Don’t forget to like, subscribe, and share if you enjoy thoughtful conversations that encourage discussion rather than division.
Tired of divisive finger-pointing? We need your support. Join us on Patreon https://www.patreon.com/c/IntoTheDesert
Ryan George AI Video: https://www.youtube.com/watch?v=zDYabud11jQ
Purchase your copy of Seeking Truth by Elgin L. Hushbeck Jr. today: https://amzn.to/3UCWnSS
Audible Version: https://www.audible.com/pd/B0B399JRWY/?source_code=AUDFPWS0223189MWT-BK-ACX0-310976&ref=acx_bty_BK_ACX0_310976_rh_us
Narration: Larissa Munz
Image by pngtree.com
#ArtificialIntelligence #AI #FutureOfAI #TechnologyAndSociety #Philosophy #Reality #HumanWisdom #DigitalAge #Aiethics #CriticalThinking #IntoTheDesert
Category
✨
PeopleTranscript
00:00One church is using AI for confessions.
00:04Meanwhile, Pope Francis says priests should not be using AI for their homilies.
00:10Is artificial intelligence in the confessional booth?
00:14This isn't just about technology.
00:16It raises deeper issues about the culture.
00:19If people are asking AI for advice and opinions,
00:23are we slowly replacing human and spiritual guidance for machines?
00:30And if that's happening, what does it say about how we really understand reality itself?
00:36What is reality?
00:38Welcome to Into the Desert, exploring the wilderness of ideas.
00:42Don't forget to hit those like and subscribe buttons.
00:44And we need your support.
00:46The algorithms are pushing divisive content while hiding serious discussion.
00:51Please become a fellow adventurer on Patreon and support this channel's mission
00:55of encouraging discussion, not division.
00:57The link is in the description.
01:00So, Larissa, I read a couple news reports recently.
01:04One of them was a church in Switzerland was using AI for its confessional booth.
01:10And then another one was Pope Francis was telling priests not to use AI.
01:16So, what's going on here?
01:19Well, I think what we're seeing is something that's part of a larger issue.
01:24On the one hand, religion is a relationship with God.
01:27That's the purpose of religion, is to have that human relationship with God.
01:32So, when you bring in anything artificial, you basically take the humanity out of it and therefore defeat the purpose
01:40of religion.
01:40You can't substitute humanity with artificial anything.
01:48That's why it's called artificial.
01:50You know, you can have an artificial plant or you can have a real plant.
01:54And there is a difference.
01:55There is a massive difference.
01:58I don't care how good the artificial plant looks.
02:00I don't care how much perfume you put on your artificial plant.
02:03It can never be a real plant.
02:07Same thing with AI.
02:09You cannot have anything artificial become real.
02:14I don't care how much it looks and feels and smells like the real thing.
02:19It is, by definition, artificial.
02:23And so, when the Pope is saying that priests shouldn't use AI for homilies,
02:28it's because you can't put something artificial into the real life.
02:33You can't take that humanity out of religion because then it's not religion.
02:39It's something else.
02:41Artificial religion.
02:42We'll call it AR, which I know, augmented reality.
02:46But again, this is part of a larger issue, right?
02:54When my children were young, I bought them Echo Dots.
02:58And this was because I wanted them to have music to fall asleep to.
03:02And I would set it to like a classical music that they could just kind of fall asleep to at
03:07night.
03:07And then it would have an alarm clock for them in the morning.
03:09So, I could buy this and it was like on sale.
03:12So, it was cheaper than any sort of other mechanism I could get that would do that function for me.
03:18So, I bought them little Echo Dots.
03:20And within six months, they started talking to it like it was a real thing.
03:26And I had to teach them that, and we call it She Who Cannot Be Named because, you know, you
03:32say her name and then suddenly she's listening and you don't want her to.
03:35So, we'd say, She Who Cannot Be Named is not a real person.
03:39You can't talk to her like a real person.
03:42And it took a long time for them to understand that.
03:45That was a real lesson that they had to learn.
03:47And now I see all of these commercials about, you know, what you can do with AI and technology.
03:54And, you know, even Ryan George last week put out a video about having AI write an email to his
04:00boss and, you know, great tongue in cheek of how bad it can go.
04:03But this idea that it can replace us as humans is very scary because I think people are confusing what's
04:17real with what's fake.
04:20Because like the fake flower, it looks and it smells like a flower, but it's still a fake flower.
04:26And if people don't understand the difference between reality and technology, I think we're in for a very rude awakening.
04:38Well, yeah, it's not really just a new problem with AI.
04:42It's you can go back to our view of reality is shaped by our experiences.
04:47And you go back far enough, you start finding people are writing down experiences that never happened.
04:56And we call them books and novels, stories.
05:01And, you know, if you immerse yourself into like, you know, dungeon and dragons type stories, mythologies or fantasy, it
05:12begins to affect how you think.
05:14I mean, it's one of those things that if you do a little bit of it, it's sort of harmless.
05:19But if you read a lot, it can begin to shape who you are.
05:23And you see this in books and then you see this in in movies where our view of reality, since
05:30it's a visual presentation, we actually see people and we begin to confuse the characters in the movies with real
05:40people.
05:40And, you know, you talk to people, they're reading a book and they're, you know, an important character dies and
05:48they cry.
05:50They they they feel sadness.
05:53Why?
05:54Because, you know, there's a confusion going on with reality.
05:57And then you move into the TV age where instead of these alternate realities we consume on an offhand basis,
06:06we started consuming more frequently.
06:10And then you move into the Internet age and now the cell phones and the total connectivity and, you know,
06:17particularly with VR.
06:18And you begin that that blurring that was somewhat existent before is now becoming very real.
06:27And, you know, they've done studies on this.
06:29There's a study where they surveyed women's views of love and found a very strong correlation between various opinions about
06:39love and various types of media these women consumed.
06:44And so, you know, it's kind of a chicken and egg problem.
06:48Are they consuming that media because they have those those notions of love or are they their view of reality
06:55being shaped by what they see on the screen?
06:59Because they see characters and, you know, it's sort of like the old thing about lawyers were complaining that they
07:05can't get a guilty conviction because, you know, someone was watching Perry Mason and the guilty person always confesses at
07:12the end of the show.
07:13And since the person didn't confess, they must have been innocent, whether that's mythical or not, that typifies the view
07:21of reality that people begin to shape.
07:25You see things happen in the movies so much that they begin to think they're actually real and it affects
07:33how they look at the world itself.
07:36Well, yeah, and you see this, you know, the nice thing about these kinds of mediums, right, is they get
07:44to the point where they start showing you a little bit of real life in it.
07:49So you see movies and TV shows and books where one of the characters is famous and then something bad
07:56happens because someone assumes that they are like it for an actor, for example.
08:03They assume that the actor is the character and not a person.
08:08And so if you act like a bad guy, people are going to say, oh, you're a bad guy and
08:12they're going to be scared of you, even if you're the nicest person in the world, because they can't separate
08:18the actor from the character that they play.
08:22Yeah, the whole concept of we blur reality and fiction together because we consume so much media that we don't
08:33really, you know, you have a whole generation now growing up on their cell phones who you could argue they
08:40their reality is more the artificial reality they see in their cell phones than the reality of the people.
08:49They, they, they, their, their whole view of life is, you know, what they see on, on TikTok or the
08:57videos they watch.
08:58They don't consume a lot of other, you know, they don't actually go out and talk to people and find
09:04out what's going on.
09:05They're, they're on their phones getting this artificial view of what everybody else is doing.
09:11Yeah, and there's this, there's this expectation if you are posting online, so you're an influencer or a podcaster or
09:23someone with some sort of online presence, that what you post online has to be a perfect version of your
09:34reality of who you are.
09:35Which ends up by definition making it fake and unreal and you see that where there's now a trend of
09:45videos that show that these influencers are lying and these people aren't who they say they are.
09:51And I think it sets up bad expectations because you have, you have people who are literally going through traumatic
09:59facial reconstruction and body reconstruction surgeries in order to look like some filter that you have on your phone.
10:09And when your view of reality is so skewed that you're willing to put your body through extreme pain to
10:19make reality fit what you think reality should be, again, it's very scary.
10:27Yeah, it's not a new problem.
10:30You know, you see this with, you know, people who are, you know, anorexic or have some other disorders or
10:37their view of reality doesn't match the real world.
10:41And they, they, their view of reality is they're, they're overweight and they need to stop eating.
10:47In reality, they're starving themselves to death.
10:51And this mismatch between reality and their view of reality is, you know, in that case, extremely harmful.
11:00And we, you know, to some extent we all suffer from this.
11:03We all have our views of reality and we all have our, you know, we have opinions about, you know,
11:10which TV shows are the best, which politicians we like, who's good, who's bad.
11:16Uh, and they're all at their core, go back to a view of reality that has been molded over the
11:26years as we, as we grew up and matured.
11:29But if that reality is now an artificial one created by AI, then, you know, what's that going to be?
11:40Um, what's that feedback loop going to end up generating?
11:44And, you know, some respects life is hard.
11:48Uh, you know, one of the summaries of philosophy of life I've heard once was, you know, you know, life
11:55is suffering and then you die.
11:58Which is not a very, uh, not a very rosy way of looking at things, but through human history, it's
12:05actually probably the more correct view of history.
12:09Life is very hard and we suffer and it's not easy to negotiate through life.
12:16You know, one of my, one of my favorite sayings is life is, you know, life is what happens while
12:23you're making plans.
12:24And if you have a view of life that's artificially shaped, um, not reality, then that difference between your view
12:34and reality will come crashing in at some point.
12:38About 15, 20 years ago, um, my ex-husband and I, we hosted exchange students and we were support coordinators.
12:45And part of that responsibility was when the students would come in at the beginning of the school year, we
12:52would, and we dealt with high school students.
12:53We would get them all in a group and we would kind of talk to them about American culture and
12:58some of the things that are a little bit different.
13:00Like the fact that you can't just expect to hop on a bus and go wherever you want.
13:04The fact that America is so much larger than you could possibly imagine and going from one state usually, or
13:15can most times take longer than it would take for you to drive across Europe.
13:19Um, so there's, it was kind of, uh, trying to set expectations, but one of the things we would do,
13:25one of the activities we would do with them is we would have them sit down and we'd have them
13:29talk about what they expected of their year here in America and what their expectations were.
13:36And one of the things we tried to warn them about is that those of you who have very high
13:42expectations of what life is like in America for whatever reason, because you watch sitcoms or because you've watched a
13:50lot of videos about it or because you've read books about it or because you've met Americans.
13:55If you come in with a lot of expectations of this is what it's going to be like, your year
14:01ended up being worse than when you had no expectations, when you were just, you know what, I just want
14:08to see what happens.
14:09I'm hoping for the best, but I'm just, I just want the experience.
14:12Those people had a much better year, a much better time.
14:17And honestly, we didn't have to really talk to them much throughout the years, but those that had, well,
14:23it's supposed to be, those are students we had issues with.
14:28We had to move them out of their houses or we had to have conversations with them about their school
14:33and, you know, expectations.
14:35And it was kind of a hard situation because basically it's like the story you were telling a while ago
14:43in one of our other videos about you had the two people who were getting a bonus and the one
14:49person expected the bonus and the other person didn't expect the bonus.
14:52And the one who did was disappointed because it wasn't as much and the one who didn't expect the bonus
14:57was like thrilled because they got money from nowhere, even though the amount was exactly the same.
15:02So again, expectations of reality will affect your eternal happiness.
15:12Yeah, and this is going back to the whole issue of AI and we build our expectations off of what
15:20we see around us.
15:22If we think everybody else is doing fine because that's what appears on, you know, the social media that we
15:31follow, then we're going to look at our own life, which is not doing fine and think that, you know,
15:36there's something wrong.
15:37You know, it can generate anger, it can generate depression, but it's not going to be a good thing.
15:46And so when you look at AI, I side with the priests here because I'm not Catholic, but I've spoken,
15:54you know, I've done sermons before in churches.
15:57And when I do, I don't do it very frequently, but when I do, you know, it's something I pray
16:04about quite a bit.
16:05What am I going to talk about?
16:07And, you know, that interaction is where I end up developing what I'm going to talk about.
16:14Well, I think that's coming from God.
16:19Now, if I'm just asking AI to whip out a sermon for me, where's the room for the Holy Spirit
16:26to come in and speak to me?
16:29You know, I'm just following what the AI told me to do, and therefore I'm not following God.
16:35I'm following some artificial computer intelligence someplace that doesn't even have consciousness as we understand it.
16:46So I think the Pope is right that the idea that you would develop your homilies, which is the Catholic
16:56equivalent of a sermon, the fact that you would develop that using AI, yeah, I can see the temptation.
17:03You know, your life is busy, got a lot going on, got to whip out a homily for this week.
17:09And, you know, it's just real tempting to, you know, push that button and have one generated for you that
17:15you can just read.
17:16But there's no room for human interaction or spiritual growth or anything like that.
17:24It's just a computer.
17:27And at which point are we programming the computers or the computer's programming us?
17:32Well, you bring up an interesting point.
17:34You know, there's always this talk about propaganda.
17:36And anytime anybody says the word propaganda, at least from my experience, it's always said in a negative way.
17:44Propaganda is bad.
17:46Propaganda is evil.
17:47It's used to brainwash people.
17:49And I think maybe by some sort of technical definition that I'm not all up on, that probably is correct.
17:58But I think everything in life is a way of using propaganda.
18:06Like, we use our religion.
18:08We use our culture.
18:11We use our interactions.
18:13There's – everybody always talks – when they talk about culture, you have your written rules and you have your
18:17unwritten rules, right?
18:19And your written rules are things like don't steal and don't kill and don't jaywalk.
18:25But then you have the unwritten rules of – especially it was a hard thing.
18:30So in California, you walk down the street and you don't look at your neighbors.
18:33You don't talk to your neighbors.
18:34You just kind of walk down your street and maybe nod, you know, if you have to pass someone next
18:39to you.
18:39Well, you come out here in Wisconsin and there's this unwritten rule of you wave to people.
18:44Even if you don't know them, you're driving along the street and they're walking and you're waving.
18:50And so, again, I feel like that is a type of propaganda, but I also don't think it's bad.
18:57Yeah.
18:58What you said reminds me of a statement from Marcus Aurelius.
19:02Everything we hear is an opinion, not a fact.
19:05And everything we see is a perspective, not reality.
19:08And I think what we need as a human society to do is figure out what are the goods.
19:16And in AI, even there are good uses, right?
19:20I love the fact that I have AI to correct my grammar and spelling.
19:27I don't always use their suggestions, but I like the ability to see the different options
19:33and choose what voice that I want to use when I'm doing my writing.
19:37But using it for human interaction, you know, the Google commercial where they were saying,
19:44how do I tell my friends I don't want to have dinner with them?
19:48Really?
19:49That's what you need AI for?
19:52I think those are things we need to learn for ourselves.
19:54We need to learn how to do human interactions.
19:57If you're in school and you have to write an opinion about something, an opinion about a book,
20:03an opinion about a movie, an opinion about a class, an opinion about whatever.
20:07Having AI write your opinion, is it really an opinion then?
20:13So I think we need to draw lines to say, this is where we need to use our tools.
20:21This is where we can get our help.
20:23This is how we can have a good society.
20:27And then these are the bad things to do and where we need to have that line in the sand,
20:33so to speak.
20:35So what do you think?
20:37Do you think that AI can be a real human interaction?
20:43Do you think it's useful in our lives?
20:46And what is reality is, is AI part of it or do we need to make sure that we're always
20:55cognitively aware that AI is artificial, just like an artificial flower?
21:01Tell us what you think in the comments.
21:03We do read your comments and we respond to many.
21:06And if you've made it this far in the video, please consider watching one of our other videos
21:10and supporting this channel on Patreon.
21:12Thank you for watching Into the Desert, exploring the wilderness of ideas.
21:16And to our fellow adventurers, a special thank you for helping to support this channel.
Comments