🎙️ Is AI truly the hero of climate innovation—or are we ignoring its hidden environmental cost?
In this sharp and revealing episode of Tangelic Talks, co-hosts Victoria Cornelio and Andres Tamez sit down with Chris Carter, Chairman & CEO of Apro Eo, to uncover what the surge in AI, cloud systems, and data centers really means for the planet.
With decades of deep expertise across enterprise systems, AI infrastructure, and sustainability technologies, Chris offers a rarely heard insider lens on the tech industry’s energy footprint—and the urgent ethical challenges we’re not prepared for.
This isn’t a hype conversation.
This is a reality check.
🌍 What We Break Down in This Episode
🤖 The true energy + water cost of AI models
💾 Data centers demystified — why most public debates get them wrong
🌡️ The climate footprint of the global AI boom
⚡ Nuclear, gas, solar & more — evaluating real energy solutions
📉 Why most AI projects collapse before delivering value
📈 How AI reshapes global capital, investment patterns & climate finance
🛑 Why human judgment is irreplaceable in AI-driven systems
🧠 The danger of AI hallucinations and manufactured “truth”
📚 What responsible AI validation and governance should look like
🎮 Tech optimism vs hard engineering reality — Chris’s 40-year lessons
👤 Meet the Guest: Chris Carter, Chairman & CEO of Apro Eo, is a pioneer at the intersection of enterprise technology, cloud systems, and sustainability. With unmatched industry experience, he breaks down the real-world physics, economics, and ethics behind today’s AI revolution.
💬 Join the Discussion: Is AI a climate solution—or a growing climate risk?
👇 Share your thoughts in the comments!
🌱 Support Tangelic’s Mission: Help us amplify clean energy access, climate justice, and community storytelling.
✨ Donate or get involved at TangelicLife.org
🔖 #AI #ClimateTech #DataCenters #EnergyUse #Sustainability #GreenTech #ClimateAction #DigitalEthics #CloudComputing #EnergyTransition #AIFootprint #ResponsibleAI #ClimateInnovation #FutureOfEnergy #Decarbonization #TechForGood #EnergyCrisis #ClimateSolutions #TangelicTalks
📊 AI climate impact, data centers energy use, climate tech podcast, AI sustainability, AI environmental footprint, data center cooling, clean energy transition, tech for climate solutions, AI water consumption, renewable energy and AI, green data centers, climate tech explained, AI ethics and climate, climate justice technology, enterprise AI challenges, AI project failure rate, climate tech innovation, energy crisis tech, future of clean energy, nuclear energy debate, natural gas and climate, solar grid integration, climate solutions podcast, AI misinformation risks, sustainable AI development, cloud computing climate impact, climate tech interview, AI and global power demand, responsible tech leadership, Tangelic Talks
In this sharp and revealing episode of Tangelic Talks, co-hosts Victoria Cornelio and Andres Tamez sit down with Chris Carter, Chairman & CEO of Apro Eo, to uncover what the surge in AI, cloud systems, and data centers really means for the planet.
With decades of deep expertise across enterprise systems, AI infrastructure, and sustainability technologies, Chris offers a rarely heard insider lens on the tech industry’s energy footprint—and the urgent ethical challenges we’re not prepared for.
This isn’t a hype conversation.
This is a reality check.
🌍 What We Break Down in This Episode
🤖 The true energy + water cost of AI models
💾 Data centers demystified — why most public debates get them wrong
🌡️ The climate footprint of the global AI boom
⚡ Nuclear, gas, solar & more — evaluating real energy solutions
📉 Why most AI projects collapse before delivering value
📈 How AI reshapes global capital, investment patterns & climate finance
🛑 Why human judgment is irreplaceable in AI-driven systems
🧠 The danger of AI hallucinations and manufactured “truth”
📚 What responsible AI validation and governance should look like
🎮 Tech optimism vs hard engineering reality — Chris’s 40-year lessons
👤 Meet the Guest: Chris Carter, Chairman & CEO of Apro Eo, is a pioneer at the intersection of enterprise technology, cloud systems, and sustainability. With unmatched industry experience, he breaks down the real-world physics, economics, and ethics behind today’s AI revolution.
💬 Join the Discussion: Is AI a climate solution—or a growing climate risk?
👇 Share your thoughts in the comments!
🌱 Support Tangelic’s Mission: Help us amplify clean energy access, climate justice, and community storytelling.
✨ Donate or get involved at TangelicLife.org
🔖 #AI #ClimateTech #DataCenters #EnergyUse #Sustainability #GreenTech #ClimateAction #DigitalEthics #CloudComputing #EnergyTransition #AIFootprint #ResponsibleAI #ClimateInnovation #FutureOfEnergy #Decarbonization #TechForGood #EnergyCrisis #ClimateSolutions #TangelicTalks
📊 AI climate impact, data centers energy use, climate tech podcast, AI sustainability, AI environmental footprint, data center cooling, clean energy transition, tech for climate solutions, AI water consumption, renewable energy and AI, green data centers, climate tech explained, AI ethics and climate, climate justice technology, enterprise AI challenges, AI project failure rate, climate tech innovation, energy crisis tech, future of clean energy, nuclear energy debate, natural gas and climate, solar grid integration, climate solutions podcast, AI misinformation risks, sustainable AI development, cloud computing climate impact, climate tech interview, AI and global power demand, responsible tech leadership, Tangelic Talks
Category
📚
LearningTranscript
00:00And I'm going to say this, and I hope everybody listening to this pings me on my LinkedIn and said,
00:05thank you.
00:05You've got some of the most brain-dead people in Congress trying to make these laws that they have.
00:13I've been in front of congressional folks, and I've talked to them.
00:17I've handed them copies of my book.
00:18I said, you or your people need to read this and learn about what this does and how this does
00:23and what these do and why you're doing it.
00:26Where can humans never be moved from?
00:28Restroom attendant.
00:35I don't want AI to ever replace anything when it comes down to the day-to-day, the real-life
00:47activities that we have where we need social interaction with each other.
00:52So I think your question resonates because it goes into so many different points.
00:57Wow, I can start doing shadow puppets over here.
01:01Sorry about that, all.
01:02You've got to love it when it hits 39 degrees.
01:05We were talking offline.
01:06Global warming had a bad media rep.
01:09It was bad.
01:11Come on, Wisconsin, for that.
01:24Welcome to Tangelic Talks, your go-to podcast from Tangelic, where we dive into the vibrant world of clean energy,
01:31development, sustainability.
01:32And climate change in Africa.
01:35And climate change in Africa, we bring you inspiring stories, insightful discussions, and groundbreaking innovations from the continent making waves
01:41in the global community.
01:42Tune in and join the conversation toward a brighter, greener future.
01:46Let's get started.
01:50All right, welcome, everybody, to Tangelic Talks, a podcast at the intersection of energy, equity, and empowerment with your co
01:58-hosts, Victoria Cornelio and Andres Tamas.
02:00Today's episode, we're digging into one of the biggest questions shaping the future of climate action, how AI is transforming
02:07the climate and sustainability sector, and where we need to be, in my opinion, careful.
02:11But we've got Chris Carter here, chairman and CEO of Aproyo, a global technology services company operating at the heart
02:18of large-scale enterprise systems.
02:20And he's going to help us understand if we need to be careful, if we need to be optimistic, you
02:25know, all the things, hopefully.
02:28Chris is widely recognized for using data and emerging technologies to boost efficiency, improve decision-making, unlock better business outcomes,
02:36and he brings a critical perspective on what automation can and can't solve.
02:41Thank you, Chris, for being here.
02:43Hey, thanks for having me, guys.
02:44I'm excited.
02:45This is going to be fun.
02:46Oh, I'm so happy you made it.
02:49Me too.
02:50I have negative 37 degrees outside.
02:52There's our climate.
02:54There's some of our weather.
02:56There's some of the technology changes that are going on right now behind us.
02:59There we go.
03:00See?
03:01Very current.
03:02I like it.
03:03Yes.
03:04I guess, how did you – I am always interested to know how people got into the intersection they're in,
03:09because this is a podcast all about the intersection.
03:12And I guess AI is a new thing, but technology itself, bridging into business, into sustainability, innovation.
03:19How did you find yourself here?
03:21Well, I found myself here a few years – actually, many years ago now, because technology runs in data centers.
03:27And data centers are all about what we need from fuels, from batteries, from air, from sustainability, to the energy
03:37cost.
03:38Yeah.
03:39And I found myself doing a lot and lot and lot of research.
03:43And when I say –
03:43You're the epitome of resources, resource-intensive.
03:47Very resource-intensive.
03:48So I was using my tools, and then everybody started getting into my machine learning, which became AI.
03:54And then I started using my AI tools to get into a lot more research.
03:59And it's interesting, because data centers is what everybody needs in order to keep using their iPhones and their Androids
04:08and their computers and all their chat GPTs and Geminis.
04:11And it's funny, when I started talking with more and more of just, I'll say, everyday people, they didn't know
04:20what we all had to do, what we had to use, what was the power consumptions, what was the –
04:25needing cooling structures.
04:28So I really started to go into detail on that.
04:31And that's where my technology and my intersections – I had so many different intersections coordinating.
04:37I still have every day between my technology and my data center activities and my EV-based activities with solar
04:44power.
04:45And I've got an intersection with technology and what we're doing with cloud computing.
04:50So there's so many different intersections out there to start looking at and start feeling aware of.
04:57Yeah, that's really cool.
04:58I mean, you mentioned that the everyday person doesn't really know, you know, the power consumption, water consumption, all these
05:04things.
05:05Do you reckon that's by design?
05:06Or is it because it's so hard to understand?
05:09Is it a marketing issue that we're not telling people?
05:11Like, why don't we know?
05:13That's a great question.
05:15Because when I bring this up with my parents, my folks will think that, oh, my gosh, you know, you're
05:22using so much power to run these things.
05:24And my dad doesn't even like to use a cell phone.
05:27We finally got him out of the flip phone and got him into an Apple phone several years ago, which
05:31was amazing.
05:33Technology is funny.
05:34You can't get him on an iPhone.
05:36But at the end of the day, I think it's a combination.
05:38People don't want to know.
05:40People don't know.
05:42You get folks that are out there that are so loud about one side of it that they don't bring
05:51the real truth to some of it because that's not their agenda.
05:55And then you have the other side where it's such a benefit.
05:58It's such a huge need that they just want to build and they want to get it up and they
06:03want to do the activities and they want to provide for people.
06:06And I'm a helper.
06:08At the end of the day, I'm a nurturer.
06:09I'm a grower.
06:10I want to help people.
06:11So I want to get things up and going as quickly as possible in the most efficient and effective ways
06:16possible.
06:17I'm not a loud, boisterous person.
06:19I'm not going to sit in a board meeting and pound my fists on the table.
06:23You can't do this.
06:25You can do this.
06:26You must do it.
06:28I leave that for other people who want to be loud and obnoxious.
06:32I'm going to give facts and I'm going to state what's out there.
06:35I'm going to state the obvious and then I'm going to go deeper and deeper.
06:39So I think your question resonates because it goes into so many different points.
06:43Wow, I can start doing shadow puppets over here.
06:47Sorry about that all.
06:49You got to love it when it hits 39 degrees.
06:53Global warming had a bad media rep.
06:55It was bad.
06:57Come on, Wisconsin for that.
07:00But I digress.
07:01So it is a number of different touch points on there.
07:05That if you just change the dynamic one degree or all of a sudden people are going to want to
07:13build hundreds of data centers because they're so important.
07:16All of a sudden they're going to want to use nuclear and natural gas because it is so prevalent and
07:22it's so much safer nowadays and so much better compared to coal fired and so on and oil and gas.
07:28Or then you're going to get people with water activities.
07:31People don't understand that 99 point some percent of the water used inside of a data center is reclaimed water.
07:40It's just it's potable water.
07:42It's reclaimed.
07:42They don't realize.
07:43It's so funny.
07:44I was I was at an event in our local state here.
07:49We've got a number of different data centers that are being proposed and somebody's yelling at the top of their
07:54lungs.
07:54We're going to lose our water table.
07:56And I looked at them and they were literally just in front of me and two people over.
08:02And I whispered to the guy next to me and I goes, he doesn't understand that the water is reclaimed
08:07water.
08:08Does he even know what reclaimed water is?
08:10And he heard me and he spun on his heels.
08:12He goes, what do you mean reclaimed water?
08:14What is reclaimed water?
08:15I don't understand what's going on.
08:16Why doesn't anybody explain this to me?
08:18And the gentleman had to be at least in his upper 70s.
08:23Wow.
08:23So when a lot of these board meetings and I have to go to him because I have to present
08:30Ph.D. level information, I'm the type of person that likes to bring it down.
08:37I'm a very common individual.
08:39I'm very salt of the earth.
08:41I'm very blue collar.
08:42I was raised that way.
08:43Just so happened.
08:44I went off to college and became educated.
08:47I got lucky.
08:48Yeah.
08:48I don't want anybody else to understand it the way I do.
08:51And so I stood up and I asked a question.
08:53And I said to the board that was out there listening and there were, I was not affiliated with anybody
09:00pitching this data center.
09:01Okay.
09:02I said, I just want to make one note.
09:05You guys have been talking about water.
09:06I heard about 15, 20 people come up here and talk about water and the water table.
09:10I said, once you put that water in for the first and only time, it's like filling your swimming pool.
09:15Okay.
09:16Now it becomes reclaimed.
09:18The systems that they put in place cool the back of the computers and they go through a reclaiming process.
09:27And so it keeps getting reclaimed within the system.
09:30You're not being a whole swimming pool with the water in every single day.
09:35And the cool thing with that is you also get cooling aspects by wind and activities that come through.
09:43Like in Wisconsin or perfect one is Las Vegas.
09:47They have a great facility.
09:50It's called switch.
09:51Okay.
09:52They have airflow coming through and that airflow gets trapped into the filters and the activities.
09:58So it helps with cooling using the water and it gets reclaimed.
10:02So now water droplets do get reclaimed.
10:05And I started to explain some of the stuff and literally at the end of the meeting, I got pulled
10:10aside and people started asking all these different questions about it.
10:15This is why you need to have people have conversations and we need to talk, not yell at each other.
10:21Let's have a basic conversation.
10:24And I said to the sir, I said, sir, you're probably the same age as my dad.
10:27I said, are you 77?
10:29He goes, well, yes, I am, young man.
10:31I said, well, thank you for calling me a young man.
10:33But let me explain to you what this does.
10:36And we literally had the same conversation between he and I.
10:40Nobody argued.
10:41Nobody raised their voice.
10:43And he walked away going, I never knew that.
10:46Yeah.
10:47And that's the thing.
10:48When you help educate people, and I don't care what you're doing.
10:51I don't care if it's stuff that's going on in the world today, if it's the data centers or if
10:55it's other.
10:55When you have normal conversations and you simply educate each other, people are very smart in one area.
11:03It may not be in another.
11:04I'm horrible in HR.
11:06You wouldn't get me doing any HR.
11:08So I'm going to have a conversation with great HR.
11:10But when it comes to technology, I'm pretty darn good at it.
11:14So I like to help people with that.
11:15That's really cool.
11:16And you sound very much like a tech optimist.
11:19But also not one that is, well, part of my language, but up your ass.
11:25You know, you're very much like the innovation is exciting because, right?
11:30Yes.
11:31Do you understand why people are scared, though?
11:33Oh, gosh, yes.
11:34Oh, yes.
11:36Hell, I'm still afraid of AI.
11:38I think because, to be honest, I have a philosophy that we have not even gone.
11:44I'm a baseball guy.
11:45So I love baseball.
11:47Don't like the Willie Peralta trade, but that's a different story for a different day.
11:51I think that from an AI perspective, we have not even gone into the clubhouse and changed
11:57into a uniform to start batting practice yet.
12:01And it's so fresh and so new.
12:03I cannot wait for the Terminator because I can't think the Terminator is going to be
12:07here in about 20 years.
12:09Probably 10.
12:10I think one of the things is a big misunderstanding of the issues because AI, well, right now,
12:15at least, AI is not the scary thing, right?
12:18And the scary thing isn't even the data centers, even though they have their big issues.
12:23They really, really do.
12:25It is the fact that it's the arms race.
12:28It is the, on a global scale, we have to beat China.
12:34Beat them to what?
12:35We don't know.
12:35But we have to beat China.
12:37And on the national sense, you have every single company just implementing so much with
12:47complete disregard because they need to stay at the cutting edge.
12:50Yes.
12:50Because if they're not on the cutting edge, then they will fall behind.
12:54But that is going to generate so much e-waste and so much electricity just boomed.
13:02And they don't even know what is cutting edge.
13:05They literally have been...
13:06Absolutely.
13:06That's why that MIT, there was an MIT document that came out probably six months ago now where
13:13it said that 93% of all AI projects, enterprise AI projects fail.
13:18And I read through that thing, soup to nuts.
13:20And I looked at them and I said, well, I know exactly why these fail.
13:24You've got somebody over here that said, I need to do AI.
13:28And they said, find me an AI project.
13:31And you had somebody over here that says, oh, I play with AI.
13:34I can do an AI project.
13:35What do you want it to be?
13:36And then you read through and they're like, well, wait a second.
13:39Why are you using Canva for finance?
13:42Why are you using this?
13:44Why are you using...
13:45I'm like, guys, gals, come on.
13:50Who is...
13:51Who is running the ship at this point?
13:54Well, you definitely don't want the CEOs running the ship.
13:57I have so many resources put into what essentially seems like a, like, like stock bait, like, like market bait.
14:06Right.
14:07And it is, that is the danger.
14:10AI is a cool technology.
14:12Now, at the point it is right now, what the CEOs want it to be is not what it is.
14:17It's just, it's just not.
14:19But, like, it is, because, like, I have a, I have a friend.
14:22He works, like, for the power grid.
14:25And they, they, they, they're like, here's AI.
14:28Use it.
14:29And it's like, you're giving me AI for a problem that does not exist.
14:32I have proprietary tools that are right 100% of the time.
14:36They've been right 100% of the time for the last, like, 20 years I've been here.
14:41Right.
14:41Like, why, why am I going to ask something that could possibly give me the wrong answer or hallucinate?
14:47Like, why are you, why are we implementing this here?
14:50It doesn't make any sense.
14:51I tell people if they want to use AI, great.
14:55If you're going to use it for writing emails and all that kind of stuff, use Gemini, use ChatGPT, whatever
15:01the case may be.
15:01If you're going to use Funky, Pictures, Nano, Banana.
15:04But if you're going to use it in the real world, in ERP and in businesses, here's what you got
15:09to do.
15:10Decide what you're going to use it for.
15:11If you're going to use it to write code and you're going to be, like, I'm a code monkey.
15:15I call myself a code monkey because that's how I started.
15:17I would write code all the time.
15:19Now, back 40 years ago when I started, that would have been great to have.
15:22But what you need to have is you need to use either Snowflake or Perplexity or SAP Jewel or any
15:29of these tools that can validate your code and help you write code.
15:33And then you work together on it.
15:36You hear that, Microsoft?
15:37You hear that?
15:39Yeah, Microsoft.
15:40Your code pilot is at times with code.
15:43Well, Windows 11 is broken on every level.
15:47The new update broke notes.
15:50Like, the notepad broke.
15:52Yes.
15:53That's crazy.
15:54It's a vibe-coded operating system.
15:56How do you not validate that?
15:57I almost swore.
15:58How do you not validate that stuff before you even send it out to the world?
16:02Just because you want to put it out there?
16:04Make sure, validate.
16:05That's why we mandate our staff.
16:09If you're on the code team, you must validate your code through.
16:13We've got two different systems that we use, and we validate our code.
16:16If you're not as part of the code team, then you've got other tools you can play with, and you
16:21can utilize and help you from a sales and marketing, an HR, a finance, whatever the case may be.
16:25And my new one, Antigravity from Google, you want to talk about something that is going to start revolutionizing what
16:36we do and how we build and how we code and how we build apps and activities?
16:41Google has got their S-H-I-T together.
16:47People are saying in the last several months, we're in an AI bubble.
16:52Oh, Google's not in an AI bubble.
16:54Between Gemini kicking the crap out of ChatGPT these days, and Antigravity, and that tool set, and that dashboard, and
17:03what you can build inside that.
17:04The market itself does seem to be a bubble, but the winners are already starting to come out.
17:10Some of them will survive, right?
17:12And it'll be really good.
17:13Like, OpenAI is crashing and sinking, apparently, according to some whistleblowers.
17:21But you have people going to Gemini, right?
17:26And that was a big fear that OpenAI had because the move from OpenAI to Gemini was so fast.
17:33Yes.
17:33And this is after they purchased all of the capacity of RAM for the next, like, what was it, two
17:41years?
17:41The next few years is all going to them.
17:43Yes.
17:44Which has huge ramifications environmentally.
17:47It has huge implications for consumer electronics and for consumer rights.
17:51It was just an article, or not an article, I'm sorry.
17:54Well, actually, it was an email, but it was also on CNBC just yesterday alone.
17:59Kelly Evans on the Midday Show talked about that on how the price of this laptop, your iMac, your cell
18:06phone, all these things literally get ready for a 30% bump because all of that got taken away.
18:13And it's just one of those things where just overnight because – but you made a great point, and I'm
18:21going to go back to this point.
18:22You made a reference point in regards to how fast it took to go from ChatGPT to Gemini.
18:28Well, you know what could be even faster is all of us going from Gemini to the Chinese-based AI
18:36tools.
18:36They can do it without NVIDIA chips.
18:40They're doing it with us, and they've got a whole process that's broken down now.
18:44What they do is they don't boil the ocean.
18:47They boil subsets of the ocean of data.
18:51So instead of, let's say, 650 trillion data points, it's only 60 trillion.
19:01And it's incredible.
19:02There's – oh, I apologize.
19:05I forget the magazine that I was reading through.
19:07And they literally had graph charts next to each other.
19:11United States cost over here.
19:13Chinese cost was over here.
19:15And it's because they've got a completely different model.
19:19That's because in the United States mind, they're in the space race.
19:23But in China, they're just doing what they do, right?
19:27They're not trying to – being at that cutting edge that they don't barely understand is very expensive.
19:35Very, very, very expensive.
19:36We're in the bleeding edge.
19:37They're in the cutting edge.
19:39Yeah.
19:39What would you mean?
19:40Yeah, yeah.
19:40But they're catching up.
19:42It's incredible how fast they're catching up over there.
19:45And now there's – one of the companies in China actually has a Chinese subsidiary and a U.S. subsidiary.
19:51And the U.S. subsidiary is growing as fast as the Chinese subsidiary because all of the folks over here
19:57see that and look at it and play with it.
19:59And they're like, wow, let's jump into this.
20:03Oh, my gosh.
20:04Do you see AI shaping where capital flows?
20:07Like, for example, improving risk modeling, forecasting, performance tracking, you know, all the good things people talk about that rely
20:15on data centers and all the bad that we're talking about.
20:17Oh, yes.
20:18It's part of the ongoing evolution.
20:22And it absolutely is and will.
20:24And it will get better and better.
20:27You know, in everything, there's good, bad, and ugly.
20:31And you got to make sure that you weed out and make sure from your standpoint what you're doing and
20:36what you need.
20:37And I hope others do that as well.
20:38Don't take everything for face value.
20:41You know, utilize the tools for those activities back and forth.
20:44But absolutely, like I stated earlier, we haven't even gotten in the dugout to put on our uniform to go
20:50hit batting practice yet with AI.
20:52So, you know, if we're lucky enough.
20:54Go ahead.
20:55I'm sorry.
20:55No, go ahead.
20:56No, no, no.
20:57I was just going to say if we're lucky enough next year, we're going to have our uniform on and
21:00start throwing pitches, warming up our arms.
21:02And everything is going to keep growing and evolving.
21:06It's fun to see.
21:07We're on the new industrial revolution.
21:09You guys are way too young.
21:10I started out on a Commodore VIC-20.
21:14I started out on a Commodore VIC-20, an Apple IIe, Atari 80.
21:19I can still remember when I had a basic Atari as a video game system.
21:23My thumb still hurts from the little button.
21:27And nowadays, look at what we can do with all these things.
21:29And it's almost, let's see, I got those when I was 15, 14, 15 years old.
21:35And now I'm 50, almost 57.
21:37So, you know, it's 40 plus years.
21:40And now we're in a new revolution, a new industrial revolution.
21:43And these are cyclical.
21:45And things got good.
21:47Things got crappy, then good.
21:50And look at what happened.
21:53Yeah.
21:53Yeah.
21:54Is that where your optimism comes from?
21:56The fact that you've seen so many years of this, of technology and innovation sort of reshaping how society acts,
22:02basically?
22:03That's a great question because I've lived through this one.
22:06I wasn't through the industrial revolution.
22:08I wasn't old enough.
22:10But I've been in the beginning of the technology revolution.
22:12And I've seen us from going from computers that look like they were the size of my trunk, laptops that
22:19were the size of my, this entire table that I'm on right now.
22:22I used to have to use two floppy drives, one for the application and then one for the operating system.
22:28And I've seen these activities.
22:30And I've seen how people like using it.
22:33And they like to grow and they like to educate themselves.
22:36When you get people that are excited for these new tools, like the three of us, we're excited to use
22:41these.
22:42We're excited to learn about them.
22:44We're excited to be able to be a helper and grow the ecosystem as a part of this.
22:50And if you're scared of it, you know, that's not a good thing.
22:54Don't be afraid, but become mindful and knowledgeable and help others.
22:59That's why I love to help.
23:00That's my goal now that I'm old enough, more mature and starting to retire, is I'm trying to help others
23:06to be better with the tools, be better with life, be better and guide them up ladders.
23:12And knowledge of success and activities around that.
23:15So, yeah, definitely.
23:17That's so interesting because that's something.
23:19Is this a different kind of scare to when people thought computers were going to take over?
23:23Or is that just our fear as humans?
23:25Like everything's going to take over.
23:28My biggest fear was 1999, December 31st, 1999.
23:32I was sitting in a data center in Atlanta, Georgia, and I was scared crapless and going, what are we
23:39going to do?
23:39And for the most part, we all thought we had fixed everything and things were going to be fine.
23:44But the software itself was all, nobody thought of the software and building it for the age.
23:52And that was a scary night.
23:54And I didn't get much sleep, lots of coffee.
23:56And we all came out fine with that.
23:59What I do get worried about is now with some of the things that you can build, design, and do
24:05with these apps and with the use of AI, there's always bad actors in the world.
24:10I worry about somebody hacking any grid, electrical power grid, whatever grid may be.
24:18I worry about somebody hacking any internet grids.
24:20I worry about somebody going off the deep end and saying, you know what, I'm going to figure out a
24:25way through the dark web to get to missile sites or to military sites.
24:30Those are the things.
24:31But for the most part, and I'm going to knock on wood or knock on wood, we've got some darn,
24:37thank you, darn good people that are on the forefront.
24:40It's a lot of different countries around the globe that are very protective of those pieces.
24:46So I'm always concerned that way.
24:49I'm so conflicted about your optimism because I'm optimistic.
24:57I run local AI models, right?
25:01I love them.
25:02Yeah, and I enjoy them.
25:04My problem is it's not just – like the level of bad acting from the – like people look at
25:17the AI, people talk about the AI, and I think that's the big problem because the AI isn't the problem.
25:22The problem is the people.
25:23People are the problem.
25:25And even the CEO of Palantir himself has admitted, you know, we're probably not – us, the CEOs, are probably
25:32not the people that should have the regulatory where – like – or like the say on where AI goes.
25:41But they do.
25:42And you have stuff like – one of the most recent things, one of the Trump kind of schizo tweets
25:52of like eliminating all of the state – basically eliminating –
25:58All the states to be able to –
26:01Yes.
26:02Or to regulate where –
26:04It's a federal, not state level because if the federal is different than the state, then you have all this
26:09differentiation.
26:11And to be honest, I wish there was a federal regulation.
26:15There's not.
26:16We still go back to the old internet – what is it?
26:20701 or whatever that is in regards to the internet privacy laws.
26:25That is so old and archaic.
26:27It's like people – and then you get – and I'm going to say this.
26:31And I hope everybody listening to this pings me on my LinkedIn and said, thank you.
26:35You've got some of the most brain-dead people in Congress trying to make these laws that they have –
26:42no – I've been in front of congressional folks.
26:45And I've talked to them.
26:47I've handed them copies of my book.
26:48I said, you or your people need to read this and learn about what this does and how this does
26:53and what these do and why you're doing it.
26:55Because you're voting on this for mankind because what the United States does, everybody else tends to follow because we
27:04are supposed to be the leaders of this.
27:06And I actually came out of Davos yesterday.
27:08The United States is the leader in AI.
27:11If you look at the types of companies, China is very close second and you've got others, blah, blah.
27:16But if you were going to lead from the front and we're going to be true leaders, our leadership needs
27:23to be to protect everybody with these activities.
27:27And so with some of the people – I'm not going to say her name, but let's just say she's
27:32made millions and millions of dollars out of stock trades.
27:35And people call it the stock trades religiously.
27:38What an imbecile.
27:40What an imbecile.
27:42She has zero idea.
27:44And she talks in circles when she talks about it.
27:47And there's a reason why I have no hair.
27:49I've literally done this talking to some of them.
27:52And it scares me because they look at what we're doing.
27:57One side of their mouth says, oh, it's great for everybody.
28:00It's great for human kind of the other side of their mouth.
28:02All they want to do is wait for the next stock trade.
28:05They're not there to protect us.
28:06No, like it's very interesting because you have like – you look at, for example, Jensen Huang.
28:13And you look at, for example, Alex Karp.
28:16And you look at –
28:17Alex is a great guy.
28:20Open AI.
28:21Oh, Sam Altman.
28:22Sam Altman.
28:23Yes.
28:24And I understand where they're coming from.
28:30But then I look around at the world.
28:32And you have that one example, a very funny example with Lisa Su, the CEO of AMD.
28:38And amazing woman.
28:41One of the women who I most respect.
28:43But everybody's sort of become blind for the shareholders when it comes to the AI craze.
28:48I was just going to make a reference of where you were going with that.
28:51They're beholden to their shareholders or to become IPO'd shareholder.
28:56And a lot of the things is – a big problem is like they ask her, like, is this one
29:02of those things where we implement first and find an application for it later?
29:06And you could see her pause.
29:08And she's like, of course not.
29:10Of course not.
29:11We already have so many applications for it.
29:13It's like, come on.
29:14Come on.
29:17That's not the case.
29:19And you see it in the way they behave, right?
29:22Like, why are people laying off employees?
29:24It's not because the AI is actually replacing them.
29:27It's because shares – like, the stock value goes up because shareholders are kind of ignorant.
29:33And they really don't understand.
29:36Yeah, they think, like, the operation of the company is going down.
29:40But the value is going up because there is a disconnect between what is happening in the real world and
29:47what people imagine is happening and what they want to happen, which is that, oh, my God, we need way
29:54less employees.
29:55This is awesome.
29:56Well, it's not that they necessarily need way less employees.
29:59They do need less.
30:00But what they need is bigger, better, faster, stronger employees who can use these tools.
30:06And that's where – between the university systems, I have been very fortunate.
30:11Last year, I spoke at over 50 different universities and colleges around the United States, Canada, some in Europe.
30:18And I tell these young people all the time, you can't just start like I started 45 years ago.
30:26You can't just be a code monkey.
30:28You can't just come in and hope that you're going to learn everything anymore.
30:31Because even though that's a great thing for young people is to just learn that way, you've got to be
30:37able to use these tools.
30:39You've got to come in knowing what these tools are.
30:43You need to know, snowflake, perplexity, clock, Gemini.
30:47You need to know these because people are going to ask you that.
30:49And I have this philosophy.
30:51It's the bigger, better, stronger, faster human being that's going to create bigger, better, faster, stronger, more efficient companies.
31:01And if we as a human, we are the final say.
31:05No machine will ever have what we have up here with our brain and our eyes to be able to
31:11see.
31:12No machine will ever have empathy.
31:14We have to have these tools and traits as humans, keep them, nurture them to be able to use these
31:21tools.
31:21And as we use these tools, grow and help others to grow and use these tools.
31:27Like I keep saying to when I, anywhere I go, help push people up the ladders of success with knowledge.
31:35Because we own that.
31:37If we can help with some empathy and pushing people up, that's where we're going to, we're going to keep
31:43that lead.
31:43There's a problem.
31:45And my problem is empathy.
31:46And a lot of medium, a lot of the AI startups are just that.
31:52AI startups, their whole point is to, they just, it's a pump and dump essentially, right?
31:57A very good analogy.
32:00And so I have a friend, for example, right?
32:03He worked at NVIDIA.
32:05And at this point, he's like, he's approaching 40.
32:10And he's like, this is too much of a young man's game.
32:13It's too intense to work at NVIDIA.
32:14It is just too, too intense.
32:17It is, it's just not realistic for me to be here.
32:21I have to leave.
32:22But the difference is NVIDIA is one of those few companies in the tech space that actually values their employees.
32:28So he got a lot of value during his 15 years.
32:31And NVIDIA doesn't fire employees en masse.
32:35Other companies do.
32:36And so he's, he is leaving and he is, he is leaving and tired and he's only 40 and he's
32:44in the best of the situations.
32:45Imagine like a lot, a lot of these smaller companies that don't really care about human capital at all, right?
32:50You have a crisis for people that work in computers.
32:54They're, I think they're, they're twice as likely to be unemployed as art majors at this point.
33:00Like people that work in computer science.
33:02I don't know about that.
33:04As much as I love my art collection, I don't know about that.
33:07See, I'm going to challenge you on this one though.
33:09I'm going to be 57 this July.
33:12I still work 18 plus hour days because I love what I do.
33:16I love learning the technologies.
33:18I love growing the company.
33:20I will be up for my European team.
33:23I'll be up for my Asian team.
33:25I'll be up for my, I, and I, I will literally do code wars with some of my, with some
33:32of my staff.
33:32Okay, guys and gals, let's do it.
33:35Let's jump into a code war.
33:36We need to each build an app for X, Y, or Z.
33:40Let's go.
33:42Let's have some fun.
33:43And I, but I'm a little bit different on that regard as well.
33:48My children, my wife, they all know that I love tech and I love being a part of it.
33:53I've always either worked for somebody in a very small degree at SAP or I've run my own companies.
33:59So it gives me that latitude and I just love to be a part of it.
34:05Now I've gotten tired.
34:07I retired for two years at one point in time and just kind of walked away and just did nothing
34:13but sales.
34:14But I came running right back.
34:16I wanted to be part of technology.
34:18I love being, and so I, I am a different animal, I guess.
34:24I do think that, that, yeah, exactly.
34:27I love, I love you and your enthusiasm and that you really enjoy it.
34:32A lot of people were, were told to go into IT because it would be a stable career.
34:37Like, they're not exactly there to bring in, like, they're not, they're passionate about technology.
34:43Don't get me wrong, right?
34:44But they wanted a nine to five.
34:47It's not their joy.
34:48It's not their love.
34:49It's not their joy.
34:49I love what I do.
34:51For them, it was a job.
34:53I love this.
34:54This isn't, for me, this isn't a job.
34:56This is my life.
34:57And I get where people are at with that.
35:00Oh, you and I have got the same types of friends.
35:02I've got friends all over Silicon Valley that have just said, you know what?
35:06Screw SAP.
35:08Screw VMware.
35:09Screw this EMC.
35:11I am done.
35:11I'm going to take my stocks.
35:12I'm going to walk away for a while.
35:14And then they come back.
35:15You know, I've got one that left VMware when they got the, when they got purchased by Michael Dell and
35:21all of them.
35:22Took a second, opened up a coffee shop and loves, he's in there every morning at six o'clock in
35:28the morning.
35:29And then he's got staff and then he goes and sits.
35:31And he's a great guy.
35:33Don't fault him at all for it.
35:35Hell, someday I'm going to be the guy sitting on his bench drinking coffee every morning at 6 a.m.
35:40So I have zero pain problems with people doing that at all.
35:45Let them enjoy their lives.
35:46That's what, that's what God gave us the ability to do.
35:49If you don't love it, find something else.
35:52He's got a great stock buyout from NVIDIA.
35:55Oh, yeah.
35:56Well, that's the thing he said is like, well, the good thing is I'm rich.
36:00So, like, yeah.
36:04As Rick James once said, I'm rich, bitch.
36:06Yeah, well, you know, like, that's one of the good things about those really big legacy companies and like the,
36:13some startups, not all of them.
36:16Oh, no, not all of them, yeah.
36:17Yeah, definitely not all of them.
36:19But they give you a good compensation package that includes stock.
36:22And if it includes stock, then I think that includes your enthusiasm because now you're part way, you actually have
36:30ownership.
36:30Yeah, you get that puppy vested after three years and you make sure that you stay vested, enjoy what you're
36:36doing, work your arse off.
36:38Anybody can work their arse off for three years to get a vested stock.
36:41And then by that time, that three years, you knock on wood and you hope that it's one of Silicon
36:46Valley's darlings.
36:48You go IPO and Andreessen Horowitz or somebody else has reshuffled the deck.
36:54And now all of a sudden, they say that NVIDIA has created more millionaires in the last 24 months than
37:02any other company on the planet, if I remember correctly.
37:05Oh, yeah, I believe that.
37:07I haven't seen the statistic, but I 100% believe it because if you work at NVIDIA and you have
37:15any amount of stock, if you've been working there for any amount of time, it's going to be worth a
37:20pretty penny.
37:21It's going to be worth a pretty penny.
37:22I guess moving away from NVIDIA into the smaller scale, what do organizations that feel overwhelmed by tech, what could
37:31they do to make it a responsible use of the tool?
37:34Like you're talking about now, you know, what can they learn from the NVIDIA example?
37:39Well, NVIDIA is a, let's call it a purple unicorn.
37:42You know, a lot of things fell in place.
37:45Jensen did a lot of really good things.
37:46They were not that successful up front.
37:49They were very low.
37:51And then all of a sudden, this whole AI thing came into play and then the stocks started taking off.
37:55And then it went on this rocket ship ride.
37:58It was wonderful to see a company succeed.
38:00They were losing server space to, for the first time in a really long time, to AMD, right?
38:07And so the AI boom came and then boom, all the way to the top, which is, which as someone
38:14who enjoys consumer electronics a lot is like a pain.
38:18And as someone who is not as, who really likes AI, but is not, not, not really excited about the
38:26industry.
38:28I'm like, I'm like, I'm so conflicted.
38:31Everything, every, everything is so conflicting nowadays.
38:34It's the problem.
38:36Yeah, it's, it's kind of positive.
38:38Everything, life has got positives and negatives.
38:40The silver lining, and you mentioned it earlier.
38:42And I, I, I mentioned that on this podcast, anytime I can, I mentioned it to Victoria every time I
38:47can is at least this data centers thing.
38:50Well, I'm not American, so I'm not paying for it.
38:52And, and, and, and it's bringing nuclear back into the, into the picture, which is a huge silver lining for
39:01me.
39:01Clean nuclear, more efficient.
39:04They've done so many things to make sure it's more safe.
39:08It's amazing what they've learned in 20 years since the Japanese nuclear plant went offline because of the tsunami.
39:18It is growing.
39:20And the second thing that I really like, not only nuclear, natural gas.
39:24The United States has more natural gas than all the Arab world has of oil in it.
39:30And it's incredible what you can do with clean, natural gas.
39:36It's outstanding.
39:37So between those two right there, get those regulatory permits ready and get invested in those.
39:43Hell, natural gas went up 50% because of this polar vortex thing that's going through the United States right
39:50now.
39:50Hell, which means it only went up from $3 to $5 per cubic, but it's.
39:56It's something.
39:58Now that's still 2X, right?
40:00For a family living on paycheck to paycheck.
40:02That's a huge hit.
40:03Yeah.
40:03And once we get solar figured out, the solar panels that we have right now are horrible.
40:11And I work, I've got a side project called Charging Bunny.
40:14And all it is is solar panels used to charge batteries for pumping into EV vehicles.
40:21And we never, ever take anything from the grid.
40:24We shoot it back to the grid at 12, 59, 59 every night.
40:31It gets sent to the grid.
40:33So it helps the grid, but it never, it's one directional, not bi-directional.
40:38And once we start getting more and more of the cleaner, better solar panels out there that are consolidated, that
40:45can absorb more, we start putting those up around all over the world.
40:49It's going to be amazing what we'll be able to do with that beautiful ball of sun up there.
40:54Even on frigid days like this, you still have massive.
40:58If I open this curtain, you would be blinded right now.
41:01I have my sunglasses for my drive.
41:03It is gorgeous out.
41:05It doesn't have anything to do with the wind or the temperature.
41:08And wind is horrible.
41:09You're killing millions of birds every year and animals.
41:13It's horrible on the oil and gas exposure because building those things, transporting them, getting them set up, and then
41:20running them, people think, that's great.
41:23No, if anything, instead of off the coast of these, use the wave.
41:27During our previous, literally our previous recording today, we talked about how people underestimate how much of an environmental cost
41:37there is to just logistics and moving things from point A to point B.
41:40Yes.
41:40Yes.
41:41But isn't that a place where AI is kind of hailed as a new solution?
41:46Where it's like, well, because we can't really measure the skills of these things and it's so complex, like large
41:52systems, energy supply chains.
41:54Isn't that where AI is like a pro?
41:57Yes.
41:57That's where you use AI to start helping out with your logistics and your strategy on how you're going to
42:02move things, where you're going to move things.
42:03What should I build?
42:04How should I build it?
42:05What types of activity am I going to need around it?
42:08And that's a wonderful way to use it.
42:11Yeah.
42:11It's a natural application to a real issue that we have on scope three emissions.
42:16Yep.
42:17Exactly.
42:17Isn't it like, wasn't there also a thing with like the air, the wind energy that the blades aren't recyclable?
42:26It's really cost effective to move them once they stop working.
42:30So they just stay there.
42:31Just literally leave it there.
42:34Jesus.
42:36Oh yeah, it's, it's, it's, we were making reference to two of them before we got on.
42:41It's the Al Gore syndrome.
42:42I have a pretty small point and I was the vice president of the United States.
42:46Listen to my words.
42:50I think it's one of those conversations that is just tying it all the way back to the start.
42:57There's a lot of stuff we're talking about because we're clued in, but this isn't a conversation that you can
43:03just have at the pub.
43:04Right.
43:05You're expecting people to have all this information, to know what you're talking about, to know who the people we're
43:10naming is.
43:10It makes the conversation harder to have, even outside of jargon, just engagement with this topic.
43:17So what is one thing that if someone wanted to get into this and sort of start learning more, where
43:23do you start?
43:24Because I don't remember how we got here.
43:26Start on Google or start with your favorite.
43:29So have your favorite AI tool interview you and ask questions back and forth about your favorite environmental topic or
43:38activities that you feel.
43:39And it will help you understand what is fact and what is fiction.
43:44Specifically, ask it.
43:45Please ask me three questions about my wind.
43:49I have...
43:50Hey, Gemini, I think that wind energy production is a fantastic use of environmentally friendly activity.
43:58Please interview me and let's have a conversation with three questions about wind turbine technology and how it affects us.
44:08Or just...
44:09I love being interviewed by my BFF.
44:14Gemini, ask me two questions about this activity.
44:18Let's have a conversation about this.
44:19And I go back and forth with it.
44:21My wife thinks it's so funny because I'll read the question out loud or I have the voice app doing...
44:26It's like, who are you talking to in your office?
44:29Oh, Gemini, don't worry about it.
44:30Okay, but well, here's...
44:31Here's...
44:32Okay, here's another word.
44:34Because you don't understand what the technology is, right?
44:38A lot of people don't.
44:39And they get dragged into speaking with AI and having back and forth with AI.
44:45And it sort of distorts their worldview in a very weird way.
44:50Yeah, AI psychosis is something I'm really scared.
44:54Yeah, yeah.
44:55That's because...
44:57Let's be honest.
44:58You can ask it questions and people take the first answer that comes to it.
45:03I always ask AI, Gemini, at some point in time, where did you get your facts from?
45:09Is this 100% accurate with the data in the known world?
45:14Have you made up any of this?
45:16If you continue to ask those questions with it, ask them the question, did you make any
45:21of this data up?
45:22That is a great question.
45:24That is a great tool.
45:25Absolutely.
45:26And again, it's what you were saying, knowing how to use the tool, right?
45:29Because if you don't know that you have to do these things, that you have to keep pressing
45:33it, you know, do all these things, you would just ask it, write me an email, and it writes
45:38you the email, and that's it.
45:39AI is a good tool, but it's not a good tool like a calculator that gives a fixed output.
45:45Correct.
45:45It's too designed to talk to you, which means it'll make many mistakes.
45:52And I've had to correct it before, and it's like, oh, yeah, you're right.
45:56Yes, I've never taken the first output of anything it gives me.
46:00I may go deep with it, and then I will circle back, and I will ask questions, comments, make
46:05comments.
46:07Hey, Gemini, I have a concern about this statement you made on the third paragraph about X.
46:13Can you please prove to me that this is accurate?
46:16Yeah.
46:19I mean, okay, so this is one of the things, because to me, I'm a very people-focused person.
46:26I would rather get a wrong answer from you, and that's okay for me, because, like, you're
46:31a person, like, that's fine, right?
46:34But there's so many things, and automization specifically, that AI allows in scale, where
46:40humans are being replaced.
46:42We were talking about it earlier.
46:43What is the one place, if there is just the one place, where humans should never be fully
46:49removed from the loop, even as systems become more automated, more reliant, that, you know,
46:54they stop having psychosis and telling you the wrong idea?
46:57Like, where can humans never be moved from?
47:00Restroom attendant?
47:03Joke.
47:04I got you both laughing.
47:05Good.
47:06Good, good.
47:07I don't want AI to ever replace anything when it comes down to the day-to-day, the real-life
47:19activities that we have, where we need social interaction with each other.
47:24Right, because the conversation you're having with Gemini...
47:27That's a conversation between me and my autonomous tools.
47:31But social interaction.
47:33I will never have BFF.
47:36I will never fall in love like Joaquin Phillips did in that movie with a voice, Alexa, her.
47:43That was a great movie.
47:45And people started to think, oh, okay.
47:47That would never happen.
47:48That is not going to happen to me.
47:51Social interaction is so important for us as human beings to be able to thrive and to
47:57grow and to do the things that we need to do.
48:00So I don't want that to ever replace.
48:03Second of all, when we start getting to the point with the, either the Tesla robots or the
48:09Dominion robots or dynamic robots, the dogs and all those, when we start getting to that
48:15point, what scares me on that one is, will it be too much of an interaction?
48:21I don't want that to be my human connection there, either.
48:25It was a really good book that got turned into a movie.
48:29Everybody was worrying games and they're challenging the game.
48:34Free Guy?
48:35No, not Free Guy.
48:37I like that movie.
48:37That's Ryan Reynolds.
48:38This was a book and a movie wrapped around...
48:43Ready, Player One?
48:43That's it.
48:44Player One.
48:45I really liked it, but that also...
48:48Look at how the only interaction you had was inside the game, inside your MetaMask.
48:54That's why everyone was freaking out about the VR thingies and the Apple glasses.
48:59But at the end of the day, what's really interesting is the Metaverse failed.
49:05It's spectacular.
49:06And I'm kind of glad it did.
49:07And I thought it was really funny.
49:10And it burned through a lot of money because it really didn't understand what social interaction
49:16was, right?
49:18I don't mind it for video games and playing games in there and building games.
49:22But when you start using it as a social tool, social life, I should say, that's where Meta
49:30failed on that.
49:31I like other Metaverses that are out there that are strictly video games and activities in there.
49:36You're not living in it?
49:38Yeah, it's an emerging.
49:39The Metaverse was just VR chat.
49:42Yes.
49:43The Metaverse was VR chat with less features, essentially.
49:46Yes.
49:46Yes.
49:48Great analogy.
49:49Oh, I love that.
49:51I wanted to circle back to data centers because that's where AI works from, right?
49:57All this data.
49:58And we need the data for the AI to be better, of course.
50:03But the thing with data is, and it's the ethical issue I have, a lot of it is stolen.
50:12A lot of it is...
50:13Oh, you mean from like writers and...
50:16Well, so for example, I know that the fact that I'm on StreamYard, I have accepted the
50:21terms of conditions of it.
50:22And there's probably something in there that I can use my likeness to learn how to create
50:26an avatar, right?
50:27Those are things that we are signing up to unknowingly.
50:31I don't count that as stolen.
50:32I mean it more the actor that has access to the tool can then use the tool in ways that
50:39are misrepresenting or that are harmful to society.
50:43But the data itself, a lot of times is under-representative of communities that maybe don't get a voice or
50:51people that aren't part of the conversations.
50:53All these risks that it poses.
50:55How do we know data is reliable?
50:58What makes data reliable?
50:58Because we're saying, oh, it's going to make the AI better if it's reliable, but who?
51:03Great question because right there, how do we know the data is reliable?
51:06Because you look at ChatGPT.
51:09People upload terabytes, petabytes, petabytes of their data to figure out information, but
51:16they don't realize their competitor did the same thing.
51:18And now all of a sudden you've mixed your data with your competitor's data.
51:22How do you know whose is whose?
51:24What is what?
51:25That's why when I talk to people, I'm like, don't just upload your data in an Excel spreadsheet
51:30willy-nilly to ChatGPT.
51:32Everybody else did that in the beginning.
51:34Now you're screwed.
51:35Now your data is their data and they legally own it.
51:39Exactly.
51:39And that's one of their terms and conditions.
51:41If you upload anything, we own the data.
51:43It sits on our own server.
51:45So you're absolutely right on that.
51:47And that's what I worry about.
51:48How do we know it's reliable?
51:49There's also who is, what data is being put up.
51:53So, you know, there's a lot of issues or concerns, I must say, on surveillance and how
51:58are we training security AIs and models and algorithms like that, but even AI-generated
52:05imagery, you know, where is it getting the pictures from?
52:08All these things that are, how do we make it a responsible use of the tool?
52:14Nano Banana probably has about two dozen photos of this good-looking head, bald head, up in
52:20it right now.
52:20And I, me and my team, we have some great pictures of me riding rockets, holding on tight.
52:27I'm riding different, yeah, it's so much fun sometimes.
52:32But I chose to put, just like I do with my iCloud.
52:36Now remember, iCloud and Google Cloud, and you upload your photos, that now becomes part
52:41of public domain as well, because we signed off on the terms and conditions on those, just
52:45as I'm doing with those activities.
52:47So, at the end of the day, if we're giving our access away, we do that with Facebook.
52:53You upload any photo, any article, you do anything on the Facebooks, you know, the Facebook.
53:00You can tell how old I am.
53:03With the Facebook.
53:04Boom.
53:05There it is.
53:06End-user license agreements are, need a radical, radical update when it comes to legislation.
53:15Well, you're going to have to legislate it, because every one of those companies wants
53:18the data to use it, to sell it, to make, it's a line item.
53:22How much money did we get from selling our data to these third-party companies?
53:27So, you're absolutely right.
53:28A big thing, not recently, but during the beginning of the AI boom to the mid-section,
53:35was Adobe suddenly saying, okay, now all of the stuff that you put on our cloud, that's
53:46my professional workspace, where I do creative things, we're going to feed into AI datasets,
53:54and it's also going to, and it's going to belong, because it will belong to us, which
53:58implies that it now belongs to them, and it also implies that they're going to use it
54:02for AI training, right?
54:04Which is a huge overstep, I feel.
54:07And that's why a lot of people stepped back from that, and that's why I believe they had
54:10to, within a week, they had to say, okay, my bad, we made a mistake.
54:15Take it back.
54:18But, they're very zealous, right?
54:19Like, you have, like, many AI companies very openly admitting, it's like, well, you know,
54:24we might have fed it information that wasn't, you know, public, you know?
54:29But, you know, that's the price we pay for progress.
54:31And it's just like, well, you should pay a price for that, because that's not okay.
54:37And downloading the entire Library of Congress, free.
54:44Downloading the entire New York Times, not so free.
54:48No, not so free, but they do.
54:50And one of, another big issue, because they scrape everything, right?
54:55And so, you are violating something there, right?
55:01Something human, right?
55:02And something that, you know, something non-corporate, which a lot of these people just don't understand, right?
55:09And it's, one, it's unethical.
55:12And second of all, right, at some point, and here's an issue I have with how things are going.
55:19They sort of ran out of data to feed the AI.
55:24And a lot, because of the way they scrape the internet, it is being fed a lot of AI content.
55:29And AI inherently becomes, like, sort of degenerates when it's fed AI during its training, right?
55:38And so, that's going to be a problem that needs to be mitigated.
55:41Like, there's so much about the ethics and just the overzealousness.
55:46Like, again, it's about being at that bleeding edge, right?
55:51It's that, that's the problem here.
55:52It's just not, it's just not sustainable.
55:55It's really not.
55:57And it, it is, it feels like it's detrimental to everyone.
56:00And what do you, like, let's, let's, let's make it simple.
56:05Like, the EULA, right?
56:07Like, should they be regulated?
56:09Should they just have a complete overhaul on how we regulate them?
56:12Because they don't make any sense.
56:14I agree.
56:14They should, they should have done that a long time ago, though.
56:16I agree.
56:17It's really interesting to me how optimistic you are and also how down to earth you are.
56:23I really appreciate it because I, the fact that you have all the information and you're
56:29both excited and aware of the concerns is, it's very comforting.
56:33Well, thank you.
56:34I appreciate it.
56:35You know, I try to be just a common sense individual.
56:38And, and it comes out very naturally.
56:40Oh, thank you.
56:41I guess I wanted to close off with, in five to 10 years, how do you think AI will fundamentally
56:48change how climate work specifically is done or area, any work in the sustainability sector?
56:56Oh, I think it's going to be integrated quite a bit.
56:58You're starting to see things get integrated, you know.
57:01In a good or bad way.
57:02I think it'll be in a good way because I think that we have, truly people have good intentions.
57:08Maybe I could be pie in the sky attitude, but I truly feel that people have good intentions
57:15to better themselves, to better other human beings and to better the world all together.
57:21And for me, in 20 years, I'll be, oh, I'll be in my seventies.
57:27Um, I really think that it will be a, it'll still be a wonderful world.
57:33I still feel that we're going to have great people.
57:35We're going to have great discussions.
57:37Some people are going to be loud.
57:38Some people are not so loud.
57:39But at the end of the day, I think that we all want to do good for this planet.
57:44We all live on it.
57:45No matter what your color, your creed, your religion, your sexual, it doesn't matter.
57:49We all are one planet.
57:51And it's the only planet we know of that sustains life.
57:54And we are one life all together.
57:56So I really hope that everybody takes it and understands the betterment that we can all
58:01do to make this one planet so much better.
58:04I love that.
58:04Thank you so much, Chris, for being on Tangelic Talks.
58:07Everybody, you can find some resources and a bit more about this conversation on the blog
58:12section at TangelicLife.org.
58:14And we'll catch you guys on the next one.
58:16Bye.
58:18Bye.
58:23Bye.
58:25Bye.
58:25Bye.
58:25Bye.
58:25Bye.
58:27Bye.
58:37Bye.
58:47Bye.
58:48We rise, it starts with you
Comments