Skip to playerSkip to main content
  • 3 months ago
Transcript
00:00Explain this deal for us, because there are some names in here that, you know, certainly household names, but also complicated relationships that these companies already have, not necessarily with each other.
00:10Yeah, I mean, look, the ecosystem that we had so far, Microsoft, OpenAI, and, you know, Google partnering with Entropic and Amazon partnering with Entropic, that seems to be changing right here.
00:24And you saw Microsoft adopting Entropic, and, you know, Microsoft, a big owner of OpenAI, actually incorporating Entropic in their Office 365 suite.
00:37And now we also know Oracle has this $300 billion deal with OpenAI.
00:42So it doesn't surprise me that, you know, Amazon and OpenAI are partnering because OpenAI has the best models.
00:49But when you look at their token consumption, Google's token consumption is higher.
00:55So they are looking for more capacity through all the different avenues they can.
01:01Why does OpenAI need Amazon when NVIDIA, for NVIDIA capacity, when NVIDIA is an investor in OpenAI?
01:10Because OpenAI cannot use NVIDIA directly.
01:14It needs a cloud provider.
01:16You fine-tune the LLMs on enterprise data.
01:20Where does the enterprise data reside?
01:22It resides in the three cloud providers with Amazon, Microsoft, and Google.
01:28And we know...
01:29But why not use Azure from Microsoft?
01:31Because Azure doesn't have as much capacity, and now Azure is also using, as I said, Microsoft is using Entropic.
01:39Okay.
01:40Google is also...
01:42Talk about frenemies, Carol.
01:44I never feel so stupid when we're talking about this stuff.
01:47So help me understand, because it does feel like this increasingly tangled web, but it may not feel that way to you.
01:53And I am...
01:54I'm just...
01:54All I see is money, money, money, spend, spend, spend.
01:57And I keep thinking about ROI, return on investment.
01:59We have talked with you, especially with the hyperscalers, that we do see that ROI.
02:04Yeah.
02:04I don't know, though.
02:06Is it still continuing?
02:09What about...
02:10I don't know.
02:10Is there so much more growth to come?
02:12And those data centers, are all of them maxed out, and that's why we need to continue building?
02:17Yes.
02:18And I'll frame it this way.
02:20This earnings season, we learned that Microsoft's $93 billion Azure business grew 39%, and they were capacity-constrained.
02:28They said they would have grown over 40%.
02:31That's crazy, right?
02:32So just think about how much AI has added to that Azure growth.
02:36It's close to 20%.
02:38Cloud growth rate is 20%.
02:40That's what you see from Amazon number, is $130 billion business growing at 20%.
02:46Why is Azure growing 39%?
02:49Because of that 19% to 20% incremental AI growth that they are getting from AI.
02:54But could it be, Mandeep, that we're seeing growth now, and then all of a sudden people are going to be like,
02:59I don't know that I'm actually going to get the payoff from this ongoing longer term.
03:05Could that possibly play out?
03:07I mean, the two metrics I remind investors of is, one is the infrastructure growth that we have seen for hyperscale cloud guys,
03:15and the other one is token consumption.
03:17If people are using AI, that will get reflected in token consumption.
03:23And tokens is nothing but your use of AI, whether it's in a coding agent, customer service use case.
03:30And so that is a metric to keep an eye on.
03:33But we've talked with you that at some point people are going to have to pay for this stuff, right?
03:37And that maybe we won't use it as much as we might be right now using it for, you know,
03:43give me a picture of a horse or tell me what to do with the stuff in my fridge.
03:46Like, because there is a cost to this.
03:48Yes.
03:48Which is why we use a subscription model for pretty much all of this.
03:54None of it is an ad-driven model.
03:56Why?
03:57Because traditionally, Google search was inexpensive when it comes to rolling out to users,
04:02and then you could show an ad and make up.
04:04The business model for AI is very different.
04:07Right.
04:07It requires a lot more compute, and you can only monetize it best using a subscription model,
04:13which is what we see in a lot of these products.
04:15So we're all going to be paying subscription for this.
04:19Everybody.
04:20Businesses.
04:20Going forward forever.
04:22Even consumers.
04:23Even consumers.
04:23ChatGPT subscriptions, they cost up to $200 per month.
04:27So there is no free lunch here.
04:31Go.
04:32I could talk an hour on this.
04:33Okay, you're just back.
04:34From three countries in six days.
04:37We only have a minute left.
04:39Where'd you go?
04:40What'd you learn?
04:40Well, Hong Kong, Singapore, and Tokyo.
04:43And all three are excited about AI.
04:46The nuance is in how they are looking at AI deployments.
04:50In the case of Hong Kong and China, AI optimizations, compute optimizations are front and center.
04:57They want to use open source.
04:58They want cheapest form of AI.
05:01That's where, you know, Singapore and Tokyo are more, you know, using Western LLMs.
05:05And look, everyone is excited about finding out what the use cases are.
05:10They want to build local data centers for having that AI capacity.
05:14And this build out is global.
05:15I mean, it's definitely something that we are seeing across the board with all countries.
05:21That Blackwell chip, NVIDIA, are they expecting that they're going to have access to it at some point?
05:25I think everyone wants allocation.
05:27In the case of, you know, certain countries, the sovereign part is not that clear yet.
05:33But Singapore government, for example, is very advanced.
05:35So in their case, you could see them being front and center when it comes to deploying their own AI.
Comments

Recommended