00:00Cerebrex priced this IPO and drew $185 a share above the top end of the marketed range. I'm looking at
00:07the Bloomberg terminal. Shares indicated to open $350 a share. Your reaction to that. Pretty good day. How do you
00:19feel. I mean do you see this coming. There's some mechanics to it right. But you know I know you
00:24I know you like to let's be honest. You like to talk about what's happening in AI. The big focus
00:29on inference.
00:30Cerebrex is placing it. But this is the biggest IPO of the year so far. You know what does it
00:35mean to you and your employees and your backers. It's the biggest tech one of the biggest tech IPOs in
00:42history and it's the biggest semi IPO in history. We couldn't be more proud. This is the culmination of a
00:52decade of work of countless late nights and long weekends.
00:58We are enormously proud and excited and ready to get back to work and start working on on the next
01:07on the next great thing. Again the market is indicating shares to open at $350 a share. Let's see where
01:14we're at.
01:15When trading starts. But you're being priced in as a major player in this field now. What evidence would you
01:23point to that you are in fact a major player. A lot of focus is on the concentration in the
01:29relationship with open AI. Tell me about some of the other frontier labs or other names that you have some
01:37concrete talks with.
01:38Sure. In the last four months we announced a deal with open AI that's north of $20 billion for 750
01:46megawatts of compute. We also announced a major engagement with AWS where our equipment would be deployed in their data
01:57centers.
01:58I think those are obviously the largest but there are dozens of others that are in what used to be
02:06a big deal in the $10 to $50 million category.
02:10And so there's just an extraordinary demand right now for fast inference. We're the fastest not by a little bit
02:17but by more than an order of magnitude.
02:19You know, we're 15 times faster than the next nearest competitor. And as AI has become useful, everybody wants it
02:26to be fast. Nobody wants to wait.
02:29On that, you know, 21x on, you know, performance and also like a lot of emphasis on the dollar per
02:34token, right? That is the metric that the field cares about.
02:39But that's on paper. You know, you just talked about the AWS relationship as an engagement.
02:44When does the word engagement end and real revenue start to show up for you from those relationships?
02:51Sure. We signed a binding term sheet with AWS as described in our S1 and we're working through the master
02:58agreement.
02:59I think in dealing with organizations of that size, it takes a little time to dot all the I's and
03:07cross the T's.
03:07But we are extremely confident that they will be an enormous channel for us and a partner in delivering our
03:16technology to large enterprises and medium-sized enterprises around the world.
03:22I think they are one of the preferred cloud providers for just about every enterprise on Earth.
03:28And so an opportunity to have your solution embedded in their offering as part of their bedrock offering, that's a
03:38huge win for us.
03:41We're live on Bloomberg television and radio. This is a Bloomberg Tech takeover.
03:45And we're speaking to Andrew Feldman, the CEO of Cerebrus, whose company just priced the IPO at $185 a share
03:51and is indicated right now,
03:52I'm looking on the Bloomberg terminal, maybe to start trade at $350 a share.
03:57Andrew, I think you know this about me. Whenever we have a big moment to have a conversation like this,
04:02I always go to the Bloomberg Tech audience.
04:04What do they want to know? And actually, the first question is, what was your attitude toward retail investors?
04:13Why did you not do more for the retail investor in this big moment for your company?
04:20We were more than 25x oversubscribed. There was a lot of hard decisions that needed to be made.
04:27And nobody got what they wanted. And we did our best. It's about all you can do when there's 25x
04:35more demand for your stock at the institutional level
04:38and at the retail level than there are shares to be sold. So we're really proud of the way we
04:45chose to do it.
04:45We thought we did it with integrity. And I think we are very comfortable with the ending allocation.
04:54Full stack, fully vertically integrated. And what I mean by that is you literally build the supercomputer top to tail,
05:00right?
05:01So NVIDIA will do the tray, not just GPU, but then Dell, Supermicro will assemble it. Dell's margins, low teens.
05:08NVIDIA's margins, mid-70s.
05:11Your margins, 40%, 41%.
05:14And what I'm trying to help the audience understand is why that strategy of owning everything top to tail is
05:21going to pay off in the long run.
05:22And I would say the other question I get for you is, what's the future outlook for margins based on
05:27your plans?
05:29Well, I think a couple of things. I think we have obviously opportunities to improve our cost structure.
05:37You know, we did half a billion last year in sales. That means we put 250 million in the supply
05:44chain.
05:44Obviously, that's not an efficient spot. As we grow, we will have more leverage in the supply chain and our
05:51cost of goods will come down.
05:52I think we have an opportunity to increase prices.
05:59I think the demand for fast inference is overwhelming this minute.
06:04And so I think in the long run, we'll be really proud of our gross margins and where they will
06:10sort of wash out as we hit scale.
06:14I've got a question for you from a terminal client via IB. Thank you for the question.
06:18Everyone's tuned in right now, Andrew. They want the detail.
06:22And so one way that people look at it is that, you know, this is custom silicon, but Cerebris isn't
06:28a chip per se.
06:29It's like wafer level. Right.
06:31Is there a reason why you can't just sell that versus the whole server?
06:37Yeah, that's a good question.
06:42For the entire 70-year history of the computer industry, every previous effort had failed to build a chip of
06:48this size.
06:49So for your audience, this chip is the size of a dinner plate, while traditional chips are the size of
06:56a postage stamp.
06:58And not only had everybody failed until we succeeded, but several tried to copy us and have since failed as
07:05well.
07:06Part of what we were capable of doing and able to do was that we were able to use not
07:13just the chip, but also our expertise in packaging and in system design to solve some of the problems and
07:20build a truly compelling solution.
07:22You don't just get 15 or 18 or 20 times faster than the competition because you built a good chip.
07:29That's one of a collection of different things that enable that sort of performance.
07:34You can build a great chip and the system vendor, the ODM, can nibble away at your performance by not
07:43delivering the right amount of power or the right amount of I.O.
07:47There's a reason why NVIDIA then sought to control the I.O., the NVLink, because they didn't want others to
07:55nibble away at their performance.
07:58And so by building the system, we were able to optimize all parts.
08:03I mean, you could ask the same thing about Porsche.
08:05Why don't you sell just engines?
08:08It turns out that an I-11 is a beautiful car because of the engine and everything else they put
08:16in it.
08:16Andrew, the team's telling me we've got to go, but I've got to ask you, $5.5 billion, what do
08:22you use the proceeds for?
08:24And actually, how flexible can you now be in allocating capacity to new customers?
08:28We just have 30 seconds.
08:30Oh, I think we will use it to increase capacity.
08:34We are excited to bring many new customers on board.
08:38There's tremendous demand for what we're doing.
08:41I think we can be aggressive on that front.
Comments