Skip to playerSkip to main content
  • 1 hour ago
Nicholas Harris, CEO of Lightmatter, sat down with Forbes to discuss building AI chip technology relying on light rather than electrical signals. Harris also discussed why the end of Moore's Law and the AI boom created the perfect moment for photonics, enabling massive acceleration in AI model training and increased data center efficiency.

0:00 Introduction
0:25 The Rise of Optical Computing and Lightmatter's Origin Story
2:57 Why AI's Breakneck Pace Requires a New Technological Leap
4:20 Can Photonics Unlock Massive AI Supercomputers?
7:08 Grad School to Deep Tech Commercialization
10:42 The AI Energy Crisis and Scaling Compute with Efficiency
14:51 What Unconstrained Compute Makes Possible
18:54 Is It A Bubble And The Sci-Fi Future Of Data Centers

Subscribe to FORBES: https://www.youtube.com/user/Forbes?sub_confirmation=1

Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:

https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript

Stay Connected
Forbes newsletters: https://newsletters.editorial.forbes.com
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com

Forbes covers the intersection of entrepreneurship, wealth, technology, business and lifestyle with a focus on people and success.
Transcript
00:00Hi, I'm Katherine Schwab, an assistant managing editor here at Forbes covering technology,
00:07and I'm stoked to be here today with Nick Harris. He is the CEO of LightMatter, which is building
00:13AI chip tech that is relying on light rather than electrical signals, which is how chips
00:19typically work. So Nick, thanks for joining me. Yeah, happy to be here. Excited to tell the story.
00:25Yeah. So let's just start by talking about why light? What advantages do you have if you're
00:33trying to use light to send information versus traditional electrical signals?
00:37Yeah, I think that gets into a little bit of the origin story for how the company got started.
00:42Basically, if you look at how computers have scaled, how they've gotten better, faster,
00:48more efficient, cheaper, more ubiquitous over the past 50 plus years, it's come from these
00:54concepts of Moore's law and Denard scaling. Transistors would get smaller and more efficient
00:59every 18 months, about twice for both of those quantities. That broke down in maybe 2005. And
01:07so around that time, people started to lift their heads up and look around and try to find
01:12new sets of physics that might be able to augment how computers work and continue to drive scaling
01:18in cost, energy efficiency and speed. Now there are a lot of choices. People have looked at things
01:23like carbon nanotubes. They've looked at different materials for the transistor, which is the switch
01:28that's used in computing. But one of the areas was light. And that's what I work on at Light Matter.
01:36And I worked on at MIT during my doctorate and postdoc there with my co-founder. And the reason light's
01:43interesting is that it has a fundamentally different set of physics associated with it. When you compare
01:48it to like transistors running on electrical signals. First of all, if you think about the
01:53way the world is wired, the continents are connected using light. So the internet's running over optical
01:59fibers that are in the ocean. Cities are connected together using light. So there's optical fibers buried
02:05in the ground. And that's carrying the data. So what's special about the light? Well, it turns out you
02:12can't send signals across really vast distances using electrical signals because they just kind of
02:18dissipate. The signal dies and it just goes away. With light, you're able to transport data over extreme
02:24distances. And not only that, you're able to send data at very high rates. One of the things that we
02:31just did is we announced an 800 gigabit per second optical fiber. Just to put that in perspective,
02:38your house has one gigabit per second internet connection. Each one of our optical fibers can
02:43power 800 houses. So light is pretty incredible when it comes to transmitting information. And, you
02:51know, that's some of the special properties. And we can talk about what we do at Light Matter in the
02:55context and all that too. Yeah, no, for sure. So I guess before we get into more of the nitty gritty
03:02details here, can you help us understand why does this matter right now in this moment? Like what are
03:09the pressures on compute right now given AI boom that's happening that makes this approach really
03:19interesting for this moment? Yeah. When you look at computing in general, it's always felt like a
03:25breakneck pace. People have had to release new fabrication technologies. You know, think about TSMC with N7,
03:32N5, N3. There's this relentless train of progress for technology, specifically the fundamentals of
03:39technology, like the computing elements. Now, that rate was really fast. The cadence was about 18 months.
03:45With AI, that 18-month improvement cadence was reduced to about three and a half months.
03:50So you're looking for doublings in performance and cost every three and a half months.
03:54When that kicked off, you had to switch to new technologies. Typically, the semiconductor industry,
04:02which is the field that we work in, is very resistant to change. They want to squeeze every
04:07drop they can out of what they have before they make any giant leaves to new technology. When you ask
04:12for a 5x improvement in the rate of progress, something new had to come. And that's why Light,
04:18the moment is here. So tell me a little bit more about the breakthrough that you announced earlier
04:24this year in terms of that fiber cable and the 8x improvement.
04:31Yeah. So at Light Matter, we build photonic chips that link together
04:35GPUs and switches to build these giant AI supercomputers. There are a couple things that you want to get
04:42right when you're building out these connections between these GPUs. If you want to build a million GPU
04:47supercomputer, one of them is you need a lot of efficient utilization of each piece of glass,
04:53each optical fiber that you're connecting to a chip. The other one is you want lots of connections
04:58to the chip and lots of bandwidth, hundreds of terabits per second of bandwidth. And just to put that in
05:04context, your house is one gigabit. A terabit is a thousand gigabit. So I'm saying a hundred thousand
05:10houses worth of bandwidth or beyond. So what we announced earlier this year was a technology where
05:17two chips could communicate using a single optical fiber, sending 16 colors of light,
05:23eight in each direction. So eight are going from one side to the other and eight are coming back.
05:28And so both chips are talking with a single piece of glass, a single optical fiber.
05:33And then we scaled that with our M1000 platform to have a thousand 24 lanes of this. So that chip
05:40is running at 114 terabit per second. So 114,000 houses worth of bandwidth. It's like a city.
05:48I appreciate the houses comparison because for us non-technical people, it's hard to wrap your mind
05:55around the scale and the complexity that we're talking about. But at the core, these are really
06:01like the engines that are building the AI models that people are using every day. And there is this
06:10huge demand for more and more and more compute. What have you seen in terms of demand for this product
06:19since you launched it? Well, earlier this year, NVIDIA announced that they were working on optical
06:24technologies to connect to their GPUs. They're trying to increase the size of the domain in which
06:31GPUs can communicate at high speed. We've been working on this problem long before NVIDIA started
06:37getting into the game, but we're really happy to see them announce that they're doing it. And the reason
06:42is the entire ecosystem now knows that it's time to switch to photonics. The level of interest is
06:49excitingly exhausting, I would say. Lots of customers, lots of demos. And we've had the
06:56opportunity to really showcase our hardware in public settings. And I think people are just incredibly
07:02excited about the tech now and eight and a half years in, it's time. Yeah. I mean, that's a long time
07:10for you to be working on bringing this to market. Will you talk to me a little bit about those early
07:16years? What convinced you to start a company when you were still working on this in an academic setting?
07:26So I'll go a little bit earlier than that. So I was an R&D engineer at Micron in 2009. And I was
07:33working on transistors and the device physics. How do you scale the transistors? How do you make them
07:38better? And what I saw at the time was that the roadmap for making transistors better was kind of
07:44at the end. I talked about Moore's Law and Denard Scaling. I could see that was kind of over. And so
07:50I went to graduate school at MIT to figure out how to augment computers. Maybe you can't replace
07:56everything about the computer with light, but what pieces might you be able to replace that could drive
08:01efficiency, drive speed, drive cost down? And so that's kind of what I worked on at MIT.
08:07And while I was there, the deep learning revolution kind of kicked off. And you could feel it. This was
08:142000, you know, 15 was when you could really start to feel it. Everybody at the school was kind of
08:20looking at how they could leverage this deep learning to control systems that were so complicated that
08:26it were kind of incomprehensible to people. And so we were learning to build these AI models. And at the
08:32same time, I was building optical computers for quantum computing in the lab. And I saw that those
08:39machines that I was building using silicon photonics, which is a technology that you build in standard
08:44electronics fabs, but that uses light to transmit information, I saw that that could actually be
08:49used to accelerate AI, both in how chips communicate, but also potentially how they do calculations themselves.
08:56And so I think a lot of things aligned when we were starting the company, we saw the end of scaling for
09:03transistors, we saw the AI boom start to happen. And we saw that the commercial foundries that could,
09:09you know, think about companies like TSMC, global foundries and tower, commercial companies that build
09:14the wafers that run the world's electronic supply chain, they all started to look at building silicon
09:19photonics. So we had the convergence of those three things. And that was the kind of fertile ground that
09:25we came from.
09:26But it's taken quite a while to get here. What were some of those early challenges? I mean,
09:33did investors buy it right away?
09:35So at least in our story, I'll tell you how we experienced it.
09:41We went to this MIT competition called the 100K, which is MIT's top entrepreneurship competition.
09:48If you win, you get $100,000 to start your enterprise. We did that, we won the 100,000 there.
09:54We went to Harvard and won this President's Challenge. And, you know, they had like 75,000
09:59or something like this. And after that, I think investors knew that we were onto something, at least
10:04the way we were telling the story and the opportunity was fundamentally true. And so investors really wanted
10:10to come in and back the company at that early stage. When you think about the challenges
10:15that we've run into, well, we've had to invent a lot of how all of this technology works.
10:21From the lasers to the way that you control the devices to robust manufacturing techniques,
10:29we've had to innovate across all of these fronts. It's been a very long journey
10:33to get here. A very deep technology stack. And we've built quite a moat around us.
10:39All right. So when it comes to efficiency, because that seems like one of the core
10:47things you're offering here is more efficient data centers. What are the challenges that data centers are
10:56facing right now? I mean, we've seen folks like Elon Musk build massive supercomputers extremely quickly,
11:06like there is so much building happening in this space. What are those core blockers for efficiency
11:13right now? And why do you see what you're doing as a solution? And how do you then convince
11:19these massive companies to take a bet on this new technology? Today, the rate of growth in the amount
11:27of AI hardware that's deployed is just crazy. In Texas right now, there's 27 gigawatts of AI
11:34compute coming online. Texas as a state consumes about 100 gigawatts of power. So a third of the power
11:41of one of the biggest states in the United States is going to powering these models.
11:46So you think about that. There are a lot of fundamental pieces of the infrastructure that
11:52are going to be constrained from a supply chain perspective. Transformers just generating the
11:57power. People are looking at nuclear reactors to try to power this stuff. Man, wouldn't it be nice if
12:02you could squeeze a lot more compute out of those data centers? With what we're doing, we can take a
12:07one gigawatt data center and make it feel like a four gigawatt data center from a performance perspective.
12:13Think about taking a foundation model from Google or name any of the frontier model apps and releasing
12:20that three times faster. So instead of every six months you get a new model, how about every two
12:24months? Think about the impact that has on the AI race and the compound acceleration of training models
12:32and intelligence in general. So that's what we can do. And the way that we do it in particular
12:37is by allowing these very fast groups of GPUs in these domains where they can talk quickly
12:44to get even bigger. Right now you can only get about 72 GPUs to talk at high speed.
12:50As soon as your workload is bigger than that group of 72 GPUs, you notice a huge drop off in performance.
12:56And so what AI scientists do is they map the problem onto groups of 72 GPUs. They don't want to do that.
13:03Yeah. That's not the goal. They would like to have a thousand GPUs or more that are all talking at
13:10very, very high speed. They really feel in these high bandwidth domains, it all feels like one giant
13:15chip. But as soon as you spill out, it feels like a giant drop off in performance. And so what we're
13:21doing is we're enabling the size of, that's called a scale up domain to get very large, 10,000, 100,000,
13:28a million chips that are all talking with incredibly high speed and driving that time to train and
13:35that one gigawatt to feel like four gigawatts of compute. So how about that 27 gigawatts actually
13:42looks like a hundred? Well, okay. There's a lot, there's a lot to talk about off of what you just
13:49said, but one obviously is energy. I mean, that's a colossal amount of energy. It's mind blowing. And the
13:56U S grid is just not going to have enough energy at the pace that we are growing.
14:02There's really going to be a major shortfall here. So it sounds like what you're building
14:06could really help with that energy efficiency problem. Absolutely. It could. And I hope that we
14:12can build fast enough to not have a shortfall happen. I haven't looked at the numbers super
14:17closely myself, but I think there's a chance that you can lay the tracks and keep the powertrain going,
14:23keep that growth rate going. But energy production is an important piece of the puzzle now, especially
14:29in this AI era. And you think about 27 gigawatts, a quarter of Texas, how long until we're at the
14:35United States power consumption? Probably not very long. So there's going to need to be a lot of
14:41innovation in that front. And the more efficiency you can squeeze out of every GPU, TPU, XPU, the better.
14:49So what the, the, the 72 cap, I didn't know that. I think that's really, really interesting. And yeah,
14:59it's like mind blowing to think about if the systems that have been built now were made with these
15:07constraints that you're describing, like what, what could be built if, if those constraints were,
15:15weren't there. I mean, I'm curious, I know this isn't your area of expertise per se, but I'm curious,
15:21like what kinds of systems could be built when you have so much more ability to compute?
15:32Yeah. So here's a couple of things that you could do with our photonic networking technology,
15:37by the way, our chips are called passage. We also build lasers that power them called guide. With that
15:43technology, you could build much bigger frontier AI models, very, very large hundred trillion
15:49parameter and beyond. And the reason is that you can lace together thousands of GPUs. Each GPU has a
15:55certain amount of memory, but you're multiplying that memory across the number of GPUs. So the more
16:00of them you have talking in an ultra high bandwidth domain, the bigger the AI model you can serve at a
16:05high speed. So imagine, you know, Gemini nine, you know, something like that. That's, that's what you
16:11can enable with the technology and beyond. And then if you look at, there's a new field, which is reasoning
16:16models and deep research. I think a lot of people are using deep research and I think multiple companies
16:21have named it the same thing. That kind of model is spinning up a ton of GPUs and they're all running work in
16:28parallel and you can actually accelerate the time to answer. Right now, sometimes you kick off the deep
16:34research program and it takes 10, 12 minutes to give you an answer. How about one second? How about
16:40instantaneous? So, so that's, what's possible. I think we can massively increase the frontier
16:47performance, bigger parameter counts, which means more intelligence. And we can also speed up that
16:53interaction. So you can do really advanced things really quickly. What in your mind would be like
17:00the holy grail of photonics in terms of a data center that was built kind of from the ground up
17:07using this technology at every layer? What, what would that look like? Today, these data centers look
17:14like a bunch of mini brains that are all computing in parallel. You ask a copy of the question to all of
17:20these mini brains, they're all 72 GPUs computing. What if you had a million GPUs that all acted like
17:27a single giant brain? That's, that's kind of the dream that we're looking at. And a million is just
17:32the beginning. We'll be at a million very soon. I like to think about the scale of this power consumption
17:38growing. It's just doubling at an obscene rate. There's going to be 10 million, 100 million, billion
17:44GPU data centers, and that can all be linked using photonics acting as a single computer.
17:51Imagine what that AI model will behave like and what it'll be capable of.
17:55Yeah, it's a little scary. If I'm honest, it's like pretty scary to think about.
18:02I mean, a bunch of companies are racing to develop like superhuman intelligence or intelligence that
18:08like actually could mimic a human brain. But it does, it certainly feels like we're pretty far away from
18:14that. And it sounds like we need, we need these kinds of hardware improvements to be able to get
18:23to the kind of scale that you're talking about. Though, the way you reacted, I'm thinking maybe you,
18:28maybe you disagree with me about how far away this stuff is.
18:32I mean, I'd say we're using AI everywhere at Light Matter. And I could only dream of having the tools
18:40that we have now when I was in graduate school and researching the stuff that would become Light
18:45Matter. Man, it's an incredible efficiency booster. I think these are very powerful tools.
18:53Yeah, okay. I have to ask because it's a question everyone is talking about right now. Is this a
19:02bubble? Is the kind of big infrastructure build out, which is kind of a signal along with the sky,
19:09sky high stock market prices. From your vantage point, are we in a bubble right now?
19:15From my vantage point, I see no slowing in demand, or interest in next gen technologies or build outs
19:25at all. I think there's a lot of talk in the press about some of the, you know, commitments
19:30on data center build outs, and how this will be funded. I think that they'll find a way to get that
19:35stuff funded. Fundamentally, the technology is delivering what they thought it would deliver.
19:42And that's crazy because, you know, this has been going on since 2017 when Google invented the
19:48transformer and the whole world basically made a bet that these AI models would keep getting better at
19:54this incredible rate. One of the possible things that could have happened is it could have slowed
19:58down. We could have had this incredible trajectory that just kind of tapered off. It's not tapering off.
20:04It just keeps getting better. So that's profoundly interesting. And I think that will drive
20:10huge economic value. And it is. So I don't think it's a bubble. I'm very shocked pleasantly at how
20:19well these models keep scaling. Interesting. There also has been, you know,
20:25a lot of talk as people do start to talk about the limitations of LLMs and the transformer
20:31architecture and think about, you know, for instance, moving around in the physical world,
20:36we don't have nearly as much data to help AI models like have spatial reasoning, things like that.
20:42So I think we've got a ways to go in terms of making something that's like truly artificially
20:50intelligent. Yeah. I mean, I think robotics is an exciting field and, you know, watching all these
20:55startups, including Tesla, which is not a startup, building these machines and the rate of progress
21:00there also seems very high. So generally things are looking very futuristic and exciting to me.
21:08I'm curious for you and your company in this moment, what is your moonshot vision for Light Matter?
21:18Yeah. Moonshot vision for Light Matter. So I think ultimately we're on this trajectory where we're
21:25trying to replace some of the core components of computers. Right now it's about how they communicate.
21:30And we're going to replace how that's done today, which is with electrical wires, with light.
21:35And that's the journey we're on now. We're also working to create new sources of light that are
21:40incredibly bright and that power crazy amounts of bandwidth. Think about a petabit per second,
21:45which is a thousand terabits. It's a million houses worth of bandwidth. Okay. So on that journey.
21:52I do need that house metaphor. So I appreciate it.
21:54Yeah, absolutely. I see us getting to a point where computers are no longer constrained by
22:02their ability to communicate with each other and their ability to access memory. And at that point,
22:08I think that there'll be a need for a fundamentally new type of computer, the calculations that underlie it.
22:16Earlier this year, we published a paper in Nature where we showed that you could actually run state-of-the-art AI
22:22models on a computer that was using light to do calculations rather than using electrical signals
22:28to do calculations. Now, I think this is 10 plus years away, but I see this incredible journey at
22:34Light Matter towards making fundamental improvements in computers and driving that rate of progress.
22:43I do have a crazy prediction, which I think will end up being true.
22:46I think that Mars will end up being a data center.
22:51And I think it's- Whoa.
22:52Right? Okay. Say more, please.
22:56Yes. You're already hearing people talking about, even Google, talking about building data centers in
23:01space. Eventually, you can imagine shipping huge numbers of these XPUs, these GPUs, TPUs to a planet like
23:10Mars and covering the surface with these billion GPU supercomputers. We're going to keep scaling the
23:17amount of power we're using. We're going to keep scaling the amount of land. You're going to want
23:20to pick a spot that's not like a national park to put that computer. And you think about SpaceX and what
23:27they're doing. They're building towards a future where that's possible. So I think the future is very
23:34exciting and there's a lot of elements of sci-fi that are probably going to come true.
23:37Wow. Mars is a data center. That's very, like, Death Star, Death Star vibes. And also, I feel like Elon
23:47wants to go live on Mars, not cover it with a data center. So there might be some conflicting
23:54incentives there, shall we say.
23:56Yeah. I don't know if I'd want to live on Mars. I don't know if that would be very- It's not that,
24:02you know, it's orange.
24:03Yeah. It doesn't seem like that nice of a place to live. Maybe he'd just rather visit and then
24:10create his own empire there. His computer empire, robots and data centers. Nick, thank you so much.
24:18This has been so interesting. What a fun, very sci-fi note to end on. Thank you so much for joining me.
24:26Absolutely.
24:33Thank you so much for visiting my mind.
24:35Thank you very much.
Be the first to comment
Add your comment

Recommended