Skip to playerSkip to main content
  • 4 hours ago
Transcript
00:00Six years at Google, before that, Department of Homeland Security, before that you were Director of Critical Infrastructure, Cybersecurity at the White House, before that, more time at Homeland Security, before that, the U.S. Army.
00:11We're going to talk a little bit about your background later and the culture that you're trying to build at Google Cloud, but I want to start with a story, something that happened from some cybersecurity perspective, a hacking or an outage, and you were just like, it happened and you were just like, okay, this is bad.
00:30Yeah, I think, well, more recent, I'll start with some recent ones and then it can always happen.
00:37It can be top secret, too.
00:38Oh.
00:39You can feel free to break some news there.
00:43I will not break news.
00:46You know, I think, well, probably one of the most challenging ones that I dealt with was actually quite a while ago in the government, which was the Office of Personnel Management.
00:58and where over a series of time we eventually discovered that some other country had taken millions of some very sensitive documents and fingerprinting,
01:16so what government employees used to apply for their security clearances, and they had taken millions of those documents.
01:23And it was one of those things that sort of unfolded like a car accident where the news just keeps getting worse and worse and worse.
01:32And I think the important thing that I also want to highlight for this is sometimes a lesson that people take away from these things is, well,
01:40you just should, like, not try to find out the bad things.
01:44Because that's what happened is as you were, and as we were sort of improving the security and improving the detection is when you started to find the bad things.
01:54And so, you know, long story in terms of how different people are held accountable that I have opinions about.
02:01But, you know, in the end, it was organizations dealing with, what do you say, when we call it legacy software and legacy infrastructures just means old.
02:13And so, you know, the government dealing with some of the oldest infrastructure.
02:16And so an organization that was trying to do the right thing, that was doing the right thing,
02:20and rolling out better detection, rolling out better capabilities, and finding out, you know,
02:24more and more horrific of what was lost and eventually having to go public and deal with it.
02:28And so I think that one has always stuck with me precisely for that reason.
02:34Because in the end, you know, as most security professionals will know,
02:40it's the security team that gets held accountable when something happens, which is a whole issue.
02:47And, yes, it's true, the security leaders need to be held accountable.
02:52The ultimate owners of an agency or business need to be held accountable.
02:56But to disincentivize the very sort of security work that allows you to have that observability
03:02and to be able to see what is actually happening is a trend that is concerning.
03:10And so that was probably my most like, oh, this is really happening.
03:15There's a few of those.
03:16But that one really stuck with me because of the accountability, the deep desire for the people to hold people accountable
03:25that ended up leading to people reconsidering whether they should be investing as much in security because it went badly.
03:34So we're going to come back to just in the last couple of minutes your time in government.
03:38But before we do that, I want to hit on a little bit of what you're doing at Google Cloud and just set up the context for us.
03:45I mean, everybody here is familiar with Google Cloud and the products and services it offers.
03:49But, I mean, in the context of what you do, billions of users, millions of websites globally.
03:55What's the biggest misunderstanding that people have about the scale and complexity
03:59and what it takes to actually keep this thing up and running?
04:01I don't know if it's a misunderstanding or just it's sort of hard to contemplate.
04:08And so, for example, we have billions of users, and I'll touch on all of Google, that interact with all of our various services.
04:18We have services deployed in over 200 countries.
04:21In order to do that, we have 7.75 million miles of cables, subsea and terrestrial, just to support that alone,
04:32let alone all of the energy and all the other infrastructure that goes into that.
04:36We process over 20 petabytes of data every single day.
04:42And it's hard to, if these numbers are so big, it's almost like hard to contemplate.
04:46So that's an equivalent of 2 million gigabytes a minute.
04:50Wow.
04:51Like, it wasn't even until I came to Google that I was like, oh, wow, this is, like, enormous.
04:58And then, of course, it's our responsibility to figure out how to protect all of that.
05:02If you think of, we were talking about phishing resistance earlier, which we have invested a lot in.
05:08Gmail alone automatically blocks 120 million phishing attempts every day.
05:14Like, that's awesome.
05:15And still get things through.
05:19No.
05:20We're pretty good at it.
05:21Okay.
05:23I have this really persistent one that I'll talk about later.
05:26They keep changing it.
05:27It's really annoying.
05:29Yeah.
05:30So, and there's a lot of that.
05:31Just, you know, just thinking about the scale of the network and the infrastructure that was built to support, you know, what we now call workspace and Gmail and YouTube and all of these things.
05:41And now we offer is cloud.
05:42And now we offer is cloud.
05:42It's immense.
05:43It's mind-boggling to the extent that they were able to engineer something that works like this with low latency and low cost.
05:53Well, and the thing that just occurred to me is when they were building this out, like when Gmail became available to us, I think, you know, consumers of a certain age, like around 40, you know, you're like, wait a second.
06:06We can have free email that has unlimited storage.
06:10But think, and, you know, a lot of us have that same email address that we've had for close to 20 years at this point.
06:16But the threats against that have evolved so much over that period of time.
06:25So it wasn't like, you know, that product or any Google product or any Google cloud product was built when it was first built with what could be attempted right now in mind.
06:37So how do you, how do you, it's not like, you're not like backfilling, but how do you make sure that the legacy services are actually protected?
06:46Well, we don't think in terms of legacy in that way.
06:50So one of the things that attracted me to come to Google was the scale and impact that you can have with that sort of reach.
06:59And the second thing was their continued recognition of the importance of security and reliability.
07:06And building that into the culture and investing in those things and recognizing that you can, and in many cases they innovated.
07:15And I'm saying they because I wasn't there when this happened, so I can't claim credit for it.
07:21But, you know, thinking about how do you deliver high quality services at a global scale with low latency and low cost,
07:34but be able to continue to ensure on a regular basis that security and reliability is embedded in everything that you do.
07:40And so a lot of times you'll hear people talk about how you have to trade off speed for security or speed for liability, and I don't think that's true.
07:49It does mean that you have to change the way that an organization prioritize and invests, and it's not something, you know, that you can just sort of do it once.
07:57You have to continually invest in that, and you have to think about the incentives in your organization, and that's one of the things that I think Google has done really well.
08:06And not just doing it for themselves, you know, being able to then externalize that through various mechanisms.
08:12So let's talk a little bit about the evolution of threats and attacks and agentic AI, for example.
08:18And this has been brought up a couple times, but the recent example about a month ago with Anthropic,
08:23the company said they disrupted a highly sophisticated AI-led espionage campaign that used their clog tool.
08:28Here's what Anthropic said, the attackers used AI's agentic capabilities to an unprecedented degree, using AI not just as an advisor, but to execute the cyber attack themselves.
08:40Was this like a crossing of the Rubicon?
08:44I'd probably, like, future folks can say whether that was it.
08:48I guess what I would say is, generally, I think of AI, or where we're at with AI right now, in generative AI, as an inflection point for security, for sure.
09:00And a lot of people have talked about it already, so maybe we won't rehash some of those things.
09:04But the idea that the advantage could get to the defender, I think, is a real thing.
09:11Now, is it also going to the offense?
09:15Of course.
09:16But when you think of security in the amount of noise that is generated in security and what AI is good at, good AI is good at, right,
09:28is being able to distill that signal from that noise.
09:30And previous panelists talked about being able to have analysts that can focus on those higher-level tasks
09:36and using agents to filter out all of that signal and noise.
09:42Of course, that doesn't mean it's easy to do right.
09:45But I do think it's that moment.
09:48If we can harness it as a community, it's not just individual companies, but harnessing it as a community.
09:54And we're doing a lot of work in that space.
09:58But, yes, for sure.
09:59Whether that was, you know, what happened with Claude is Rubicon, I think there's a lot of things that are going on.
10:05And I do think it's, you know, is the community going to come together to embrace what is available or potentially available
10:15and be able to leverage that for good against those who would leverage it for bad?
10:21Who has the advantage right now?
10:23Well, I always think the offense has the advantage.
10:27But, you know, I come from Wisconsin where we believe in a strong defense.
10:30And we do win championships every now and then with a good defense.
10:34But I see scenarios where it's changing, right, where you see, you know, AI-powered security capabilities,
10:45whatever domain, security operations, right, where they are able to both detect, process, and mitigate issues.
10:54And at the rate, you know, billions of these.
11:00And so it's, again, the scale is mind-boggling.
11:04But that we have to do it right.
11:07And we have to do it in a way that's explainable.
11:10And we have to do it in a way that's standardized.
11:11And that is open for all.
11:16And so when we're making decisions or the agents are making decisions on our behalf,
11:22we can – they're being guided properly and in a way that is generally accepted and understood by the community.
11:29I promise we'd come back to your time in public service.
11:32And I think it's fair to say your entire career up until 2020 when you started working at Google was in the public sector.
11:40White House, DHS, Army.
11:42You probably work around a lot of people who have never worked in the public sector and have worked in tech their whole lives.
11:48Culturally, what do you bring and what do they bring to you?
11:51I think – well, in one way, I remember all the time in government, everybody would come to me and say,
11:58oh, the private sector is so much more efficient and you guys are so slow.
12:01And I was like, no, that's not, like, totally true, guys.
12:05You actually have the experience.
12:06Yeah, the private sector is not some, like, magical land where everything, like, happens instantaneously and there's no bureaucracy.
12:12That was sort of shocking for me.
12:14And that's fine.
12:16Bureaucracy is good to some extent.
12:19But I think, you know, some of the things that I bring is – I would say for Google – this is maybe a little bit more unique to Google.
12:30Google is a very engineering-first company, right?
12:33And in thinking about how do you design a solution that, you know, that scales, that makes, you know, humans' life easier,
12:46ideally trying to engineer humans out of the process as much as possible.
12:52But I will say sometimes that means that we tend to look at everything through that lens.
12:58And so I think a lot of times what I bring is, one, a deeper understanding of governments and where regulators and governments might be coming from
13:10and they're thinking of risk and the risk that they're trying to manage and how they see different companies working together.
13:19And then also bringing a – you know, there's ways to bring stakeholders together.
13:25And then combining that with a desire and capability for engineers to be able to, okay, how can I, like, engineer better solutions?
13:35I think it comes to a really unique sort of set of outcomes when we can do that together.
13:41So I'm going to break the rules here because we're going to run out of time.
13:43But I have to ask you, because of your time in government, because of what you did at the Army, what you saw in the White House,
13:51what, from a cybersecurity perspective, keeps you up at night?
14:02You know, I think – I'm going to say it's a mix between cybersecurity and reliability
14:07because I think a lot of the most impactful things may not necessarily be from a malicious perspective,
14:17as was talked about earlier.
14:20And I think the thing that keeps me up at night the most is we're now dealing with such a complex ecosystem
14:29in so many different players and so many different dependencies, right?
14:33And that there's a dependency somewhere in the system.
14:40You could take that either technical or, you know, just relationship that is not understood well enough
14:47and you have these cascading consequences.
14:50And I'm thinking a little bit broader than just our company.
14:53This is, like, one of the most common things that happens with these big outages.
14:58I mean, I was thinking about it from, like, a national security perspective.
15:00Yeah, so where you have – so one of the things I did in Homeland Security
15:05was work on something that we call national critical functions.
15:08And so it's what are those functions that the country, that the U.S. in particular, needs in order to function.
15:14And many of those are based on trust.
15:17And they're based off of a oftentimes unknown set of participants.
15:22Some are well-known, some are not, and a lot of dependencies within those systems.
15:26And that's changing a lot, which I think is, from a technology perspective,
15:30you have different people in the market providing those services.
15:33And that's what I mean is one of those dependencies, you know, the reliability of the financial system,
15:41the reliability of markets.
15:42And whether that's an actor maliciously exploiting those or through some unintentional scenario
15:50and bringing those down, and especially in a low-trust environment where people then are now getting into a scenario
15:57where I cannot trust the financial system or I cannot trust for the delivery of my electricity.
16:02And you combine those two things, and where an actor, a malicious actor,
16:06could even exploit something they had nothing to do with,
16:09but they can create a perception that something happened.
16:13Those are the nightmare scenarios to me.
16:15I can't end on that because you said nightmare.
16:20What will help us sleep tonight?
16:24What prevents you?
16:27Like, that all sounds pretty bad.
16:29And I read the headline about the CME last week, and, you know, we saw what happened with one accidental thing.
16:34What gives you confidence that we're not going to enter a scenario like this?
16:38Well, everything that we've talked about all day, right?
16:40All you heard is all the different companies, whether those are cloud providers, financial institutions,
16:47telco institutions, tech companies, people are aware of this, right?
16:53And they're working together.
16:54So I know, you know, Google and tons of different companies that are part of this.
16:57We've put out things like secure AI framework and investing together, Amazon, Google, PayPal, Microsoft.
17:04People are investing in solutions and are recognizing this and working together.
17:09And it's going beyond information sharing.
17:10People are investing actual dollars into these things to improve the ecosystem.
17:15That's how I feel better at night.
17:19And I also know we have a lot of people who are working every day to make sure.
17:23And I do think we're very resilient as individual companies and as a country so far.
17:30And so you just need to keep ensuring that we are open, transparent, and continue to work together.
17:37It sounds a little bit lame, but in the end, that is what it is.
17:40We have the responsibility as providers in these ecosystems, as providers of critical functions, wherever you are in that value chain.
17:49We have a responsibility to be regularly engaged and regularly improving.
17:55And I think people are doing that.
17:57Okay.
17:57I think we can end there.
17:58Okay.
17:59Okay.
17:59Thank you so much, Jeanette Mantra, Vice President, Global Risk and Compliance over at Google Cloud.
18:05Thank you, Jeanette Mantra, Vice President, Global Risk and Compliance over at Google Cloud.
Be the first to comment
Add your comment

Recommended