- 2 days ago
Category
🗞
NewsTranscript
00:00You've got a new book out, you've left Meta, but you're still advocating for the benefits of social media.
00:05Tell us more about your book.
00:07Well, my book, in essence, is about something which I think has not been focused on enough
00:12in the sort of public debate around the online world,
00:16which is the growing fragmentation and breaking apart of the internet itself.
00:21And the internet is, people don't think about it,
00:24because you send a photo to the other side of the world, you share a post,
00:27you know, we just take for granted the seamless openness of the internet
00:31in every millisecond of our lives.
00:33But it's actually sort of slightly put together by sort of sticky tape in the background.
00:37It grew up by accident.
00:39And because you've now got this great collision between America and the US,
00:44both fighting for dominance in the age of AI,
00:47and because you've got governments all regulating the same thing, the internet, in different ways,
00:53the thing that we like about it, which is that it's so open,
00:56and so easy to communicate, and that it sort of demolishes geography,
01:00is, I think, in real danger.
01:03And so what my book is about, and it touches on many things,
01:06including my time at Meta and so on,
01:08but it's really, in essence, about the view I've arrived at,
01:12which is that if we want to safeguard the openness of the internet,
01:16the three major techno-democracies in the world,
01:19the US, India, and Europe, in that sort of order of descending importance,
01:24need to get together and agree explicitly some new sort of guardrails
01:29to guarantee the internet for the future in this time when, of course,
01:36it's changing dramatically because of AI anyway.
01:39Well, should we break down how to do that safeguarding?
01:41Because the future feels like AI, but we've already got social media,
01:44and you, of course, were at Meta.
01:46There's a lot of concern about the ad-based business,
01:49and really doom-scrolling, how addictive the likes of Facebook and Instagram are.
01:54Are you not worried about that?
01:56Yeah.
01:56Look, I think anyone who has kids, I have kids,
01:59anyone who has kids in particular, of course worries constantly
02:02about what are your kids experiencing online and so on.
02:05Is it addictive?
02:05Should we make sure that kids don't have phones in school?
02:10For what it's worth, I personally think kids shouldn't be on their phones at school,
02:14much rather than listen to their teacher rather than their phone.
02:17Read a book.
02:18What?
02:18Read a book.
02:18Yeah, totally.
02:19And I'm like any other person.
02:20I used to love reading books, I still do,
02:22but I'm sometimes constantly distracted by what's on my phone.
02:25So it's not just kids, it's all of us are constantly glued to these screens.
02:29And it's not just social media, it's news as well.
02:33Bloomberg's on your little screen as well all the time.
02:34You like people to watch it all the time.
02:36So, you know, so many people, whether it's social media companies
02:40or news companies like Bloomberg, are competing for people's attention
02:44by way of that little screen that we all have in our clammy hands all day.
02:49So, yeah, I definitely think that's something we need to worry about.
02:53And I don't think it's just driven by an ad-based business model.
02:59If anything, having ads, which Meta does and Google and these other companies,
03:06actually means that, you know, the companies have a commercial incentive
03:10not to constantly recommend, you know, vile or hateful content
03:15because advertisers don't like their ads for their plastic ducks or their cars
03:19next to hateful content.
03:20So I think sometimes the commercial logic of an ad-based model
03:25actually points in a somewhat different direction to what a lot of the critics assume.
03:30But that all being said, yes, of course we need to worry about that.
03:33And it's one of the reasons, and I explain this in the book,
03:36why I think so much more could be done to make sure that we know
03:40exactly how old kids are when they're online,
03:43but also give people, all of us, just much more control
03:47over how we use the apps and services that we do every day online.
03:53Well, where you said about the commercial argument, keeping safety there,
03:58why wouldn't Meta have third-party audits of its algorithms?
04:02Wouldn't that make sense?
04:04I don't think, there isn't a thing called an algorithm,
04:06there isn't just like an algorithm.
04:08It's, I'm not honestly sure what an audit of algorithms would look like.
04:14I definitely, however, think, and I explain that in some detail in the book,
04:18that the companies, of course, should be constantly forced to be more transparent,
04:22particularly, for instance, in an age of AI,
04:24how are they assembling the AI, these very powerful AI models that they're developing?
04:30What kind of training data are they using?
04:33Give people, you know, give people much, much more visibility under the bonnet,
04:38under the hood of exactly how these foundation models are built in the first place.
04:43Give people far greater control over how they interact with AI bots and so on.
04:50So I definitely think whether it's transparency, whether it's user control,
04:54whether it's much stricter, watertight age verification across apps
05:00so that you know what age, particularly teens are, when they're using apps.
05:05I think all of these things can and must happen, for sure.
05:08So you don't think that voluntary commitments from companies are enough?
05:11Because we've had the UK, former UK Prime Minister, Rishi Sunak,
05:14saying that actually to try and build up oversight would take too long over AI companies,
05:19that you should just have companies signing up to these voluntary commitments themselves.
05:24I don't think it's one thing or the other.
05:26The companies have got their responsibility, but they're companies.
05:29They're companies competing with each other.
05:30They're not elected by anybody.
05:32They're not philosopher kings.
05:34They're not charities.
05:36They're in a ferocious sort of existential contest with each other.
05:40So, of course, governments should step in where, particularly in the democratic world,
05:45people by democratic choice want certain limits or constraints imposed upon the private companies.
05:51I think this, at the moment, the debate, particularly around regulation of AI,
05:57is swinging from one extreme to the other.
06:00So you have one lot of people who say AI is a threat to humanity.
06:04It needs to be stopped or regulated, sort of more or less out of existence now.
06:08Then you've obviously got the Trump administration
06:11and now all the leading figures from the big tech companies saying all regulation is a bad thing.
06:16Neither extreme make any sense at all.
06:19Of course you need some regulation.
06:21Of course you need some guardrails.
06:24And over time, that is exactly, I suspect, what will happen.
06:26Well, let's take the Brussels approach.
06:28Brussels looking at changing its EU AI Act and data protection law.
06:32Is that the right approach?
06:34Look, I think Brussels have got themselves into a terrible muddle on AI,
06:38which is that they initiated the legislative process on AI
06:42before ChatGPT was even invented.
06:46So what happened was, instead of pushing the pause button,
06:49and I remember talking to EU officials at the time and saying,
06:52look, the technology you're trying to regulate has now completely changed out of all recognition.
06:57Just stop.
06:58You know, there's no first prize in being the first entity to regulate.
07:01It's much better to regulate in an effective way than just be the first to do so.
07:06But instead, they charged ahead, they tried to retrofit at speed a bunch of additional provisions
07:15to this pre-ChatGPT law.
07:18They did it like on the hoof, without really fully understanding what generative AI really implies.
07:24And now you've got this dog's dinner of an AI law from the EU,
07:28which, at least if you read some reports in the newspapers,
07:32wholly unsurprisingly and entirely predictably, as I predicted at the time,
07:36and they're basically now trying to get themselves off a hook of their own making.
07:40It would be much more sensible for the EU just to pause how they're regulating AI,
07:46work out what they want, what are the things you're trying to fix,
07:50what are the externalities to this technology that you're trying to fix.
07:53Is it the protection of kids? Is it the protection of vulnerable adults?
07:58Is it greater transparency, as we have just discussed,
08:02on how these models are created in the first place?
08:05Is it catastrophic, biochemical or weaponized risk?
08:10Be clear about the risks you're trying to mitigate.
08:14Then work with others, because this technology doesn't know borders.
08:18So particularly try and work, if you can, with the American administration,
08:22work with the Indians, work with other democracies around the world,
08:25rather than what they've done, which is they've lurched forward
08:28with an imperfect piece of legislation based on all sweeping assertions about risk
08:33that were as much generated, honestly, by newspaper headlines at the time,
08:37rather than cool analysis.
08:39And, of course, they're now in this terrible predicament
08:41that if they go ahead with the law they've got on the statute book,
08:44it's only going to damage European AI companies
08:46as they try and compete against the much bigger competitors
08:50they have in China and the United States.
08:52Well, have you told them that?
08:53Yes.
08:53Have you been speaking to EU commissioners since you left Meta?
08:56I was in Brussels last week.
08:58No, no, I've been very outspoken on this.
09:00But, look, I was a member of the European Parliament myself for five years.
09:04I worked in Brussels for ten years.
09:07The problem with the Brussels sausage-making machine
09:09is that once it kind of gets going,
09:11it's a bit like that mastermind thing.
09:13You know, I've started, so I'll finish.
09:17But I feel if you read the Draghi report,
09:21if you read the letter report,
09:22if you look at a lot of the conclusions
09:24that the European leaders themselves are arriving at
09:27in their own meetings,
09:29this chronic lack of competitiveness in the digital sphere in Europe
09:34vis-à-vis China and America
09:36is, in my view, a really existential threat
09:39for the European continent.
09:41We are perilously close to becoming a museum continent,
09:44if we're not careful.
09:46And our kids and our grandkids,
09:48much as we're now all using American technology in Europe,
09:50will only continue to use American and Chinese technology.
09:54I actually believe it is important
09:55that Europe should stand on its own two feet,
09:57not just in terms of defence, which is clearly the case,
10:00but also technologically.
10:01We've got great gifts.
10:02We've got smart entrepreneurs.
10:04We've got good universities.
10:05We've got good start-up companies.
10:08But the moment they grow in scale
10:10and start getting any momentum,
10:13they can only grow by taking American money
10:16and going and living in California.
10:18We have to learn how to develop our own success in Europe
10:21if we want to stand our own two feet.
10:24And part of that, to your question,
10:26is to kind of unravel the mess
10:28that they've got themselves into legislatively
10:31under the previous European Commission
10:34and the previous European Parliament.
10:35So, Nick, I've got to ask you while I've got you.
10:38We've got a budget coming up here in the UK.
10:39You were part of an austerity government.
10:41Is austerity the answer for Rachel Reeves?
10:44No, but candour is definitely the answer.
10:48I think the problem for the Labour Party
10:50is that they basically spent the last ten,
10:53well, decade and a half
10:55kind of kidding themselves about the choices they face.
10:58You know, after Labour lost the election in 2010,
11:01I remember very well in that half a decade period
11:04when I was Deputy Prime Minister
11:06and we were basically trying to fix the terrible damage
11:08that was done to the UK economy
11:10because of the financial crash.
11:12Instead of just accepting that whoever was in power
11:14would have to take some pretty difficult decisions,
11:17some tough decisions,
11:18because the British economy had suffered
11:19this terrible heart attack,
11:21instead Gordon Brown and all the rest of them said,
11:23oh, it's outrageous, how dare you save any money,
11:25you shouldn't make any cuts and so on.
11:27Even though they'd actually passed legislation
11:29back in 2010
11:30to halve the deficit more quickly
11:32than the coalition government did.
11:34So they kind of lulled themselves into thinking
11:37that government's just a sort of set of easy choices.
11:40And so, of course, now they've got this huge majority,
11:42so they've got no excuse not to stick to their manifesto.
11:44I couldn't stick to my manifesto,
11:46but I hadn't won the election.
11:47I didn't have a parliamentary majority.
11:49I was the leader of a junior party.
11:50They've got a huge parliamentary majority.
11:52They've got no political excuse.
11:55And instead, they've got this majority
11:56full of MPs who kind of, I think,
11:59thought that just because they think
12:01they're nicer than the Conservatives,
12:02that the moment they got into power,
12:03there'd be no difficult choices to make.
12:05Then they refused to make the decision
12:07to make savings in welfare,
12:09which are desperately needed.
12:10So now you end up in this ridiculous position
12:13where they're going to have to, apparently,
12:15break their very, very clear
12:17front-page manifesto commitments
12:18on income tax,
12:20because they've basically been kidding themselves
12:22for the decade and a half
12:24about the difficult predicament
12:25that this country is in.
12:26So she should break those promises or not?
12:29No, of course she shouldn't break those promises.
12:31So it'll be spending cuts, austerity?
12:33Look, I've been out of politics for ages.
12:36I'm not going to start writing budgets
12:37in this television studio for Rachel Reeves.
12:39But clearly, economically,
12:42the government needs to avail itself
12:44of both spending reductions
12:46and some tax increases.
12:49My point, simply, to you,
12:50is that this is just, you know,
12:52the chickens coming home to roost.
12:54Of ten and a half, well,
12:55a decade and a half of the Labour government
12:57simply refusing to accept
12:59that if you are, particularly now as we are,
13:02ludicrously out of the European Union,
13:04a floating, you know, currency,
13:06our massive deficits are funded, basically,
13:08by the generosity of strangers.
13:10We have this ballooning welfare budget.
13:12Productivity is persistently low.
13:14So, you know, you just have to take
13:16some pretty difficult decisions.
13:18And I think they ducked those
13:20for the last decade and a half,
13:21pretended that they'd fetishised austerity,
13:24even though they'd actually legislated
13:25for it in 2010.
13:27And now I think all of this
13:28is now coming to a head
13:30at a time when, of course,
13:31the government is politically,
13:33remarkably, given how big a majority it's got,
13:35really very weak.
13:37Just finally, Sir Nick,
13:38there's a job vacancy at the BBC.
13:39Do you fancy it?
13:40No, I've got plenty of other plans.
13:43Thank you very much.
13:44Including what?
13:45Well, that's for another interview
13:46on another day.
13:48All right.
13:48A pleasure to have you back.
13:49So, Nick.
Recommended
7:24
4:31
5:49
4:48
3:52
7:54
4:55
9:37
7:39
6:05
6:26
1:25
Be the first to comment