Skip to playerSkip to main content
  • 7 hours ago
India Today Group Chairman and Editor-in-Chief Aroon Purie, who was part of a roundtable to discuss the 'red line' between AI efficiency and human ethics at the World Governments Summit in Dubai, said AI is just an engine, but the driver is a human, who also has the brakes.

Category

🗞
News
Transcript
00:00:00Good morning, everybody. I think we'll slowly get started. I think there's a few participants, as if we don't have enough, I can gather, who are finishing up their other sessions, so we might have a few others coming in shortly.
00:00:15I'll quickly introduce myself, Ethan Stephanopoulos, the moderator of this event. I think it goes without saying, if you look around the table, we have probably the largest roundtable session.
00:00:25As my good friend over here, Mike Allen, joked, we are the Baskin-Robbins of roundtables. We have 31 voices, so we have an hour and a half of which we're going to try to be as dynamic as possible.
00:00:39Before I get into kind of quick housekeeping items to get this kicked off and try to use the time as efficiently as possible, I'd like to introduce first His Excellency, Syed Al-Attar, the chair of the UA government of the media office,
00:00:53who will just say a couple of quick words on behalf of the Prime Minister's office.
00:00:57Thank you, Ethan. Thank you very much, guys. I'm really excited to be part of this discussion.
00:01:04Everybody would love to see where media and storytelling is going.
00:01:09Everybody is talking about the future and AI and how things will evolve from here and going forward.
00:01:16And, you know, communication and storytelling has been, you know, one of our most important, you know, I call it the software of human being, you know.
00:01:26We are wired through stories, you know.
00:01:28We acquire our values, our convictions, you know, our beliefs through stories.
00:01:34And that's why storytelling is a very powerful thing, you know.
00:01:38Even, you know, humankind has been doing storytelling for the last maybe four or five thousand years, you know,
00:01:46on the walls of caves, through rituals, through songs, through everything, you know, stories has been, you know.
00:01:51They are saying those who tell the stories are the most powerful.
00:01:55Those who tell the stories, they rule the world.
00:01:58So we are in a room full of, you know, very powerful group.
00:02:04And, you know, stories has been a cornerstone of humankind, of our, you know, awareness also, you know.
00:02:11Even most, I have came across, while I'm preparing for this session, I came across some numbers, you know,
00:02:17that even in religions, you know, in Old Testaments, 40% of the Old Testaments is stories and narratives, you know.
00:02:24And New Testaments, like 60%, even in Quran, it's like 30%, 70 different stories in Quran.
00:02:30So it's all about stories, you know.
00:02:32Stories are the most important aspects of communications.
00:02:36And human being, they are saying he can remember facts 22 times more if it is being told through stories.
00:02:45So storytelling is a very powerful tool for governments, for, you know, organizations and corporations.
00:02:51And it is how we also, how we design the communities going forward.
00:02:59And in this session, we are looking, maybe we are not looking for definite answers,
00:03:04but definitely we are looking for some questions also that we need to ask ourselves.
00:03:10Questions like, are we more informed or disinformed in the future, you know.
00:03:16The body of knowledge that human beings have is huge, you know.
00:03:19I came across this number that, you know, in the last 25 years, the data story or the story data that humankind generated
00:03:26is more than 40 million times than what we have generated in the last 2,000 years.
00:03:30So the body of knowledge is huge.
00:03:33But at the same time, we have lots of reports from World Economic Forum, from, you know, Edelman, also Trust Barometer,
00:03:39that, you know, most of the people, 65% of the people, they are misinformed or disinformed or they feel like, you know,
00:03:48there are some foreign agents who are implanting, you know, false information and false storytelling.
00:03:55So we don't know, are we more informed or disinformed?
00:03:58Are we more connected?
00:03:59Are we disconnected or are we disconnected and insular and fragmented, you know?
00:04:0665% of humanity are on social networks right now.
00:04:09But, you know, we are growing our own pupils, filter pupils.
00:04:14We are growing our own echo chambers, you know.
00:04:16And in social media nowadays, you know, they are confirming our own biases, you know.
00:04:23So we, each of one of us, is living in his own bubble.
00:04:28So we don't know if we are more connected or more fragmented.
00:04:32Are we more empathetic or more polarized going forward, you know?
00:04:36Because when you see things like what happened in Gaza, for example,
00:04:40and the world have come together and thousands of protests around the world, you know,
00:04:44you feel we are more empathetic.
00:04:46But, you know, when you see the hatred, when you see the reward for polarized content online,
00:04:53you feel now, no, we are, when you see the sectarian war, wars, like in Iraq or in Syria,
00:04:58which was instigated by social media and by storytelling, by communication,
00:05:02you feel that we are more polarized, more hatred.
00:05:06Charlie Kirk was in university, you know, debating, you know, some facts,
00:05:10and he was shot dead because of the hatred that has been growing big
00:05:14and the polarization that has been growing big on social media.
00:05:18And you ask yourself also, are we more, are we, what about our well-being with the more communication, you know?
00:05:27You think that we have more knowledge, more conscience now, we are more conscious about our well-being.
00:05:32But, you know, when you read some reports about, you know, the lack of focus in the new generation,
00:05:38you know, the attention span, the 30% of the teenager girls,
00:05:42they are thinking of suicide because of social media, 40% of Generation Z,
00:05:46they have depression because of social media.
00:05:48So you ask yourself, are we going into a world with more well-being for humanity
00:05:54or we are damaging our well-being?
00:05:57Are we in control of the storytelling in the future or we are losing the control
00:06:02because numbers say that 75% of the data stories generated right now is through AI and machine?
00:06:09So are we controlling our future narrative or are we not, you know?
00:06:14So this is kind of a question that I would love to hear some answers today
00:06:18and would love to know exactly.
00:06:20Usually I'm a very optimistic guy, but, you know,
00:06:23I believe that the future will mainly depend on, you know,
00:06:26the values that we encode and we enforce and we reward through the algorithm also.
00:06:32Thank you very much and I hope we have a great interaction today.
00:06:36Thank you, Your Excellency.
00:06:39So very quickly, just to get into all of this, again, we have a very large group, 90 minutes.
00:06:46Any one of the topics that were presented to you, we could spend an hour and a half on individually.
00:06:51But in order to try to keep this as valuable and dynamic as possible,
00:06:57a couple quick housekeeping items.
00:07:00Please know it, we're not going to be doing introductions, obviously,
00:07:02but the first time you do speak, just introduce your name so people have a little bit of context of who you are and who you're with.
00:07:09I would ask that you try to be as succinct as possible within probably about one minute
00:07:13so we can try to get some conversation and dialogue and debate going in terms of active participation.
00:07:18And then to help me try to navigate a large group, turn on your microphone, you press the right button so it turns red.
00:07:28I'll be able to see it and I'll start to cue people.
00:07:30So if you want to kind of jump into somebody's point or add on to it, I'm going to kind of keep an eye and then I'll come to you as somebody finishes.
00:07:38That way I know, know that your mic is on when you press it, but I'll come to you when it's time.
00:07:44So to pick up from what His Excellency mentioned, I think it's a good starting point.
00:07:49Really two themes I thought we'll cover here today.
00:07:51The first part and kind of the first half of it is talking about the content creation side
00:07:55and obviously the advent of AI, the role that it plays, and in some cases the loss of control.
00:08:02And then the second half I want to talk about the business side.
00:08:04We have a table of leaders in the media landscape, content creators from the advertising space.
00:08:12So we have a whole range of perspectives in front of us.
00:08:15So I welcome everybody to jump in.
00:08:17But to start, you know, is really around, we're not going to ask the question, are we using AI?
00:08:24I think that is a foregone conclusion.
00:08:26Rather, where I'd like to begin and ask the group to start discussing is where does AI, at least on the creation side,
00:08:33and you maintain the red line, where does the human fingerprint become essential, whether in a newsroom or elsewhere?
00:08:42Where do you see the protection of that human fingerprint while capturing, obviously, AI's efficiency?
00:08:49Well, thank you very much.
00:08:56I think, you know, AI should help people work faster, not replacing the human being.
00:09:02So the values, the cultural ideas should remain.
00:09:06So I think we should, as a cultural understanding and the best results, we should get out of AI rather than just, we have to control AI.
00:09:17I mean, not to control us.
00:09:19This is my point of view on that.
00:09:21Meena Al-Uraybi, Editor-in-Chief of The National here.
00:09:26I think the human element, in addition to the values and ethics, is incredibly important.
00:09:33Two elements.
00:09:34One is the news gathering on the ground.
00:09:37AI is great in terms of news gathering for verification, crowdsourcing, sifting through it.
00:09:41But essentially, you need humans on the ground to go get the human interest stories and speak to people and have that direct relationship.
00:09:47And that's something I think we should protect.
00:09:50And then second, on opinion.
00:09:51It's really important to know who holds the opinion that is being espoused.
00:09:55And I think increasingly we see already AI-generated videos, opinions, and so forth, and you're not really sure who's behind it.
00:10:02Whereas actually the authenticity and transparency of who's holding an opinion that is being circulated is important.
00:10:08Kelly.
00:10:09Hi, I'm Kali Puri.
00:10:11I head up the India Today Group, which is a media network in India across platforms.
00:10:16I want to add to Mina's point.
00:10:18I think that one of the places where we feel we can do a certain amount of augmentation is in the desk roles where we are rewriting the stories.
00:10:27But we absolutely cannot, at this moment in time, replace the human connect that reporters bring to the newsroom by telling the story with empathy.
00:10:37So one of the rules that we have as we are looking to put in a lot of AI in our work processes is a system called the AI sandwich.
00:10:46So you have human intent in the beginning, you have AI in the middle to help you augment, and then you have the human decision at the end, which is the final call.
00:10:58Hello.
00:10:59Santiago Garcia with Ithaca Films.
00:11:02I think the key element is authorship and who claims authorship over the work of artificial intelligence.
00:11:11We've seen different workshops from artists all the way back to the Renaissance that artists have claimed authorship for workshops and assistants that have done their work.
00:11:23And at the end, the authorship comes back to the person making the decision willingly to claim the work for themselves.
00:11:32And if governance goes towards the regulation of authorship with AI, I think a lot of ethics will be clarified and people will take responsibility, moral and legal responsibility for what they're endorsing.
00:11:50Hi, Fahim Amat from G42, an AI champion from UAE.
00:11:57In the last one and a half years that we've been working, and since I have the kitchen sink view from the development side, I want to add on what Pina spoke about.
00:12:09We constantly talk about what AI can do in the art and form of storytelling and communications and marketing.
00:12:17But what we really need to start focusing on is how can we be relevant in a society that's already AI native?
00:12:24And where does the social contract stand?
00:12:28How do we look at the concepts of trust?
00:12:32Where is it coming from?
00:12:34Because amongst us humans are now AI and agentic AI that's taking decisions on our behalf and putting out stories.
00:12:45There's one example I just want to share just to get our thinking right.
00:12:51We're working with some very innovative content creators from LA, France and here at the MBZ UI who are looking at the next stage of what content could be.
00:13:02So imagine you're on a flight, you have 13 hours of flight into Japan and you tell your in-flight saying,
00:13:12I want to watch a heist movie for six hours, give it to me in three parts.
00:13:18I want actor A, B and C.
00:13:21And in this, I want to learn on how to negotiate within the Japanese culture.
00:13:27And by the time your flight takes off, your content is curated.
00:13:31That's where this is potentially heading.
00:13:35So I just want us to keep that in mind as we carry on our discussions.
00:13:42I want to throw out to kind of build for those who represent publishers and newsrooms who are using AI to augment some of the journalism.
00:13:51Where is the red line in terms of working with your journalists of where the use of AI can come in and then where does the human element start and continue?
00:14:03I'm going to go to Miles first.
00:14:07Will you forgive me?
00:14:09I just wanted to kind of follow up on this point.
00:14:11My name is Miles Fisher.
00:14:13I'm an entrepreneur, created some of these visual AI companies.
00:14:16And I think from a creator's standpoint, you also run into the risk of kind of narcissistic mirror.
00:14:23You know, for the most part, yes, they can draw up these three characters who they want to see.
00:14:27I think the character they want to see themselves most is themselves.
00:14:30The iPhone was brought to market in 2007.
00:14:34It was really the Model 4 in 2011 where there was a camera pointing towards ourselves.
00:14:39And that's when content exploded.
00:14:42And I think AI, you know, allows us to gaze into a better mirror at least for a little while.
00:14:49You know, these companies are giving well-timed teases of what an AI future might look like.
00:14:55They're nudging us.
00:14:56This is maybe a better version of yourself.
00:15:00Years ago, as I built my company talking about data, to showcase the technology, this technology morphed me into Tom Cruise.
00:15:09It got billions of views.
00:15:11And I just think, you know, once we see our face on Tom Cruise in Mission Impossible or ourselves as a character in The Simpsons or on the soccer pitch, what fantasies remain?
00:15:22How many more ways can we see ourselves?
00:15:24I'll just close.
00:15:26The computer, you know, first beat a chess grandmaster in 1989.
00:15:30And yet today chess has never been more popular.
00:15:34So what it reveals to me is that when it comes to activities we take pleasure in, perhaps younger people don't really care if machines did it better.
00:15:43As with these kinds of AI, you know, we take pleasure from seeing ourselves.
00:15:48And you take us out of the picture, we start to lose interest.
00:15:53It can be narcissistic, it can reduce empathy, and yet it can completely simulate our fantasies.
00:16:07I'm Arun Puri.
00:16:08I'm chairman of the India Today group.
00:16:10We're also a multimedia group.
00:16:12There's no doubt that the AI is, for me, is just an engine.
00:16:17But the driver is the human and also has the brakes.
00:16:22I think the key is in this world of abundant information is interpretation, meaning, and ethics is what humans have to bring to the table.
00:16:34And I think this is where institutions are important.
00:16:37Institutions who have been built on the old values of journalism, of trust and credibility.
00:16:44I think that is what, in the end, has to be the end of the communication, as far as journalism is concerned.
00:16:55I'm Fabrice Frise for Agence France-Presse.
00:16:59And to your question, the answer is very simple and clear.
00:17:03We do not produce or distribute AI-generated content.
00:17:08So we encourage our journalists to use AI tools to prepare, to research, to organize.
00:17:15But any output must be verified by a human.
00:17:20This position is also true for images.
00:17:23It's also true for fact-checking.
00:17:24We are very strong in that field.
00:17:27But we cannot do a fact-check only based on an AI tool.
00:17:32This world might change, of course.
00:17:35But today, we consider that there are too many hallucinations.
00:17:39And our credibility is our biggest asset.
00:17:42So, so far, this is the rule.
00:17:44Hello, I'm Luis Mandoki.
00:17:49I'm a film director from Mexico.
00:17:51I've done most of my career in Hollywood.
00:17:54And I come to this table as a filmmaker, but also as someone who works every day with a language that now organizes the world, the audiovisual language.
00:18:08And I think there is also a myth we need to correct, which is AI is going to take away jobs.
00:18:15From the film industry, we see the opposite.
00:18:18AI does not eliminate people.
00:18:21It eliminates poorly designed tasks and maximizes human ability.
00:18:28The strategic question is not how many people do we lay off.
00:18:33It is how many people do we train to work with AI.
00:18:38Countries that understand that AI is the infrastructure and that energy is the bottleneck
00:18:46and that human talent remains the central asset will not lose sovereignty.
00:18:53Just to follow up with what the gentleman was just saying.
00:19:00First, my name is Lou Dickey.
00:19:01I'm chairman of Modern Luxury Media.
00:19:03We have 80 publications along with digital and social and events across the United States.
00:19:08And the way we view AI is we're able to create digital employees.
00:19:13And those employees are effectively assistants.
00:19:18And we're able to now expand the capabilities of our existing edit staff and even sales staff.
00:19:25And so I think that when you think about there's a lot of mundane rote tasks that get done in journalism every day.
00:19:35And we're service journalism.
00:19:36We're not breaking news and we're not opinion.
00:19:41And so it's really ideal for our company to be able to do that and empower our existing employee base with these digital assistants, these digital employees.
00:19:52And it also across, because we're a distributed business, it also enables us to, through a top edit process, achieve more of a consistent brand voice across all of our publications.
00:20:03So we have, we've been big proponents of it and are investing heavily in it and have found it to be, you know, very productive and very constructive.
00:20:13Hi, I'm Marjorie Kraus.
00:20:15I'm the founder and executive chair of APCO, which is a consulting firm.
00:20:19So I'm seeing this from maybe a different side, a little bit.
00:20:23But I want to just build on what you said.
00:20:25I think we see, we started an AI lab eight years ago.
00:20:29So we've been into this for a while and we have mandatory training on AI because we see AI as your assistant also.
00:20:38And somebody said that it's not that we're going to lose jobs to AI.
00:20:43We've told our people, you're not going to lose your job because of AI.
00:20:46You're going to lose it if you don't know how to use AI and you don't know how to use it ethically and correctly.
00:20:52And so, you know, I think this idea of how we can enhance the products of what we do by cutting out the tasks that take a long time to do and spend the time being smart and efficient and strategic is really important.
00:21:09And I think it has freed up a lot of that creative talent that otherwise would be doing basic research tasks.
00:21:18That some of that basic research tasks, as long as you check on the hallucinations, is valuable.
00:21:24I did want to also say one other thing is that when we look at AI too, we look at it from the point of view of where's the firewall between what's out there publicly.
00:21:37Because, you know, we have clients with confidentiality.
00:21:40But I think it's really important to think about educating enough people that they realize that when you're using the more pervasive AI that you're putting everything in the public domain.
00:21:52And that also causes information and misinformation.
00:21:57And so you have to be really careful on how you engage, I guess I'd say.
00:22:05Troy.
00:22:06Good morning.
00:22:07I'm Troy.
00:22:08I run the CEO of Omnicom Advertising.
00:22:11We look after two thirds of the world's top 100 brands.
00:22:14And in 1995, I remember the cover of Wired magazine.
00:22:18I still have that on my wall.
00:22:20It predicted the death of advertising.
00:22:22And I think we've done our case since then.
00:22:24So we've survived.
00:22:26And I think we will survive again through AI because I see it as a partner.
00:22:29I think we have to see what the value of AI is.
00:22:32So to Arun, to your point, I see it as for us, it's there to help recognize patterns.
00:22:38It doesn't necessarily break patterns.
00:22:40And so where the driver in us is ultimately there is to be able to ask better questions, to be able to push us to new places, and to be able to have that as an enabler as our partner.
00:22:50So it is a co-creator, as you sort of suggested.
00:22:53So for us, I think obviously the topic is convenient.
00:22:57It's easy to kind of go down the path of loss of jobs and everything else and automation and whatnot.
00:23:02But I very much see it as the beginning of a new economy that enables us to get to solutions quicker.
00:23:09Yes, obviously, the process will run faster.
00:23:12But I think those who ask better questions will win in this world.
00:23:15And I think the value of being distinct is incredibly important.
00:23:19As algorithms get us to very similar places, those who really value distinction will stand out from a commercial standpoint, from a human standpoint, from a purpose standpoint.
00:23:29So that's our responsibility.
00:23:32Laura.
00:23:34I think, so I'm Lauren Selig.
00:23:36I run a film production company.
00:23:38We've made movies like Hacksaw Ridge and Lone Survivor and American Made and Top Gun.
00:23:43But I also run a fund.
00:23:44And my background was not in filmmaking.
00:23:46It was in technology.
00:23:47I ran acquisitions from Microsoft and the Windows team.
00:23:50So those two things at one point when I first started them were very separated.
00:23:54And now they have to work together.
00:23:56And I think all of us at the table as leaders and as creators, we have an opportunity at a global scale to change how we see our lives, how we see storytelling, how we see the news, how we see opportunities, how we see
00:24:08cultural divide and cultures come together.
00:24:11And I think we have to be very careful about how we do that.
00:24:15And even in reverse engineering stories and advertising, we can do that as well.
00:24:19We've invested in companies where I can tell you exactly how much money you make based on how much you spend in marketing and where.
00:24:25And so why would you not reverse engineer a piece of content for that for your audience?
00:24:30Probably saves time and money, but also is an indicator of where culture and where people are thinking what they're doing.
00:24:35You know, on a positive note to throw it out there, I sit on the board of the XPRIZE where we use very big gamification to solve world problems.
00:24:43We have several billion dollars worth of prizes out there for carbon initiatives and for reversal of aging.
00:24:49But we just launched a project for a five million dollar prize for whoever comes up with the best AI piece of content showing how you can make the world a better place.
00:24:57So why would we not want to take that opportunity for all of us here and globally to reach out and use our voices in that way and use our technology in that way?
00:25:07And use our reach and our passions and our capabilities and our talent creatively for a betterment of our future, which is, I think, what the World Government Summit speaks to and what it should always be about.
00:25:19So thank you for having us. Appreciate it.
00:25:21Pedro.
00:25:23Hello. My name is Pedro David. I'm the chairman of Euronews.
00:25:27I share, I mean, you're not going to stop AI the same way Don Quixote wanted to stop the wind with his hands, right?
00:25:34So this is coming to us. We have to live with it.
00:25:36I'd like to bring three or four points and maybe some disagreement with some of what's said before.
00:25:43For big media organizations like ourselves, cost matter and AI will be a huge enabler of significant cost reductions.
00:25:52And in an industry that strives and that has a difficulty in monetizing its content because of how the internet works, because of how it is disseminated,
00:26:05AI will be a tool and will be an ally. It will allow us to cut jobs. We will have less people.
00:26:12We will also have people doing different things. But I think this is an opportunity.
00:26:18We have, we run an organization that broadcasts in 19 languages.
00:26:22You don't imagine how many jobs I'm going to cut. And I don't say, I mean, I sound horrible here.
00:26:27But we will cut a lot of jobs that were people who were doing translator, who are not journalists.
00:26:32They were just doing translation. And so we have to assume, and if we try to hide that there will be no impact in society
00:26:39and there will be people who are doing things that no longer will exist, starting with taxi drivers and going all the way
00:26:45onto a number of other positions in this world, we are fooling the people and we should not.
00:26:51So I think there is an important point here is, yes, this is a very important tool. It can allow us to focus on quality.
00:26:57We will free up people to do more editorial because at the end of the day, people want to see unique content, whether it is AI driven,
00:27:05whether it uses AI or whether it is 100% human, what people want is for us to find the next big story that no one has yet to publish.
00:27:18Two more points that I want to mention on AI. The first is bias.
00:27:24Is how are we ensuring and how will we control that people have access and know what is behind the algorithms,
00:27:37understand what is behind the thinking? Because today we kind of know, right?
00:27:42When we look at the landscape of media, we kind of know what bias each organization has.
00:27:47And so how technology is also going to deal with this is something that is really critical for us to understand.
00:27:53And finally, the power of brands. In a world, in news in particular, where, of course, almost everyone can be a journalist
00:28:02with an iPhone coverings and event live, the power of brands continue to be significant.
00:28:08We see that every time something major, every time something significant happens in the world,
00:28:15our audience comes back straight running to the big names in the game
00:28:21because they know that at the end of the day, we will tell the truth, we have that credibility.
00:28:26And so the power, I mean, AI will also enable a very interesting creation of new media houses,
00:28:34but ultimately the power of brands and if they do their work properly and protect that brand,
00:28:39it will continue to be a very important name in the game. Thanks.
00:28:43Hi, I'm the founder of Inc 42. We are a new age tech media company.
00:28:50And in our 11 years of existence in India, right, we have realized, especially when you talk about AI,
00:28:56we automate a lot of processes, but our role as an institution is more important than ever.
00:29:01We work with all the state and central governments and we have often found governments and institutions reaching out to us just to verify data,
00:29:09especially when content creators are putting a lot of data out.
00:29:13What is the importance of verification, factual credibility?
00:29:16Like Arun sir said, the credibility is most important in these times, especially from a journalism POV.
00:29:21And I think, you know, as we grow as an economy, as an institution,
00:29:26credibility will be the benchmark or one of the most important pillars in the new age world we exist.
00:29:32Mike.
00:29:35I'm Mike Allen, co-founder of Axios, a media website based in DC.
00:29:41We cover AI governing national security reality.
00:29:46Your Excellency, I have another reason that you can be an optimist.
00:29:50So we are an extremely AI-forward company.
00:29:53Our CEO, Jim VandeHei, over here built an app on the bus the other day.
00:29:58But this gathering, this conversation at the World Government Summit,
00:30:04the live events that many of our organizations do around the world,
00:30:08the expertise that we see from your journalists are even more important,
00:30:14more valuable in an AI world.
00:30:17In a world of AI slop, that human expertise, human connection becomes more valuable than ever.
00:30:24So you can be an optimist.
00:30:29Meena.
00:30:31Sorry.
00:30:32Okay.
00:30:33I think you can hear me.
00:30:34I just wanted to say, when you started the question for the publications from the newsroom,
00:30:39and so I think for us the difference between journalism and content creation,
00:30:46and journalism is very different in terms of how we use AI.
00:30:50And I think our red line is fact and fiction.
00:30:53So we can easily do AI recreate, I don't know, the moment that a particular president was extracted from his bedroom
00:31:03and flown to another country.
00:31:05AI can fantasize what that was like, but unless you actually have seen it, know, have spoken to the sources.
00:31:10So I think for us it's fact and fiction, and we also don't produce anything that's AI generated that is written and put out.
00:31:17There are things that we do in terms of AI summaries that are actually quite popular, and people want that option.
00:31:22But because we're transparent and we label it properly, it makes the difference.
00:31:26So I think for us it's fact and fiction that's different, and I think entertaining is different from informing.
00:31:31And for us what's really important is how societies will be informed.
00:31:34And I think that it's completely different from the fantasies and putting yourself first in narcissism.
00:31:38I agree.
00:31:39I think there's joy in that, but I think what really counts is how we're informing societies.
00:31:43Hi, I'm Matt Kaminski.
00:31:46I'm the editorial chairman of the Middle East Broadcasting Networks.
00:31:49We're an Arabic-first media platform trying to connect Washington with this part of the world.
00:31:55But I'm also a senior advisor at an AI startup that benchmarks how business adopts AI.
00:32:01And I just want to encourage our industry to take a look at other industries and how they are thinking about it.
00:32:09And having led efforts inside newsrooms over the last ten years to adopt AI, I think in the newsroom we don't think creatively enough.
00:32:18We don't think ambitiously enough.
00:32:20We're very conservative for the reasons that you mentioned.
00:32:23We have to be careful if the facts are right.
00:32:25But that is preventing us from really kind of trying to imagine how to turbocharge our journalism using AI.
00:32:32I think we've seen that data can be an incredibly powerful force in creating work that we couldn't do 20 years ago.
00:32:43And this actually predated AI when we started using data a lot in newsrooms.
00:32:46With AI available, we can do so much more.
00:32:50We can do so much more to sort of to obviously to, you know, as Peter was saying, improve efficiencies around the way that this stuff is produced.
00:33:03But think bigger.
00:33:04Think bigger.
00:33:05How can this stuff be used to sort of reach more people?
00:33:07How can it be used to produce the kind of content we could not imagine doing five years ago?
00:33:13And if you look at, I'll say one last thing.
00:33:14If you look at the banking sector, they are experimenting every day.
00:33:17They're spending billions of dollars to sort of really kind of to overhaul their, all their processes.
00:33:22And I just pick up a little too much resistance within the media sector to really kind of opening ourselves up to the possibilities of this.
00:33:30Yeah, I think at one point I'm going to go to Kali.
00:33:34There's this perception that the media industry more often than not is on its heels and on its front foot in terms of a defensive posture versus an offensive posture.
00:33:42Kali.
00:33:43I want the minister to be very optimistic and what David said that when there's a big event, you find that the audience comes running back to credible news brands.
00:33:56But I'm going to throw something in the room which kind of changes that a little bit to what Suleiman said, right?
00:34:03He said that we need to look at social contracts maybe differently in an AI native society.
00:34:10That got me thinking that digital first natives don't think about privacy, dating, love, relationships in the same way as let's say baby boomers, right?
00:34:23What if the digital, sorry, the AI natives don't see trust in the same way as we do, right?
00:34:32What if a bit of lying and a bit of hallucination and a bit of cunning is okay by them?
00:34:38They say, oh, that person is still trustworthy enough.
00:34:41It's not seen, they don't see trust the same way as us.
00:34:44And I think that then raises very big existential questions for us on credibility and trust.
00:34:51And I don't know how to solve that.
00:34:55I'm just throwing it in the room because all of you got me thinking about it slightly differently.
00:35:02Please.
00:35:03Thank you all.
00:35:05Thank you all.
00:35:06I forget to just say who I am.
00:35:08I'm known as advertising founder in Dubai for the last 30 years.
00:35:12Maybe press your, so we can have the mic on there.
00:35:15I'm Samy Amuka.
00:35:16I'm known as advertising CEO and founder for the last 30 years in Dubai doing alcohol.
00:35:22But, you know, I think for answering your question for the red line for the problem when it comes to AI.
00:35:29I think, you know, we have to keep keeping the honesty and ethics always and teach ourselves how to keep the honesty and ethics.
00:35:38We should know when we should use AI.
00:35:41Number two, we should know who owns the content.
00:35:45And number three, who is responsible for it?
00:35:48Oh, sorry.
00:35:49Who is responsible for the content?
00:35:51Who is behind it?
00:35:53So the ethical and the honest, and as publishers in this part of the world or all over the world, I think we should know and when we should use AI.
00:36:02So keeping that, we have to keep, at the same time, we have to keep the bridges, not the fences for creativity of the content.
00:36:10I definitely think that we are in the age of authenticity.
00:36:14I feel like the consumer society, as you mentioned, the value of trust is only going to continue to rise.
00:36:23I think it started with politics probably about 10 years ago.
00:36:27And we started to see that movement.
00:36:29It wasn't necessarily what was being said.
00:36:32It was how it was being said.
00:36:33That was the sort of first wave of it.
00:36:36But I think now with AI and the way in which content is going to be created, it's going to be very important for companies and for individuals, governments in particular as well, to really stand behind their level of how much they can be trusted for the communication that they're putting out there.
00:36:53Where does it come from?
00:36:54What place does it come from?
00:36:55And again, the level of authenticity in terms of the delivery of that message as well.
00:37:00So I think we've been in that era, but I think it's only intensifying right now for the next decade or so.
00:37:06I'm going to ask one more kind of the theme here around where we're at keeps coming back up, credibility and trust.
00:37:14So one kind of final question to the group before we move on to the business side is what efforts can you take individually within your organizations, whether it's on the information side, on the entertainment side, to maintain that credibility and trust?
00:37:30Because it's something that takes a long time to build and something very easy to lose.
00:37:34And so how do you think about doing that on the day to day basis?
00:37:38And is it requiring labeling something as being AI assisted?
00:37:42Does that help build trust?
00:37:44How are you thinking about that in your respective domains?
00:37:47Please.
00:37:48I don't know.
00:37:49Thanks.
00:37:50Nick Raymond.
00:37:51I work for Oxford Analytica at Bristol.
00:37:52It's obviously acquired by Dow Jones recently.
00:37:53I think the trust board is pretty well linked.
00:37:54The key to it is actually just being really transparent about exactly what you're using AI for.
00:38:07That trust part, the gentleman from the mayor here nailed it really well.
00:38:12And it comes back into something we talk about a lot, and forgive me, but actually, I think we're going to go back to Aristotle here.
00:38:25Aristotle was a guy that, I don't want to talk philosophy, was a guy that communicated for a living.
00:38:30And if he didn't do it well, he couldn't get paid.
00:38:32And he said there were three elements.
00:38:34It was about ethos, pathos, and logic.
00:38:37Where you go to a certain sort of moral standard, perhaps a beat, someone who you trusted.
00:38:43Pathos is about the emotional connection, which I think we go back to the 90s, 2000s.
00:38:48It was something the government was terribly good at.
00:38:50It was all evidence-based.
00:38:51It wasn't really connecting.
00:38:54And then that hard facts part of it.
00:38:57I mean, hard facts in particular are probably, given the question you read first of the start,
00:39:02which is, in an atomized information environment, where there's often little trust,
00:39:07where you're not just speaking to one community, but different communities.
00:39:11That trust part of it is absolutely critical.
00:39:14Do I follow that?
00:39:19I didn't really want to follow Aristotle.
00:39:22But just to wrap on a couple of points.
00:39:24One, I just wanted a plus one with Mike Allen's point about optimism.
00:39:29And I would take an optimistic position on this.
00:39:32I think Axios and Jim, you've done amazing work in AI.
00:39:36And it's a must-read for me.
00:39:38And I would suggest for the rest of you as well.
00:39:40I'm an optimist for two reasons.
00:39:42One, our business structure is almost factory and industrial-like.
00:39:47We have a very large news gathering operation.
00:39:51We're a news gatherer.
00:39:52We supply many of you with our content.
00:39:54And so for us to do that efficiently, the productivity of AI is really an asset.
00:39:59It allows us to grow our ability to cover more stories, to deduce more insight from data,
00:40:05to scan thousands of documents or millions of pages when somebody decides to drop those.
00:40:10And so our ability to find what's new actually gets better as a result of AI.
00:40:16The second thing is that I think the market value of truth and reliable,
00:40:20good branded truth is going to go up in a misinformation society.
00:40:25And obviously Reuters, we have been pursuing truthful journalism for 175 years now.
00:40:32And we have the standards and we think this is a great moment for the work that we do,
00:40:36the great work that AFP does and those who spend their days trying to gather the first version of the truth.
00:40:41Yeah, Tyler Brulet from Monocle magazine.
00:40:46I guess one thing just to your challenge about, yeah, trust.
00:40:50I guess one thing I sort of, when I think we bring in young journalists to our organization,
00:40:54Mina just was sort of picking up on your first point.
00:40:57Reminding young journalists when they arrive in Muscat, when they arrive in Bangkok,
00:41:02to get off their screen, to look up, to challenge what is around them.
00:41:08Because I think one of the great challenges, I sort of question sometimes our business of journalism,
00:41:12is it still attractive?
00:41:14But take a step back from that, do we give young journalists the tools to report still?
00:41:18And the status in a world of storytelling versus journalism.
00:41:23If I may pick on that.
00:41:26I think if I speak to my young girls, none of you guys would be relevant to them.
00:41:33Their source of truth would be on TikTok.
00:41:36That's the reality.
00:41:37That's the generation that we're walking in.
00:41:40Where influencers, different platforms, are the source of truth.
00:41:44Now where they get it from, how they curate it.
00:41:47When we speak about trust, I think we give too much credit to ourselves on our own human frame biases.
00:41:56We are at the heart of pretty much all the screw ups that are around there.
00:42:01And if you speak to the next generation, look at the UN, look at all the organizations,
00:42:07look at the wars that are going on, look at how journalism is happening.
00:42:12Should they trust is a question we should hold as a mirror.
00:42:17I think with the advancement of technology and AI,
00:42:22there is a potential where there's going to be an uberization and a democratization of the source of truth.
00:42:30Where it wouldn't be the old traditional names, but perhaps a blockchain that validates what is real news and what is validated.
00:42:41So just want to leave that thought.
00:42:44One quick point.
00:42:45I think we can only have three mics on at any given time.
00:42:48So that's causing a little bit of a problem for people.
00:42:50I'm going to end this part of the session with Miles and then we'll jump to the second answer.
00:42:55We are in the age of authenticity.
00:42:57How do we trust people in real life?
00:42:59It's people that do what they say they're going to do, that are consistent, and show up regularly.
00:43:05No Mike Allen is going to, you know, be awake at 3.50 a.m. every morning and deliver consistent news.
00:43:12The newspapers used to arrive on our door.
00:43:15And yet, I 100% agree with your daughters.
00:43:18You know, one of the great success in media over the last 10 years have been podcasts.
00:43:23I've been told by young people, podcasts is now almost a cringe term.
00:43:28It's like web series 10 years ago.
00:43:30They're just shows now.
00:43:32But there's not a lot of technology and people talk for three hours, unfiltered.
00:43:36Mr. Patel has built an extraordinary media empire.
00:43:39People talking over and over and over every day.
00:43:42Incredible trust.
00:43:43Incredible trust of podcasters.
00:43:45No filters.
00:43:46Because all of us can publish brilliant words augmented by technology.
00:43:51But can you do it with your eyes up and be present over and over?
00:43:56So, for young people, if you want me to trust you, I don't use filters.
00:44:02In fact, being too perfect signals AI used it.
00:44:06I think your daughter's, you know, radar is probably much higher tuned than most of ours.
00:44:12And so, you know, even just talking regularly builds trust in this weird moment of time that we're in right now.
00:44:21And I identify with this person because, frankly, I feel like they're no BS.
00:44:25They're not augmented.
00:44:27Hello, my name is Ricardo Karam.
00:44:38I'm a TV host.
00:44:40I live in Lebanon, so I come from a place where words have filled with words and where silences have buried truth.
00:44:54AI did not invent that, but AI can magnify that.
00:45:00In countries like mine, narratives don't fail quietly.
00:45:07They fracture societies.
00:45:10So, having said that, I feel I'm less concerned with how powerful tools are.
00:45:17I'm more concerned with how accountable we remain.
00:45:22I think that innovation without empathy produces content, but does not produce connection.
00:45:31So, we all care for a global outreach, definitely.
00:45:35But global reach is not success if the context is lost.
00:45:41I just want to step back for a minute, because I think AI is just the latest iteration of a cycle we've been in for maybe like three decades.
00:45:54And we came from a media world globally, and certainly at the national level, where there was a very small number of companies that had the scale to reach the population.
00:46:06The internet obviously changed that.
00:46:09And AI is just the next advance.
00:46:12So, for a while, anyone could put up a website.
00:46:15All of a sudden, these companies that were the legacy communications companies who could set the narrative had to operate in a world, not of monopoly, but of competition and almost extreme competition.
00:46:29AI is just the next iteration.
00:46:31A small company like ours can publish in every language if we want and reach the entire globe.
00:46:38So, the trust issue is the key.
00:46:42But there's two sides to the trust issue.
00:46:44Do we trust the audience in a world of extreme competition to decide what they want to rely on?
00:46:51And whether we want to or not, that's what we have to do in the new iteration.
00:46:55So, full of risk, but also full of opportunity, I think.
00:47:00I just wanted to maybe put a marker down for, it's a combination of things that were said.
00:47:08But we've been focusing on the output of AI.
00:47:11And I wanted to just remind us about input as well.
00:47:14Because my concerns are that, well, as a woman, that a lot of the algorithms and things that work are not developed in equal ways.
00:47:27And so, you want to make sure that the content that goes in is also reflective of society.
00:47:33Otherwise, what comes out is going to further reinforce biases that already exist.
00:47:39And I think also the preservation of global culture is really important.
00:47:44And while there are a lot of ways that I think these tools do bring people together, I think that we have to be very careful.
00:47:51We don't lose the distinctive nature of various cultures around the world.
00:47:55I work in 35, I have offices in 35 countries, and they're all different.
00:48:01And protecting and preserving what's unique about those countries is an important part of the storytelling and of, I think, creating the global understanding.
00:48:11And so, I think when we talk about authenticity, we have to really think about not only what comes out,
00:48:18but how we organize about what goes in so that we protect certain things that are really important.
00:48:26Luke, we'll go with you then, Your Excellency.
00:48:28Luke Courtier, Monocle Magazine.
00:48:31I think there are probably two parts that are quite interesting.
00:48:34One is obviously being rigorous about the utility of AI.
00:48:38But I think the other one is not underestimating the sheer irrationality of ourselves as consumers,
00:48:45in that I might happily scroll TikTok all night, but I'll also read, I'll listen to Tortoise Media,
00:48:51I'll read The Guardian, I'll read The FT.
00:48:53And I think there's a great danger in the fact that we put ourselves, I think your point on the audiences and trusting the audiences,
00:49:02do we give ourselves too much power?
00:49:04That lovely film on the plane that's a three-hour crime thriller that I learned to negotiate in Japanese, I don't want to watch that.
00:49:12I don't want to watch what I've created, I don't think I'm as good as really, really good filmmakers and artists and editors.
00:49:20I think we have to allow this sort of intangible thing that is the irrationality and the emotional nature of the human condition to thrive.
00:49:29And I do believe that AI can help and facilitate that.
00:49:32But that's where we have to be incredibly rigorous on utility.
00:49:36And that's not a defensive stance, that's a very progressive stance.
00:49:40Yeah, I'd like to have some comments about, you know, when it comes to truth, authenticity and facts.
00:49:48Now if we are looking at the sources of these truths, you know, it's very diverse.
00:49:53Today we are having more than 50 million videos on TikToks and YouTube created daily.
00:49:58We have 1.3 billion snaps every day also.
00:50:02And the young people also, they trust also the content creators they are following also.
00:50:07And they consider them as a source of good information, good factual information.
00:50:11And we cannot discount the fact that we have now, we have an influence and AI influencers also.
00:50:19And the size of those AI influencers is growing to reach around 30 billion in 2030.
00:50:25So it is a very complicated question when we are asking about the truth and authenticity of the story that we are receiving.
00:50:32And if we look at the young generation, what kind of sources of information they are using?
00:50:37You wouldn't find them, you know, really keen on reading, you know, traditional media.
00:50:43And that's creating a lot of challenge.
00:50:45And many countries now, they are, they started to move.
00:50:48I read that in Finland, they are now starting to educate their kids at the age, as early as the age of three,
00:50:55to distinguish between, you know, good information and bad information and from where they can get the information.
00:51:01In Australia, they have also come up with new rules and new laws to prevent, you know, kids and teenagers from, you know, using the social media as a source of them.
00:51:15Yesterday, actually, we had a very important speech by the Prime Minister of Spain,
00:51:19where he announced that they will start prosecuting the platforms like YouTube and TikTok and, you know, other platforms for the misinformation and the disinformation and manipulation of data and manipulation of algorithm also.
00:51:35And he says they cannot hide behind the algorithm because we will press charges, criminal charges against those platforms.
00:51:43So things are changing in a very rapid way. Last year, and I will end with this also, another subject also, Mark Zuckerberg have announced something really interesting.
00:51:54He said, you know, we are working because social media platforms, they are spending around 250 billion on AI developments.
00:52:01He announced that going forward, anybody with any products, he can just upload his product and the budget,
00:52:08and the AI will create the whole campaign for him, the whole script, the whole videos and KVs for him,
00:52:13and they will target the audience and they will achieve the target for him, the conversion rate that he's looking for.
00:52:19So we are looking for a paradigm shift when it comes to if he can do this for a product, he can do this for a cause or for a value or for anything else.
00:52:29So this information or these facts only is food for us and to think also about the young generation because countries need to really think about protecting their future generation from AI.
00:52:47So I'd like to use the last third of the hour and a half we have together to talk about the business side of our industry.
00:52:54We've talked a lot about the use of AI and the tools in our organizations and how we use it individually, etc.
00:53:01But, I mean, I think we're starting to recognize that AI has essentially created, you know, or devalued commodity content down to about zero.
00:53:11We recognize that the creator economy is changing the way in the relationship with consumers.
00:53:17And the reality is big tech is taking 80 to 90 cents of every digital dollar.
00:53:22So I'm curious how many of you are looking at your business models and looking how you diversify your business models to create long-term sustainability.
00:53:31Because without the economic engine, none of us would be in the business of making the content and building relationship with audiences.
00:53:39Jim.
00:53:42If I could humor me, I'm going to start with a sermon and then I'll get to your thing.
00:53:48I do think we have to be honest with ourselves in terms of AI and how we're informing readers right now.
00:53:55The most important thing we can do as content creators, as media companies, is do a much better job of explaining to people what's coming and the velocity that it's coming at.
00:54:05I can speak with authority on US media. We started Politico, then we started Axios, and I look around at US publications and there's very little coverage of it.
00:54:16And anyone who's played with this technology in the last month, the revolution is coming at warp speed.
00:54:23And the most important thing that we could do is do a much better job of explaining how this is going to affect people.
00:54:28And then since most of us are in positions of power, leveling with our employees.
00:54:34To Kaminsky's point, this idea of newsrooms being slow to AI, we are very slow for being honest with ourselves to the digital transformation.
00:54:44But sitting our employees down and explaining to them, you don't have months, you don't have years.
00:54:49If you're not using this technology 20% of the day to 10x your output within the next year, in all likelihood you do not have a job with any of us or anywhere.
00:54:58And that's a hard thing to say, but it is the most honest thing that we can say to people and then give them the tools and give them the training to do it.
00:55:06On a hopeful note on the business side, and this will capture how much I think these businesses are going to change.
00:55:11I think the hardest thing to solve in America is local news.
00:55:15And we're in about 34 cities, I guess we're about to go into seven more.
00:55:19I believe we can get to 1000 because of AI with humans on the ground.
00:55:24Here's why. We've broken down a media company down to every single component that gets produced.
00:55:30Other than the journalism, every single piece of that can be automated today.
00:55:36And certainly by the end of the year we think with human level efficacy, almost everything.
00:55:41That means data visualization, visualization, headline selection, social distribution, the work of a BDA, the work of creating an ad, the work of selling an ad, the work of placing an ad, the work of monitoring an ad,
00:55:51and then interacting with the buyer.
00:55:54What that means is you take that there's a whole lot of jobs that would have existed in the old era that just aren't going to exist.
00:56:00And we have to level with people on that.
00:56:02But the flip side is, if we're right, and we could be in hundreds or 1000 cities, that could be 10 times the journalists that we have today.
00:56:10And so I would caution all of us just to think honestly about how did we respond to the internet?
00:56:17Where were we slow?
00:56:19Where did we surrender our money, our freedom to the platforms?
00:56:25And then think about how we could be as ruthless as any other business in terms of turning these into businesses so we can keep producing awesome content.
00:56:34A little bit further then, to answer, you covered a lot of ground on the savings opportunities and how each of these organizations are going to fundamentally change.
00:56:44And talking about surrendering to big tech, how do we look at the future of monetization?
00:56:49While many of these big tech players are taking, again, 90 cents of every dollar, how is everybody thinking about driving value of their content in that relationship with their audience?
00:57:00I'll throw it to you and then I'll go to Peter.
00:57:01Listen, I think if you provide a distinct product to a distinct audience and it's based on trust, that content's probably worth more money going forward.
00:57:10Because everything else is going to be commoditized.
00:57:13So if you have a relationship based on expertise or a relationship based on personality or trust, that's going to have huge, huge, huge value.
00:57:22And if everybody has access to the exact same brain through AI, it's the stuff that sits above and beyond it that makes you and makes us as humans distinctive.
00:57:31Therefore, that has value.
00:57:34Now, if we're being brutally honest, I don't know that most journalists produce that.
00:57:38There's a lot of generalists.
00:57:40There's a lot of people who are sort of happy to be spectators.
00:57:43But if you're like an expert, and if you can provide news and nuance that a machine can't, and if you can provide personality or entertainment that helps people think provocatively and differently, I think you're going to have a ton of value.
00:57:56I just think it'll look a lot different than it does today.
00:58:00Well, it's clear how people from the journalistic and the media sector speak better than the private equity financial guys because you put it in a much more beautiful way.
00:58:09But I 100% agree with how you phrase it.
00:58:14On big tech, it's our fault.
00:58:16I said this last year here, I'll say it again, and I'll say it everywhere.
00:58:20We keep on allowing, you said we're in a position of power, but maybe we're not in such a position of power because we let Google and the others play us around as if we were little toys being thrown up and down every time they change the algorithm.
00:58:33On average, everyone went down 20-30% on page views last year.
00:58:38If you start going and asking around the real numbers, everyone had a huge cut.
00:58:42And it was not because suddenly our content is worse.
00:58:44It's because they keep on playing with the algorithm.
00:58:46And the states let them, and the huge discussion we had this last year, we're not going to go again into that, whether we should have the European Commission or the Federal Government or the Prime Minister of Spain for that matter,
00:58:56governing and regulating on this because that's a whole can of worms that we don't want to open potentially.
00:59:03But we are the owners of this content.
00:59:08And we do not, I mean, we don't have an articulated position.
00:59:12We don't have the capacity to organize ourselves and say, guys, enough.
00:59:17And so we're the ones to blame because we have that position of power.
00:59:21And we don't fight enough because, of course, we all get some money from Google at the end of the day.
00:59:24And so we are afraid of throwing that back at their faces.
00:59:27But we should.
00:59:28And I don't know if there's anyone else.
00:59:30If there's anyone from Google here.
00:59:32But it's clear that we have to bring this discussion and we shouldn't be afraid of doing it.
00:59:37On the other hand, we also have to be honest about something.
00:59:41I am quite recent into this sector.
00:59:45But when I compare it to other sectors that we invest, there's a lot of costs.
00:59:50There's a lot of spending that is unnecessary.
00:59:53There's a lot of things that we keep on spending money with that is just the response we get is,
01:00:01oh, it's because it's how it's always been done.
01:00:04I mean, you can't say this in Euronews.
01:00:06Every time everyone tells me, oh, it's how it's been done, I feel like firing that person right away.
01:00:12Right?
01:00:13Honestly.
01:00:14Because you cannot live in a sector who is under such difficulties and pretend that we're going to continue doing things as we've been doing in the past.
01:00:22On the other hand, I 100% agree with you.
01:00:24If you focus our people into creating original, unique content, this will drive the audience.
01:00:33This will drive productivity.
01:00:36We lost 2% of our digital audience last year.
01:00:39But we increased output by more than 30.
01:00:42So we recovered from the Google effect by creating more and better content.
01:00:46And so there's ways to fight this.
01:00:48You don't have to take it as a lost cause.
01:00:51And we turned around Euronews in the last three years from a huge loss-making, almost bankrupt company into a profitable operation.
01:00:57Now we need to make it sustainable.
01:00:59But it is possible to do it.
01:01:00And even with a legacy organization.
01:01:03So I definitely, and we have to go back to the fundamentals, we have a crucial role to play in our societies, in our democratic societies.
01:01:16We have to keep power accountable.
01:01:18We have to keep telling the stories.
01:01:20We have to keep generating this link between, and this link is needed.
01:01:26Of course, people speaking directly, and the podcasters, and all the content creators, of course they also have a place to play in the space of free media.
01:01:36But there is also value on having people who have knowledge, who try not to be biased, who are not playing in the political arena, who come and narrate the facts.
01:01:49Because at the end of the day, if Trump goes and gets Maduro, that's a fact.
01:01:54And the journalists don't necessarily have to go and agree with Trump or disagree with Trump.
01:02:00They can, as citizens, but they should do that in their space.
01:02:03As journalists, they have to go back to the facts and hold the truth.
01:02:08And that needs to still continue to be reported.
01:02:11And when you do that, the audience recognizes it.
01:02:13And so, there's, I mean, we're all content creators.
01:02:18We can create content for a niche.
01:02:20We can create content for everyone.
01:02:22We can do it with a bias.
01:02:24We can do it with neutrality.
01:02:25There's space for all of us.
01:02:27But at the end of the day, when we go and try to do something else, which is politics, and we believe that the profession goes from journalism to politics, and we take that active role, and our journalists take that active role, they need to understand the consequences of that.
01:02:41And it's fine if they're ready to take those consequences, but as organizations, we need to decide whether we want to take that and whether we want to be the space for that or not.
01:02:50Well, I'll tell you.
01:02:51I'm going to go to Neil.
01:02:52But we don't have somebody here from Google.
01:02:54But I think you have a good point that we need to mention to the organizers of the World Government Summit.
01:02:59Next year, a very dynamic conversation with future media.
01:03:03We should invite members from Big Tech.
01:03:06Neil.
01:03:07I think I forgot to introduce myself last time.
01:03:11Neil Patel, co-founder of Tucker Carlson Network.
01:03:14You know, on the business question, I think the truth is if you have commoditized media content, you can't fight Big Tech.
01:03:22They're going to take all your revenue, and you're pretty much dead in the water.
01:03:26I think if you have distinct content, if you reach people with that content, if you build a loyal audience, if they have trust in you, then there's almost never been a better time to be in media.
01:03:42You have all the traditional revenue lines that media has used for quite a while, subscription and advertising.
01:03:48But once you have that audience in place, you know, you can launch your own brands.
01:03:53We're a company that's less than two years old.
01:03:56We have three brands we've launched.
01:03:57Our largest brand is bigger than the media company, and it's 12 months old.
01:04:02So it's about building that audience and building that loyalty and standing, finding your lane.
01:04:11I think we probably embody that more than most.
01:04:14But I think it's interesting.
01:04:17There's just two paths.
01:04:18One is a dead end.
01:04:20It will be AI, and it will be these big tech platforms.
01:04:24But the other one is full of opportunity.
01:04:27Well, just a reflection over everything that we've been hearing today.
01:04:37I think it all comes back to authorship.
01:04:41The way to monetize content today comes back to trust, the trust that you can have in a human being, a person who has built a career and a credibility.
01:04:52It's greater than the trust that you can have in machines.
01:04:56We've all seen how data gets manipulated by bots, numbers get manipulated by bots all the time, trying to create a fake trust, a fake credibility, and trying to manipulate people into believing that there is an echo chamber and a lot of validation from fake information or fake stories.
01:05:18And the way you can combat that and the way you can monetize that is the same way the star system has worked over the years in Hollywood and in the media and the different media platforms.
01:05:31And it's how do you validate and how do you build a star system that generates credibility and trust into the people in your organizations or in our films and how those people are authentified and generate or create accountability for what they stand behind.
01:05:51So it all comes back on how do you regulate this certification of authenticity that the person that is speaking to you is actually the person that is claiming authorship over the content that they're spreading.
01:06:12You asked about opportunities.
01:06:14I would like to sound as optimistic as my colleague here.
01:06:18I will be much more pessimistic because I just read in the play in the Reuters Institute digital report that says that the media is facing an existential threat.
01:06:31Existential.
01:06:32It's a very strong word.
01:06:33And we know why this is.
01:06:35It's because AI agents offer answers instead of links.
01:06:42And we can't finance our efforts in AI if we don't have media clients who are strong.
01:06:49And so the elephant is the room, I believe, is the remuneration of IP, of intellectual property.
01:06:58And we are the victims, all of us, of theft at an industrial scale.
01:07:05And you ask why are there not platforms in the room.
01:07:11I think we should be honest about it.
01:07:14The Google, Facebook and others have disappeared from our space.
01:07:20They no longer participate in a news conference.
01:07:23They disinvest.
01:07:24They want to disengage because it's a source of nuisance.
01:07:28It's as simple as that.
01:07:30Hi, I'm Emma.
01:07:34I have the very lucky job of running the Observer newspaper, which is the world's oldest Sunday newspaper.
01:07:41And Luke, thank you so much for mentioning Tortoise Media, because it was Tortoise that acquired the Observer from the Guardian Media Group last year.
01:07:49And so we have a huge challenge to turn around what is a print only title on a Sunday to make it fit for a modern paying customer base.
01:07:59And so, Pedro, I have the same strength of feeling of you when you mentioned this is just the way it's done.
01:08:05If you buy a newspaper business now and you come up against a kind of production team who have always done it in a certain way,
01:08:12there's a feeling that things must be must be different and we need to do it now.
01:08:17Our challenge this year is to turn the Observer around, make it profitable and ensure that it is producing the right journalism for the right audiences.
01:08:27And so we're really optimistic about building on what we've done with Tortoise, which is much more our kind of ethos was sense making and storylines.
01:08:37And so making sure that majority through podcasts that we're telling the right stories to our unique younger and skews female audience and bringing the Observer and Tortoise together.
01:08:48And so to Mike's point earlier, we're really enthusiastic about opening up our newsroom every single week.
01:08:54We invite our subscribers into our newsroom to see our journalism being made.
01:09:00And so we think there are huge opportunities with AI in terms of efficiencies in news gathering and our processes.
01:09:07But the ability of bringing people together with our journalists and our customers, we feel it's massively important for us this year.
01:09:15I was just thinking about if we convene the same panel in 27 and 28 and 29, how the narrative would change and how different.
01:09:27And it's a little bit of fear, denial, rage, acceptance.
01:09:30And when we think about, we talked about legacy media or traditional media and how slow it was to embrace digital.
01:09:37Keep in mind, digital, all it really did was it changed distribution.
01:09:42AI, this is like multiple orders of magnitude different than just changing distribution.
01:09:51And I'm afraid we're making the same mistake all over again, which is, again, late adoption.
01:09:56And somebody was talking about the banking industry and how it's moving very, very quickly.
01:10:00It's not just the banking, it's pharma, it's everything else you can imagine.
01:10:04And I'm afraid that I hope we're not relegated to the past and also in terms of this slow adoption of something that, as I say, is multiple orders of magnitude in terms of the bedrock fundamental change it's going to create in our business.
01:10:20And I think the other thing in listening is we're talking about AI as if it's some static thing.
01:10:25AI is moving so fast.
01:10:30And if you talk to, whether it's ethicist or you talk, and I had a nice conversation with a woman the other night from one of the spinoffs of Google.
01:10:38And we talked about how much money is getting invested in this.
01:10:42And we say, well, Aristotle, it can't do empathetic AI.
01:10:46Well, it will be able to do empathetic AI.
01:10:49And it will be able to do, in other words, whatever you can imagine.
01:10:53This is even prior to super intelligence because then you can't imagine it.
01:10:56But whatever you can imagine, whatever voice in terms of entertainment or news or localism or anything you can possibly imagine,
01:11:05your horizon should be broad because it will be able to do it.
01:11:08And it's moving at breakneck pace.
01:11:11And so I think we would all be wise to embrace it.
01:11:16And I also think there's sort of a gravitational pull.
01:11:18The audience, our consumers, will ultimately make the determination.
01:11:23And if we think we can be, you know, we can in essence sort of be the arbiters of truth and media because it controlled distribution was able to do that.
01:11:33Digital democratized distribution.
01:11:36So wasn't able to do that.
01:11:39This, as I say, is like three-dimensional chess in terms of what it's going to be able to do.
01:11:43So ultimately the consumer, just as they were with the digital transformation, will be the true arbiters, you know, in terms of what they accept and don't accept.
01:11:51But I would just sort of caution everyone that I think that we would be wise to embrace it and realize the full capability of it because it will turn itself back at us if we don't.
01:12:04So, you're asked what's working.
01:12:07This is going to sound crazy.
01:12:08Ladies and gentlemen, print.
01:12:11We grew Luke at double digit last year.
01:12:16The money is still on paper, which is remarkable.
01:12:18And Google is not taking 90% of that.
01:12:21The second thing, Miles the cafe.
01:12:23This is also, I think, our point of bringing people together.
01:12:26It's great having conferences.
01:12:28But bringing people together, almost like in the world of local news, used to go into all sort of local newspaper, bringing them a story chip.
01:12:36That also works for us in a cafe environment as well.
01:12:39So the two things, interesting, physical retail, physical cafe experiences for people, where we can directly distribute.
01:12:47And you're directly distributing a piece of print.
01:12:49And of course, there's radio and digital and many other things.
01:12:52But it's still the magazine and books.
01:12:56So, whether you're print, digital, I think the one thing that we should start looking at very quickly is a whole new model of monetization based on tokenization.
01:13:11That is going to be your new revenue stream that you should really focus on.
01:13:16So whoever is the creator of any IP in whichever format, how do you assign value to those tokenization is what the next game is going to be.
01:13:27So if that's not part of your commercial model, then I think we'll be caught napping yet over again over the tech companies.
01:13:35And to your point, absolutely, I think AI should not be seen as yet another bolt-on like the internet or mobile.
01:13:44Think of AI as the operating system of your business and perhaps our human evolution as we take this further.
01:13:53And the last point I wanted to build on what Marjorie mentioned.
01:14:00There is an innate frame bias of the English language in all things that we consume, whether it's news or storytelling through Hollywood.
01:14:12In the UAE, we created the Arabic large language model called JAIS, which has now empowered 350 million Arabic speakers to participate in the age of intelligence.
01:14:25And that's when we stumbled upon this very simple idea that we still have another 9 billion people as audiences for commercialization and engagement that has not been looked at yet by the world.
01:14:42So we've now worked on Nanda and developed a large language model for Hindi.
01:14:48We're looking at Swahili.
01:14:50We're looking at Bahasa, Farsi.
01:14:53That's going to add another 4 billion people.
01:14:57So there's a whole new audience acquisition that will happen with the democratization of technology that could help to the monetization conversation.
01:15:07Journalists state the obvious thing that runs through this conversation.
01:15:13It seems there's a kind of tension between scale and niche.
01:15:16We're talking about models and scale is kind of scary.
01:15:20It doesn't seem to have a great future.
01:15:24It is the one that's facing the greatest existential threat.
01:15:27Whereas niche and something that Mike Allen taught me when he brought me to Politico 12 years ago,
01:15:32if you think of your audiences as people, you have a very clear idea of who you are for,
01:15:37and you are building these communities with which you have an intimate relationship,
01:15:43that opens up all kinds of possibilities.
01:15:46And if you moreover don't think from a high level, ah, the holy religion of journalism,
01:15:51but what are you doing with those people to bring them to you, to make them loyal to you,
01:15:57to make them come back to you, that opens up incredible range of both editorial and business opportunities.
01:16:07So in our final 10 minutes, I want to do two things, as a good moderator should do,
01:16:12is make sure everybody gets to participate.
01:16:14So I want to invite those who have yet to have a chance to say something to jump in.
01:16:18But, Lou, it's almost as if you read my mind.
01:16:21What I'd like to do is do somewhat of a big prediction.
01:16:26If we were to imagine ourselves all sitting at this table five years out from today,
01:16:31we're gathered again.
01:16:33How do you see the world?
01:16:35One big prediction of the reality of media.
01:16:38Again, don't have to give too much other context other than we're just placing bets.
01:16:44Yes.
01:16:48Kamal Hachandani from Haute Living Media Group.
01:16:50One of the things in the monetization that we saw was connecting people more.
01:16:55So the experiential side of it, you know, we tripled our business as we saw the digital footprint and traffic go down.
01:17:02So there's a huge value as influence was looked at scale, but now it's moved to trust.
01:17:08So I've been feeling that we've been moving more from a creator economy to a credibility economy.
01:17:13And getting people and the brands together has been effective in us monetizing it more.
01:17:19My bet is that the consumer won't care where the content comes from.
01:17:25They'll care how good it is.
01:17:27And I think we all believe that made by humans is going to be a movement that's going to save us, much like CSR.
01:17:34At the end of the day, people are cheap.
01:17:36And invariably, they just want what will most entertain them.
01:17:40So the only value that's going to keep us all afloat is by being distinct.
01:17:45I think five years from now, like, it'll be so unrecognizable.
01:17:54It'd be akin to someone at the advent of the printing press looking at an iPhone.
01:17:59I think we'll be on a platform we can't imagine.
01:18:01I think people will interact with content in ways that are unfathomable.
01:18:05I think it'll be fully integrated, fully personalized somehow.
01:18:09I think our content, whether it's through tokens or subscription, it'll somehow feed into that.
01:18:15I don't think it means we don't have a place in the universe.
01:18:18I just think that universe is going to be wild.
01:18:21I think in five years from now, we either won't have overlooked or will come round to the incredible capital
01:18:29of what it means to connect in-world with like-minded individuals.
01:18:34I think if you look at the growth of something like Strava and some of the technology that's about that,
01:18:38it's about bringing me and you together doing a thing we like and connecting in person together,
01:18:43I think that capital will be the most prominent thing.
01:18:47And how and where we get there, we won't mind as long as we get there faster and more efficiently
01:18:52and whether or not that's through a media channel that has a point of view that we connect
01:18:57and facilitates our coming together in person to a tool that simply allows me to go,
01:19:01do you know what, this guy's in this room and he likes the same kind of shirt that I do, so let's go and chat now.
01:19:07I think that commodity is going to be vital and that will drive our revenues and it plays a bit to niche as well.
01:19:14My answer.
01:19:16If I had to place a bet, I would say that people in the future are going to be placing more bets.
01:19:20This online gambling about anything is a growing phenomenon and while it's slightly off topic to what we're discussing,
01:19:30where now Disney is running, you know, major you can gamble on anything is it could help solve a fake news problem.
01:19:37This online gambling marketplace, a shared set of facts to determine the outcome of who the winner is, is a problem and it's growing.
01:19:44And so people care about how they're getting money and it could help solve a definitive source of quote unquote unbiased facts.
01:19:54Sir.
01:19:55You know, I agree totally with His Excellency Mr. Letter about the Spanish role model of the changing the law in the last eight years from the speech of yesterday.
01:20:06I think in five years, if the decision makers in each country didn't take care of the policies and the flexibility,
01:20:15at the same time enforcing the law of the mistakes, to minimize the mistakes of the AI misuse, the world would be crazy in my point of view.
01:20:26So I think there should be some guideline from decision makers to make sure that this crazy AI misuse will affect both of the populations all over the world,
01:20:39and especially the new generations.
01:20:43I think in five years from now, well, it's happening now again, but it will be increased, is that you'll be catering to an audience of one, just one person each.
01:20:52And another disparate trend would be creating communities, which you will then monetize through experiential in whichever way you can.
01:21:02I think there's a distinct difference, or there will be a distinct difference five years from ten years from now.
01:21:10And that is in five years, we're still going to be in a turmoil and in a transition period by using these new technologies.
01:21:19I think, just a simple question, why would a company will want to produce a hundred times more tomatoes if people are going to be fired,
01:21:28and then they're going to have probably ten percent of the market that they have today?
01:21:33Less tomatoes become pretty much free, and we tend certain golden bridges into the re-education of people that would allow them to have a lot more time,
01:21:45a lot more free time, and that free time at the same time will create a lot more necessity to watch and consume content.
01:21:54So content companies that have distinguished themselves and have generated the credibility and the eyeballs to them,
01:22:01they're going to be way more valuable.
01:22:03So I think the tables are going to be flipped in a way with all of these big tech companies,
01:22:08and content and IP is going to generate more value than today.
01:22:13Now ten years from now, that algorithm might completely change.
01:22:17And ten years from now, we might be looking at a society that has resolved most of the problems,
01:22:24thanks to these new technologies, and have democratized things like food and basic necessities.
01:22:31But at the same time, I mean, if we don't self-destruct before, but at the same time, I think two lines are going to evolve.
01:22:41One of which will be this use and overuse of technologies, and the other ones will be to place value on human craftsmanship.
01:22:51What humans do, for example, I mean, if you look at other industries, when cars were made piece by piece,
01:23:03and today cars are made piece by piece are way more valuable than cars that are produced in mass.
01:23:09And similar things will happen to content in the future.
01:23:13AI will generate cheaper, simple, less effective, less quality content,
01:23:19and humans are still going to create unique and disruptive content.
01:23:24That, I believe, is going to happen.
01:23:27If I put a bit, after five years from now, I believe all of us would be five years younger from today.
01:23:37Thanks to the progressive pharmaceutical area.
01:23:41And I would imagine, you know, each one of us would have his own AI agent,
01:23:45where everything will come through that agent, you know, whether it's news or marketing or products
01:23:50or, you know, helping us with our tasks also.
01:23:53In five years, I hope to see a world where using AI comes with the same awareness while we're driving a car with a driving license.
01:24:07Like, everyone using it knows the responsibility of using that strong engine, knows the rules,
01:24:15and act with that responsibility when driving the story forward.
01:24:20Did you want that?
01:24:23Well, I was just going to add, I think I may be repeating your point,
01:24:27but actually in five years, to be a little precise, I don't think we'll have websites.
01:24:30I don't think we'll have apps.
01:24:32I think we'll all have our own individual agent that will scan the world to build our own news experience.
01:24:38And if you ask, well, how will we get traffic in that world, I think the AI agents will be talking to each other
01:24:43and helping each other discover the best place to solve the problem.
01:24:48Can I just add something that's a little bit maybe on the other side of this?
01:24:52So I think that five years from now, we're going to be craving a lot more relationships.
01:24:58We're all talking about what technology is going to do to us, but we're human beings.
01:25:03And I think if we can minimize the tasks of our daily life, we're going to have more time.
01:25:08And that time is for experience, and that experience is through connections.
01:25:12And so I want to come back to your optimism at the beginning, that I hope that five years from now,
01:25:18that relationships will matter more than ever.
01:25:21I think that's a perfect way to end in that I'd like to first thank His Excellency
01:25:27and the World Government Summit for convening all of us to have this conversation.
01:25:31And hopefully we can gather together in five years and see where we are.
01:25:36And, you know, we had 31 voices here, an hour and a half conversation,
01:25:40and I feel like we actually had a fairly healthy dialogue and discussion.
01:25:44I want to thank you all.
Comments

Recommended