Skip to playerSkip to main content
  • 3 months ago
In today’s information battlefield, outside forces are waging influence campaigns using proxies, propaganda, and falsehoods to sway public opinion.

This tactic is known as Foreign Information Manipulation and Interference (FIMI).

How does FIMI threaten democracies around the world? What’s happening in the Philippines?

Prof. Rachel Khan of the University of the Philippines breaks it down.

This is Part 1 of a three-part special series uncovering how foreign actors meddle in Philippine politics through disinformation.

Category

🗞
News
Transcript
00:00Hello, my name is Howie Severino.
00:03Is our information space being infiltrated by foreign manipulators?
00:08Welcome to our new podcast series on foreign information manipulation and interference.
00:14For our first guest, we have Dr. Rachel Kahn,
00:16Professor of Journalism at the University of the Philippines in Diliman
00:20and co-founder of Check.English.
00:30So, you decided to start teaching fact-checking as kind of an antidote to disinformation
00:40or as a response to disinformation in 2017,
00:43kind of as a reaction then from what you observed in 2016 during the elections.
00:49Of course, the Philippines, it became kind of a tactic in the Philippines,
00:53but also in the U.S. at that time.
00:56Did it influence the outcome of elections, do you think?
01:00I think there was, in a way, it influenced the outcome
01:04because there was a lot of information that was outside the normal channels of media
01:12and the usual trusted media.
01:16And people were starting to pick this up.
01:19You could see Facebook pages being created solely to bring out misinformation or disinformation.
01:28And it became so rampant.
01:32And people, because they were not aware that it could be done,
01:35and they still had trust in whatever they could read on the internet.
01:39That created sort of like, you know, it just spiraled into a lot of disinformation
01:48that was being shared by ordinary citizens who could not detect that this was false information.
01:54And so part of what we needed to do was to create an awareness that this exists.
02:01These efforts to counter disinformation, what kind of impact have they had, in your opinion?
02:06I think its greatest impact is to create that awareness.
02:11In the 2025 digital news report, which was finalized and completed by my colleague, Professor Iwan Chua,
02:20it was found that 90% of Filipinos are now aware of disinformation.
02:26Something that they didn't, they were not aware of back in 2018.
02:31See, so that is for us a major, like they saw it as a problem.
02:38Something that for us is already an accomplishment
02:42because they already recognize the existence of disinformation.
02:45And because of that, they're also wary about the information they consume.
02:51And that wariness is a first step to combating this big problem.
02:57So I think that alone, you know, that people now have that sense of needing to fact-check
03:05before they start sharing, I think that has sort of lessened the dangers of disinformation
03:11to still a small degree, no?
03:14But still, you know, I think with people becoming more aware
03:19and people becoming more savvy in checking information,
03:24maybe, you know, this whole-of-society approach can actually work against disinformation.
03:32And itong ang 90% awareness, I mean, it's significant, no?
03:36And nabanggit nyo nga na it has to be a whole-of-society approach,
03:40but however, from, you know, your paper,
03:45it didn't seem like there was like a government counterpart to all of the civil society efforts na,
03:52unlike what's happening in other countries, especially in Europe and even in Asia, no?
03:56That there's strong government collaboration with civil society.
04:00Bakit sa Pilipinas?
04:01Parang hindi masyadong active yung ating government,
04:04and would you actually be open to cooperating with government?
04:09Or ano ba yung dynamics dun, Prof?
04:12Well, I think the main problem is
04:16we want to keep the constitutional guarantee of free expression and the free press.
04:23And the danger of having government involved is it could, you know,
04:31that danger of they're the ones defining what is true and what is false
04:39might impede our own free press.
04:43And so that's why there's this careful approach.
04:48And that's why it's also private sector-led in the same way that media in this country is private sector-led.
04:54Some of the models that we've seen actually have a strong government collaboration with civil society,
05:01despite the risks that you mentioned, the risk to freedom of speech and all that.
05:05And one model that's been mentioned is the model in Taiwan,
05:11where they have a rapid response mechanism based on a group set up by academics like yourself,
05:17journalists, et cetera.
05:19There's like civil society groups that monitor falsehoods, no?
05:23Obviously, alam naman natin yung danger sa kanila, no?
05:27Because they're closer to China than we are, no?
05:31Yeah.
05:33But have those been effective?
05:35Well, I think for Taiwan, it has been.
05:38And I think because their citizens and their government are more or less on the same page.
05:42I'm not sure if that will work here.
05:45I was a guest speaker at the Presidential Communications Office Information Summit.
05:53And I suggested to them that they have a fact-checking unit per agency
05:58because what they can do is actually, for their own sake,
06:02if somebody is spreading false information about a government policy of that agency,
06:08they should have a unit in their press office that will correct that wrong information, right?
06:16Because the policy is there, but sometimes people misinterpret it deliberately.
06:21So I want to talk about FIMI now, no?
06:26Maybe you can tell us a little bit about the origins of that term,
06:31because it might not be familiar to everybody,
06:33and then how it's relevant now to our region.
06:39Okay, so FIMI is Foreign Information Manipulation and Interference.
06:46So it basically means that there are external influences or influence operators outside the country.
06:56It could be a state operation or it could be individuals or private citizens that are not part.
07:05Basically, it's external sources of disinformation and that it's coordinated.
07:13So it's a campaign.
07:14It's different from an influence operator who's, let's say, a blogger.
07:18So that blogger can have their own causes of disinformation.
07:25But this one is bigger because it's a campaign among either several bloggers outside the country
07:34or really institutions that either produce inauthentic, coordinated disinformation
07:44or have influencers, individual influencers that want to manipulate the local public sphere.
07:53Okay, we've been talking about disinformation previously known as fake news for more than a decade now, no?
08:01But fake news is a deliberate way of creating distrust with the news media.
08:11It's a Trump coined term.
08:13Why? Because he wanted to discredit legacy media.
08:17See, Trump may naging-imbento or popularized it?
08:20Well, he popularized it basically because he was attacking legacy media, right?
08:26And then it was picked up here by our politicians also who also wanted to create a distrust
08:32so that it will be the influencers that they have more control over who will be more popular than traditional media.
08:41Yeah, it's a tactic.
08:43That's why fake news is an oxymoron, technically, right?
08:47Because news, when you define news, at least when I define news in class,
08:51it means the truthful, the accurate reporting of an event issue and idea.
08:57See?
08:58But putting the word fake before it, you're manipulating that traditional definition of news.
09:05Unfortunately, mas ginagamit yun.
09:08Unfortunately, yes.
09:10This information is a long word.
09:11But then, right.
09:12But then it's also the influencers who are using that term.
09:17Why?
09:17Because it benefits them.
09:19Because they will become the authority rather than the media, right?
09:25So, that's why it's popular.
09:27In a way, it's a chicken and egg thing.
09:30Of course, this information has evolved, no?
09:32So, these days, to what extent is this information fee me?
09:38That's hard to say to what extent, like to put a percentage or number on it.
09:44It's very difficult.
09:45Is it greater now than before?
09:47With the start of social media, I think it started growing also.
09:50They saw a platform in which, or several platforms in which, they could now influence domestic politics.
10:01Not just in this country, but in several.
10:04So, you have those with the capital and those with the machinery and technology that can now influence domestic politics in smaller, more developing economies.
10:17And I think that's why fee me is something that is of danger because you don't really need that much capital.
10:28You don't even need that much, that many people to, you know, to conduct this information.
10:36So, yes, it's been growing, I think, and it's taking on different hues depending on what issues are, I think, hot at the moment.
10:51Yeah, kaya ko rin tinanong kung, you know, to what extent or is it growing, et cetera.
10:56It's kasi we've had many, many conversations about this information for, you know, for the past decade.
11:05Pero parang ngayon lang pinag-uusapan yung foreign aspect of it, no?
11:09Is it because, yun nga, is it because the scale of it has increased or it's always been there?
11:15So, there are now efforts pinpointing it as foreign or are we realizing now with more research, et cetera, that a lot of the disinformation from the very start was actually fee me?
11:29Right.
11:30I think it's because of that greater awareness.
11:33In the same way that there was very little awareness about disinformation in the early days, and now there's a lot of awareness.
11:43So, now we're focusing now on who are the actors behind the disinformation.
11:47And that's why that information is growing among those combating disinformation.
11:54And I think that's why it's being talked about.
11:57It's being talked about openly so that people, again, to create an awareness that it's beyond our shores, that the problem doesn't stop, you know, in just the Philippines alone.
12:11And also because of, I think, the European Union had started really talking about it because they saw a big influx of foreign influence operators during the Ukraine war.
12:25And so, because of their own findings, they started, you know, we started also studying it locally, which was not really a focus of disinformation studies before, at least in Asia.
12:38And FIMI itself was kind of coined by the EU, right?
12:43Yes.
12:44So, it was coined by them, but also the knowledge of how they operate was also plotted by them.
12:52It helped the local studies to analyze our own problems here.
12:56So, I think that's why there's a greater awareness and also why we're only talking about it now, even if it probably existed well into 2016 or even earlier, much earlier.
13:08Yeah, because they were talking about the actors, no?
13:10So, let's talk about that a little bit.
13:12There aren't many actors here, I mean, in terms of the really, really big players.
13:16I mean, in Europe, you know, they got really concerned because Ukraine was kind of, there's a domino effect there.
13:24I mean, and Russia has these ambitions, so they're getting territory from Ukraine and, you know, which country is next, et cetera.
13:32So, they're all like rearming, et cetera.
13:34And then they're examining the tactics of the Russians here because Russia supposedly is using disinformation as kind of a war doctrine, right?
13:45It's like an official part of their war machine, right?
13:48So, there's the hardware and the tanks and all of that, but then in cyberspace, it's like they're just as aggressive, no?
13:55In our part of the world, I mean, let's not pussyfoot around the elephant in the room, di ba?
14:03We're all thinking about China, di ba?
14:06Kung baga, they're the Russia in Asia.
14:08What evidence is emerging na they are engaging in FIMI?
14:13Well, let's start with just doing a really minor content analysis on the online news agencies of China, like China News Daily.
14:26It's in Xinhua, right?
14:27Xinhua.
14:28It's a big one.
14:28So, if you just read the news every time there's a conflict in the West Philippine Sea, you would already read that they have a totally different and often fictitious account of what happened, right?
14:46So, I mean, you don't really have to go that far to see if they're doing it in their official media sites, then there's a natural assumption that it's also being done in the unofficial sites or their unofficial influencers.
15:03So, Prof, nabanggit niyo yung official, of course, those are obvious, they're open about their government, right?
15:12Pag sinabing unofficial, how do we know that they're actually affiliated? Kung unofficial naman yun?
15:18Technically, it's hard to say if they're affiliated or they're just parroting or echoing something of the same nature.
15:28But, for example, on TikTok, we find that there are TikTok accounts that have no names, that just have D-X-Y-Z, you know?
15:42And there are a lot of those that have the same opinion or even have the same message.
15:48So, that already can make you think whether it's inauthentic accounts that have been created precisely to distribute this information.
16:01And actually, TikTok itself has admitted that they've taken down several sites.
16:06I was in one of their ethics meetings about half a year ago, and I asked them about this, and they said,
16:15yes, we've detected them, and these will be taken down.
16:18And really, when I looked for them again, they were no longer there.
16:22So, at least they were, you know, despite being a Chinese-owned company, they did obvious disinformation, they did remove.
16:35But, of course, you know, you take down one, another is created.
16:39Okay, so I guess we can differentiate what is coming from local troll farms and FIMI by simply looking at where the messages are being repeated.
16:51So, for example, the troll farms, when we see, for example, a journalist is being attacked on Facebook.
16:57It's the same message, but the Facebook users are, you know, they have more or less, you see their names.
17:10So, it could be authentic, paid trolls.
17:16With FIMI, there's the same message, but mostly coming from, it's very hard to detect what countries they're coming from.
17:29So, also, many times, at least you see on TikTok and Facebook, the names are, because probably it's Chinese characters,
17:40sometimes they come out as consonants, you know, because the original, you don't, it doesn't pick up on the platform.
17:48So, instead of you see the characters, you see...
17:52But that's enough to conclude.
17:54That's enough to conclude that, at least from a basic non-technology way of tracking,
18:03you could say that these are coming from outside.
18:06The thing about FIMI, you know that they're coming from outside, but you don't know whether it's operated
18:13or the source is the state, an outside state, or a local paying an outside provider.
18:22That's the thing that it's very hard to distinguish.
18:25Because if I was a local politician and I hired a company in India, you'd have, you know,
18:34you'd have these trolls coming from the outside.
18:40That can be traced that it's coming from the outside, but who's behind it, it's hard to say.
18:46That's where I think you will really need a collaboration with the platforms to trace the source.
18:55Is it coming from a private entity or a state entity?
19:00That's where you need the technology that's beyond a normal fact-checker like myself.
19:05That's an example of information laundering, where you can't really trace the source,
19:09kasi ang dami ng pinagdaanan.
19:11Correct.
19:12And then at the end, parang may pagka-believability, believable yung information.
19:17Kasi you don't really know it came from this source.
19:20Yeah, except for sometimes it's easy enough because the script is the same.
19:24Right?
19:25Then, at least in that sense, you don't really need to be an expert.
19:30So, anong agenda ng China when they, you know, when they're engaging in FIMI?
19:40Ano bang, do they want to influence our elections?
19:42They want to make sure the next president is on their side?
19:45Do they want to, I mean, anong bang, what's it like?
19:48I think the goal is simply their interest in the West Philippine Sea, right?
19:55And any politician who will support it will have their backing.
20:00I think it's really as simple, at least, I think it's as simple as that.
20:06Okay, which makes me curious din, ano?
20:08Very often, their response and their official response,
20:12and then what's repeated in the supposed unofficial networks,
20:18often sounds unbelievable.
20:21I mean, sometimes we see video and what they say is like the opposite of what we saw.
20:28Right, right.
20:30So, I'm wondering, this seems so clumsy considering there's such a big power,
20:35or do they think it's going to be effective?
20:37I don't think the Filipinos are being targeted at that point.
20:40They are targeting their own domestic market.
20:43And their domestic market have not seen what we've seen.
20:45Right.
20:46Their domestic market does not have access because...
20:49Censorship.
20:49Because of censorship, they don't even have Facebook in China, right?
20:55So, what they say officially is really for the domestic market
20:59so that their support for their government is intact.
21:05So, it might sound ridiculous to us, but to them,
21:12they don't care what we think, basically.
21:15And whatever they say is the reality.
21:17It's the reality in their country.
21:20Because they don't have any alternative information.
21:23Right, right.
21:23Okay.
21:25All right.
21:25And of course, it goes back to political systems na, diba?
21:29So, I mean, when we...
21:31It's beyond us.
21:32Yeah.
21:32Well, we're lucky in a sense because we're able to read what they say,
21:37but at the same time, we can compare it to what we actually see.
21:41Correct.
21:41I mean, there's different sources of information.
21:44That's the advantage of having a free press and freedom of being able to access the different platforms.
21:51Let's dwell on the fact that TikTok is a Chinese-owned company.
21:55I mean, there have been calls to actually restrict TikTok for that reason.
22:02And I think in certain countries, it might even be banned.
22:08And other countries, if you have a government mobile phone, you cannot have the TikTok app, etc.
22:14There have been calls to do the same thing here.
22:15We interviewed the spokesman, one of the spokespersons of the military.
22:19They did discourage or even they restrict or crave it their personnel to use to have TikTok on their official mobile phones.
22:29I mean, that would be an obvious kind of unofficial channel because if it's Chinese-owned,
22:36I mean, there are suspicions that that's how they get data from Filipinos.
22:41I mean, you can restrict, for example, a Filipino general from using TikTok,
22:47which he probably wouldn't do anyway, but the prohibition wouldn't cover his children, for example.
22:53I mean, you can actually track the general through his family, right?
22:57So it seems to be a big, if any of that's true, that it's a suspicion, wouldn't that be a huge risk?
23:05Yes.
23:06Especially in the Philippines where TikTok has grown probably faster than almost any other society.
23:10True.
23:11So that suspicion remains, although it's being denied by everybody I've talked to in that company.
23:19Which shouldn't surprise us, right?
23:21Which shouldn't surprise us.
23:23So it's very hard to say or it's very hard to accuse them of deliberately being the spy of the country that owns them.
23:34But at the same time, you also have the gadgets themselves.
23:38I'd also be wary of gadgets, you know.
23:41Made in China.
23:42Right.
23:43Most Apple phones.
23:44Because there could be implants there that you don't know about.
23:47So we could really have that suspicious mind and think everything that is internet-based or can have a geopolitical implant can be used against us.
24:09So at the same time, what is more obvious are what we see on the platforms, the news, the people, the influencers who are parroting the lines of official Chinese media.
24:26I think that one, at least you can say black and white, that they are, you know, they are really producing this information.
24:36Yeah.
24:36Okay.
24:36That's the content.
24:37Balikan ko lang yung TikTok because it's so popular among everybody.
24:42I mean, what do you advise your Gen X, or sorry, Gen Z students, no?
24:47Many of them have TikTok, use TikTok.
24:51Should they be, you know, cautious about it?
24:54Or what do you tell them about this?
24:56I mean, they know that you're a scholar in this information, no?
25:00Right.
25:00But it's still a platform and like any platform, it can be used for good or bad.
25:06So it's not like I have any government secrets to hide, right?
25:13I would, I think out of my normally suspicious mind, I tell anybody who's connected with government or anyone who had information to protect,
25:25to not engage, you know, to be careful of what gadgets they buy and where it was made and what platforms are on it.
25:34But an ordinary citizen, technically, unless, you know, you have a zero-day kind of scenario in your head,
25:44I don't see any problem at the moment.
25:49People should have that common sense.
25:51If you don't have that common sense.
25:53So at least that's an awareness, eh?
25:56That if you can detect something using your common sense, then you also make the job of the fact checkers easier.
26:05Because we can already focus on the harder-to-detect technology-based disinformation, right?
26:13Like big fakes.
26:15Yeah, yeah.
26:15Okay, well, speaking of which, no?
26:17Kasi nga, we're also monitoring the midterm elections.
26:22And in the months leading up to the midterm elections, there was a lot of talk about deep fakes and how deep fakes is kind of the next generation of disinformation
26:30and how it's going to have a huge impact, you know?
26:33So we actually had a campaign here.
26:36We went to universities.
26:38We talked about it.
26:38I'm sure you guys did also, et cetera.
26:41And so, kasi pati kami biktima na deep fakes, eh?
26:44Like, most of our anchors, myself included, I was selling dietary supplements.
26:48And Jessica was selling, like, weight loss drugs and whatever, Ivan Mayrina, et cetera, et cetera.
26:55So this was last year pa.
26:58So early on, we were, like, kind of educating people.
27:01We had a self-interest, of course, but we thought this is kind of maybe a sign of things to come, right?
27:09Especially pagdating sa eleksyon.
27:11Pagdating doon sa campaign itself, the political campaigns, ang observation na, maybe it was superficial, but we didn't note a lot of deep fakes used for politics.
27:24Kung baga, I mean, kami, ginamit kami to sell products, right?
27:27No politics.
27:29But we didn't, it would have been easy to create deep fakes of, like, political leaders saying things they didn't say, right?
27:37Pero wala kaming masyadong napansin na ganun.
27:42So to us, that was kind of, ano ba, nagiging, sabi naman, 90% awareness na, maybe, maybe, maybe, maybe we're maturing in a way na people, the obvious way of faking videos or information now wasn't deployed as much kasi they weren't deemed to be effective.
28:04Ano bang?
28:05Did you have the same observation?
28:06Yeah, check that page, came out with a study that the recent elections had shallow fakes, not deep fakes.
28:14Okay, that's interesting.
28:15Meaning that it could still be detected with just one observation.
28:22You didn't need high-tech technology to, you know, to look at the photographs.
28:28So tama yung aming speculation.
28:31Okay.
28:32So that's good news.
28:33That's good news.
28:34Yeah, but doesn't mean that we shouldn't be alert for the future because as technology improves and then they make it so available to the public, you know, so you don't need to be a computer expert anymore.
28:49So, yes, that can happen.
28:51So, so far, it's all been detectable.
28:54It hasn't been that level yet.
28:56A regaining ground in terms of 90% awareness, but what needs to happen with this awareness?
29:04Well, I think what should be happening is if you are going to use the same technology to fight disinformation, then it is for the positive, like to what we call pre-bunking, like to give out the correct information or to have a rapid response when you are being attacked by disinformation.
29:28And basically, that's using the same technology that the disinformers use, but to a positive and a good, in a good way.
29:38It's a lot of technology.
29:39So, and again, any technology or any tool you can use for the good or for the bad.
29:45So, you know, in the same way, for example, AI, we use AI precisely to detect disinformation, but it could also be used to create disinformation, right?
29:55So, it's really a matter of who is using it and for what purpose.
29:59So, I think that those who want to stay clean in the political sphere, but at the same time not be naive, then use it in the proper way.
30:11Pamilyar na sa atin ang money laundering, pero ano naman ang information laundering?
30:19At paano ito nakaapekto sa atin?
30:23Paano ito'y ipinapakalat?
30:26Yan ang mga susunod nating tatalakayin sa special series na ito tungkol sa online invasion.
Be the first to comment
Add your comment

Recommended