- 5 hours ago
- #considerthis
Several recent flashpoints in Malaysia have shown how quickly online outrage can spiral beyond the digital sphere. A new report on ‘Far-Right Extremism and Tech Accountability in Malaysia’ takes a closer look at how far-right narratives are spreading in the country’s digital space, and what role tech platforms may be playing. How do online narratives escalate into real-world consequences, and what can be done to stop that cycle? On this episode of #ConsiderThis Melisa Idris speaks with Suyin Chia, Research Director of INITIATE.MY an organisation seeking to promote tolerance and prevent violence, and Aziff Azuddin, Research Director of IMAN Research, a think tank focused on security, peacebuilding, and community development.
Category
🗞
NewsTranscript
00:00Music
00:10Hello and good evening. I'm Melissa Idris. Welcome to Consider This.
00:14This is the show where we want you to consider and then reconsider what you know of the news of
00:18the day.
00:19Several recent flashpoints in Malaysia have shown how quickly online rage can spiral beyond the digital sphere.
00:27We have a new report out on far-right extremism and tech accountability in Malaysia
00:32which takes a closer look at how far-right narratives may be spreading across the country's digital space
00:38and what role tech platforms may be playing in that.
00:42So tonight on the show we will explore the question,
00:45how do online narratives escalate into real-world consequences and what can be done to stop that cycle?
00:53So joining me for this very important conversation, I have two authors of the report.
00:58We have Suyin Chia who is Director of Initiate.my which is an organisation seeking to promote tolerance and prevent
01:04violence.
01:05We also have Azif Azuddin who is Research Director of Iman Research
01:09which is a think tank that focuses on security, peace building and community development.
01:14Both of you, welcome to the show. Good of you to be here with me.
01:18I want to get the definitions right first as we start the conversation.
01:23In the report, you use the term far-right extremism.
01:27What exactly do you mean when you refer to that phrase in general, Azif?
01:32Yeah, so if you're talking about it in a Malaysian context,
01:35politically and ideologically, we have left-wing, centre and right.
01:40If you think about what right-wing means in the political definition,
01:44we have maybe political parties or groups that align themselves with more conservative ideas
01:48about society and government, right?
01:51Maybe relating to tradition, relating to slightly more conservative views on how religion or society should operate.
01:59But when you talk about the far-right, and I think the distinction is quite necessary here,
02:04the far-right brings those conservative ideas to the more extreme level.
02:10And when we talk about the extreme side of things, this then maybe spills over to violence,
02:16the possibilities of violence, the acceptability, them accepting ideas that maybe they are second-class citizenship, right?
02:23Certain people deserve less rights than others and perhaps to the point where maybe oppression is seen as the natural
02:30order of things.
02:30So when we talk about the far-right in the Malaysian context, this then refers to groups of people
02:36or maybe an ideology shared among a community or certain individuals where this separation between
02:44who is better than others, whether it's divided by race or religion.
02:49Right. So you're right. I think it's so important that we make that distinction
02:54between kind of conservative narratives with extreme narratives.
02:59I'm quite curious as to what you think determines whether a society or a population is vulnerable
03:07to some of these extremist narratives.
03:10Yeah. So I think when we look at the political ideology spectrum again, right,
03:14if you talk about the left or maybe liberal ideas, I don't like to use the word liberal so much,
03:19but because it's always blended together with progressive, liberal, what does that mean?
03:25But I think along the spectrum, if you're on the left side of it, there is a certain openness
03:30to diversity, pluralism, so ideas that I think progressives can get behind.
03:35But if you go on the right side, right, then we have conservative ideas, ideas about tradition, structure.
03:41And I think if you go on the right side, it is much more safer mentally and emotionally
03:46to kind of sit yourselves in conservatism.
03:50But this is where the problem lies, right?
03:52So when you have a political landscape or if you have an economic landscape that feels uncertain,
03:58so we're talking maybe about lack of a job, lack of a vibrant job market,
04:03being unable to deal with inequality between, let's say, between Penang.
04:08If you work in Penang, you get better wages.
04:10But then if you live in Kedah, you get less better wages.
04:11Because there is a resentment that comes up from that, that spills over into anxiety
04:15and spills over to resentment.
04:17Sorry, anxiety that spills over to resentment.
04:19So these are the conditions, material conditions, whether it's socioeconomic
04:22or whether it's cultural, that makes a population much more vulnerable to far-right ideas
04:30because the far-right then offers a very attractive and simplistic view of the world
04:35where your problem can be solved if you adopt an ideology where you are much better than the other person.
04:42In that, in the order of things, you do deserve to be better than the other person.
04:46And these are the, these are kind of like the political structure that you can operate on
04:49where you are better, where you can get better opportunities.
04:52Alright, Suyin, I'm going to bring you into this conversation.
04:55Why did Initiate think this was important to put this out?
04:58Because I understand you've been working on this report for quite a while now.
05:03And it's come out, I think, at a time where we're feeling the heat as a country.
05:09We're feeling the heightened tensions.
05:11So the question I ask as if it's what makes a population kind of vulnerable to far-right extremist narratives.
05:19Initiate putting this out, do you consider Malaysia quite vulnerable to these types of narratives?
05:27Simplicy saying, I think yes, in very specific ways.
05:32The reason why Initiate is interested in this subject is not just because it's the narrative for narrative sakes,
05:38but it's also its implication onto, one, vulnerable populations, but also populations who are vulnerable to such narratives,
05:46as Azif has mentioned.
05:47And that implication has very much to do with public security and public safety, right?
05:52So it's a sense of how safe are we as a community and as an inter-ethnic community as well,
06:01right?
06:01So these types of tensions that is being raised and popularized by far-right extremist groups leads to racial polarization.
06:09It leads to inter-ethnic tensions.
06:12But also more importantly, I think on a structural level, it also undermines institutional and democratic processes, right?
06:20Some of these narratives, what are they?
06:23The incitement to violence, you can trace a direct line to, well, we don't believe in the rule of law,
06:30right?
06:31So we need to take matters into our own hands because the current democratic processes do not serve in protecting
06:38our interests,
06:39whether it's economic interests, whether it's ethnic or religious or race-based interests.
06:45That's the institutional distrust.
06:47It's the institutional distrust.
06:49So you would say that, you know, we cannot rely on the justice process.
06:53We cannot rely on the law enforcement process.
06:56We cannot rely on the government of the day to provide and to protect our interests.
07:02And therefore, we will take matters into our own hands or we will support non-state actors that seem to
07:07be, you know, fighting for our cause, as it were.
07:13What I liked about the report was it was very kind of methodical in the way you broke down the
07:19process of how kind of a single flashpoint escalates into, you know, different platforms and then it escalates into kind
07:29of real life harm.
07:30Maybe we can talk a little bit about that as if the report mentioned event-driven spikes, which I think
07:37is so important because timing matters in all of this.
07:40So tell me what you've noticed when you look at all the different case studies and in terms of the
07:45process of how it happens, what did you observe?
07:48Yeah, so I think we have to tie this back to the digital platforms, whether it's meta, whether it's TikTok,
07:55or whether it's X.
07:57Typically, these things would start from a single flashpoint, which is maybe somebody posting about something.
08:01And I think the emotional content of a specific post determines whether or not something goes viral, right?
08:08So we used the KK Mart example, which is what was used in the report, right?
08:14Where it was a single image with the word kalimah Allah on it, right?
08:20The words Allah on it.
08:22And I think that provided the visual and the accompanying caption provided enough emotional charge for people to feel strongly
08:30enough for them to share it.
08:31And I think this is where we need to also then talk about virality.
08:36Virality also means that how a certain thing gets viral also determines on who carries the message.
08:43So because the individual who initially posted the post had quite a strong following, and this individual was renowned in
08:52the community that he was in, it got widely shared.
08:55And the moment it spreads, it's got vital for a reason, right?
08:59It just goes everywhere to the point where the six degrees of separation thing becomes entirely true.
09:06Within 24 hours or within 48 hours, that single post reached to a single politician who himself had the public
09:13platform and had the reach to make this into a national level issue.
09:16So I think sometimes when we look at certain issues and how it blew up, it's incredibly unpredictable because it
09:23really depends on the content and it depends on who is amplifying it and how far does it have links.
09:27Is it really unpredictable, Azif?
09:29Because some of these flashpoints are so similar, it's like you can almost predict it will happen.
09:35So I think the question is, are these escalations organic in nature or are they being helped along? Are they
09:42manufactured? Are they coordinated by certain actors?
09:47So I would say that the post, the content, the intent of it is certainly manufactured.
09:55But I think just like anyone making any post, you don't know how far it will go, right?
09:59Because then it depends on who's your followers, whether this follower has somebody influential in them.
10:04And I think that's the unpredictable part because you never know who it reaches.
10:07But if you're talking about the pattern of how certain things get viral, this is where we use actor network
10:13theory, right?
10:13Where we look at who are the actors that are spreading this, right?
10:16Who has more followers, who has more pool?
10:19Does this individual also has political mileage?
10:21That's a person who also has public mileage, right?
10:23So it's really about the individuals that are sharing it.
10:28But also, and I think this is where Suyin can also talk about it a bit more,
10:31what is it within the post and the engagement that the post gets that allows the algorithm to push it
10:36further?
10:37There are, I think, leapfrogging elements to this, right?
10:40Like you said, it almost seems as if it's very predictable.
10:43But what makes one post more sensational or amplified than another?
10:49Sometimes there is this black box of algorithms in different platforms that I think we are still yet to unpack
10:55and figure out,
10:56you know, what kind of messaging and stories and issues gets pushed to your front page, right?
11:04So while Azif talks a lot about, you know, it's very important to identify who's carrying the message,
11:09I think on the other side also is how is the platforms carrying this message from beyond the initial network
11:17of people carrying the message.
11:19And this is being distributed to the masses through, you know, algorithm amplification, right?
11:28But I think also, we also need to talk about the role of other actors that come, you know,
11:34we don't want to use the word pile in, but it does feel like it's a pile on effect.
11:38Once something trends on social media, you will see, you know, for example,
11:43online media and traditional media will pick up on the story.
11:46So even if the story is being reported in a very neutral and objective manner,
11:52it still raises the issue into the public consciousness.
11:57And so now there's more attention and eyeballs to the issue, which then re-legitimizes the issue, right?
12:04That there is an actual grievance to be explored and to be taken for further action.
12:10And I think that amplification from, you know, having the network of individuals to the platform,
12:18to media, and then therefore now to the masses is where you see this, you know, these issues happen.
12:24And, you know, the de-escalation to it oftentimes that we talk about is,
12:29of course, we need to talk more about content moderation.
12:31But we also, in the report, we've also speak about how just moderating some contents
12:36and taking it down post-fact is insufficient.
12:39Because once an issue is being taken up and amplified into the public consciousness,
12:44you can remove it from the platforms, but the issue and the narrative remains.
12:50There is that saying that everything on the internet is permanent.
12:53Indeed.
12:55And you explaining how it gets amplified, and with kind of media,
13:00is one of the layers of the amplification and legitimization of it.
13:04I'm very cognizant and aware of the responsibility.
13:08Sometimes the role that we play either inadvertently in amplifying or legitimizing a message or a concern, right?
13:16Let's talk about moderation, because all these platforms, they always say we have safety tools in place,
13:23we've got moderation systems and standards.
13:28Why do you argue that that's not enough?
13:31Well, there's several large reasons for this, but I'll give you three that's relevant to the Malaysian context.
13:36I think where platforms today are trending on their content moderation policies are towards large-scale AI-supported automated content
13:49moderation.
13:50And they are divesting from human moderators, which is incredibly necessary to be able to pick up things like nuance
13:58and language and subtext.
14:01AI moderation is very good at moderating at scale, so large data, but a lot of hateful narratives, hate speech,
14:14incitement of violence,
14:16can be very linguistically specific, that current AI modalities today are just unable to pick up.
14:24So that's one of the reasons.
14:25Sorry, just to get to explain that, so in the Malaysian context, what would that look like, like slang?
14:31So slang, language, spelling as well.
14:36If you look at the Malaysian public and netizens, incredibly creative in the way you spell things to kind of
14:42bypass content moderation, etc.
14:47Also coming up with new language.
14:49Every day, there's a new hateful word to describe a particular race group, or a particular ethnic group, or a
14:56particular, you know, vulnerable group, and you wouldn't know that it's hateful language unless you are part of that cultural
15:04context.
15:05Yeah, absolutely.
15:06The second part also is we are now seeing globally tech companies that are regressing in their content moderation policies.
15:16What we argue in the report is that tech companies themselves are not neutral actors in the messaging framework, as
15:27it were, right?
15:28Tech platforms are governed by board of directors who have very explicit profit-based, but also political and ideological-based
15:36agendas.
15:37And if you've seen just in the recent 24 months, you know, there's been severe rollbacks on trust and safety
15:45policies in a lot of these major platforms.
15:48They've, you know, they've removed fact-checking.
15:50They've removed community notes that provide some form of fact-checking.
15:55You know, they've, again, mass layoffs of human moderators, which means that institutional memory and institutional understanding of cultural context,
16:05language context is now kind of like out the door.
16:08There's also been rollbacks and therefore more support for inflammatory speech and more allowance for it.
16:18And quite insidiously, right, it's all under the quote-unquote label of promoting free speech.
16:24And so, you know, what we argue is content moderation can only go so far as the tech platforms themselves
16:32are willing and able to provide more safety-focused designs in its content moderation frameworks.
16:41And that's just not a reality that we see right now.
16:43Did you have one more? You said three, right?
16:46Yes, and the third one very much is we don't have the incentives, right, to be developing content moderation policies
17:00as well as frameworks like right now.
17:03And that challenge is also because Malaysia is a very small economic context and it's very difficult for us to
17:10be, how would you call it,
17:14for us to be leveraging with big tech platforms to say, you know, these are our challenges.
17:20You know, this is the kind of narratives that we are tracking and the issues that we're tracking.
17:24And therefore, we require, you know, more specific moderations for very specific issues.
17:31The way it works now for content moderation is, and we've heard this on feedback from a lot of the
17:38tech platforms that are willing to speak to us to say,
17:41when they moderate their content, it's on the content basis, is this content inflammatory, is this content hateful, but they
17:50don't see the ecosystem of narratives and contents and actors to say,
17:58well, the accumulation or the sum of this narrative is problematic, right?
18:04And so it's very difficult to bridge that gap between when we say there are explicit extremist narratives that are
18:14trending online that you need to pay attention to.
18:16That's a gap that tech platforms are currently, you know, unable to meet us.
18:20I can only imagine how nuanced some of the hate speech is, right?
18:26All the narratives, very coded and all the different spellings.
18:30I sometimes read it and go, what do you mean by that?
18:33I can imagine if it's an AI content moderation, how are they ever going to piece together the meaning of
18:39it?
18:40Yeah, because I'm just thinking about how human creativity far outstrips any policy, right?
18:46So if you're thinking about how, for example, Gen Zers create new terms or repurpose old terms to have new
18:54meanings almost every week, almost every other day, right?
18:59If you have people sitting in kind of like a policy room deciding what words are red flags, what words
19:05should be considered hateful,
19:09and these are not individuals who are plugged into the culture, then you have a lot of things slipping to
19:13the cracks.
19:13I'm just thinking a very salient example from two years ago was when a business company, sorry, when a fast
19:21food company was up and coming used the term type C.
19:25That would have not been flagged by any moderation AI or even person who doesn't know the context that it
19:32means something hateful, right?
19:34It means something discriminatory.
19:35But now it's entered the lexicon of Malaysians.
19:38Malaysians just use the word type and then letter after that so openly, right?
19:43And that's, I think, how fast culture moves.
19:46It moves faster than policy.
19:48Okay, so we've talked about the problem.
19:51You clearly have been tracking some of these cases.
19:53You know how it does escalate.
19:56Are there things we can do about it?
19:59Is there policy interventions, personal advocacy, personal education that we can use as our resistance to this?
20:11Yeah.
20:12So why don't you begin and talk to me about one of the recommendations under the report is recognizing far
20:20-right extremism as a domestic extremist threat.
20:23And this is under the National Action Plan on Countering and Preventing Violent Extremism.
20:27Why is this important?
20:28What would it change in practice?
20:30So what we've been doing in initiative is we've been tracking through data collecting and evidence on the incidences that
20:38are instigated by far-right extremists and far-right extremist narratives.
20:43And it's impact and influence on the ground.
20:49How, again, how it undermines democratic institutions and how it threatens the safety and freedom of vulnerable communities or individuals.
20:58We think that it's important to have it officially recognized as a domestic threat because if you're able to say
21:07it and name it and put it within a framework, then you can begin to be more proactive in countering
21:19it.
21:20Right now, our mode of operandus is very reactive and inflammatory flashpoint happens, violence has taken place, inter-ethnic tensions
21:31have arisen and, you know, polarization continues to grow.
21:35Then we talk about how to address it.
21:37But I think if we are able to add it into the current framework, which is what we're recommending is
21:42to add it into the MyPCVE framework, then you can start beginning to say, OK, this is the issues.
21:48This is the consequences.
21:50How do we now then work on early warning, on prevention education, right?
21:56How do you build frameworks of, you know, digital and information ecosystem types of education so that you can educate
22:08your, you know, different communities on how to be more resilient online and being able to identify issues.
22:16But also then you can build in legal and law enforcement processes on the back of this and say, how
22:22do we actually deal with such issues as it comes?
22:26I would love to see that.
22:28I mean, you're right.
22:30A more structured and thoughtful approach, right?
22:34As if we have a few minutes left and I want to ask both of you in answer to my
22:38question earlier, as well as a message you have for our audience tonight.
22:44You know, what can we do as private citizens in resisting, in fighting, in facing some of these online hate
22:53and violent extremist narratives that we're confronted with?
22:58Yeah, I mean, I know I go back, I always go back to the basics when I think about this
23:02because essentially a lot of our discourse now is moderated online.
23:06And I think the problem with it being online is also that you don't see the other person on the
23:09other side.
23:10So the development of empathy or at least the immediate consequences of what you say doesn't go immediately, doesn't direct,
23:18you don't directly experience it, you don't directly see it, right?
23:21And I think this is where, this is where I think as a private citizen, my recommendation or at least
23:27what I would love more people to do is to actually do go out and interact with more people.
23:32And I think the way how we, because of the way how we structure our lives online, not interacting with
23:37each other, but mediated through mobile, the use of mobile, the use of mobile phones, mediated through WhatsApp, through social
23:45media, that means we kind of lose the touch.
23:47And I think there are several studies that were done during the COVID-19 pandemic as to how being, as
23:54to how youths being under lockdown also deteriorated their social skills, which means that you do have then a generation
23:59of youths who are now maybe a bit more blunt when it comes to their social cues, right?
24:03They are unable to not develop empathy, but the way they communicate is much more, perhaps less, less about thinking
24:10about the other person and just thinking about it as though they are speaking to a, to a chatbot, right?
24:14Yeah.
24:15And I think that then translates to how we should talk to people.
24:18We should, by engaging in some, in other, with other people face to face with different communities that you are
24:24uncomfortable with or people are strangers, you start to learn, you need to learn how to talk to other people.
24:31You need to learn the nuances of how to engage with people socially.
24:35And I think we've lost a lot of that.
24:37Oh my gosh, we need to learn how to be a society.
24:39We need to learn how to be a society again.
24:41Yeah, yeah.
24:41So, your last message from you.
24:44It's a big one, isn't it?
24:46It's quite a big one.
24:47I think I'll talk more about your life as a, as a netizen, right?
24:52Your online life as well.
24:53And for today's generation, your physical online, offline life is as much as, is as real as the one online,
25:01right?
25:01And so, it's also about how do you become a good citizen and a good neighbor in the digital space
25:07that you are in, right?
25:09And my two recommendations would be just never read the comment section and never contribute to it, right?
25:15That would, half the problems of the world probably will be solved by that.
25:19But it's also about thinking more critically, not just about the content that you're consuming, but how did the content
25:26arrive in your feed, right?
25:29What, what's being sold to you, what algorithms is pushing, what type of narrative and why?
25:35Why is this content on my feed at this time, right?
25:38And so, being a more informed, you know, digital citizen, having better digital hygiene in terms of understanding how the
25:46broadcasting messaging network works on your platform of choice, as it were.
25:52And also understanding that, you know, a lot of the content and a lot of the commenters on these contents
25:58online may not necessarily be real.
26:02Or they may not be acting in good faith, right?
26:04They are also actors who are engaging for the sake of engagement because you can farm engagements for profit.
26:11You can be inflammatory for profit.
26:15So, I think if you really look at your online lens through, like, how is somebody else profiting from my
26:21emotional connection and investment to this piece of content, I think we'll all be much better for it.
26:27Absolutely.
26:28Both of you, thank you so much.
26:29This report was fascinating to read.
26:32Thank you for the hard work that you put into putting this out.
26:35Thank you for having us.
26:36That's all the time we have for you on this episode of Consider This.
26:39I'm Melissa Idris signing off for the evening.
26:41Thank you so much for watching.
26:42Good night.
26:42Good night.
26:44Good night.
26:45Good night.
26:46Good night.
26:48Good night.
26:50Good night.
26:53Good night.
26:53Good night.
26:55Good night.
26:55Good night.
26:55Good night.
26:55Good night.
26:55Good night.
26:56Good night.
26:56Good night.
26:57Good night.
26:57You
Comments