00:00That person should be banned from all platforms, thrown into jail.
00:03It was less like a shot, like fire, then it went out on TikTok.
00:08I don't know if it's a Singaporean attitude, like, don't know, don't care.
00:11They don't even know the full story, and then they say things that affect my family.
00:16Cyberbullying, sexual content, content inciting racial or religious tension, and violent content.
00:23These were the most common types of harmful content on designated social media services,
00:28based on the annual online safety poll done by the Ministry of Digital Development and Information.
00:34A series of measures has already been introduced to address online harms,
00:38such as the Online Criminal Harms Act,
00:41the Online Safety Miscellaneous Amendments Act 2022,
00:44and the Protection from Harassment Act 2014.
00:48Various public education efforts are also ongoing.
00:51With the growing prevalence of online harms,
00:53we wanted to find out if people thought that these efforts were sufficient
00:56in encouraging a safer online environment for all.
01:00AsiaOne recently did a survey on online harms,
01:02which gathered 1,208 responses.
01:06Almost half of the respondents recounted either experiencing online harm,
01:10or personally knowing someone who has.
01:12Of the people who have experienced online harm,
01:1560% denoted having been impacted to a large extent,
01:18either emotionally or psychologically.
01:21When asked under what circumstances respondents would take legal action
01:25against a perpetrator of online harms,
01:2786% highlighted that they would do so at least under some circumstances.
01:32The remaining 14% respondents cited costs,
01:34complex legal processes,
01:36and emotional strain as the main reasons for not taking legal action.
01:40Proposed efforts by the government to protect its citizens,
01:43which include new legislation, as well as the setting up of a new agency,
01:47have been well received, with 84% of respondents indicating that these plans
01:51will help to deter potential perpetrators, at least to some extent.
01:55Out of all the new measures, the most well received was allowing victims of online harm,
01:59who have filed a complaint with the agency,
02:02to apply for the disclosure of a perpetrator's user information for specified purposes.
02:07To expand on our findings, we headed out to find out if Singaporeans felt the need
02:11for better protection against online harms, and whether these new measures
02:14set to be introduced would be beneficial.
02:16Yes, I have multiple friends who have experienced this.
02:20They have experienced anonymous accounts contacting them,
02:22and harassing them, slut-shaming them.
02:25Yes, her.
02:26Yes, it's me.
02:28I have someone that I knew that has experienced it before.
02:32Like, they were sending very inappropriate comments about like,
02:35for example, their outfits, body type-wise, and so on.
02:40For me, not personally.
02:41I don't think any of us really have experienced any online harms.
02:45I think by the comments, like, they say things that weren't true on my end,
02:51and then they say things that kind of like, affect my family, so yeah.
02:55She felt really like, exposed, because all the photos that she usually posts
03:01are not like, inappropriate.
03:03It's like tank tops, but like, casual wear everyday.
03:07She didn't expect it to be in the eyes of something sexual at the same time.
03:11So, I think she felt very like,
03:14ooh.
03:15Self-esteem really went down, and her happiness wasn't really there anymore.
03:20So, and she also doesn't talk in detail to me.
03:22She always keep it to herself.
03:23So, it was quite impactful to her to say this.
03:26Previously, they did, they tried, but the platform didn't really do like, much about it.
03:34So, the most, it's like, the account gets reported, but it doesn't really get banned, banned.
03:39So, that particular person decided to like, target others instead.
03:44I feel like, there's not much I can do.
03:48As much as I want to report it, there's not much follow-up, and
03:51at that point, it's mostly damage control, so.
03:54I feel like I won't really report it, because, to be fair, some of the platforms,
04:00what's the most they can really do?
04:01Like, for example, like my friend said, remove a comment.
04:04That's the most they'll do.
04:05It really needs that human aspect to it to look through and go through, because,
04:12it's a lot of grey areas, you never know what's really hurtful, and what's really just for jokes.
04:20For a safer one, filtering of comments.
04:22Filtering out explicit contents for, especially the younger generation.
04:28Definitely, like, you do not know how much a person go through, because you're not the person.
04:34But the poor person who has experienced what you thought was maybe a joke, or for fun,
04:40had a lot of trauma after that.
04:42Of course, I think they should face consequences.
04:45Yeah, for sure, I feel that people are getting very comfortable online.
04:49I think it would still happen.
04:50So, I think it's like, it's not that big of an impact yet,
04:53that it's able to stop the harms that are happening online.
04:57Considering my friends are still going through this, it's not, it's definitely not enough.
05:04It's a, I feel it's a good initiative, that more can be done for the internet,
05:09and to set some boundaries.
05:11Because right now on the internet, everything is kind of grey.
05:15You can kind of do whatever you want.
05:18But, with this initiative, I think there's lines being drawn that cannot be crossed.
05:24You know what I mean?
05:25It might be half-half, because as much as they're trying to stop this,
05:30people might not really want to open up to the agency as well,
05:33and they will want to face it themselves.
05:36So, I would say it's just really 50-50, because they really have to be strong enough
05:40to talk about this to the agency or the people itself.
05:43I 100% think it would be good to help victims that are experiencing this kind of things,
05:50than online platforms not doing their jobs properly.
05:55I would say so, I would say so actually.
05:57Because if you have a, if you have a specialised part of the government to help tackle all this,
06:05and you have specialists who do this kind of stuff, I feel like they will be more understood.
06:11So, I feel like it's a great move.
06:13Of the people we interviewed, 9 out of 10 respondents have not reported,
06:17or would not report the online harms they have faced,
06:19to the respective social media platforms, as they found it ineffective.
06:23The main reasons why respondents felt that reporting online harms to the relevant platforms were futile,
06:29included not receiving follow-ups after reports were made,
06:33only removing some harmful comments, and not placing a ban on the account.
06:38Respondents also said that social media platforms should continue filtering harmful content
06:43and explicit comments to foster a safer online space.
06:46Some suggested not just relying on the platforms, but instead,
06:50working on educating the public on what sort of behaviour constitutes online harm,
06:54and the steps to take should they be subjected to online harms.
06:58All of the respondents agreed that those responsible for online harms
07:01should face consequences, and that a temporary ban from social media platforms is not enough.
07:07Slightly more than half of the respondents felt that the current measures in Singapore are sufficient,
07:11but building on and strengthening our laws as technology continues to develop should be our focus.
07:17Respondents also reacted positively towards the proposed new measures by the government,
07:22and the establishment of a new agency to enhance online safety.
07:26Adding that providing more timely and effective relief,
07:29and having an agency dedicated to assisting victims of online harms,
07:33are good initiatives to make the internet a safer place.
07:36So, what do you think?
07:42You can help us
Comments