00:00The spread of misinformation is the biggest risk to humanity over the next two years, according to the World Economic Forum.
00:07So, what does this mean? And how do we, as busy internet users, safeguard ourselves from the deluge of misleading, manipulative and just plain wrong information online?
00:19Most of the information out there, especially on social media, hasn't been reviewed or held to editorial standards before it's published.
00:28Social media networks and search engines, they weren't created for accuracy or to help civil society.
00:34They are big global businesses set up to make as much money as possible.
00:38And they do this by showing users lots and lots of ads that are paid for.
00:43So, to them, it doesn't matter if the information is factually correct or balanced or ethical.
00:49It just has to engage you so you keep coming back and see more ads.
00:54The first step to fact checking on the go is to check the source of the information.
00:59When something piques your interest on your feed, check who's sharing it and where the post or link originated.
01:05Is the creator identified, reliable and trustworthy?
01:09Are they pushing an agenda or trying to scam you?
01:12Is it attached to a reputable website?
01:15Misinformation thrives on emotion, especially negative emotions like fear and anger.
01:22So, try to step back from that and view any information or media you come across critically.
01:28Check website addresses too and check the about and contact and privacy sections to see who's behind the information.
01:36On social media, avoid sharing or reacting to posts from anonymous accounts where you don't know their background and so you can't make a judgment about what they're saying.
01:46Click on account bios to check.
01:49Google them if you're not sure.
01:51When you're trying to check whether something is true, look for subject matter experts.
01:55It's no use getting advice from a physicist, say, about celiac disease or asking a social scientist about the changing climate.
02:04Finally, look for the consensus of evidence.
02:07So, a lone voice claiming something when 97% of other people in the field disagree may not be reliable.
02:15Find sources you can trust, like government agencies, not necessarily politicians with a political agenda, but departments and that kind of thing.
02:24Reputable news outlets, subject matter experts at universities.
02:28All these sources review information before it's published and they hold it to standards of accuracy and balance and ethics.
02:37Once you've identified the source of the information and determined that it does come from a legitimate group or person, it can help to then do some lateral reading.
02:46Check other expert sources before you respond and make sure you're getting all the facts before you make a decision on the information you're looking at.
02:54For example, do you remember seeing this image during the catastrophic fires in California?
03:00It was shared by several Hollywood A-listers, including reportedly Isabella Rossellini and Robert Redford,
03:06and it seemed to convey perfectly the horror of the fires that had devastated the Hollywood Hills just days after the 2025 Golden Globes on January 5.
03:16But the image was a fake.
03:19Using reverse image search tools, including TinEye, RevEye, and Google's search by image function,
03:25users can quickly see that this image is most likely made using artificial intelligence and it just wasn't a real photo.
03:34Artificial intelligence poses a new era of concern.
03:38Deepfake videos have been around for years, but with AI it's become much easier to make these sophisticated fake videos.
03:46These kinds of videos could undermine democracy and sway elections by convincing voters that candidates have done or said something that never really happened.
03:56This is going so well. It's going exactly how I'd always dreamed.
04:03It's pretty simple. Misinformation can actually be incredibly dangerous.
04:08We know that it can affect public health. It can affect responses to disasters like bushfires and floods.
04:16It can affect elections and civic engagement, crusting institutions as well, democracy as a whole and public policy.
04:25And of course, it can affect individuals in a really big way.
04:30What it comes down to is we need facts and reliable information to function properly as a society.
04:37All throughout this video, we've been speaking to media literacy legend, Saffron Howden.
04:42You've heard the way she speaks and you've seen some of her mannerisms and the way that she speaks.
04:47It may not surprise you to know, however, that Saffron is not actually fluent in every world language.
04:59No, we use freely available AI technology to make this video.
05:08Pay attention to the eyes in the AI generated video and the voice.
05:12In both cases, there's something slightly unnatural about it.
05:16Now I can give it any script I want and my clone will say what I want it to say.
05:24If you can reduce the speed of the suspected deepfake, that way it's easier to see any abnormalities popping up in the frames.
05:32If the video depicts a celebrity or politician, you can do a little bit of research to see whether that video and what's being said in it has been reported widely by news organizations.
05:42But in most cases, it pays to just take a minute to examine what you're looking at before you respond to it.
05:48Remember, anyone with a digital device and an internet connection can now publish information.
05:56So we need to be really careful with what we're taking as fact or true.
Comments