Skip to playerSkip to main content
  • 1 day ago

Category

🤖
Tech
Transcript
00:00OpenAI reveals how it's battling scammers, spies, and sadbots.
00:05OpenAI just dropped a new report that reads like a mix between a cybersecurity thriller
00:09and a corporate therapy session.
00:12The company detailed how it's fighting off everything from cybercriminals to government-backed
00:16influence campaigns, all while trying not to freak out regular users worried about privacy
00:21or chatbot overreach.
00:23Since February 2024, OpenAI says it has shut down over 40 networks that tried to misuse
00:29its models.
00:30The villains of this story?
00:32Scammers, hackers, and the occasional geopolitical puppet master.
00:36One highlighted case involved a Cambodian crime group using AI to streamline operations.
00:43Another saw Russian actors using ChatGPT to generate prompts for deepfake videos.
00:48And then there were accounts tied to the Chinese government, reportedly using the models to
00:52brainstorm social media monitoring systems.
00:56The company says it monitors patterns of threat actor behavior, not random one-off messages
01:00to avoid disrupting normal use.
01:03Beyond the usual data and disinformation worries, there's growing concern about AI's psychological
01:08impact.
01:09A handful of tragic cases this year, including suicides and a murder-suicide in Connecticut,
01:14have reportedly involved AI conversations gone wrong.
01:18In response, OpenAI says ChatGPT is trained to detect when someone expresses a desire to self-harm
01:24or harm others.
01:25Instead of responding directly, the AI will acknowledge the distress and try to guide the
01:31user toward real-world help.
01:32If someone seems to pose a serious threat to others, human reviewers can step in and, if necessary,
01:39contact law enforcement.
01:40So, while OpenAI's latest report shows it's taking threats seriously, it also reminds us
01:46that teaching AI to be both safe and sensitive might be one of tech's hardest balancing acts
01:51yet.
Be the first to comment
Add your comment

Recommended

0:37