Skip to playerSkip to main content
  • 1 day ago
Transcript
00:00Meta finally tightens AI chatbot rules to protect kids.
00:05Meta's AI chatbots are getting a stricter rulebook, and this one reads less like a tech
00:09manual and more like a digital babysitter's handbook.
00:13Business Insider got its hands on the new internal guidelines Meta contractors are using
00:17to train the company's chatbots, revealing how the social media giant is trying to keep
00:22its AI on the right side of child safety concerns.
00:25Back in August, Reuters reported that Meta's policies allegedly let its AI bots engage
00:31a child in conversations that are romantic or sensual.
00:35Meta called that claim erroneous and inconsistent with its rules and quickly scrubbed the offending
00:40language.
00:41Still, the damage was done, and regulators, including the FTC, started looking a lot more
00:46closely at companion AIs across the industry.
00:49Now the leaked training document lays out exactly what the bots can and cannot say.
00:54The rules are blunt.
00:56No content that enables, encourages, or endorses child sexual abuse.
01:01No romantic role play if the user is a minor, or if the AI is asked to act like a minor.
01:06No advice about potentially romantic or intimate physical contact with minors, period.
01:12The bots can still discuss heavy topics like abuse in an informational way, but anything that
01:17could be seen as flirtatious or enabling?
01:20Hard stop.
01:21The FTC's inquiry covers other big names, Alphabet, Snap, OpenAI, and X.AI, all of which
01:28are being asked how they're protecting kids from chatbot creepiness.
01:32But Meta, with its sprawling user base and recent AI push, has become the poster child for how
01:37messy this can get.
01:38In other words, Meta's chatbots are learning some very strict manners.
01:43Whether these new guardrails will satisfy regulators or prevent another headline-grabbing slip remains
01:48to be seen.
01:50For now at least, the bots are officially on a no-flirting-no-funny-business diet.
Be the first to comment
Add your comment

Recommended