00:00While Meta is still defending itself in that high-profile trial about teen user mental health,
00:05the company's platform Instagram has unveiled a new system to help alert parents when their
00:10children search for concerning topics on the app. When an underage user repeatedly searches
00:14self-harm terms within a short time frame, the company will notify parents about that behavior.
00:20These features will be available in the parental supervision tool in the app starting next week.
00:25In a blog post announcing the new feature, the company says it will flag, quote,
00:29phrases that suggest a teen wants to harm themselves and terms like suicide or self-harm.
00:34The parental notification will link to resources that can help the parent or parents deal with this
00:40issue. Meta also said they will roll out similar features in their AI experiences if, say,
00:46kids use those terms in conversations with chatbots.
Comments