00:00So YouTube just walked into a courtroom and said we're not addictive. We're not even social media.
00:05If you don't know what's going on right now, there's a huge trial happening where a 20-year-old
00:09woman is suing YouTube and Instagram saying these apps were basically digital slot machines that
00:14hooked her as a kid and wrecked her mental health. And YouTube's lawyers are like, whoa, whoa, whoa,
00:19we're not Instagram or TikTok, okay? We're basically Netflix with a comment section. People
00:23come here to learn, cook, fix their sink, or watch Ariana Grande, not doom scroll. They even said
00:28their algorithm isn't trying to rewrite your brain, it's just asking what you like. Meanwhile,
00:33the plaintiff's lawyer pulled out internal Google Docs where employees literally called their features
00:38slot machines. Awkward. So what's Instagram's defense? Well, they say her mental health issues
00:43come from her family problems, not the apps. And that her therapy notes, which were entered in as
00:48evidence, didn't mention social media addiction at all. Now this case is the first of thousands. If
00:54she wins, courts could force apps to redesign features like infinite scroll or pay out massive
00:59damages. If she loses, tech companies could get a huge shield from future cases just like this.
01:05Now generally, critics say that these apps are engineered to keep kids hooked. And the supporters
01:10say there's zero proof they cause any addiction and millions use them just fine. So with all that being
01:15said, here's the big question. If an app is designed to keep you engaged, but you still choose to use
01:21it,
01:21who's actually responsible? The platform, the user, or both? Drop your answer in the comments,
01:27check out our website, and follow us here for more.
Comments