00:00A landmark U.S. court ruling this week has found Google and Meta liable for the dangerous design of their
00:07platforms, citing a failure to warn users of addiction risk.
00:12In a major shift, the jury focused not on content, but on the algorithms themselves.
00:18How will this ruling rewrite the rules for big tech?
00:22Oh, I think it sets a precedent very much so.
00:25I mean, people always ask, well, how can we pass laws that keep up with technology?
00:30And the bottom line is we don't need new laws.
00:32This is a simple negligence case.
00:33Did you design your product in a way to harm individuals?
00:37And now that that's been established and the facts that came out at this trial that other attorneys, if they
00:42haven't gotten their hands on it in other litigation, certainly will be getting their hands on it now.
00:46So you can assume that it's going to lead to more litigation and probably a lot more settlements.
00:51Well, that distinction matters because Section 230, which is a federal law, basically says, listen, the content that's put on
00:58social media is not the responsibility of Meta or any of these companies.
01:03But what is their responsibility is the manner and method by which they design their algorithms in order to show
01:10you that content.
01:11And that is a unilateral choice that they make in the design of their products.
01:16And that's why they were found liable here and did not have immunity under Section 230.
01:20I think this incentivizes legislators to be on the side of the people and go ahead and figure out other
01:26ways to regulate tech companies, ensure that they don't do further harm to children or, quite frankly, society itself.
01:32I think this is a good idea.
Comments