00:00 Mr. Zuckerberg, let me start with you.
00:01 Did I hear you say in your opening statement
00:04 that there's no link between mental health
00:06 and social media use?
00:09 Senator, what I said is I think it's important to look
00:11 at the science.
00:12 I know people widely talk about this
00:14 as if that is something that's already been proven.
00:17 And I think that the bulk of the scientific evidence
00:19 does not support that.
00:20 Well, really, let me just remind you
00:23 of some of the science from your own company.
00:25 Instagram studied the effect of your platform on teenagers.
00:30 Let me just read you some quotes from The Wall Street Journal's
00:32 report on this.
00:33 Company researchers found that Instagram
00:35 is harmful for a sizable percentage of teenagers,
00:38 most notably teenage girls.
00:40 Here's a quote from your own study.
00:42 "We make body image issues worse for one in three teen girls."
00:47 Here's another quote.
00:48 "Teens blamed Instagram"-- this is your study--
00:50 "for increases in the rate of anxiety and depression.
00:54 This reaction was unprompted and consistent across all groups."
00:58 That's your study.
01:00 Senator, we try to understand the feedback
01:03 and how people feel about the services.
01:05 We can improve--
01:06 Wait a minute.
01:06 Your own study says that you make life worse for one
01:11 in three teenage girls.
01:12 You increase anxiety and depression.
01:14 That's what it says.
01:15 And you're here testifying to us in public that there's no link.
01:18 You've been doing this for years.
01:20 For years, you've been coming in public
01:21 and testifying under oath that there's absolutely no link.
01:24 Your product is wonderful.
01:26 The science is nascent.
01:27 Full speed ahead.
01:28 While internally, you know full well your product
01:31 is a disaster for teenagers.
01:34 Senator, that's not true.
01:35 And yet you keep right on doing what you're doing.
01:37 Right?
01:37 That's not true.
01:38 That's not true.
01:38 Let me show you some other facts I
01:40 know that you're familiar with.
01:42 You can carry the data points if you want.
01:43 That's not a question.
01:44 That's not a question.
01:45 Those are facts, Mr. Zuckerberg.
01:47 That's not a question.
01:48 Those aren't facts.
01:48 Let me show you some more facts.
01:50 Here are some-- here's some information
01:52 from a whistleblower who came before the Senate,
01:54 testified under oath in public.
01:55 He worked for you.
01:56 He's a senior executive.
01:58 Here's what he showed he found when he studied your products.
02:02 So for example, this is girls between the ages
02:05 of 13 and 15 years old.
02:06 37% of them reported that they had
02:10 been exposed to nudity on the platform,
02:12 unwanted, in the last seven days.
02:14 24% said that they had experienced
02:18 unwanted sexual advances.
02:19 They've been propositioned in the last seven days.
02:22 17% said they had encountered self-harm content pushed
02:26 at them in the last seven days.
02:29 Now, I know you're familiar with these stats
02:31 because he sent you an email where he lined it all out.
02:34 I mean, we've got a copy of it right here.
02:38 My question is, who did you fire for this?
02:42 Who got fired because of that?
02:44 Senator, we study all this because it's important
02:46 and we want to improve our services.
02:47 Well, you just told me a second ago
02:48 that you studied it, but there was no linkage.
02:51 Who did you fire?
02:52 I said you mischaracterized it.
02:54 37% of teenage girls between 13 and 15
02:58 were exposed to unwanted nudity in a week on Instagram.
03:01 You knew about it.
03:02 Who did you fire?
03:03 Senator, this is why we're building all these--
03:05 Who did you fire?
03:07 Senator, I don't think that that's--
03:09 Who did you fire?
03:10 I'm not going to answer that.
03:12 Because you didn't fire anybody, right?
03:15 You didn't take any significant action.
03:16 It's appropriate to talk about individual HR decisions
03:20 that didn't--
03:21 Do you know who's sitting behind you?
03:23 You've got families from across the nation whose children
03:27 are either severely harmed or gone,
03:30 and you don't think it's appropriate to talk
03:32 about steps that you took?
03:33 The fact that you didn't fire a single person?
03:35 Let me ask you this.
03:36 Let me ask you this.
03:37 Have you compensated any of the victims?
03:39 Sorry?
03:39 Have you compensated any of the victims?
03:42 These girls, have you compensated them?
03:44 I don't believe so.
03:46 Why not?
03:49 Don't you think they deserve some compensation
03:51 for what your platform has done?
03:53 Help with counseling services?
03:54 Help with dealing with the issues
03:57 that your service has caused?
03:58 Our job is to make sure that we build tools
04:01 to help keep people safe.
04:02 Are you going to compensate them?
04:04 Senator, our job, and what we take seriously,
04:06 is making sure that we build industry-leading tools
04:09 to find harmful content--
04:10 To make money.
04:11 --take it off the services--
04:12 To make money.
04:13 --and to build tools that empower parents--
04:15 So you didn't take any action.
04:16 You didn't take any action.
04:17 You didn't fire anybody.
04:18 You haven't compensated a single victim.
04:20 Let me ask you this.
04:21 Let me ask you this.
04:22 There's families of victims here today.
04:23 Have you apologized to the victims?
04:27 Would you like to do so now?
04:29 They're here.
04:29 You're on national television.
04:31 Would you like now to apologize to the victims who
04:33 have been harmed by your product?
04:35 Show them the pictures.
04:37 Would you like to apologize for what you've
04:38 done to these good people?
04:40 [APPLAUSE]
04:43 I'm sorry for the situation that we're in.
04:45 It's really terrible.
04:46 And no one should have to go through the things
04:48 that your families have suffered.
04:51 And this is why we invested so much in our own team,
04:54 doing industry-leading efforts to make sure
04:57 that no one has to go through the types of things
05:00 that your families have had to suffer.
05:02 You know, why, Mr. Zuckerberg, why should your company not
05:11 be sued for this?
05:13 Why is it that you can claim-- you
05:15 hide behind a liability shield.
05:17 You can't be held accountable.
05:19 Shouldn't you be held accountable personally?
05:20 Will you take personal responsibility?
05:22 Senator, I think I've already answered this.
05:25 I mean, this is--
05:25 Well, try us again.
05:26 Will you take personal responsibility?
05:29 Senator, I view my job and the job of our company
05:32 as building the best tools that we can
05:34 to keep our community safe.
05:36 Well, you're failing at that.
05:37 Well, Senator, we're doing an industry-leading effort.
05:39 We build AI tools that--
05:40 Oh, nonsense.
05:41 Your product is killing people.
05:42 Will you personally commit to compensating the victims?
05:44 You're a billionaire.
05:45 Will you commit to compensating the victims?
05:47 Will you set up a compensation fund with your money?
05:49 Senator, I think these are--
05:50 With your money.
05:51 Senator, these are complicated issues.
05:52 No, that's not a complicated question, though.
05:54 That's a yes or no.
05:55 Will you set up a victims' compensation fund
05:57 with your money, the money you made on these families
06:00 sitting behind you, yes or no?
06:02 Senator, I don't think that that's--
06:04 my job is to make sure we build good tools.
06:06 My job is to make sure that--
06:08 your job is to be responsible for what your company has done.
06:11 You've made billions of dollars on the people sitting
06:14 behind you here.
06:15 You've done nothing to help them.
06:16 You've done nothing to compensate them.
06:19 You've done nothing to put it right.
06:20 You could do so here today, and you should.
06:22 You should, Mr. Zuckerberg.
Comments