Skip to playerSkip to main content
  • 2 years ago
On Wednesday, the House Energy & Commerce Committee held a legislative hearing on a “Proposal to Sunset Section 230 of the Communications Decency Act.”

Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:

https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript


Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com

Category

🗞
News
Transcript
00:00:00 Good morning. The subcommittee will come to order. The Chair recognizes himself for an opening statement.
00:00:04 Good morning and welcome to today's legislative hearing to discuss sunsetting Section 230 of the
00:00:11 Communications Feasibility Act. Since 1996, Section 230 protections have allowed the U.S.
00:00:17 tech industry to flourish. This legal framework emboldened Americans to pioneer creating internet
00:00:25 and social media platforms for innovation, user content, and social media interaction. Its intent
00:00:32 was to provide online platforms immunity from liability for content posted by third-party users.
00:00:38 But as the internet exploded in growth, it also increased challenges that were not contemplated
00:00:43 when the law passed in 1996. Section 230 must be reformed. As we heard in our last hearing on this
00:00:50 topic, the current online ecosystem is flawed. Many of these platforms are rife with content such as
00:00:57 online self-sex trafficking, narcotics, child pornography, and other illicit crimes. In
00:01:03 response, big tech platforms hide behind Section 230's broad immunity. In that process, courts have
00:01:10 rewarded their destructive behavior. We need to reform Section 230 to hold platforms accountable
00:01:16 for the role they play in facilitating and enabling harmful behavior. But in doing so,
00:01:22 Congress must be thoughtful and deliberative. There is no silver bullet to fix this issue.
00:01:26 Some argue that amending or repealing Section 230 violates the First Amendment rights of those
00:01:33 platforms to host the content they so choose. Yet no industry has complete protection from
00:01:39 all liability for harm it causes. Newspapers, broadcasters, fundamental mediums that exemplify
00:01:45 our First Amendment rights are subject to publisher liability or can be sued for defamation.
00:01:49 Over the past several Congresses, there have been numerous proposals to hold big tech accountable
00:01:56 for when it acts as a publisher in moderating content on its platforms but to no avail.
00:02:01 Which is why today we are reviewing a discussion draft that will sunset Section 230 of the
00:02:06 Communications Act of 1934 effective December 31, 2025. I hope this legislation will bring
00:02:14 people together, including those who support, oppose, or are interested to carefully discuss
00:02:20 Section 230 reforms. One thing is certain, big tech's behavior has brought Republicans and
00:02:25 Democrats together on a commitment to find a long-term solution to reform Section 230.
00:02:30 Congress has a monumental task ahead, though we must reform the law in a way that will protect
00:02:36 innovation, promote free speech, and allow big tech to moderate indecent and illegal content
00:02:42 on its platforms and be accountable to the American people.
00:02:46 I look forward to our discussion today and to working with my colleagues on a broader discussion
00:02:51 about purposeful reforms of Section 230. It's up to Congress, not the courts, to reform Section 230.
00:02:57 And changes to this law are long overdue. And with that, I will yield back the balance of my time.
00:03:04 And before I do, recognize the Ranking Member of the Subcommittee of the General
00:03:07 Lady from the 7th District of California. I just want to mention to our witnesses that we
00:03:11 do have three subcommittees of the Energy and Commerce running today. Health starts at 1030,
00:03:16 so you're going to see members leaving the committee and coming back. It's not
00:03:20 that they don't want to hear your testimony, but you have to be back and forth in committee.
00:03:24 So I just want to mention that before we get started. And also, if I just take a point of
00:03:28 personal privilege, recognize our Chair's birthday today. Well, happy birthday.
00:03:40 And with that, I now recognize the Ranking Member of the Subcommittee of the General Lady from the
00:03:44 7th District of California for her opening statement. Thank you very much, Mr. Chairman.
00:03:49 Of the many critical issues within the subcommittee's jurisdictions, few are more
00:03:54 consequential than Section 230. In both positive and negative ways, Section 230 of the Communications
00:04:00 Decency Act has shaped our online ecosystem. Regularly referred to as the 26 words that
00:04:07 created the internet, Section 230 established vital protections that allowed the internet to
00:04:13 flourish in its early days. Without 230, it's unlikely we'd have the vibrant internet ecosystem
00:04:20 we all enjoy. And yet, despite this, it's clear that Section 230 as it exists today isn't working.
00:04:27 The status quo simply is not viable. As with any powerful tool, the boundaries of Section
00:04:34 230 have proven difficult to delineate, making it susceptible to regular misuse. The broad shield
00:04:42 it offers can serve as a haven for harmful content, disinformation, and online harassment.
00:04:49 This has raised significant concerns about the balance between protecting freedom of expression
00:04:55 and ensuring accountability for the platforms that host this content. These concerns aren't
00:05:02 abstract. From documented attempts to interfere with our elections to the harm this content is
00:05:08 inflicting on America's young people, the unchecked immunity of 2230 has consequences.
00:05:15 And we know many online platforms aren't simply hosting this content.
00:05:21 They're actively amplifying it to reach more viewers. Most large social media platforms
00:05:26 are designed from the ground up to promote constant engagement rather than healthy interactions.
00:05:34 That means pushing harmful and misleading information on some of the most vulnerable
00:05:39 users with ruthless effectiveness. Young women are constantly subjected to unhealthy or untested
00:05:47 diets. Suicidal material is foisted on those seeking mental health care. And recent elections
00:05:54 show the ongoing danger of targeted disinformation. So it should be clear to all the role of Section
00:06:02 230 needs immediate scrutiny because as it exists today, it is just not working.
00:06:08 To date, congressional efforts to make needed reforms have come up short, but we can't give up.
00:06:16 The stakes are just too high. But it's also for that reason that we must be intentional,
00:06:23 thoughtful and deliberative in our attempts to update Section 230. Any reforms we implement
00:06:29 should create a meaningful incentive for online platforms to own the outcomes of the products
00:06:35 they're designing in a way they don't currently. It's well understood that many of these platforms
00:06:41 are knowingly amplifying harmful content. There can and should be consequences for that.
00:06:49 Until the fundamental dynamic changes, we can't expect to achieve the safer online experience
00:06:56 we all want. And we need reforms that allow for better enforcement of civil rights laws
00:07:02 to prevent some of the most upsetting discriminatory behavior online. But while
00:07:08 there's widespread bipartisan consensus in Congress that 230 needs to be modernized,
00:07:14 the process for getting there remains unclear. That's why I believe the process is just as
00:07:20 important as the product. We need a thoughtful process that allows for nuanced conversations
00:07:26 on difficult issues. I'm ready to begin that work. And with that, I yield back the balance of my time.
00:07:33 Thank you very much. The gentlelady yields back the balance of her time. The chair now recognizes
00:07:38 the gentlelady from Washington, the chair of the full Committee of Energy and Commerce, for five
00:07:43 minutes for her opening statement. Good morning and thank you, Chairman Latta. Ranking Member
00:07:49 Pallone and I recently unveiled bipartisan draft legislation to sunset Section 230 of the
00:07:55 Communications Decency Act. As written, Section 230 was originally intended to protect internet
00:08:03 service providers from being held liable for content posted by a third-party user or for
00:08:09 removing truly horrific or illegal content. The intent was to make the internet a safe place
00:08:15 and allow companies to remove harmful content in good faith without being held liable for doing so.
00:08:21 However, the internet has changed dramatically since then. Over five billion people around the
00:08:28 world use social media, with the average person spending more than two hours a day on social media.
00:08:35 The internet has become vital for people to connect, work, find information, make a living.
00:08:40 Big tech is exploiting this to profit off us and use the information we share to develop
00:08:48 addictive algorithms that push content onto our feeds. At the same time, they refuse to
00:08:54 strengthen their platform's protections against predators, drug dealers, sex traffickers,
00:09:00 extortioners, and cyber bullies. Our children are the ones paying the greatest price. They are
00:09:07 developing addictive and dangerous habits, often at the expense of their mental health.
00:09:11 Big tech has failed to uphold American values and be good stewards of the content they host.
00:09:19 It's been nearly three decades since Section 230 was enacted. And the reality is many of these
00:09:26 companies didn't even exist when the law was written, and we could not comprehend the full
00:09:32 effect of the internet's capabilities. It is past time for Congress to reevaluate Section 230.
00:09:39 In recent years, U.S. courts have expanded the meaning of what Congress originally intended
00:09:46 for this law, interpreting Section 230 in a way that gives big tech companies nearly unlimited
00:09:54 immunity from legal consequences. These blanket protections have resulted in tech firms
00:10:01 operating without transparency or accountability for how they manage their platforms and harm users.
00:10:07 This means that a social media company, for example, can't easily be held responsible if it
00:10:14 promotes, amplifies, or makes money from selling drugs, illegal weapons, or other illicit content
00:10:22 through their post. As more and more companies integrate generative artificial intelligence
00:10:29 technologies into their platforms, these harms will only get worse, and AI will redefine what
00:10:37 it means to be a publisher, potentially creating new legal challenges for companies.
00:10:41 As long as the status quo prevails, big tech has no incentive to change the way they operate,
00:10:49 and they will continue putting profits ahead of the mental health of our society and youth.
00:10:54 Reforming Section 230 and holding big tech accountable has been a long-time priority
00:11:01 of mine and Ranking Member Pallone. Last Congress, we both introduced our own legislation to reform
00:11:07 the decades-old law. Unfortunately, tech companies did not engage with us in meaningful ways
00:11:14 and offered no solutions or reforms. Ultimately, no solutions or reforms were made, and big tech
00:11:20 is satisfied with the status quo, so much so that they've become masters at deception, distraction,
00:11:27 and hiding behind others in order to keep Section 230 unchanged. That's why we're taking
00:11:34 bipartisan action now. Our discussion draft will bring Congress and stakeholders to the table
00:11:39 to work in good faith to create a solution that ensures accountability, protects innovation and
00:11:45 free speech, and requires companies to be good stewards of their platforms. Let me be clear,
00:11:51 our goal is not for Section 230 to disappear, but the reality is that nearly 25 bills to amend
00:12:00 Section 230 have been introduced over the last two Congresses. Many of these were good-faith
00:12:05 attempts to reform the law, and big tech lobbied to kill them every time. These companies left us
00:12:11 with no other option. By enacting this legislation, we will force Congress to act. It is long past
00:12:18 time to hold these companies accountable. The shield of Section 230 should be there to protect
00:12:25 the American people, not big tech. I'm hopeful that this legislation is the start of an opportunity
00:12:32 to work in a bipartisan way to achieve that goal. It is vital that we develop solutions to restore
00:12:38 people's free speech, identity, and safety online, while also continuing to encourage innovation.
00:12:47 That's the American way. I look forward to hearing from the witnesses today, and I yield back.
00:12:51 Thank you. The General yields back. The Chair now recognizes the gentleman from New Jersey,
00:12:56 the Ranking Member of the full committee, for five minutes for an opening statement.
00:13:00 Thank you, Mr. Chairman. Today, we continue the committee's work of holding big tech
00:13:04 accountable by discussing draft legislation that Chair Rogers and I circulated that would sunset
00:13:10 Section 230 of the Communications Decency Act at the end of 2025. While I believe that Section
00:13:17 230 has outlived its usefulness and has played an outsized role in creating today's profits over
00:13:24 people—internet—a sunset gives us time to have a serious conversation about what concepts are worth
00:13:29 keeping. Section 230 was codified nearly 30 years ago as a good Samaritan statute designed to allow
00:13:36 websites to restrict harmful content. While it was intended to be just one part of the Communications
00:13:43 Decency Act, it was almost immediately left to exist on its own when most of that act was deemed
00:13:48 to be unconstitutional. Section 230 was written when the internet largely consisted of simple
00:13:53 websites and electronic bulletin boards. Today, the internet is dominated by powerful trillion-dollar
00:13:59 companies. Many of these companies have made their fortunes using sophisticated engagement
00:14:04 and recommendation algorithms and artificial intelligence to harvest and manipulate our
00:14:09 speech and our data. All of that is in an effort to maximize the time we spend on their platforms
00:14:15 and to sell advertising. Unfortunately, these platforms are now not working for the American
00:14:22 people, especially for our children. But that shouldn't surprise us. These companies aren't
00:14:26 required to operate in the public interest like broadcasters, nor do they have robust editorial
00:14:31 standards like newspapers. They are not a regulated industry like so many other important sectors of
00:14:37 our economy. And the vast majority are publicly traded companies with a singular duty under
00:14:42 corporate law to maximize value for their shareholders by increasing their profits.
00:14:47 As a result, they face constant pressure to grow their user base, which these days means hooking
00:14:52 children and teens. They introduce addictive features to keep us watching and clicking.
00:14:58 They exploit our data to develop granular profiles on each of us to sell advertising
00:15:03 and then provoke our emotions to monetize our engagement. And with Section 230 operating as
00:15:09 a shield to liability when people are harmed, making money remains the primary factor driving
00:15:14 decisions. As a result, provocative videos glorify suicide and eating disorders, dangerous viral
00:15:21 challenges, horrific child abuse images, merciless bullying and harassment, graphic violence, and
00:15:27 other pervasive and targeted harmful content is being fed nonstop to children and adults alike.
00:15:33 Just this week, a popular event and ticketing platform was found to have been promoting
00:15:38 illegal opioid sales to people searching for addiction recovery gatherings. So, frankly,
00:15:44 I think we deserve better than all this. And the fact that Section 230 has operated as a near
00:15:48 complete immunity shield for social media companies is due to decades of judicial opinions trying to
00:15:54 parse its ambiguities and contradictions. Judges have attempted to apply it to technologies and
00:16:00 business models, though that could not have been envisioned when it was drafted. And the courts
00:16:04 have expanded on Congress's original intent and have created blanket protections for big tech
00:16:09 that has resulted in these companies operating without any transparency or accountability.
00:16:14 I don't believe that anyone could come before us now and credibly argue that we should draft
00:16:19 230 or Section 230 the same today. But despite all of this, some courts have started to scrutinize
00:16:25 the limits of Section 230 more closely. Moreover, major search engines have recently begun to
00:16:30 substitute their own AI content over search results directing users to third-party sites.
00:16:36 Not only does this demonstrate an intentional step outside of the shelter of Section 230's
00:16:41 liability shield and raise significant questions about its future relevance,
00:16:46 but it also upsets settled assumptions about the economies of content creators
00:16:50 and the reach of user speech. It's only a matter of time before more courts chip away at Section
00:16:56 230 or the Supreme Court of Technological Progress upends it entirely. And state legislatures are
00:17:02 growing impatient, increasingly passing bills seeking to introduce liability to tech platforms.
00:17:07 But for now, we're left with the status quo, a patchwork where, more often than not,
00:17:12 bad Samaritans receive broad protection from a statute intended to promote decency on the
00:17:17 Internet. So Congress should not wait for the courts. We should act. Our bipartisan
00:17:21 draft legislation would require big tech and others to work with Congress over the next 18 months
00:17:26 to develop and enact a new legal framework that works for the Internet of today.
00:17:30 I believe we can work together to develop a framework that restores the Internet's intended
00:17:34 purpose of free expression, prosperity, and innovation. And I want to finally say,
00:17:40 Mr. Chairman, I reject big tech's constant scare tactics about reforming Section 230. Reform will
00:17:46 not break the Internet or hurt free speech. The First Amendment, not Section 230, is the basis
00:17:51 for our nation's free speech protections, and those protections will remain in place regardless
00:17:57 of what happens to Section 230. We cannot allow big tech to continue to enjoy liability protections
00:18:02 that no other industry receives. And I yield back, Mr. Chairman.
00:18:06 Thank you very much. The gentleman yields back. This concludes member opening statements. The
00:18:12 Chair reminds members that pursuant to the Committee rules, all member opening statements
00:18:16 will be made part of the record. And at this time, we also want to thank our witnesses for being here
00:18:22 before the subcommittee today. We greatly appreciate your testimony. Our witnesses will
00:18:27 have five minutes to provide an opening statement, which will be followed by a round of questions
00:18:31 from our members. The witnesses that are appearing before us today are Ms. Carrie Goldberg,
00:18:37 founding attorney at C.A. Goldberg, PLLC; Mr. Mark Berkman, the CEO of the Organization for Social
00:18:46 Media Safety; and Ms. Kate Tamariello, the executive director at Enjin. I would like to
00:18:54 note for our witnesses that the timer light on the table will turn yellow when you have one minute
00:18:59 remaining and will turn red when your time has expired. Ms. Goldberg, you are recognized for
00:19:04 five minutes for your opening statement. Once again, thanks for being with us.
00:19:07 Thank you. And good morning, Chair Lata, Subcommittee,
00:19:15 McMorris-Rogers, or Subcommittee Chair McMorris-Rogers. Happy birthday to you
00:19:20 and ranking members Matsui and Pallone, distinguished members of the House
00:19:24 Committee on Energy and Commerce. Thank you so much for inviting me to testify today.
00:19:29 My name is Carrie Goldberg, and I'm the owner of a national law firm where we litigate for families
00:19:35 who've been destroyed by big tech. I stand for the belief that our courts are the great equalizer.
00:19:42 In 1791, Congress passed the Seventh Amendment, which gives Americans the right to a civil jury
00:19:49 trial. This gives us the constitutional right to hold accountable those who injure us. And I stand
00:19:57 against the idea that some bad actors, whether human or corporate, are too important to face
00:20:04 their victims eye to eye, to be examined by a jury, or to pay for their harms. That the Seventh
00:20:11 Amendment just doesn't apply. Companies that mint money off of the backs of the masses can't later
00:20:17 claim when their products hurt those people in predictable or even known ways that it's not their
00:20:24 fault and that they're just a passive publisher. And I want to tell you about some of the cases that
00:20:30 I work on every day. A child whose murder was live posted and her family is harassed daily by people
00:20:40 who defile her murder images, and in her death, Instagram went from her profile having 2,000 users
00:20:49 or visitors to almost 200,000 followers. Instagram refuses to give her estate the power to control
00:20:56 the account. I represent children who were matched with predators on a video streaming app,
00:21:03 including 11-year-old A.M., who was used by a man in his 30s for sex and became his sex slave
00:21:10 for three entire years and made her go back on the app Omegle to recruit more kids.
00:21:18 I'm the originating attorney in a case against Snap where our client's children were matched
00:21:23 with drug dealers who sold them fentanyl and killed them. This case has 90 families in it,
00:21:30 and all 90 are mourning the deaths of their kids. One parent is even here today. I represent
00:21:37 multiple victims of the serial rapist Dr. Stephen Matthews in Denver, who was using Hinge and Tinder
00:21:45 as a catalog to find more victims, and Tinder knew about it. I represent a young nurse,
00:21:52 or the family of a young nurse, from Brooklyn who was murdered on a Tinder first date by a felon
00:21:58 who was erroneously released from prison. I represent a man who was impersonated on a gay
00:22:04 dating app, and over 1,000 strangers came to his home in person to rape him while the platform
00:22:12 stood by and watched. I also represent a child with profound autism who was matched on a dating
00:22:20 app that aggressively markets to children and then was raped by four men over four consecutive days.
00:22:26 And finally, I represent over 30 families whose children purchased a suicide kit online and died
00:22:34 the most horrific deaths imaginable, and for 24 of those families, it was an online retailer called
00:22:42 Amazon that sold the chemical, aware that there was no household use for it. I do not represent
00:22:51 people who were harmed by mere content moderation decisions. In every single one of my cases,
00:22:57 the platform knowingly caused the conduct, released a dangerous product into the stream of commerce,
00:23:03 sold something deadly, turned a blind eye, or in some cases, all four. I sue them for their
00:23:10 product defects, just like you would if a community was poisoned by contaminated groundwater, or weed
00:23:17 killers caused cancer, or an emergency exit door fell off of an airplane mid-flight. Yet in virtually
00:23:26 every one of my cases, the online company says that it was just a mere form of speech, that it's
00:23:33 immune from liability, and just a passive publisher. Now over the last 10 years, I've lost more cases
00:23:41 because of Section 230 than anybody that I know, but I've also won a few. Congress passed Section
00:23:48 230 in 1995 as an accident in its mission to end online porn, but now it's used by internet oligarchs
00:24:00 to shirk the courts and avoid the victims. They make money off their victims, they pay lobbyists,
00:24:09 and they amass just an unknown amount of power. Meanwhile, the equivalent of a plane full of
00:24:16 children is crashing into mountains every single day while we just wait and watch. In 1997,
00:24:23 the court in a famous case said, "The internet is a rapidly developing technology.
00:24:30 Today's problems may soon be obsolete, while tomorrow's challenges are yet unknowable." In
00:24:36 this environment, Congress is likely to have reasons and opportunities to revisit the balance
00:24:41 struck in the CDA. Yet here we are 27 years later, but only Congress can fix this emergency. Congress
00:24:51 created Section 230, Congress can fix it, and I'm here to support the sun setting of Section 230
00:24:56 to restore balance. Thank you. Thank you. Mr. Berkman, you're recognized for five minutes.
00:25:03 Thank you very much for being with us today. Thank you. Good morning. Chair Latta, Ranking
00:25:08 Member Matsui, thank you for the opportunity to testify. As a former staffer myself, I also want
00:25:15 to thank all the hardworking staff sitting back there today, so thank you. My name is Mark Berkman.
00:25:20 I'm the CEO of the Organization for Social Media Safety, the first and leading consumer protection
00:25:27 organization focused exclusively on social media. Thank you to Chair Rogers, Ranking Member Pallone,
00:25:35 and the full membership of the committee for your strong bipartisan efforts and comprehensive
00:25:43 approach towards protecting families from the dangers associated with social media use.
00:25:49 Harms either caused or exacerbated by social media are indeed severe and they are pervasive.
00:25:56 In our own study with over 14,000 teens, we have found that a whopping 53 percent self-report using
00:26:05 social media for more than five hours a day. That is a lot of time, and in that time, our children
00:26:14 are being exposed to a range of threats. Here are some study findings. A breathtaking 46 percent of
00:26:22 teens self-report being a victim of cyberbullying. Cyberbullying victims are about two and a half
00:26:29 times more likely to attempt suicide. Last year, the FBI reported 12,600 sextortion victims,
00:26:38 at least 20 of whom died by suicide. 43 percent of young adults had seen self-harm content on
00:26:45 Instagram. 33 percent consequently replicated such self-harm. The more time adolescents spend
00:26:52 on social media, the more likely they are to be exposed to drug-related content and to experiment
00:26:58 with substance use. New TikTok accounts set up by a supposed 13-year-old were recommended self-harm
00:27:05 and eating disorder content within minutes. These studies, among many, many others, indicate real
00:27:14 ongoing harm from social media. We do not need to wait for more research. The conclusions are clear.
00:27:22 Social media executives have themselves readily acknowledged the safety concerns in these very
00:27:29 halls. Meta's Mark Zuckerberg acknowledged that Meta didn't do enough to prevent their platforms
00:27:36 from being used for harm, including a failure on data privacy. Xu Zhi Qiu of TikTok stated that the
00:27:43 security, privacy, and content manipulation concerns raised about TikTok apply to the other
00:27:50 social media companies as well. Snapchat said that while they've set up proactive detection
00:27:56 measures to get ahead of what drug dealers are doing, those drug dealers are constantly evading
00:28:02 Snapchat's tactics, not just on Snapchat, but on every platform. Despite this very clear consensus
00:28:11 among its own leaders that severe safety risks to adolescents pervade the industry, big social
00:28:18 is not taking sufficient action. And so the harms continue, the mortality count mounts.
00:28:26 That is why we must stand in support of Chair Rogers and Ranking Member Pallone's Section 230
00:28:32 Sunset Discussion Draft. Section 230(c) has directly facilitated these harms by gutting
00:28:38 our carefully developed tort law jurisprudence for this industry. We have removed the traditional
00:28:45 public policy mechanism that forces all other companies to appropriately consider public safety
00:28:52 along with their profit motives when making business decisions. The results speak for
00:28:58 themselves. But we cannot fully understand the failures of Section 230 without a focus
00:29:04 on its tragically forgotten provision, Section 230(d). Congress required that internet providers,
00:29:12 including today's social media platforms, provide users upon signing up for the service
00:29:18 with a list of commercially available safety software providers. The clear legislative intent
00:29:24 of Congress was to provide the civil liability immunity provisions of Section 230(c) only in
00:29:31 conjunction with the understanding that a robust safety software industry would help ensure
00:29:38 critical safety support to families and children. Tragically, the social media industry has
00:29:44 consistently defied the clear mandate of Section 230(d) and unfortunately Congress could not have
00:29:51 envisioned that today's social media platforms would have to provide some level of minimal
00:29:57 assistance to third-party safety software providers for their products to effectively
00:30:02 function. With this essential pillar of Section 230 long forgotten and ignored, we have seen
00:30:09 millions of children unnecessarily harmed. That is why along with the other essential pieces of
00:30:16 legislation that this committee is considering like APRA, COPPA 2.0 and COSA, Congress must
00:30:23 pass Sammy's Law to restore this imbalance and give caregivers the choice of using safety software
00:30:29 to protect their children. As we consider comprehensive reform, the committee should
00:30:34 move forward with sunsetting Section 230 to bring a reluctant unwilling social media industry to the
00:30:40 table. If the industry wants a more tailored policy framework in place, let them finally engage in
00:30:47 meaningful dialogue and compromise. Given the current public health catastrophe, the daily
00:30:53 deaths, the growing harms, we need to sunset this broken system today and get to work to protect
00:31:00 America's families. Thank you. Thank you. Ms. Tamariello, you are recognized for five minutes
00:31:06 for your opening statement. Chairs McMorris, Rogers and Lada, Ranking Members Pallone and Matsui,
00:31:13 members of the subcommittee, thank you for the invitation to testify before you today.
00:31:16 My name is Kate Tamariello and I am the Executive Director of Enjin, a non-profit that works with
00:31:22 thousands of startups across the country to advocate for pro-startup, pro-innovation policies.
00:31:28 Sunsetting Section 230, especially in a little over 18 months without consensus around an
00:31:33 alternative framework, risks leaving Internet platforms, especially those run by startups,
00:31:38 open to ruinous litigation, which ultimately risks leaving Internet users without places to gather
00:31:43 online. That user perspective is so important to these conversations. I am incredibly grateful for
00:31:49 the people like Carrie's clients who are willing to share their stories about the harms that can
00:31:53 arise when people connect online. It undoubtedly helps us to understand what's at stake here.
00:31:58 But it's also important to hear about the benefits of online communities for user expression,
00:32:03 a perspective that isn't represented here today. So as an Internet user myself, I'd like to talk
00:32:09 briefly about a time I relied on an online community that likely couldn't exist as it
00:32:13 does without Section 230. In the summer of 2022, I needed medical care. When I had a pregnancy,
00:32:21 tragically ended 22 weeks due to a critical health problem. While I had an amazing support
00:32:27 system of loved ones and medical professionals, I didn't know anyone personally who had navigated
00:32:32 such a devastating loss. I was able to turn to the support of online pregnancy loss communities,
00:32:39 sometimes on Facebook groups or through Instagram DMs, but often on small platforms that provide
00:32:44 resources and discussion forums for pregnant women. I leaned on those communities to navigate
00:32:49 not only surviving the emotional trauma, but also some practical considerations, like how do I get
00:32:56 my body to stop producing breast milk because it hasn't realized that my pregnancy didn't end with
00:33:01 a healthy baby. Since it was the summer of 2022, and there was a rapidly shifting legal landscape
00:33:07 around reproductive health care, I saw in real time these communities shrink as women express
00:33:13 fear not only about seeking care, but even about talking about seeking care online. And we've since
00:33:20 seen states propose making it illegal to offer support to people seeking reproductive health
00:33:25 care, including operating an internet service that facilitates user sharing information.
00:33:30 Currently, Section 230 is what would prevent that small platform for pregnant women that I used
00:33:36 from having to endure expensive and time-consuming litigation anytime one person wants to see
00:33:42 another person's content about reproductive health removed from the internet. My personal story is
00:33:47 about pregnancy loss, but you can substitute in any other controversial topic, religious beliefs,
00:33:53 fertility treatment, hunting gear, the Me Too movement, political organizing, etc., and see the
00:33:58 same consequence. If an internet platform could be sued or could be even threatened with a lawsuit
00:34:04 over the content created and shared by its users, the platform will have an incredibly hard time
00:34:09 justifying hosting that content or anything that comes close. Not only does that put those platforms
00:34:15 in the very expensive and time-consuming position of having to find and remove lawful user speech
00:34:20 they might want to host, it means dramatically fewer places on the internet where people can
00:34:25 have these kinds of difficult but necessary, and for me, life-saving conversations. Sunsetting
00:34:31 Section 230 would harm the diverse ecosystem of internet platforms and the users that rely on
00:34:35 them. Section 230 is a faster, more efficient way to reach an inevitable legal conclusion,
00:34:41 that internet platforms shouldn't be punished in court for the speech and conduct of their users
00:34:45 the platforms can't logically be expected to know about, and that means litigants can't use the
00:34:50 threat of drawn-out and expensive legal battles to pressure internet platforms into removing speech
00:34:54 the litigants don't like. Section 230 has enabled user expression far beyond the platforms run by
00:35:01 big tech companies. That includes everything from the non-profit run Wikipedia, to libraries,
00:35:06 to educational institutions, to internet infrastructure companies, to individuals running
00:35:11 mastodon servers or community list serves or bloggers with comment sections. And it works
00:35:16 for startups and engines network, like those that build local communities through events,
00:35:20 create safer dating experiences, facilitate conversation about current events, support
00:35:24 educators, help small businesses find customers, and much more. We know that startups with limited
00:35:29 budgets and small teams invest proportionally more in content moderation than their larger
00:35:33 counterparts. They have to. They need their corners of the internet to remain safe, healthy,
00:35:38 and relevant if they want to grow. But they don't have the thousands of content moderators that large
00:35:43 tech platforms employ, and they can't always buy or build content detection and removal technologies,
00:35:48 neither of which is a silver bullet option. Startups are also least equipped to handle the
00:35:52 costs of litigation. The average seed stage startup has about $55,000 per month to cover
00:35:58 all of its expenses. Contrast that with the cost of defending against a lawsuit, which can cost
00:36:04 hundreds of thousands of dollars, even if the startup were to ultimately prevail. It would
00:36:09 always be in the best option for a startup's bottom line to just avoid the lawsuit altogether,
00:36:15 even if that means removing user content it would otherwise host to the detriment of its users.
00:36:19 Sunsetting Section 230 won't lead to the outcome that this, the leaders of this committee say
00:36:24 they want, an internet where free expression, prosperity, and innovation can flourish.
00:36:28 Section 230 is and has been critical to those goals, and it's essential for the competitiveness
00:36:32 of U.S. startups. Instead of sunsetting Section 230 in the hopes of an elusive replacement,
00:36:37 we must be clear-eyed about what we can realistically accomplish and what we risk
00:36:41 in terms of trade-offs to expression, prosperity, and innovation. Thank you for the opportunity to
00:36:45 testify, and I look forward to answering your questions. Thank you all for your opening
00:36:50 statements today, and that will conclude, and we will now start with members' questions to our
00:36:55 witnesses. Ms. Tumarillo, if I could start my questions with you. You know, you state in your
00:37:01 testimony about we want to make sure that the small and the medium-sized companies out there
00:37:06 and the startups can be out there and have the protection of Section 230, and also I'm sure you
00:37:12 agree that there's some companies out there that can better protect themselves. At what point should
00:37:18 we consider a company to be big enough to take more responsibility for their platforms? Thank
00:37:24 you for the question, Chair. I think, you know, like I said in my testimony, we know that startups
00:37:29 invest proportionally more in content moderation to keep their users safe. They have to. It's a
00:37:33 business necessity for them if they want to grow. Generally, I think we're very hesitant to put a
00:37:39 cap on what constitutes a startup. It's really hard to measure. You can have a lot of users with a
00:37:44 very small team. You can have a lot of users without having a lot of profit. I'm not sure
00:37:49 there's a great metric that says once you hit this point, you should be expected to immediately
00:37:55 find and remove any problematic content, especially when we know companies of all sizes,
00:38:00 but especially startups, are already investing in finding and removing harmful content. So I don't
00:38:04 think there's kind of a clear threshold where once you cross it, you should have the resources to
00:38:10 perfectly moderate content and prevent users from sharing harmful content every time.
00:38:16 And that's part of our problem that we're going to have as we look going forward is
00:38:20 establishing those, you know, those guideposts out there, the guardrails, is that where we're
00:38:25 going to be in between who's that startup, who's that small company, who's that medium,
00:38:30 because again, we want to make sure that they can flourish out there in the economy. Ms. Goldberg,
00:38:36 do you believe small tech companies should be carved out from Section 2(3) reform legislation
00:38:41 or this sunset bill and why or why not? I don't believe that there should be any
00:38:46 exceptions for small startup companies. Some of the internet's most malicious websites are small.
00:38:53 Omegle was run by one person and it accommodated 60 million unique visitors a month, matching
00:39:00 adults with children for sexual online streaming. Sanctioned suicide is run by two people. It's a
00:39:08 pro-suicide platform that instructs people on how to die. It's visited by children and has
00:39:16 single-handedly increased child suicide in this country. There's absolutely no reason that small
00:39:24 companies should get some sort of start, like carve out. Our country has 99% of businesses
00:39:32 in this country are small businesses. No other industry gets some blanket immunity from
00:39:38 litigation. Instead, small businesses, I happen to own one, we guard ourselves against litigation by
00:39:45 being responsible and not harming people. Okay, well thank you. Mr. Berkman, many of the harms
00:39:52 that you raise in your testimony are against the terms and services of tech companies.
00:39:56 How will reforming Section 2(3) encourage platforms to be better stewards of their platforms and
00:40:02 create better accountability? Yeah, so, sorry. So the issue is now that with blanket immunity
00:40:12 from any sort of liability, we have a severe imbalance in how the social media industry
00:40:20 is making their decisions and they are not weighing in public safety sufficiently. And so,
00:40:27 with reform and adding in that liability, we're going to see a calculus that all other businesses,
00:40:34 as Ms. Goldberg just mentioned, undertake when they make their decisions and that's how we increase
00:40:40 public safety in this industry. Well, let me ask you the same question. You know, if we're looking
00:40:47 at reforming, do we keep Section 2(3) or does 2(3) expire at the sunset, if we get a sunset through
00:40:56 in December 31st of next year? I really think given the extreme amount of harms, we're really
00:41:05 facing a public health catastrophe, especially for adolescents, that reworking the entire system,
00:41:12 using a comprehensive package that I know the committee is working on and considering now,
00:41:20 I believe is essential to alleviate the harm that we're seeing, mitigate the harm that we're seeing.
00:41:26 Thank you. I yield back the balance of my time and recognize the ranking member of the subcommittee
00:41:32 for five minutes for questions. Thank you very much, Mr. Chairman. As a grandparent, I can tell
00:41:38 you one of the things that concerns me the most is the treacherous online environment America's
00:41:43 young people are forced to navigate. From cyber bullying to content encouraging disordered eating
00:41:49 or self-harm, it's simply naive not to acknowledge the connection between the rise in social media
00:41:55 and the harm for America's youth. Mr. Berkman, can you describe the strategies online platforms use
00:42:02 to target and push harmful content on children? Absolutely. So there's a range of features out
00:42:09 there that we find incredibly problematic. There are features that are meant to keep children
00:42:18 using platforms. That's how these platforms make their money. So the thumbs up, the likes,
00:42:24 the shares, the view counts, all of that is geared towards getting children to come back and watch
00:42:30 again and again. The algorithms are designed off of what gets high engagement. And unfortunately,
00:42:37 it's that dangerous salient content that often gets that high level of engagement. So sexually
00:42:44 explicit material, extreme violence, that's what human beings tend to watch and that's what the
00:42:51 social media platforms use to increase engagement via their algorithms, among many other features,
00:43:01 by the way. As we consider reforming Section 230, how should we understand its limitations in cases
00:43:08 where platforms are knowingly amplifying harmful content? In any other industry, if you are
00:43:15 knowingly causing harm, you are going to be subject to liability. And so again, we have this
00:43:22 very severe imbalance here in business decisions. And as Ms. Goldberg mentioned, the jurisprudence
00:43:30 of Section 230 has really gone off the rails and included a range of business decisions,
00:43:37 feature design, product design that really should never have been included in the concept of 230
00:43:44 and certainly was never the original intent. Okay, thank you. Ms. Tamarallo, I want to take a moment
00:43:53 to thank you for sharing your story. It was very important. And that took an immense amount of
00:43:58 courage, I understand that. Without women like you stepping up to sharing your stories,
00:44:04 lawmakers wouldn't fully appreciate what's at stake here. In a post-Dodd environment, more and
00:44:11 more women are being forced to turn to online communities for seeking reproductive care and
00:44:17 advice. Ms. Tamarallo, given your own personal and professional experience with pregnancy law,
00:44:24 what effect do you believe a full sunset of Section 230 without a viable replacement
00:44:29 could have on a woman's ability to find communities for reproductive care online?
00:44:34 Thank you so much for the question, Congresswoman. This happened to me two years ago, but I have
00:44:40 remained an active member of these online communities since then in the hopes of kind
00:44:43 of paying it forward because those women helped me so much. And again, every day that I log on to
00:44:50 the forums or check a Facebook group, I see a woman express fear about being able to seek the
00:44:54 care she needs. I worry deeply that not only will she not be, those women won't have the platform
00:45:01 to express that fear, they won't have the platform to find the support and information they need.
00:45:05 And I think as, you know, this is just the perfect example of the kind of content that,
00:45:11 you know, women who suffer from pregnancy loss, I don't think anyone's trying to harm them, right?
00:45:18 No one, I don't think anyone's intending to repeal 230 to get at women like me, but I do really
00:45:23 worry that's the unintended consequence here. And I don't want the women who are dealing with this
00:45:27 today and in the future to not have the resources I had and have the community that I was able to
00:45:31 lean on because, again, for me, it was life-saving. Okay, well, thank you very much.
00:45:36 In California and other regions of the country, judges are increasingly interested in hearing
00:45:41 cases concerning the design of platforms themselves rather than suits about specific
00:45:46 content. These cases represent a novel approach for addressing the harms of social media in a
00:45:52 way that isn't immediately dismissed because of Section 230. Ms. Goldberg, while I'm glad these
00:45:58 cases are increasingly viable, what are the legal limitations of this approach?
00:46:02 Thank you. My firm pioneered the product liability theory in 2017, and in the Second Circuit,
00:46:14 the court said that even using a product liability approach where we said these features
00:46:21 are defective, it still was prone to Section 230 immunity. Now, thankfully, in the Ninth Circuit,
00:46:28 the courts are saying, "Well, if you don't sue for a publication problem, then it's not vulnerable
00:46:37 to Section 230." But I think it's important to remember that when we're talking about removing
00:46:45 Section 230 immunity, it doesn't create a pathway where these companies just suddenly are liable.
00:46:53 Again, Ms. Tamariello's heartbreaking situation, it doesn't mean that suddenly the First Amendment
00:47:01 doesn't apply to those platforms and they suddenly have to remove all the content.
00:47:07 There still would have to be somebody who's injured from a platform decision.
00:47:11 I think it's important that we not confuse- Excuse me, the gentlelady's time has expired.
00:47:16 Oh, sorry. I'm sorry. I'll ask a couple of other questions. I'll ask.
00:47:22 Okay. Thank you. Thank you.
00:47:24 The chair recognizes the gentleman from Pennsylvania for five minutes for questions.
00:47:28 Thank you, Chairman Latta and Ranking Member Matsui for holding this hearing on Section 230.
00:47:34 Thank you for our witnesses for giving their time and your compelling testimony.
00:47:38 As I stated in our hearing last month, as a doctor, as a father, as a grandparent,
00:47:45 I understand how important our children's well-being is, particularly their mental health.
00:47:51 This is particularly true as more and more children have access to smartphones, the internet,
00:47:57 and specifically the content that these devices bring to them. We need to make sure that they
00:48:03 are not interacting with harmful or inappropriate content. Section 230 is only exacerbating this
00:48:10 problem. We here in Congress need to find a solution to this problem that Section 230 poses.
00:48:17 I'm glad that we can walk through the potential reforms and the solutions here today.
00:48:21 Mr. Berkman, what are the good parts of Section 230? How can we reform the text so that we get
00:48:28 back to the original text and the original intent of Section 230?
00:48:34 I appreciate the question and I'll start my response by highlighting Section 230(d). It is a
00:48:44 long-forgotten and tragically ignored critical component of the original concept of Section 230.
00:48:52 Section 230(d) requires internet companies, including social media platforms, to give
00:49:01 all users notice upon signing up of all commercially available safety software
00:49:08 providers. The immunity provisions of Section 230(c) were put in on the back of the understanding
00:49:17 that there would be a robust third-party safety software industry out there protecting users,
00:49:24 particularly adolescent users. Is that robust industry available now?
00:49:28 There is an industry out there now that is effective. Unfortunately, what could not have
00:49:34 been envisioned in 1996 was that for the safety software companies to work, the social media
00:49:40 platforms have to provide a level of minimal cooperation. They have to provide data access.
00:49:46 Does that cooperation exist?
00:49:47 It exists among some platforms and not others.
00:49:52 You used the word robust. Is it robust?
00:49:54 There are strong companies out there. I would not call the industry robust.
00:50:00 I think we share those concerns. Ms. Goldberg, what, if any, criminal activity does Section 230
00:50:07 inadvertently continue to facilitate and how would reforming the law stop that illegal activity?
00:50:14 Well, Section 230 has basically given the tech industry a pass to allow the trafficking of
00:50:24 children on its platform, the trafficking of drugs, matching children with predators,
00:50:30 the sale of suicide chemicals, inciting violence. And even though there is supposedly a carve-out
00:50:40 for federal crimes, our DOJ never holds our tech industry responsible for crimes that happen on
00:50:47 the platforms. So with the removal of Section 230, we empower the people who are injured
00:50:53 to take action where the government doesn't.
00:50:56 Do you feel that the nefarious acts that you just elicited, do you feel that those continue to
00:51:03 exacerbate the problems with mental illness that we see in children and young adults?
00:51:08 100 percent, especially because these platforms not only turn a blind eye to the bad things that
00:51:14 happen, but they often promote them. My final question is for all the witnesses.
00:51:18 There is debate happening right now that if Congress is going to amend Section 230,
00:51:23 whether it should do so on a carve-out or carve-in basis, whether to pursue a more
00:51:29 comprehensive approach. Can you speak to the pros and cons of each of these approaches?
00:51:34 And I'll start with you, Mr. Berkman. I would say we want to see comprehensive,
00:51:43 significant reform. And that's why we're really supportive of Chair Rogers and Ranking Member
00:51:48 Pallone's discussion draft here. It needs to be sunset and reworked. So the pros of doing
00:51:56 carve-ins means that the social media industry is at the table and providing real compromise
00:52:03 that is reasonable. The pro of starting over again means that we're potentially going to get
00:52:09 a very robust system that works. Either way, with the discussion draft, we need them at the table,
00:52:14 having the discussion as well. Ms. Tamariello, from your perspective and from your personal
00:52:21 experience, how would you weigh in on this? You just mentioned the gentleman only has five
00:52:26 seconds left, so if you could make a real quick statement. Okay. I would say generally,
00:52:30 Engen is wary of kind of carve-outs, carve-ins. We think we want a framework that works for the
00:52:35 whole internet. Startups want to grow. They need to be able to know they're not going to have to
00:52:38 rework their entire business model when they hit some arbitrary threshold. But as my story
00:52:42 illustrated, as a user, free expression across the internet depends on 230. And so picking and
00:52:47 choosing who gets 230, I don't think ends with more free expression and more innovation.
00:52:51 Mr. Chairman, my time has expired and I yield back. The gentleman's time has expired and the
00:52:55 Chair now recognizes the Ranking Member of the full committee, the gentleman from New Jersey,
00:52:59 for five minutes for questions. Thank you, Mr. Chairman. Just yesterday, it was reported,
00:53:04 and I mentioned in my opening, that a popular event ticketing platform has routinely allowed
00:53:09 users to post messages selling drugs, fake social security numbers, fraudulent online reviews,
00:53:16 and other illicit things. But the platform didn't just allow these messages to exist on the platform.
00:53:21 Using its algorithm, it pushed these posts to vulnerable users. And for instance,
00:53:26 the platform actively placed posts offering illegal access to prescription drugs like oxycodone
00:53:33 and next to genuine events intended to support users struggling with addiction and substance
00:53:40 abuse. So again, I think it's pretty outrageous. And we've seen things like this over and over
00:53:45 again. And I don't think these platforms, they're not going to clean up their act or make their
00:53:49 platforms safer without major changes to the law. So, Mr. Berkman, sometimes we hear from people
00:53:55 that they don't understand the harm we're attempting to address. Is it surprising to
00:54:00 you that an online platform would engage in this type of activity? Absolutely not. We've been seeing
00:54:07 that specific harm for years. DA Administrator Milgram has said for multiple years now that
00:54:16 all the major platforms have issues with illicit drug trafficking. There was also issues that are
00:54:22 well known with human trafficking, trading of CSAM material. All platforms operating in this
00:54:27 space have clear constructive notice that this is a danger and it is harming many, many Americans,
00:54:35 particularly children. Well, thank you. Now I'm going to ask each of you, but just say yes or no,
00:54:40 because I have other questions for Ms. Goldberg. Do you believe that any company should be able to
00:54:45 use Section 230 to escape liability for harms caused by the type of conduct I just described?
00:54:51 Ms. Goldberg? No. And Mr. Berkman? No. And Ms. Tamariello? 230 doesn't protect companies that
00:54:59 do commit federal crimes, including the ones you described. Okay. Now, I understand that these
00:55:03 platforms allow for speech expression and association that has changed the landscape
00:55:09 for the exchange of ideas. But in my opinion, free speech isn't the business model.
00:55:13 They're selling, advertising, and making money. And unlike most other industries in America,
00:55:18 Section 230 allows platforms to make business decisions without having to give a second
00:55:23 thought to whether and how the platform could be used for destructive purposes. So,
00:55:28 let me ask Ms. Goldberg, and then I'll go to Mr. Berkman again. Ms. Goldberg, it seems like some
00:55:33 plaintiffs are beginning to have success in cases against big tech companies on product liability
00:55:39 claims. You did actually mention some of your cases. But do you think there's still a need for
00:55:44 Congress to sunset Section 230 even with that? 100 percent there's still a need. What we're
00:55:52 finding is that courts don't know what to do. Their decisions are inconsistent. We still have
00:55:59 the Ninth Circuit applying the Second Circuit law, and they always say that they're looking
00:56:05 for Congress for clarity. Okay. So, Mr. Berkman, your testimony details the platforms are knowingly
00:56:12 causing harm to children and without fundamental reforms to Section 230, can we expect that
00:56:19 platforms are going to do anything more than the bare minimum, even if we place other regulatory
00:56:24 requirements on them? We can expect continued harm without additional regulatory requirements,
00:56:32 including a significant reform of 230, passage of Sammie's Law, and other legislation that the
00:56:39 committee is considering comprehensively. They have known about these harms for years. They have
00:56:44 provided talking points in these halls to the press about the actions they're taking, yet the
00:56:51 harms continue and casualties rise. Well, thank you. I mean, you know, I don't get too many people
00:56:59 that come here or say to me or Chair Rogers that, you know, Section 230 shouldn't be changed.
00:57:07 There might be somebody out there that will say they love the status quo, but they don't articulate
00:57:14 that very often. But many of them will say, well, let's just make the changes. We don't need to
00:57:22 sunset it. But obviously you disagree. You think that doing the sunset is important. Well, I think
00:57:29 we do want to get it right here. I am a long time, have a deep, deep appreciation for the legislative
00:57:38 process and the process in Congress. And that means getting all the stakeholders meaningfully
00:57:42 at the table because we want to get the legislation right. And what I really appreciate about the
00:57:48 strategy of your discussion draft in the sun setting is that the social media platforms have
00:57:55 become so reliant on Section 230 and their dangerous business decisions that providing a sunset will
00:58:03 meaningfully get them to the table and get us legislation that works. All right. Thank you so
00:58:10 much. Thank you, Mr. Chairman. Thank you. The gentleman's time has expired and yields back.
00:58:14 The chair now recognizes the gentlelady from Washington, the chair of the full Committee of
00:58:19 Energy and Commerce for five minutes for questions. Thank you, Mr. Chairman. Mr. Tumarillo, last
00:58:25 Congress, I had a draft legislative proposal to reform Section 230 that included a threshold to
00:58:30 ensure the reforms would only apply to the large tech companies and would not affect small businesses
00:58:36 or startups. However, small and medium sized companies still oppose reforms to Section 230
00:58:42 and did not engage in any meaningful conversations. If this bill passes and Section 230 is at risk of
00:58:50 sun setting, will these businesses engage in the process and support meaningful reforms to Section
00:58:56 230 to put control back in the hands of Americans? Thank you for the question, chair. I should note
00:59:02 Enjin is always happy to engage and did engage last Congress. Can't speak for any specific
00:59:06 companies, of course. I think there's always a I think everyone wants the Internet to work better
00:59:12 for everyone. And so companies we work with are always happy to engage to find ways to make that
00:59:16 work and would just hope that those conversations start from a recognition about what's really at
00:59:22 stake when we talk about innovation and expression online. But a sunset risks, especially at the end
00:59:28 of next year, when there's still so much disagreement, even among members of Congress
00:59:32 over what the Internet should look like. A sunset risks leaving our startups, but also, again, just
00:59:36 Internet users generally vulnerable with after 2025. Thank you. I can appreciate that. The intent
00:59:44 of this bill is to put a clock on making reforms to Section 230 before it sunsets. With the various
00:59:49 proposals out there, there's no shortage of options. And I know that I believe that we can
00:59:53 come together and that we must start now. What substantial reforms as a follow up should Congress
00:59:59 consider in modifying the liability protections in Section 230? I think Congresswoman Matsui put it
01:00:07 really interestingly when we talk about modernizing Section 230. And to me, not to put words in her
01:00:12 mouth, of course, but to me, that means keeping the original intent, which was to incentivize
01:00:17 good faith attempts at keeping corners of the Internet safe and healthy and relevant.
01:00:21 Right. Absent 230, it's and to Ms. Goldberg's point, absent 230, it's not that the platforms
01:00:27 would be held liable for the speech. It's that the platforms could very easily be pressured into
01:00:32 removing speech people don't like. And again, to my personal story, that scares me when we talk
01:00:37 about controversial or vulnerable populations online. So I think anything that maintains the
01:00:42 framework that allows a platform to quickly and easily get a lawsuit over user content dismissed,
01:00:48 that kind of framework needs to continue as Congress thinks about 230. Thank you. Ms. Goldberg
01:00:53 or Mr. Beckman, do you have anything to add? I'll add that the problem with Section 230 and any
01:01:00 accusation against a platform is that they say that everything is speech, that basically if a user
01:01:07 has a profile or contributes anything, no matter how much the platform does to develop it to
01:01:13 promote content or use its algorithms or generative AI, they say if any, if a user had any
01:01:21 content whatsoever involved, then anything that stems from it should be immune from Section 230.
01:01:29 The other thing is that when we're talking about content removal, like Ms. Tamarela is talking
01:01:36 about, there still has to be a cause of action. So a platform simply succumbing to pressure
01:01:41 and political pressure or fear of being, of posting controversial stuff,
01:01:49 like it doesn't just create a tort that anybody can sue under. So I don't actually understand
01:01:54 the basis for the concern that suddenly the gateways of litigation are going to open. Okay.
01:02:01 Okay. Mr. Beckman? I wholeheartedly agree with Ms. Goldberg. I think the fear here is overwrought.
01:02:08 Tort law jurisprudence has been around for hundreds of years. All other businesses work
01:02:14 by it to bring a suit. You need a meritorious case on its face. And on the flip side, the concern
01:02:23 about removing content aggressively unnecessarily, that would indicate that they have the ability to
01:02:30 be removing all of the illegal, uncontroversially dangerous content that's on there now. If they
01:02:37 could start picking and choosing out there anything that might be in the gray area.
01:02:42 The focus here is removing what it would be a tort, what is causing severe harm,
01:02:48 ensuring that they're doing that in a way that is not negligent, grossly negligent or reckless.
01:02:58 Okay. Yes. One of the main concerns with reforming section 230 is around the potential
01:03:05 of incentivizing frivolous lawsuits from trial lawyers. In the time remaining, would you just
01:03:12 speak a little, Ms. Goldberg, as a follow-up, would you speak to, you know, if we reformed or
01:03:18 if it went entirely away, how do you believe the landscape would change? Rule 3.1 of the model
01:03:26 rules of professional conduct forbids the filing of frivolous lawsuits. Okay. Will be sanctioned.
01:03:32 Okay. Okay. Yeah. More to come. I yield back. Thank you. The gentlelady yields back. The
01:03:39 chair now recognizes the gentleman from Florida's 9th district for five minutes for questions.
01:03:44 Thank you, Chairman. The year was 1996. We marveled at the groundbreaking special effects
01:03:52 of Independence Day. Weiser released the indie cult classic album Pinkerton, a personal favorite,
01:03:58 and section 230 was born out of a concern to protect the nation internet service providers
01:04:04 that came out of Stratton v. Oakmont, Stratton-Oakmont v. Prodigy Services. Remember those
01:04:09 guys, right? Yeah. New York case, it's a New York case that found that internet service providers
01:04:16 could be found liable for content posted on their websites. And so 230 protected ISPs from liability
01:04:22 from content posted by third parties. Since then, internet companies have become a powerhouse of
01:04:28 innovation and our economy. They also developed algorithms that amplify posts so they aren't
01:04:36 always passive participants in their platforms. And the harms we know, sadly, are many,
01:04:41 enabling sexual assault, bullying, identity theft, addiction, anorexia, and more. And so we see a
01:04:47 sunset bill that's been filed by the chairwoman and our ranking member that is a bold move and
01:04:53 maybe the road we're headed down. I believe the key is to adopt comprehensive principles,
01:04:58 a duty to protect identity and personal information, preventing crime and libel, and especially
01:05:05 stopping exploitation of our kids. Our major tech companies are some of the most innovative
01:05:10 companies in America. My challenge to them is work with us to develop these principles and the
01:05:16 tools to enforce them. Protecting our fellow Americans and improving trust in popular platforms
01:05:22 is good business and it's the right thing to do. Locally in central Florida, we saw an innocent
01:05:28 UCF student, Alex Bouguet, whose identity was stolen in an account that was created to then
01:05:35 criticize and make racist comments towards a Georgia state legislator. This action wrecked
01:05:41 his life. He was fired from his job, he almost got kicked out of school, and we saw it become
01:05:48 a huge issue in central Florida. Ms. Goldberg, we just filed the SHIELD Act last week, which would
01:05:55 create a duty to take down posts where someone's identity was stolen, whether it's directly or by
01:06:03 use of a social media account. Right now, if a bill like this didn't pass, what recourse would
01:06:10 Mr. Bouguet have for others? Well, right now, without the SHIELD Act that you're proposing,
01:06:16 the man in Florida would have no rights to go after the platform that was knowingly publishing
01:06:25 the personal content. The same thing happened to my client, Matthew Herrick, where somebody was
01:06:30 impersonating him on a dating app and sent over a thousand men to his home, thinking that he wanted
01:06:39 them to fulfill his rape fantasies. He was thrown out of court. Well, thank you. You know, we see
01:06:45 from subject matter jurisdiction to negligence and other legal concepts, this common principle
01:06:51 that affirmative acts can give rise to duties and liabilities under law. And so my second question
01:06:57 to you is, where is the line from being a passive platform, simply hosting a virtual town square,
01:07:03 to being an active participant in a platform and really availing themselves of duties?
01:07:09 Every single platform that gets sued will say that they're just a forum for speech, that they're
01:07:16 just a conduit. So the issue is that until they're accused of doing something wrong,
01:07:29 they're going to just, I mean, they're going to deny that they're active, no matter what.
01:07:35 Mr. Berkman, we know algorithms are some of that proactivity. What are some of the other
01:07:40 proactive ways social media platforms and internet providers can become more active participants
01:07:48 than merely just hosting a platform? It's a really basic one that's not as techy as algorithms.
01:07:53 We see this all the time, where there's a severe case like the one in Georgia. It's getting worse
01:07:58 with deep fakes out there as well. And the imp had the victim, the target makes multiple reports to
01:08:06 the platforms and they ignore it, don't respond. Basic notice, also a basic feature of tort law.
01:08:14 So actual notice and ignoring it. Other features like Snapchat's quick ad feature that links
01:08:21 children to dangerous predators and drug dealers. All these features in there are causing harm and
01:08:30 not necessarily connected to free flowing content that we saw in 1996 that had no algorithms or
01:08:37 features keeping people on the sites. Thank you. And I think the key is we see common principles,
01:08:42 both negligence and defamation and others, whether it's newspapers, whether it's duties for
01:08:48 different businesses that really can guide us through should we not see a wholesale sunset.
01:08:53 With that, I yield back. Thank you. The gentleman's time has expired. He yields back. The
01:08:57 chair now recognizes the gentleman from Florida's 12th district for five minutes for questions.
01:09:02 Thank you, Mr. Chairman. I appreciate it. As a conservative Republican, I generally err on the
01:09:07 side of industry self-regulation in which competition and reasonable self-governance
01:09:13 management standards. This is typical. It's really typically the best way to incorporate
01:09:19 business needs as well as consumer expectations. In my opinion, the heavy hand of government
01:09:25 should be a last resort. Ms. Goldberg, over the last several years, Congress has discussed with
01:09:35 online platforms the concerns of their failure to be good stewards of their platforms in light
01:09:42 of Section 230. Has industry made any reasonable changes or offered any meaningful solutions in
01:09:51 addressing people's concerns? And do you think it's necessary to pass a Section 230 repeal
01:09:58 for social media companies to meaningfully engage on this particular issue? Thank you for this
01:10:05 question. And I think it's important to remember that Section 230 is a regulation. I feel that in
01:10:13 the 10 years that I've been litigating against tech companies, there has been no meaningful reform
01:10:19 in their day-to-day operations. If anything, the products have become more sophisticated
01:10:25 with algorithms and generative AI, and there's more pronounced advertising, data mining,
01:10:32 and targeting at children. I feel, if anything, the absolute urgency of reform is now.
01:10:39 Thank you. For you, again, you mentioned the kids, and this is what this question
01:10:45 refers to. Section 230 has been a one-size-fits-all approach to liability protections,
01:10:51 regardless of the harm caused and the individuals that are harmed. As this committee has developed
01:10:58 privacy legislation, we have incorporated a higher degree of protection for children because of their
01:11:04 unique vulnerabilities online. I'd like each witness, if you can, please, each witness to
01:11:12 answer this particular question. In whatever policy takes shape as a replacement to current
01:11:20 Section 230 protections, do you think that we should consider different liability protections
01:11:27 for social media companies when they engage with children than those made for adult consumers,
01:11:34 and why? We'll start with Ms. Goldberg, please. Thank you. I do think that we have to treat
01:11:41 companies that target children differently. They're looking at children as a mass market,
01:11:48 and the sooner that they can get a child on their platform, they can control how much time
01:11:54 and attention that child attends to their platform, and they might have a customer for life.
01:12:00 The harm that we see to children who don't have the ability or the knowledge to cope with
01:12:06 emergencies, who maybe are being blackmailed and afraid to tell their parents, is extreme.
01:12:11 Thank you. Mr. Berkman, please. My answer would be yes. Millions of children are being harmed by
01:12:21 social media, millions in the clear data. It is a public health catastrophe for our children,
01:12:28 a full range of dangers. I do think that we need to change the calculus through liability for
01:12:38 platforms that allow children on. That includes allowing them on by their terms and allowing them
01:12:44 on de facto, as in not doing any sort of verification. My answer is yes. In terms of that
01:12:53 regulatory package there, again, it really needs to include bills like Sammy's Law,
01:13:02 which would specifically significantly increase protection for children. If a social media
01:13:09 platform is allowing children on, that needs to be something in the mix as well. COPPA 2.0, APRA,
01:13:16 those provisions on data collection for children as well need to be in the mix.
01:13:21 Thank you very much. I appreciate that. So, Ms. Camarillo, same question.
01:13:28 Thank you for the question. I think generally when we think about startups, a relatively small
01:13:33 number of startups we work with are truly aimed at children, know that they're dealing with
01:13:37 children. I think it generally makes sense, like COPPA, to have a different framework for dealing
01:13:41 with children when that's who you know your users are. What we worry about is this kind of bleeding
01:13:47 into general audience platforms that have no way of knowing that they're dealing with children.
01:13:51 And to the point on age verification, we're especially concerned about startups being
01:13:54 forced to collect additional information from users. Imagine signing up for a new service you've
01:13:59 never heard before and being asked for your driver's license. It might put you off from
01:14:03 using that service, and so that would really harm startup growth. But to the point of your
01:14:06 question, when we're talking about children's targeted platforms that know they're dealing
01:14:10 with children, generally different rules of the road make sense.
01:14:13 Thank you very much. I appreciate it. I'll yell back.
01:14:15 The gentleman's time has expired and yields back. The chair now recognizes the gentleman
01:14:20 from California's 29th district. You have five minutes for questions.
01:14:24 Thank you, Chairman Latta and Ranking Member Matsui, for holding this hearing and bringing
01:14:29 us together, and I appreciate the witnesses' opinions and sharing their expertise and
01:14:34 full view of the public about today's issue. I'm glad that we're discussing a path forward
01:14:38 to holding online platforms accountable in the form of the chair and ranking member's proposal
01:14:44 to sunset Section 230 today. I hope that we go beyond just discussion and actually take some
01:14:50 action. In my time in Congress, I've participated in multiple hearings where we've heard repeatedly
01:14:56 from CEOs of large online platforms that they take our concerns very seriously and are working
01:15:02 hard to address them. I don't believe them. What has consistently followed is that these
01:15:08 companies, often worth billions of dollars and traversing an atmosphere of a trillion dollars
01:15:13 or more annually, have failed to meet the moment and address the societal harms that are proliferating
01:15:20 on their platforms. As we've heard from our witnesses' testimony today, there are very
01:15:26 real risks to public health in allowing things to continue as they've been. There are also risks
01:15:32 to our democracy, as authoritarian adversaries abroad have repeatedly demonstrated that they
01:15:38 are willing and able to fill our online spaces with false information and designed to push their
01:15:45 interests and undermine our institutions. While I wish we could better depend on American companies
01:15:51 to help combat these issues, the reality is that outrageous and harmful content helps drive their
01:15:59 profit margins. That's the online platforms. I'll also highlight, as I have in previous hearings,
01:16:07 that the problem of harmful mis- and disinformation online is even worse for users who speak Spanish
01:16:13 and other languages outside of English as a result of platforms not making adequate investments to
01:16:20 protect them. I have a question for Ms. Goldberg. In your testimony, you referenced the sophistication
01:16:27 of technology we're dealing with now as compared to when Section 230 was created. Especially,
01:16:33 you mentioned that the Internet's ability to overturn elections, spur genocides, and coordinate
01:16:40 government takeovers, among other large-scale societal harms. In Congress, it can often take
01:16:46 a long time before we take a second crack at getting something right. Ms. Goldberg, as we
01:16:51 consider how to implement a regulatory framework that can effectively deal with these threats,
01:16:56 how do you make certain we are writing policy that can keep pace with technological innovation,
01:17:03 and what blind spots should we be looking out for?
01:17:05 Thank you for this question. I think it's a glaring travesty that social media companies
01:17:16 invest so much into English-speaking content moderation at the expense of the languages. I
01:17:23 think it's like 87 percent of content moderation is in English on Facebook, and that was revealed
01:17:29 by whistleblower Francis Haugen. When we're thinking about Section 230, we really need to be
01:17:36 in how to modify. I think instead of drafting laws that are specific to technology, which is always
01:17:43 going to change, we have to be thinking about harms and the duty that platforms have instead
01:17:53 of the technology. Thank you. Public shaming has happened in these committees where we have CEOs
01:17:59 in front of us, and they give us all kinds of somewhat apologetic answers or what have you.
01:18:05 Mr. Barkman, do you have any thoughts on how we can ensure that platforms make
01:18:11 more equitable investments in moderating harmful content in languages other than English?
01:18:17 Yeah, I really appreciate that question, and I also appreciate the sentiment that
01:18:22 you're hearing from the industry, and they're telling you that they're working very hard.
01:18:26 They probably also told you that they're using proactive detection measures. They're using
01:18:31 industry-leading techniques. The problem is that millions of children are being harmed,
01:18:37 and many are dying on social media. The question to equity goes to sufficient trust and safety
01:18:45 processes and staffing. When we're talking about supporting innovation here for smaller businesses,
01:18:52 we have to realize that every other industry in this country is subject to our historical
01:19:01 tort law. That innovation happens within that context because it requires the balancing of
01:19:08 the profit motive and safety. In terms of ensuring that trust and safety staff and AI algorithms can
01:19:17 keep pace with evolving risks in other languages, that is a basic trust and safety operation,
01:19:22 and the failure to properly staff that is negligent or even reckless. Amending, reforming
01:19:31 230 and the liability protections here is essential.
01:19:36 Thank you for your testimony and your answers to my questions. My time has been expired.
01:19:40 I yield back.
01:19:40 The gentleman's time has expired and he yields back. The chair now recognizes the gentleman
01:19:44 from Michigan's 5th District for five minutes for questions.
01:19:47 Thank you, Mr. Chairman, and thanks to the panel for being here.
01:19:50 Section 230 has allowed the internet ecosystem to grow and thrive in the United States, for sure,
01:19:56 but after three decades, it's time that we evaluate whether the current model
01:20:01 is still benefiting our constituents and businesses. As technology has evolved,
01:20:06 the problems consumers face online have evolved and expanded, and we've talked about that today.
01:20:14 Illegal activity and harmful content seem to be rampant on big tech platforms, especially
01:20:20 impacting the mental health, safety, and security of our children and teens.
01:20:25 We've also heard many stories from our constituents who have had their content censored
01:20:31 or taken down or flagged without real reason or recourse, and that's a problem.
01:20:36 But we can't throw the baby out with the bathwater. We need to create an environment
01:20:41 that allows the U.S. to continue leading in innovation, one that works for consumers and
01:20:47 businesses alike and protects children and increases the freedom of expression online,
01:20:52 all of that being a good thing.
01:20:56 Mr. Morello, in your testimony, you discuss why sunsetting Section 230 is the wrong approach.
01:21:03 I also have some of the same concerns. In your opinion, what is the right approach, and do you
01:21:11 think the status quo is sustainable, especially when we see so many harms occurring on these
01:21:16 platforms? Thank you for the question, Congressman. Obviously, Congress is very interested in talking
01:21:22 about an alternative framework, so clearly the status quo isn't working for the members of
01:21:26 Congress who get to write the laws. I think any time we're talking about 230 reform, sunset is
01:21:33 kind of a very blunt tool and appreciated as a negotiating tactic. And it's supposed to be.
01:21:38 Yeah, yes, but as a law, it's quite a way to make a change. I think we would love to see Congress
01:21:44 engage in kind of nuanced conversation that starts with an understanding of the way that the
01:21:50 internet and content moderation online works for all platforms. A lot of the conversation today is
01:21:54 focused on big tech, which is understandable. They have a wide reach, but Enjin works with
01:21:59 thousands of startups across the country and every district represented here, and those are the
01:22:03 companies that really need 230. So until we can start kind of from a place where that's the
01:22:08 constituency we're worried about protecting, I worry that a sunset brings people to the table,
01:22:14 but not in time to get something done before the end of next year.
01:22:18 Okay, well, time will tell, I guess, with the sunset, but hopefully the table will be full of
01:22:25 people aggressively trying to work to the solution. Mr. Berkman, I appreciate you mentioning my bill,
01:22:31 COPPA 2.0. In your testimony today, Section 230 reforms would obviously address content,
01:22:38 but to fully safeguard young people online, why is it also important to increase privacy
01:22:44 protection specifically for children and teens?
01:22:47 Yeah, well, first of all, I think 230 reform, as we've talked about, the jurisprudence has gone
01:22:53 far beyond what I think 230 originally intended. So that is design features, that is marketing the
01:23:01 children through content, and other harms that are happening through social media that
01:23:07 weren't within the original intent of just posting on an old school website. Privacy is particularly
01:23:15 concerning because of the amount of information that children are sharing, including significant
01:23:21 information, and that can be used for a range of harmful purposes, from manipulative advertising
01:23:29 to exploitation and extortion. So we really support COPPA 2.0, especially the eraser part of
01:23:39 that and the ability to delete existing information that young people are posting and then regretting
01:23:46 and being harmed regarding later on in life. Yeah, and parental involvement as well.
01:23:53 Yes. Thank you. Ms. Goldberg,
01:23:56 as I said in my opening, technology is always changing, and new challenges come with it. It's
01:24:02 happening faster all the time. During our last Section 230 hearing, we heard from three professors
01:24:08 that Section 230 should not apply to generative AI. Do you agree? And why?
01:24:14 I do agree that Section 230 should not apply, and if we read it the way it was intended,
01:24:20 generative AI is generated by the platforms, and Section 230 was not intended to immunize
01:24:29 platforms for their own content. I'll have to ask further questions about
01:24:37 chatbots and chat GPT specific. We'll submit that for the record. I yield back.
01:24:47 The gentleman from Michigan yields, and now I recognize Representative Fletcher from Texas 7.
01:24:55 Five minutes. Thank you so much, Mr. Chairman, and thanks to Chairman Latta and Ranking Member
01:25:00 Matt Sui for holding today's hearing and for our witnesses for testifying today.
01:25:04 I just want to start my questions with a response to Mr. Morello. Thank you for sharing your
01:25:14 perspective and your personal story about your miscarriage and the importance of access to
01:25:19 information and support. I'm very sorry for your loss. There is certainly a need for access to
01:25:27 medically accurate, real-time information about pregnancy, pregnancy loss, and reproductive
01:25:34 health care more broadly. I would submit, however, for this committee's consideration that the answer
01:25:40 lies not in Section 230, but in passing the Women's Health Protection Act, which has been
01:25:45 referred to this committee for consideration so that women in the United States can have access
01:25:50 to the full range of reproductive health care and accurate information about it. I appreciate your
01:25:57 response to Ms. Matt Sui's question, but I disagree that people aren't trying to hurt pregnant women
01:26:03 and women experiencing pregnancy loss. That is exactly what legislators in my home state of Texas
01:26:09 and others are doing, where extreme legislators are criminalizing pregnancy. They are preventing
01:26:17 access to medically necessary miscarriage management and access to medications like
01:26:23 mispristone that are used in miscarriage management. When women who are having miscarriages
01:26:28 are going to emergency rooms and being told to wait outside, they are being told to come back
01:26:35 when they're sicker, when they have sepsis, when they are on the verge of death. So,
01:26:40 the other thing we see is that they're even empowering random strangers, giving them
01:26:46 standing to sue people, giving random strangers standing to sue people who may have been pregnant
01:26:54 and anyone who helps them, anyone including their doctors in states where abortion is illegal like
01:27:01 mine. So, your testimony that over the last couple of years you have seen the fear in these discussion
01:27:07 groups, I think is incredibly powerful and really important for this committee to understand and I
01:27:13 thank you for sharing it. In my view, the answer does not lie in Section 230, however. It lies in
01:27:18 protecting the health, dignity, and freedom of all women in the United States and we do that by
01:27:26 passing the Women's Health Protection Act and I hope this committee will take that up. It is in
01:27:30 our jurisdiction. The last Congress passed it twice and it's time that we do it again.
01:27:35 With the time I have left, I do want to focus on some of the Section 230 issues and Ms. Goldberg,
01:27:41 I really want to direct my question to you and give you the rest of my time to answer it around
01:27:47 some of the litigation questions because I too am a lawyer. I understand very clearly what you're
01:27:52 talking about when you talk about some of the procedural challenges, but I'm hoping you can
01:27:57 explain it for the record and for those watching because like you, I absolutely fundamentally
01:28:02 believe that our legal system and our ability to seek justice in the courts and accountability in
01:28:09 the courts is essential to the functioning of our society and I would appreciate it if you could take
01:28:14 the time that I have left, about two minutes, and explain very generally how Section 230 operates
01:28:20 today as an immunity from suit as opposed to say a defense or an affirmative defense and how that
01:28:28 impacts the discovery process and other things, what it prevents you from being able to do that
01:28:33 you might expect in another kind of case. I think that would be really helpful.
01:28:36 Thank you and I couldn't agree more with you about the Reproductive Health Protection Act.
01:28:43 Section 230 is not sexy. It's a procedural act that basically defendant corporations use
01:28:51 at the earliest stage possible in a motion to dismiss. We file a pleading telling a product
01:28:57 exactly how our client was injured through its features and then they file a motion saying,
01:29:03 "We're just a forum. You're suing us for speech," and then a judge decides it. What happens is that
01:29:08 the cases get thrown out at the earliest stage without the opportunity for discovery. So we never
01:29:15 know exactly how much notice the platform had about the harm, how many other similar incidents there
01:29:21 were. They don't have to give up any information and that's what's so fundamental to our civil
01:29:26 justice system is that it's all about sharing and exposing the information of bad acts. So these
01:29:34 companies really get to continue to hurt people in the same way and really benefit from this
01:29:42 informational imbalance where they know how much they're hurting people and how many similar
01:29:49 incidents there are, but victims have no idea that there have been a thousand people that purchased
01:29:56 the same psilocybe product before them. Well, thank you so much. I'm running out of time,
01:30:02 so I thank you for your answer and explaining how this is used and I think it's something for us to
01:30:08 consider as we look to the Section 230 reform. So with that, thank you and I yield back.
01:30:12 The gentlelady yields back and I recognize myself for five minutes of questions. I believe all my
01:30:18 colleagues on this committee agree we want the internet to remain a relatively free and open
01:30:23 place. Since 1996, Section 230 has operated under a light touch regulatory framework allowing
01:30:31 companies and online providers to moderate content under a liability shield. Today,
01:30:38 our internet and its regulatory framework is under attack. The American public gets very little
01:30:43 insight into decision-making processes when content is moderated and users have little recourse when
01:30:50 they're censored or restricted. Recently, Americans experienced a great deal of online policing from
01:30:57 big tech during the last presidential election. And for example, users saw platforms like Twitter
01:31:02 and Facebook immediately cut stories from being shared or talked about by the users on their
01:31:08 platforms at the request of our own government. It's Congress's job to ensure that big tech
01:31:14 companies are not obstructing the flow of information to benefit a political agenda
01:31:19 and ensure a free and competitive news market. It's our job to promote transparency and truth.
01:31:26 As a member of the Select Committee on China and the Speaker's AI Task Force, I have major
01:31:31 concerns with the risks our internet ecosystem faces from the Chinese Communist Party and other
01:31:37 adversarial nations as well. Our younger generation has never been more targeted by foreign propaganda,
01:31:44 illicit online activity, misinformation, and mental health harms than they are right now
01:31:50 without critical reforms to Section 230. Mr. Berkman, I was recently at a conference where
01:31:56 some major players in the generative AI space were speaking. They were all very hesitant to discuss
01:32:03 what data their algorithms were trained on, but they were very clear that they didn't want to
01:32:08 be held liable for the output of those algorithms. If we clarify that Section 230 protections do not
01:32:15 apply to generative AI outputs, would that incentivize these platforms to invest in
01:32:21 higher quality data for developing AI, perhaps more transparent?
01:32:25 Yes. Again, the immunity provisions right now that have been in place since 1996
01:32:35 create a severe imbalance in the business decision-making that every other industry
01:32:44 is subject to. So that balance between profit and safety. And there is a danger in AI now from our
01:32:52 perspective. The dangers you discussed, we've seen AI on social media platforms recommended
01:32:58 to 13-year-olds that they should engage in adult relationships, really concerning coming out of AI.
01:33:07 Clearly dangerous stuff. I agree. Thank you for that testimony. Ms. Goldberg,
01:33:12 in keeping with this AI topic, how do you think that holding generative AI firms liable for the
01:33:17 outputs would affect their behavior? Well, I think that the pressure of litigation would motivate
01:33:26 anybody who's in the business of generative AI to be developing safer products and to be considering
01:33:32 the predictable ways that they could harm. I agree with you. Thank you. Ms. Tumarial,
01:33:38 I believe Section 230 currently also makes it possible for new competitors to enter the market
01:33:45 and attract investment. You have some first-hand experience working with thousands of startups
01:33:50 across the country, and you understand the importance of Section 230 for these companies.
01:33:54 Can you tell me what's going to happen to the small guys if Section 230 sunsets?
01:33:59 Yeah, absolutely. Thank you for the question, Congressman. We are truly concerned that small
01:34:04 platforms will not just have to deal with litigation as they currently stand, but it will
01:34:09 be much harder to launch a small platform. We've heard investors say to us, and we have some data
01:34:14 on this that I'm happy to follow up with, that when they're looking at investing in companies,
01:34:18 they invest in companies and startups where the money will go to the product. The money will go
01:34:22 to adding user value. If the money has to go to a legal defense fund, if the money has to go to a
01:34:26 defense attorney, they're not interested in investing. And they've cited current intermediary
01:34:31 liability frameworks as something that gives them confidence to invest in startups that host user
01:34:35 content. So we're really concerned that not only will the startups we know today have a tougher
01:34:39 time existing, we're really very concerned about the next generation of startups that host user
01:34:43 content that will have trouble getting off the ground. So we should get in there and do something
01:34:46 about 230, I guess. So, Mr. Berkman, I appreciate the attention you've placed on social media harm
01:34:52 towards youth throughout this testimony. You lay out in your testimony that Section 230 is the
01:34:59 correct response. Can you explain how it will not unintentionally also silence, you know, the free
01:35:06 speech on the internet? And even controversial. You know, so there's controversial that should be
01:35:13 allowed and some that should. I'm sorry, could you repeat that last? Yeah, so I just wondered,
01:35:18 you know, how does it affect the free speech online? We have hundreds of years of established
01:35:28 tort law jurisprudence. And what we're talking about here is negligent, gross negligent, and
01:35:35 reckless business decisions on the part of the social media platforms. And so we believe the
01:35:45 fears are really overblown in terms of impact on free speech that is happening over social media.
01:35:53 Well, thank you very much for that. Our time has exceeded and I will yield back and
01:35:57 we will call on Ms. Dingell, please, for a question.
01:36:03 Thank you, Mr. Chairman. And I want to thank the committee for holding this legislative hearing
01:36:08 today to discuss how Congress can properly address the harms present on the internet today.
01:36:13 And thank you to the witnesses for testifying. Every day we are impacted by the decisions that
01:36:19 tech companies make in deciding what content on their platform should be promoted, recommended,
01:36:25 monetized, and more. Whether it's cyber bullying, mental health issues, explicit threats, or the
01:36:32 spread of false information, a platform's business decisions can cause tangible harm.
01:36:39 Currently, Section 230 of the Communications Decency Act essentially provides tech companies
01:36:46 with a legal safe harbor for all user content on their platforms. Courts have interpreted Section
01:36:52 230 to grant tech companies broad immunity, allowing them to evade accountability for what
01:36:58 occurs on their platforms. Congress does need to reexamine Section 230 and that's what we're doing
01:37:04 here today. The internet has changed dramatically over the past several decades, yet Section 230
01:37:11 has remained virtually unchanged for nearly 30 years, except for a 218 law that exempted sex
01:37:19 trafficking content from 230's reach. Section 230 deserves real scrutiny and we must strike a
01:37:26 balance. Preserving free expression while ensuring companies and platforms are accountable to their
01:37:32 users, especially vulnerable populations like our children, for the decisions they make. From how
01:37:40 they design their platforms to how they monetize the content on them. We need to hold tech companies
01:37:46 accountable when they fall short. So let me start with Ms. Goldberg. Yes or no, are companies
01:37:54 currently incentivized to promote provocative, even potentially harmful content to increase user
01:38:00 engagement? Absolutely. Harmful content is hugely lucrative. It increases engagement and time on the
01:38:11 app and then these companies can sell more ads. Oftentimes they sell ads specific to the worst
01:38:17 content. So then, without Section 230's liability protection, would these companies be incentivized
01:38:25 to do so? Yes, the removal of Section 230 would incentivize companies to make sure that people
01:38:31 don't get injured because those injured people could hold them liable. We want that. Okay,
01:38:38 Mr. Berkman, should tech companies be held accountable for the content they promote,
01:38:42 recommend, or amplify on their platform and causes offline harm? In cases where you can show
01:38:52 negligence or gross negligence. And again, amplification is a business decision on the
01:39:00 part of the platform. It is not a level playing field, virtual town hall amongst its users.
01:39:09 The platform has decided to amplify harmful content for revenue. So, Mr. Berkman, how then
01:39:17 does Congress hold these companies accountable and also incentivize companies to implement
01:39:23 responsible algorithms and platform designs? The range of comprehensive reforms that is being
01:39:30 considered by this committee is a major essential step, first of all. Significant reform, reworking
01:39:39 of Section 230 as we're discussing today is a big piece of that. And again, I'll come back to
01:39:46 Sammy's law where that is essential protection for the family in the home when the algorithms fail.
01:39:54 So, Ms. Goldberg, I'm not going to have much more time. So, could you provide further insight into
01:39:59 the liability gaps that have emerged of Section 230 and how Congress can address them? Yes. I mean,
01:40:07 the main liability gap is that these companies say that everything is content. And that includes,
01:40:13 you know, dating apps. It includes algorithms, situations where these platforms are incredibly
01:40:21 complex with geofencing, generative AI, data harvesting, photography. And yet, you know,
01:40:29 they say that they're just a forum for speech, that they're not products, they're services.
01:40:34 We have companies that say that their own terms of service and the contracts don't apply to them
01:40:40 because users don't rely on them. They look for every single way to get out of court that they
01:40:46 can. The liability gaps are what we're all just, I mean, it's a hole that we're all falling into.
01:40:51 Thank you. I'm out of time, but this is a very important hearing and we must
01:40:56 update our current law. Thank you, Mr. Chairman, and I yield back.
01:41:00 Thank you very much. The gentlelady yields back and we recognize Mr. Carter of Georgia
01:41:05 for five minutes. Thank you, Mr. Chairman. Thank all of you for being here. We really
01:41:09 appreciate it. Mr. Berkman, I want to thank you especially for your help with my office
01:41:15 with Sammy's Law. I'm the lead sponsor on that legislation and it's good legislation and we
01:41:22 couldn't have gotten to the point we're at now without your help and your office's help. So,
01:41:26 thank you for that and I hope we see that move through the committee process. It needs to,
01:41:31 very much so. You know, you've heard it said so many times here before already, you know,
01:41:36 30 years ago, where were we at and how much we've evolved over that, the internet's evolved over
01:41:42 that. I would suggest to you that the internet's one of the greatest inventions of all time and
01:41:48 we've witnessed that, but we know we need to do something. We know that we've got to address this
01:41:56 situation as it exists now because 230, 30 years ago is not as relevant now as it was then and
01:42:04 practically not relevant at all. I've got a vested interest in this. I'm a father of three,
01:42:09 a grandfather of seven with the eighth one on the way. I want to make sure we get this right. I want
01:42:14 to make sure we get it right for my grandchildren and part of the problem and part of the blame,
01:42:19 I believe, is with section 230 because it's kind of set a free-for-all on the internet.
01:42:25 Now, don't get me wrong, I don't want to stifle innovation. That's why this is such, you know,
01:42:32 this is not easy and this is important. We need to get it right because we don't want to stifle
01:42:38 innovation. We need to continue to have innovation, but at the same time, there's got to be a sweet
01:42:44 spot there and we got to find it. Ms. Tumarillo, I want to start with you. You talk about the
01:42:51 content moderation or lack thereof that could happen if section 230 were sunset. Either companies
01:42:58 may leave up most of the content or they may over moderate the content. Why don't companies know? I
01:43:05 mean, surely they can guesstimate what they're going to do. They ought to be able to know what
01:43:12 they're going to do. Thank you for the question, Congressman. I think this gets back to 230 as
01:43:18 kind of a legal shortcut to the inevitable legal conclusion, which is that in the vast majority of
01:43:22 cases, startups especially, but all internet platforms don't have knowledge about every piece
01:43:28 of content every user shares. And while, yes, 230 is almost 30 years old, the internet has only
01:43:34 grown in scale and scope since then. And so there's even more content to try to figure out.
01:43:37 Courts would likely at the end of the day hold a lot of, kind of in the cases we're thinking about,
01:43:43 they would often hold platforms as liable as they would hold bookstores who don't have to
01:43:47 be responsible for every page and every book on their shelves. And so to avoid having distributor
01:43:54 liability, platforms might bury their head in the sands and say, we don't have knowledge of this.
01:43:58 We didn't know about this harmful content. We're not looking for it. We're not finding it. We don't
01:44:01 know about it. We can't be held liable in court. I think that cuts against a lot of what this
01:44:05 committee is looking to do in creating a safer, healthier internet. And so that's one of the
01:44:08 consequences of sunsetting section 230. To your point, the other side is some platforms who are
01:44:13 really invested in getting this right might over remove and we might see lawful productive
01:44:17 expression taken down online. And that's also a concern. Okay. Mr. Berkman, in your testimony,
01:44:21 you directly quote several social media executives acknowledging the harms caused by their platforms.
01:44:28 And we frequently hear that platforms don't have the capacity or the manpower to effectively protect
01:44:34 the users, their users. What's it going to take? What's it going to take for the platforms to rise
01:44:40 to the occasion? Yeah. Well, first of all, thank you so much for your leadership on Sammy's Law,
01:44:46 along with the other bipartisan co-leads. And that therein lies the answer is Congress needs
01:44:54 to act and force them to because we are seeing the prioritization of profits over safety.
01:45:01 And when we're talking about innovation, one point I want to make here that's really critical is
01:45:08 if I want to go innovate and create a new type of car there, I am subject to normal tort law.
01:45:14 If my business isn't capitalized sufficiently and I can't afford to put in seat belts
01:45:20 and a car seat holder, I can't make that car and I can't sell it. And so if companies are going on
01:45:26 the market and they're not able to be safe for consumers, especially children, that is an issue.
01:45:33 And that's why Congress needs to really rework the regulatory framework here. Good. I've only
01:45:39 got 30 seconds left. Ms. Goldberg, real quickly, Section C2 shields the platforms from liability
01:45:45 for material that is considered to be obscene, lewd, filthy, excessively violent,
01:45:50 and otherwise objectionable. Do you think otherwise objectionable is too broad?
01:45:55 Well, can you give me any examples how this has been used in court, otherwise objectionable?
01:46:03 I mean, the thing is that it's basic. C2 is basically canceled out by C1
01:46:08 because everything in C2 is content and C1 says that you can't sue a platform for content.
01:46:13 So it's for decoration. You know, I remember Justice Stevens, I believe it was him who said,
01:46:23 "Don't ask me to define pornography, but I know it when I see it." I mean, otherwise objectionable.
01:46:29 Anyway, I'm out of time. And thank you, Mr. Chairman. And I yield back. Thank you all
01:46:35 again for being here. I think the gentleman yields back. And now we recognize the gentlelady
01:46:39 from New Hampshire for at least five minutes. Thank you very much, Mr. Chairman. I want to
01:46:45 thank our subcommittee leader, Chair Latta, and Ranking Member Mitsui for holding this very,
01:46:50 very important hearing about protecting our families. As the founder and co-chair of the
01:46:55 Bipartisan Task Force to End Sexual Violence, I'm particularly concerned about reports of
01:47:02 online dating apps being used to commit sexual assaults and how Section 230 has prevented these
01:47:08 survivors from seeking justice. I recognize that Section 230 is the bedrock of our modern-day
01:47:14 internet, but Congress has the responsibility, and I think you're hearing this in strong bipartisan
01:47:21 terms from this committee, to ensure that these legal protections are functioned as Congress
01:47:27 originally intended. We did not intend a wide-open, Wild West internet. The protections that Section
01:47:35 230 provides online platforms should not extend to bad actors and particularly online predators.
01:47:43 Mr. Berkman, I know you're a strong supporter of ending Section 230. If Section 230 sunsets,
01:47:49 how do you see this change benefiting children and young people as they navigate the online world?
01:47:57 Thank you, Representative. I appreciate the question. Again, we're seeing harms exploding on
01:48:03 social media that are significantly impacting children. You mentioned sexual predation. The FBI
01:48:09 has reported a significant increase in sextortion. Online enticement of minors has
01:48:16 increased over 300 percent, according to NCMEC. There is a rain, and that's just the tip of the
01:48:24 iceberg, so a range of these harms. Now, reworking Section 230 so that there is liability on clear,
01:48:33 bad-faith business decisions on the part of the social media platforms, which is happening across
01:48:38 the industry today, is essential to protect American families, particularly our children.
01:48:44 Thank you so much, and I agree. We need to take action to protect children online, as well as
01:48:52 young people and adults. It's a really painful part of our society right now, and we owe that
01:48:58 obligation. Ms. Goldberg, in your testimony, you mentioned that Section 230 has grown to be a near
01:49:04 absolute liability shield for tech companies. Will sunsetting Section 230 better protect the
01:49:10 American public? Sunsetting Section 230 will better protect the American public. I mean, just,
01:49:16 you know, I thank you so much for your work on sexual violence, and some of the worst actors that
01:49:22 we see at my firm are the dating apps, you know, and they're still saying that they're entitled to
01:49:27 immunity because they're just passive publishing forums. Since when is an app that matches people,
01:49:36 you know, that advertises on TikTok and Instagram to children, that geolocates people,
01:49:43 organizes data, promotes algorithms, sells ads? Since when are those just passive, you know,
01:49:50 platforms? And what's interesting is that we now have, you know, like, Section 230 was for
01:49:56 CompuServe and AOL and these obvious publication forums, not for real life encounters like the
01:50:06 dating apps are creating. So, this is for either one of you. As we work to place more
01:50:13 accountability on internet platforms so that they better protect individuals, including children and
01:50:19 our families, and promote safe spaces online, what recommendations do you have for Congress
01:50:24 to ensure that these platforms best serve the American public? I'm happy to answer because
01:50:32 I have some ideas. I think the most important thing is we have to look at the wrongdoers.
01:50:38 So, you know, there's companies that are just in the business to be malicious.
01:50:43 And then there's companies- Evil intent.
01:50:46 That just have ill intent. You know, companies that just create deep, you know, that their whole
01:50:51 product is creating deepfakes or that, you know, entire websites promoting suicide. And then there's
01:50:58 companies that know that bad actions are happening, like dating apps that accommodate serial sex
01:51:05 predators and just don't care. And so, you know, we have to like kind of be looking at the level
01:51:12 of culpability in these cases. And I would jump in there too and speak specifically about the
01:51:20 smaller businesses, the startups that we're talking about that are not sufficiently capitalized
01:51:26 to protect especially child users. So, the mantra in Silicon Valley is move fast and break things.
01:51:32 And fortunately, those things are children. And so, we have an app like YOLO, a small app out
01:51:39 there that allowed anonymous chatting amongst teens on platforms like Snapchat. Carson Bride,
01:51:47 16, cyber bullied over that app and died by suicide. And that's the consequence. And it's
01:51:54 very clear that that app had no concern with safety. Thank you. I really am so grateful for
01:52:03 this hearing and for the action that this committee will take. And I yield back. And
01:52:08 I yield back. The chair and I recognize the gentleman from Texas, Mr. Pfluger, for five
01:52:11 minutes. Thank you, Mr. Chairman. I agree. It's a good hearing. And I'll kind of, Ms. Goldberg,
01:52:16 pick up where you just left off there. I know that there was a motion to dismiss in the case
01:52:21 that you were working, Neville et al. versus Snap. And that was defeated in the Los Angeles
01:52:29 court. If you remember, this committee about a year ago had a roundtable. And we heard from
01:52:35 Amy Neville about the tragic poisoning of her son and the death. And what I want to
01:52:42 talk with you about is you've also litigated the Omegle case. I hope I'm saying that correctly.
01:52:48 It ultimately led to the website being shut down for connecting children with sexual predators.
01:52:51 So in your first-hand experience, how have big tech companies hidden behind or abused Section
01:52:56 230 to protect them from the liability regarding either drug sales, child sex abuse material,
01:53:02 human trafficking, any of those things? Tech companies just say that it's all content,
01:53:08 that it's always the user's fault. If it's a child who was injured, then they blame it on
01:53:12 the parents for not supervising. Even in cases like Snap, where the product itself prevents
01:53:20 parents from overseeing the content that their kids use. And I will correct you that we actually
01:53:25 won the motion to dismiss against Snap. And we are able to move forward to show liability and
01:53:33 that their product was matching children with the drug dealers that hurt them.
01:53:40 Congratulations. Thank you. Thank you. They're appealing. They're appealing, of course.
01:53:45 Yeah, of course they will. I'll go to Ms. Tumarillo. When it comes to the sun setting of 230,
01:53:51 I mean, just maybe talk me through what a small company should be prepared for timeline-wise. I
01:53:56 mean, how long would they need to prepare for a change like that? What are some of the implications
01:54:00 that you see? Thank you for the question, Congressman. So like I said in my testimony,
01:54:04 we know that startups already invest proportionally more in content moderation. They absolutely have
01:54:09 to if they want to see user growth. Nobody is, the startups and engines network are all trying
01:54:14 to be responsible, good actors. Content moderation is incredibly time-consuming and expensive.
01:54:20 A startup scale, they have to start hiring. I mean, Facebook talks about the tens of thousands
01:54:24 of content moderators they have. Google very famously has spent a million dollars and more on
01:54:30 their copyright detection software on YouTube. So these are all things that are realistically out
01:54:36 of reach for startups. I think if we were to see Congress pass a sunset, startups would spend
01:54:42 the next 18 months frantically trying to make sure they knew what every user was doing. It might mean
01:54:46 hosting less content, which again, hosting less bad content would be good for the ecosystem, but
01:54:51 hosting less content overall would be bad for startups and bad for their users.
01:54:54 Thank you for that. You know, I think the goal of a hearing like this is to find that balance
01:55:00 between safety and security to understand the implications and appreciate all of the inputs
01:55:06 here. In the last hearing that we had, I talked with Dr. Stenger about how our adversaries may
01:55:11 exploit Section 230 to conduct influence campaigns, to recruit and promote terrorism, to sell illicit
01:55:17 drugs, a myriad of other things. So I'll go back to you, Ms. Goldberg. I'm sorry, Mr. Berkman.
01:55:21 But Dr. Stenger argued that by removing C1, that the immunity shield companies would behave
01:55:28 differently and would thus benefit our national security in these aspects. In your opinion,
01:55:34 what effect would this have both on national security and on the freedom of speech of
01:55:39 platforms? Well, we would still have situations where, you know, if somebody's injured because
01:55:45 of malicious actors infiltrating and impersonating other people on the platforms, then the injured
01:55:53 people without Section 230 would be able to actually hold the platforms liable and there
01:55:57 would be pressure on the platforms to not let horrible things happen there.
01:56:04 Mr. Berkman, I will go to you. I mean, what is the, how do you describe the environment right now?
01:56:09 Some would say this is the Wild West. Some would say that companies are operating without impunity.
01:56:15 You know, I mean, how do you describe the environment right now? Unmitigated catastrophe.
01:56:21 Okay. Particularly when it comes to children. The harms that we're seeing, so we work
01:56:29 in K through 12 schools across the United States and what we're seeing on the ground level. So we
01:56:36 went over the stats and the testimony in terms of the harms. What we're seeing on the ground level
01:56:41 is horrifying. We had a fifth grader come up to us the other week who was considering suicide
01:56:48 because everyone on a social media platform was telling her to kill herself. We had another child
01:56:54 addicted to watching execution style videos on social media. You see algorithms being specifically
01:56:59 directed at these children, at users with content that they, you know, haven't seen before, but
01:57:06 it is obviously very worrisome. That particular example was not an algorithm. It was a platform
01:57:12 that allows grouping in servers. And so it was a lack of sufficient safety protocol on that one.
01:57:19 Algorithms are certainly causing a lot of other issues, but it's not the only cause of the massive
01:57:25 amount of harm. Thank you. I thank the witnesses for being here. Yield back.
01:57:29 Gentleman yields back. Chair now recognizes Ms. Jelani Millinoy for at least five minutes,
01:57:39 if she can fill her microphone. Thank you so much and thanks for holding this hearing. As I said
01:57:45 last month, and we had the last hearing on section 230 of the Communications Decency Act of 1996,
01:57:52 I'm glad that Democrats and Republicans agree that Congress should be reevaluating section 230,
01:57:58 given the growth of the internet and its changing landscapes. However, as with everything we do,
01:58:04 the details matter, especially in this context. And I apologize if I ask something because I'm in
01:58:09 two different hearings. So, Mr. Berkman, as I'm sure you know, May is Mental Health Awareness
01:58:14 Month. So I was particularly moved by your written testimony section regarding how social media is
01:58:20 harming children and causing negative mental health outcomes. May you briefly discuss some of
01:58:25 these harms and if some of the safety initiatives or measures that social media platforms have
01:58:30 launched have really addressed these harms? Yes, I appreciate the question, Congresswoman.
01:58:37 We really would recommend that members read the Surgeon General's Advisory from May 2023
01:58:45 on the impact that social media is having on adolescent mental health. It is a really
01:58:53 exceptional overview of the various studies that are out there showing significant correlations.
01:59:00 And just summarizing that a bit, there are studies showing significant correlations
01:59:06 with social media use, particularly excessive social media use, so three or five plus hours a
01:59:13 day with anxiety, depression, self-harm, including suicide. There are correlations between social
01:59:21 media use, adolescent youth and eating disorders, ADHD, substance use, substance abuse. The list is
01:59:32 very long. Educational impacts, negative educational impacts as well. So if there is a negative mental
01:59:40 health outcome out there for adolescents, there's almost certainly a good amount of research showing
01:59:45 a correlation between social media use and that outcome. So in terms of what the platforms have
01:59:50 done, we would say not a lot. We would say not a lot. A few have put in some level of time
01:59:59 restriction because our theories as to why social media use is causing that mental health impact
02:00:04 is dependent upon time of use. The restrictions that the platforms tend to put in benefit the
02:00:11 platforms. It's still a significant amount of time. Content moderation in our view has not
02:00:18 changed a lot. We're looking from the outside, so they don't put out a lot of information for us to
02:00:23 study on that. They have not changed much in terms of the features that the platforms employ that we
02:00:31 believe are impacting mental health negatively. That's because those features drive engagement
02:00:36 and revenue. Thank you so much for your response. I feel like a lot of attention
02:00:43 around calls to reform section 230 stem from perceived abuses by big tech and large social
02:00:49 media platforms not doing enough to moderate user speech that some find offensive or even dangerous
02:00:56 or actions of platforms themselves like to amplify or monetize such speech. While I share general
02:01:03 concern that the largest social media companies could likely do more to police their platforms,
02:01:08 I expect internet companies large and small engage in conduct that harm users. After all,
02:01:14 it is sometimes the smallest fringe platforms that can cause disproportionate to their size.
02:01:20 Is it Ms. Tamarillo? Okay. I would like you to explain what specific components of section 230
02:01:28 benefit the thousands of startups and small businesses your organization works with across
02:01:34 the country? How do we ensure that bad actors don't benefit in those protections? To be clear,
02:01:41 I ask because should we set a deadline for section 230 to sunset? I'm interested in which
02:01:46 components should we set a deadline, which components ought to be maintained in any new
02:01:51 section 230 proposal? Thank you for the question. To be clear, NGIN is not advocating for a small
02:01:58 business carve out. We definitely agree that some of the smallest players can do some of the most
02:02:01 harm. And so that's when we're thinking about startups in 230, we're not asking for special
02:02:06 protections for startups. We're asking for an ecosystem that kind of works for everyone,
02:02:10 including startups. And I think the core piece of section 230 that is both controversial, but also,
02:02:15 I would argue necessary, is the liability limitations for lawsuits and to the points
02:02:21 about tort law, even threats of lawsuits that protect internet platforms that host user content,
02:02:28 even if a startup were to get sued and win in court, or use their insurance to breach a
02:02:35 settlement or you name it, it will always be the fastest, cheapest option to settle and settle
02:02:41 usually means removing the speech. And so really, section 230 is the thing that lets users speak
02:02:46 online and it lets platforms create places where they can speak. Thank you so much. And I yield
02:02:50 back. The lady yields back. Chair recognizes Mr. Allen for five minutes. Thank you, Mr. Chairman.
02:02:57 And I want to thank our witnesses for being here today. Talk about this important subject that
02:03:03 discussion has been going on for some time. Certainly our Constitution guarantees the right
02:03:10 to own property and the right to protect it. And we're the only nation that actually ensures that
02:03:16 in our Constitution. And so we're talking about, you know, this social media and those,
02:03:26 those kind of things that are using people's personal property to enrich themselves.
02:03:33 And so, Ms. Tamarillo, in your testimony, you argued that sunsetting section 230 risks leaving
02:03:41 internet platforms, especially those run by startups, open to substantial litigation,
02:03:48 which ultimately risks leaving internet users without places to gather online.
02:03:54 I hear your argument that startups need to focus on innovation rather than litigation. But on the
02:03:59 other hand, sympathetic to arguments that section 230 enables large internet platforms to amplify
02:04:07 and monetize harmful content. Should this committee consider sunsetting section 230 for large
02:04:16 internet platforms only? Thank you for the question, Congressman.
02:04:21 You know, it's not my job or interest to defend large tech companies. And so I won't speak for
02:04:26 kind of what a sunset would mean for them or from an engine perspective. We certainly have
02:04:30 startups that use large platforms to reach consumers and users and their activity could
02:04:35 be impacted. But I will just note the personal story I told in my oral testimony was about how
02:04:40 I was able to rely on things like Facebook groups, private Facebook groups to access needed support
02:04:46 and information. And so I think from a user perspective, sunsetting section 230 only for
02:04:50 large companies runs the same risk as sunsetting 230 for everyone and that will harm user expression.
02:04:56 I'm interested to see the limits of the section 230 as it applies to internet innovations. For
02:05:03 example, we're seeing disruptors and innovators emerge in the search space searches perplexity,
02:05:09 which unit uses generative AI technologies to produce consolidated responses to search queries.
02:05:18 Mr. Amarillo, should AI generated responses to users search queries be protected?
02:05:23 That is an emerging field not only of technology, but also legal interpretation. And so
02:05:29 hesitant to take a position I definitely think it's there's compelling arguments that it shouldn't be
02:05:36 covered by 230. I will say generative AI tends to be talked about in the context of chat GPT
02:05:42 and mid journey and things that kind of create things that replace human creations. But we have
02:05:49 a lot of startups using generative AI for like chatbot responses for hotel guests that tell you
02:05:54 when the pool is open. And so I worry that any conversations about generative AI focused on some
02:05:59 of the edge cases would impact the whole ecosystem. Let me ask you about this. What about if sources
02:06:03 for components to the ad generated response provided so it is clear the response comprises
02:06:08 third party content? Should section 230 shield these outputs? I'm sorry, could you repeat the
02:06:15 question? What about if sources for components of the AI generated response are provided so that
02:06:21 it's clear the response comprises third party content, which is clear, should section 230
02:06:27 shield these outputs? I imagine policymakers want to incentivize transparency around AI. And so
02:06:32 certainly would would want to incentivize platforms, acknowledging and disclosing when AI is
02:06:38 when something is AI created. Ms. Goldberg, in your written testimony, you noted that the financial
02:06:43 pressure is typically imposed by consumers safety standards are almost non existent.
02:06:48 Consequently, online companies have no incentive to prevent injuries, intervene when harm is
02:06:54 underway, invest in infrastructures and staffing to moderate harm, or innovate for safer products.
02:07:01 Some have proposed that sunsetting section 230 would incentivize
02:07:05 internet firms to prioritize amplifying professional content like news content,
02:07:11 because they would have greater confidence. And it is legally sound news content must go through an
02:07:17 editorial process prior to publication after all. Could this sunset be beneficial for consumers
02:07:24 since outputs might be safer? The sunset would definitely prioritize consumer safety.
02:07:31 Um, you know, I think that as I listened to Miss Tamarillo, talk about the the pressures that
02:07:37 startups have, and the burden that they face if they have to spend a lot of money on on content
02:07:44 moderation. I just I don't I have no sympathy for that. If you are in the business of creating
02:07:50 a product or a service that is supposed to attract lots and lots of people and you don't have an
02:07:56 infrastructure that can responsibly prevent those people from being need guardrails, then then
02:08:02 that's where the innovation needs to be right. Exactly. Well, I am out of time. I have additional
02:08:07 questions which I will submit to you for the record. And with that, Mr. Chairman, I yield back.
02:08:12 So my heels back, John recognizes Clark for at least five minutes.
02:08:17 Well, thank you very much, Mr. Chairman. And I think our ranking member Matt suey for holding
02:08:23 this very important hearing. I think our expert witnesses for joining us today
02:08:29 to examine the proposal of the sunset of section 230 of the Communications Decency Act. Originally,
02:08:37 we reenact we enacted the to section 230 to regulate obscenity and indecency online when
02:08:47 the Internet was still in its infancy. Section 230 has transformed into an all encompassing shield
02:08:53 used to protect big tech firms from accountability for the harms caused by their platforms and
02:08:59 moderation policies. That's it's just so evident. This is due at least in part to overly broad
02:09:06 interpretations of the law by federal courts as well as flawed incentive structures.
02:09:11 Pairing this overly expansive interpretation of section 230 with most companies online
02:09:17 monetization policies, where an engagement drives data collection, which in turn drives revenue
02:09:24 via ad dollars has resulted in an Internet ecosystem that incentivizes the creation and
02:09:32 rapid dissemination of increasingly outrageous or extreme content without considering its veracity
02:09:38 or potential for real harm. In short, for many big tech firms, we are the product. Collection of our
02:09:45 personal data is their big money maker and the promotion of harmful content, even disinformation,
02:09:51 has become part of the business model. This status quo cannot stand and the proliferation
02:09:57 of generative AI tools only underscores the urgency and the need to create new incentive
02:10:04 structures for big tech platform providers and how they operate. AI generated content can now be
02:10:11 created and spread across the globe in a matter of minutes. We cannot sit back and just hope that a
02:10:17 decades old regulatory regime is equipped to deal with the harms created by rapidly advancing
02:10:23 technology today and in the future. We must take care to ensure that any future legislation allows
02:10:29 for big tech to be held accountable for prioritizing profits over the well-being of the American public
02:10:35 and we must do so in a way that will not stifle innovation or place unrealistic regulatory hurdles
02:10:42 on the new market entrance. So having said that, by now we are all likely at least somewhat familiar
02:10:49 with the public facing generative AI tools of today, like AI chat bots and content creation
02:10:57 tools. What we may not be as well understood among the public is the role AI can play behind
02:11:04 the scenes in terms of things like engagement algorithms on social media and other automated
02:11:10 decision systems related to consumers access to education, vocational training, employment,
02:11:16 essential utilities, financial services, health care, housing, and more. So my question is first
02:11:22 directed to Ms. Goldberg but to all of our witnesses and you're welcome to respond. Ms.
02:11:28 Goldberg, given your experience working with those harmed by online platforms, how has section 230
02:11:34 been applied thus far in the brave new world of generative AI and its ever-expanding list of new
02:11:41 use cases and is section 230 starting to lose relevance as more firms roll out new AI tools
02:11:49 likely not protected by section 230? I wish I had more time to answer those great questions.
02:11:55 But I think the advent of generative AI is going to elicit harms that we can't even comprehend
02:12:04 and already we have products like Snap that have AI that's aimed at children, chat bots that
02:12:11 you know can elicit children to provide their deepest darkest secrets and we don't know how
02:12:16 they're going to use that and how you know they might be blackmailing kids with it or inducing
02:12:22 them to have suicide. Section 230 right now is going to be used by all these companies as a
02:12:30 reason to not be held liable. Do you want to respond? I have 44 seconds. I'll go quick. We're
02:12:40 really early in the rollout here and we've already seen extremely concerning examples like
02:12:48 Ms. Goldberg just mentioned the Snapchat AI bot was using harmful content with children and I
02:12:59 think this is the tip of the iceberg and I'm really appreciative that this committee is
02:13:05 prioritizing this concern. Did you want to respond? Yeah very quickly just want to add a lot
02:13:11 of the startups in our network are actually using AI to find and remove harmful content and to be
02:13:16 clear I don't think any startup in our network thinks that it's a burden to have to host and
02:13:19 moderate content in a way that helps users. Like I said several times they actually proportionally
02:13:23 invest more than larger companies. I think the concern where it would be a burden is if a startup
02:13:28 had to worry about perfectly moderating content every single time a user uploaded a photo, a
02:13:33 comment, a review, you name it in real time or risk giving rise to liability under a lawsuit.
02:13:38 Very well thank you Mr. Chairman I yield back. The gentlelady yields back. The chair recognizes
02:13:43 the gentleman from Monmouth, Mr. Fulcher for five minutes. Thank you Mr. Chairman. To the panel,
02:13:47 thank you for being here and as some have said we bounce in and out of committees so forgive the
02:13:52 repeat if there is one but they'd have a chance to go through your testimony and hear a good part of
02:14:00 your responses so thank you for your participation. I've got a question for Ms. Goldberg.
02:14:03 When social media companies flag or remove content is there any any reporting requirement
02:14:12 whatsoever necessary for that? They do that under current rules? Under current law there's
02:14:18 there's no requirement that social media companies report to anybody what content they remove. Because
02:14:24 you know frankly I suspect there's a fair amount of that that happens so if that if that were to
02:14:29 happen with your understanding of current rules would a plaintiff be able to obtain information
02:14:35 if they in any way if they sued or any other means from that platform under the current rules?
02:14:43 Under current rules there would for a plaintiff to sue there would have to be
02:14:46 something that they're suing about a cause of action and a way that they've been harmed and
02:14:52 then to get the information they would have to subpoena the platform. Which is doesn't sound
02:14:58 like a simple or easy process. It's not simple or easy and it's the the biggest deterrent to
02:15:04 litigation is how expensive cumbersome and invasive litigation is for plaintiffs. Okay so I'm
02:15:12 admittedly especially after hearing the the the testimony today I am concerned over the degree
02:15:19 of protection social media companies have from liability and there's there's benefits of that
02:15:26 shield I'm sure but if 230 is sunset or sunsetted and there's not reform do you believe that the
02:15:38 well what happens next maybe that's a better way to ask if it's sunsetted under the current rules
02:15:44 then what happens next with the current liability framework can the system handle that?
02:15:55 Absolutely because the removal of of section 230 does not create liability it just means that
02:16:01 somebody who's been terribly injured can plead and accuse a company of being responsible for
02:16:07 that injury. They still have to go to court they have to prove their case and that can take years.
02:16:12 There's there's not going to be some sort of mythical rush to the courthouse by millions of
02:16:21 people because I mean there has to be an injury. Thank you for that. I'm gonna go to Mr. Berkman
02:16:27 and just maybe get your input on this a little bit. I'm not exactly a
02:16:36 search engine hound but like a lot of people I'll purchase something online periodically and whatnot
02:16:44 and this may not be right up your area of expertise but my guess is you're gonna have a
02:16:49 take on it. It seems like in the last I don't know year all I have to do is think about something
02:16:57 and the next time I punch on a search engine I get an advertisement or something that's related to
02:17:03 that. Now maybe that's just me or maybe that's unique but I heavily suspect there's some AI
02:17:10 involved with whatever wherever I've been or whatever I've talked about or whatever I've looked
02:17:15 at. Do you see a relationship between the development of artificial intelligence and
02:17:24 the amount of data that's collected versus a world before artificial intelligence?
02:17:34 That's an I appreciate the question. We actually get your preface question all the time is social
02:17:43 media listening to me? How do they know? Why am I getting this ad for shoes when I was just talking
02:17:51 about shoes? We get that all the time. To be clear the platforms all consistently say that that
02:17:57 they're not eavesdropping but then the answer is if they're not eavesdropping they're collecting
02:18:04 a significant amount of data on their users to advertise and I don't I don't know that AI has
02:18:14 increased that amount of data it might have always been there but it certainly has increased the
02:18:18 capability to analyze the data and refine the advertising and sometimes that refinement is
02:18:26 manipulative and dangerous. Thank you for that and I suspect you're absolutely right.
02:18:33 Mr. Chairman I yield back. The gentleman yields back. The chair recognizes the gentleman from
02:18:36 Texas for five minutes. Mr. Chairman thank you very much and I'm happy that we're really having
02:18:41 this hearing on legislation to sunset section 230 of the Communications Decency Act. This is an
02:18:48 opportunity to get Congress on how to really best reform big tech's immunity over the next 18 months
02:18:55 and I know that a lot of people that work in this particular area have been you know talking about
02:19:02 this and that it's a really big deal. One of the things that I do here in Congress is that I co-chair
02:19:08 and I founded the Congressional Voting Rights Caucus to help safeguard our right to vote and
02:19:14 as many of you may remember last Congress the House passed H.R. 1 the For the People Act and
02:19:20 this legislation contained provisions aimed at preventing deceptive practices in our federal
02:19:27 elections and ahead of the 2024 elections I've also spearheaded efforts alongside my colleagues
02:19:33 to hold big tech accountable and reports have recently revealed a concerning trend again
02:19:41 a reduction in the workforce dedicated to combating harmful content on social media platforms
02:19:48 while increasingly turning to artificial intelligence and other automated systems to
02:19:54 get the job done and one of the things that worry me is that in this in this era that we live in and
02:20:01 and we're all worried about democracy right now we're facing unprecedented threats to the
02:20:07 proliferation of harmful content that seems to manipulate and try to influence in a bad way
02:20:14 certain populations during the elections and I wanted to ask Miss Tumarillo with this backdrop
02:20:22 in mind and understanding that you are skeptical of efforts to reform Section 230 and don't
02:20:28 represent large social media companies what are the best levers that we have to
02:20:35 completely address the spread of voter suppression content online and are you aware of any startups
02:20:42 in the political participation space that are protecting voting rights? Thank you so much for
02:20:47 the question Congressman and to be clear I an engine is not opposed to efforts to reform 230
02:20:51 we're specifically concerned about the proposal to sunset 230 absent an alternative framework I think
02:20:56 there's a lot we have a lot of startups in our network that are focused on civic engagement
02:21:00 and those platforms honestly more than anyone else in our network desperately need Section 230.
02:21:05 There'll be platforms to encourage students college students let's say to analyze in real
02:21:11 time with critical thinking skills an article about a current event you could easily imagine
02:21:16 an article about a current event saying something you know pertaining to a member of congress maybe
02:21:21 and somebody commenting on that article something unflattering to that member of congress maybe
02:21:25 something defamatory that startup right needs Section 230 to make sure they're not going to
02:21:29 be sued for the commenter's defamatory statement I think that's kind of the startup solution that
02:21:36 we can think of kind of broadly in our network combating voting rights misinformation but also
02:21:41 just encouraging civic engagement generally right is the ability for people to have these tough
02:21:46 often controversial conversations and Section 230 is what enables the startups in our network to
02:21:50 create places for those conversations online right right exactly Miss Goldberg I know that
02:21:54 we've talked a lot about realigning the incentives of big tech companies that better serve the public
02:21:59 do you have any thoughts on why it's crucial for congress to clarify section that to clarify that
02:22:05 Section 230 does not serve as a shield to federal and state civil rights claims particularly in
02:22:11 instances of discrimination in areas like employment lending or even housing absolutely
02:22:18 I think we can all agree that discriminating base you know like for housing employment I mean those
02:22:25 are not traditional public publishing functions and yet platforms do it all the time and they
02:22:32 still plead Section 230 Section 230 was never intended to be to be a shield for for discrimination
02:22:39 and so whatever reform we we get we absolutely need it to include language that that recognizes
02:22:49 civil rights yeah what would that kind of look like what do you what do you and just I know that
02:22:53 you don't have a lot of time left but just like what would that look and feel like to the public
02:22:58 well we need any reform to recognize the difference between content and conduct so
02:23:05 discrimination or or just sending ads to just certain types of people like that that is
02:23:12 conduct of the platform it's not content that that somebody's posting and so the biggest the
02:23:18 biggest thing that we have to be looking at is the person's right to sue for the platform's own
02:23:22 conduct yeah that makes sense oh thank you that's interesting I yield back thank you Mr. Chairman
02:23:27 Jim protects yields back the general agency is recognized for at least five minutes
02:23:32 thank you Mr. Chairman thank you to the witnesses for being here today
02:23:35 one thing that excites me about this bill is that we'll be setting the table for a new internet
02:23:43 ecosystem more than likely developed under the leadership of a new president next year
02:23:49 and I want to start with you Mr. Berkman let's assume we make this sunset provision law it's a
02:23:57 new congress next year and we have a new president what kind of replacement would both protect the
02:24:02 internet as we know it and at the same time expand the rights of individuals to express views that
02:24:08 often get conservatives kicked kicked off of left-wing uh companies like Facebook yeah well
02:24:15 first I think providing real liability to negligence and recklessness on the part of the
02:24:25 platform will work to improve the business decision making so that they consider the harms in a in a
02:24:34 real way and if that's done correctly in this process it should not impact content it should
02:24:41 not impact the expression of political views we do believe that's a red herring in these arguments
02:24:49 because we are talking about harm inflicted on people and as Ms. Goldberg mentioned to bring
02:24:55 a case you need to demonstrate damage harm injury so that's one piece and then I know I've mentioned
02:25:02 this a few times but uh the forgotten section of of 230 section 230d ensuring that there's access
02:25:12 to third-party safety software uh and we have Sammy's law in front of uh the IDC committee
02:25:20 for consideration now and that is that is an essential component as well okay thank you sir
02:25:26 Mr. Morello I know you're concerned about small businesses dealing with lawsuits um under a new
02:25:34 internet landscape I've been an independent pharmacy owner for over 30 years so I understand
02:25:40 what would you want to see if we did rewrite section 230. Thank you for the question congressman
02:25:45 I think the critical piece of section 230 for small internet platforms run by startups
02:25:50 is the piece that immunizes them from liability for user speech we've talked a lot about tort law
02:25:56 here and I think it's worth noting um that startups kind of have to worry not just about lawsuits but
02:26:02 about threats of lawsuits we see demand letters in all kinds of contexts including um the Americans
02:26:08 with Disabilities Act intellectual property we see startups getting demand letters because they know
02:26:13 right you don't have to actually file suit you're not going to court yet you're just sending a
02:26:15 demand letter saying we could sue you and the startup is going to pay to get the demand letter
02:26:21 to go away and paying and taking down user speech is bad for users and it's bad for startups so I
02:26:26 think in addition to kind of the the the existing tort law around things like defamation which uh
02:26:32 would would bring lawsuits if absent 230 there's also the entire patchwork of state laws that could
02:26:37 be either could be created to bring lawsuits absent 230 but there's also just the threat that a small
02:26:43 business faces that they're not usually the most legally sophisticated they usually have an outside
02:26:46 counsel that they pay up front uh and it's very expensive for them and and them dealing with even
02:26:52 a demand letter which which doesn't involve a court yet could be ruinous I understand it's kind
02:26:57 of like if the IRS goes after you makes less than 75,000 you're going to pay the fine because you
02:27:02 can't afford to pay an attorney um Mr. Berkman I'm going to go back to you sometimes comments
02:27:09 are turned off for certain posts where under the same site comments are allowed on most posts
02:27:15 should preventing comments on select post open a sign up for suit
02:27:19 you mean a platform turning off comments themselves uh I think that's a it would be a
02:27:30 highly contextual circumstance that you would again have to prove injury and harm for turning off
02:27:37 comments on a platform uh or a particular user so um
02:27:42 it would depend and that's something that that congress could weigh in on I you know I think a
02:27:50 real deep review of the actual cases there and looking at where the harms are coming from and
02:27:57 why the decisions are being made I guess it's all based on the determination and definition of harm
02:28:03 so all right with that thank you for being here and I yield back Mr. Chairman.
02:28:08 Kenley yields back. Chair now recognizes himself for five minutes. Um Ms. Goldberg you are an
02:28:16 attorney is that right? That's right. How about you Mr. Berkman? Recovering attorney. A recovering
02:28:21 I'm still barred. How about you Mr. Tremarello? I am not an attorney. Okay um and that's good I'm
02:28:27 glad that all three of you are here irrespective of your occupations. But Ms. Goldberg for you
02:28:33 how long have you been a lawyer? Um since 2007. You're just a young whippersnapper. So
02:28:38 my question for you is are you optimistic as an attorney because you've you've seen enough cases
02:28:45 it sounds like or are you pessimistic of the chances of 230 actually going away?
02:28:50 It's up to you. Well now I'm answering I'm asking the question. I'm begging I'm begging the courts
02:29:01 to recognize my clients injuries and they tell me they've told me for the last 10 years that they're
02:29:09 they're looking to you to give them that that right. I'm optimistic insofar as as you all are.
02:29:15 You have the power here. You all created section 230. The experiment is over and you can take it
02:29:22 away. So you're looking to the courts to make that decision but you'd rather us supersede that
02:29:27 process. I will always be looking to the courts first but when I get cases like Herefie Grinder
02:29:33 where I sued a platform while it was standing by watching thousands of men come to my client's home
02:29:41 that I have to I have to be looking elsewhere too. I mean they threw that case out of the court.
02:29:47 Okay um Ms. Torello I'm gonna jump over to your section. You're 30s have become blank
02:29:54 immunity. Now there was some discussion between you and I think Mark Congressman Beese that you
02:30:00 were kind of skeptical about that it would be changed. Was that skepticism about it being
02:30:05 changed or being effectual if it was? Oh I think what I said to Congressman Beese is uh we're not
02:30:10 opposed to reforming section 230. We are eager to have conversations about how to use policy to make
02:30:15 the internet a safer better healthier place for everyone. What we are concerned about is sun
02:30:19 setting because uh one the end of 2025 is quickly coming up but also there's been so much kind of
02:30:26 discord between even members of Congress generally about what should replace 230 that we worry we
02:30:31 would get caught in this tug of war where startups and their online communities of users are actually
02:30:36 the ones kind of losing out. So it's safe to say that you would not be in favor of us just
02:30:40 card washed and awake 230 completely but not replacing it with something? I think that'd be
02:30:45 very dangerous for the startup ecosystem. Right I got you. Mr. Berkman you talked about an
02:30:49 unmitigated disaster. You talked about children across the nation that were harmed by some of
02:30:54 this stuff and I don't didn't know the cases. I came in a little later so I didn't I didn't get
02:31:00 this I was at another committee so uh when you talk about children across the nations uh do schools
02:31:06 get involved in those? Do you interface with schools? Yeah we work with K through 12 schools
02:31:12 across the country and the harms that we see on campuses are are incredibly horrifying. We're
02:31:19 seeing and this corresponds with nationally representative surveys about 46 percent of uh
02:31:24 fifth through twelfth graders self-report being cyber bullied. That's almost a triple risk in
02:31:30 suicide. We are seeing uh sexting at the sixth grade level. We are seeing 85 percent of fifth
02:31:36 graders exposed to real-life violence over social media. A lot and I could keep going and we don't
02:31:42 have time but a long list of harms. Well one of my colleagues uh edification I filed a bill back
02:31:49 after COVID that said the all the money that was left over for COVID is after a school shooting.
02:31:54 I had a school shooting in my district in 2018. Some of that money should be taken and actually
02:31:59 hiring a counselor not about grades not about those kinds of things but a counselor who would
02:32:06 go into social a social media counselor who would monitor the kids in that school if that's would
02:32:13 would y'all consider that I go to the attorney here first Miss Goldberg would you consider that
02:32:16 a violation of privacy if we were trying to monitor to try to deflect some of these problems
02:32:21 or prevent them I should say. Hiring a counselor to help with mental health seems like it would be
02:32:27 good if it's a bad counselor then who's violating privacy another story. I want to add one thing
02:32:33 to what I said earlier about the the interaction between courts and and and legislation. We're here
02:32:40 today to help people in the future and to deter bad actions. When we go to court it's because
02:32:48 something terrible has already happened and we're trying to to get justice for a family but that
02:32:53 doesn't mean we can't also be looking ahead. Right which is very good we should be. Well um I
02:32:59 appreciate y'all being here and I'm not going to take any more time because I know we have Mr.
02:33:03 Obernolte so I'm going to yield back and the gentleman from California is recognizing.
02:33:07 Thank you Mr. Chairman and thank you to our witnesses on what is a really important topic
02:33:14 to me personally. I think we're all on the same team here really at the end of the day right we've
02:33:20 got these problematic situations that have arisen on social media that I think everyone can agree
02:33:26 is not healthy and should not be permitted and needs to be stopped but then we have this idea
02:33:32 which that's in tension with that which is that we don't want to chill free speech. So we're here
02:33:39 talking about repealing section 230 to create more liability on social media platforms and this is
02:33:46 where I start to have a problem because I am it seems like the premise of repealing 230 is that
02:33:51 the world would be a better place if we just all suit each other more often and I reject that
02:33:57 premise. I'm not sure that this solves the problem because if you are a social media company I mean
02:34:04 and not to view them uncharitably but they're in business to make money right and people go to those
02:34:11 social media platforms because they enjoy the content there and so I mean I think it's pejorative
02:34:15 to say as Miss Goldberg you did that they're companies that mint money off the backs of the
02:34:21 masses. I mean that ignores the fact that we've got millions and millions of Americans that
02:34:26 enjoy using social media platforms. So you know we've got to navigate this space. So let me ask
02:34:32 you this and I'll start with Miss Tumarillo. Why is increasing liability the solution to
02:34:40 this problem? Why shouldn't we as a legislative body as someone that has authority over this space
02:34:45 as we do in this committee, why should we not just pass laws to limit the problematic behaviors? I
02:34:52 mean for example allowing people to recruit children for sex right we everyone should be
02:34:58 able to agree that's not something that should be allowed to occur. We can write laws that prevent
02:35:02 that rather than exposing more liability and and relying on the indirect route of the threat of
02:35:10 being sued. Thank you so much for the question Congressman. I think that that gets at kind of
02:35:14 the exact tension that that comes up when we talk about section 230 right. Section 230 doesn't
02:35:18 immunize platforms that commit federal crimes. That is already the truth and so many of the kind
02:35:24 of horrible things that we hear about happening online are things that are either illegal and
02:35:30 the people who do create and share that content can be held liable themselves. Not necessitating
02:35:37 the platform being held liable or they're kind of what's called lawful but awful content which
02:35:42 is first amendment protected usually. And so I think we'd be very clear which we're talking about
02:35:46 when we're talking about online harms and if we're talking about content that is illegal and can be
02:35:51 prosecuted. I mean if it's a question that law enforcement needs better education and resources
02:35:56 and collaboration to go after criminals who are using the internet. I think that's something
02:36:00 startups that in our network would be happy to talk about but I worry that section 230 does end
02:36:06 up chilling free expression because ultimately the platforms who are least equipped to deal with even
02:36:10 threats of litigation will just take user speech down and I don't know that that's to the betterment
02:36:15 of the internet overall. Right yeah that's that's the the problem that we're having. Okay so we
02:36:20 we we've kind of we're agreeing that there are cases where content is clearly unlawful and we
02:36:27 can solve this problem in other ways than just expanding liability. So let's talk about the kind
02:36:33 of the more edge cases and Mr. Berkman you brought up a few of them in your testimony. You talked
02:36:38 about eating disorder content that's had some really harmful effects on children. You brought
02:36:45 up the fact that 46 percent of teens report being cyber bullied. So let's talk about cyber bullying
02:36:51 right because this is a huge problem. So you know a teenager posts on social media I thought your
02:36:58 shirt was awful yesterday and you know that the teen that wore the shirt says hey I'm being cyber
02:37:04 bullied you know I feel very uncomfortable and you know the other teen said I wasn't trying to hurt
02:37:09 your feelings I just said your shirt is is awful. Like so how do we how do we navigate that right
02:37:13 because because this is something we can all agree cyber bullying is terrible but you know at what
02:37:18 point does it infringe on free speech? Yeah well I all of the speech we're talking about regardless
02:37:24 of whatever happens with section 230 is is protected under the first amendment first of all.
02:37:29 But okay so let me ask you specifically in a perfect world what is the social media
02:37:34 what is the the social media company's obligation with respect to that content?
02:37:42 They should be enforcing their terms of service in a responsible manner with a sufficient amount
02:37:47 of trust and safety staff and so the issue is that we see severe cyber bullying cases across
02:37:54 the country that continues to stay up and if if only we're just I don't like your shirt usually
02:37:58 it's kill yourself I hate you. Okay so you say the person that's told to kill themselves say I'm
02:38:04 being bullied and a social media company at that point should take the content down. Yep once there
02:38:08 is actual notice of cyber bullying that violates a platform's terms of service they need to take
02:38:16 that down and that's a severe risk to that child. Nate Bronstein in in Chicago was being cyber
02:38:23 bullied by hundreds of other teens throughout the metropolitan area with kill yourself messages and
02:38:30 others. The content kept expanding throughout the platform that was being circulated on. Right okay
02:38:36 well I mean again this is something that we can solve a different way than just expanding liability
02:38:41 if we wanted to make a law that says if someone asks you to take content down you have to take
02:38:45 it down if it falls into these categories we could do that we don't have to get rid of all of section
02:38:50 230. I'm out of time I want to be respectful but you know this is this is the basic problem that
02:38:55 we're having and I want to thank you for your testimony. I yield back. The gentleman yields back
02:38:58 that your no record recognizes the gentle lady from Iowa, Ms. Miller-Meeks for at least five minutes.
02:39:05 Thank you Chairman Weber and Ranking Member Matsui for holding this hearing today and I want to also
02:39:11 thank our witnesses for testifying before the subcommittee. As a member of Congress it's our
02:39:16 duty to ensure that our laws strike as I think you've heard the right balance between fostering
02:39:22 open unbiased discourse and protecting the vulnerable especially our children from the
02:39:27 dangers that lurk online. We must scrutinize whether the current section 230 framework adequately
02:39:32 safeguards our children from exploitation and exposure to harmful content while still preserving
02:39:38 the robust and free exchange of ideas that's essential to our democracy. Mr. Berkman you spoke
02:39:43 about studies that suggest a strong link between social media use and negative mental health
02:39:48 outcomes for adolescents I think you mentioned this briefly how can companies be incentivized
02:39:53 to prioritize mental health and well-being in their platform design and content moderation policies?
02:40:00 Well the main reform there is ensuring that there's some level of liability for negligent and
02:40:06 reckless design in the platforms. Insufficient parental safety controls as well and I want to
02:40:13 thank you for your co-leadership of Sammy's Law too and the platforms that are disallowing parents
02:40:20 the choice of being able to use safety software to us is a reckless decision and so putting in
02:40:27 some level of of liability there is going to change that business calculus and is going to protect
02:40:32 millions of children from these harms. Doesn't that also penalize those companies who have not
02:40:39 done that especially small businesses and entrepreneurial businesses that are just
02:40:43 coming into the marketplace? Yeah the small businesses that have well first of all on the
02:40:49 Sammy's Law piece we're talking about a negligible amount of effort as as you know in terms of of
02:40:56 ensuring that design is not negligent or reckless that should be the minimum requirement for being
02:41:03 able to come into the industry. If you're not sufficiently capitalized and you offer your
02:41:08 platform up for children and you don't have the resources to make sure it's safe that's an issue
02:41:14 and that's what current tort law protects consumers from in all other industries.
02:41:18 And it seems like we may have some laws that do offer protection. Ms. Goldberg you highlighted a
02:41:25 number of cases where social media platforms fail to protect users from exploitation and harassment.
02:41:31 Are there examples where tech companies have effectively worked within the bounds of section
02:41:35 230 to protect consumers and what proactive steps do you believe tech companies should be legally
02:41:42 required to take identify and remove content related to sexual exploitation blackmail harassment?
02:41:48 The positive stories don't make their way to my office. I hear about you know the most tragic
02:41:56 situations that happen through online platforms and if if somebody's just coming to me because
02:42:02 content didn't get moderated correctly they don't have a there's nothing to hold a platform
02:42:09 accountable. Platforms can can moderate however they they want to. I think it'd be beneficial to
02:42:15 have both when platforms have done things in a proactive way to highlight those as well too
02:42:21 because it gives us a full spectrum of what we can and cannot do. We have a tendency to overregulate.
02:42:26 Ms. Tremorello you stated in your testimony that startups with limited budgets and small teams
02:42:31 invest proportionally more in content moderation because they need their corners of the internet
02:42:38 to remain safe healthy and relevant if they want to see their user growth grow that they need to
02:42:44 survive. How might the potential repeal of section 230 affect the willingness of entrepreneurs to
02:42:49 launch new platforms and services considering the increased risk of litigation? What specific types
02:42:55 of legal challenges and associated costs do you anticipate startups would face? Thank you for the
02:42:59 question and to note you know when I talk about startups I'm not talking specifically about
02:43:04 startups that offer platforms aimed at children. I think the startups in our network that are
02:43:08 working with children populations that they know our children are very responsible and are building
02:43:12 in lots of guardrails because they know they have to. We're talking about kind of general audience
02:43:16 platforms that don't ask for driver's licenses when you sign up because they don't want to have
02:43:20 that data on you that that feels creepy to them and they don't want to have to collect it so they
02:43:24 don't necessarily know the ages of their users and they are the ones right like for every larger
02:43:30 platform complaint we have a startup in our network that's trying to disrupt that space
02:43:33 that's trying to provide a safe online dating experience. We have a startup founded by a victim
02:43:39 of sexual assault who's using her experience to make online dating safer. She needs section 230.
02:43:44 Examples like that you know they're in my testimony so I won't I won't go through all of them
02:43:49 but without section 230 if 230 were to be repealed or sunset especially at the end of next year
02:43:54 those companies would have to think very differently about hosting user content
02:43:58 which means right less innovation a less diverse discourse online because there'd be fewer places
02:44:03 to go and less expression from users. And then just very quickly would repealing section 230
02:44:12 would that benefit or preferentially treat larger companies that currently do not or at the behest
02:44:20 of government censor content? I absolutely think repealing section 230 would give larger companies
02:44:26 a leg up it would build a regulatory and legal moat around larger companies who know that they
02:44:31 can invest hundreds of millions of dollars in content moderation hire tens of thousands of
02:44:36 content moderators and use kind of their legal defense funds to survive in court whereas a
02:44:43 startup can't do any of those things and I worry that it would create a very unfair advantage for
02:44:48 large companies. Thank you all very much and I yield back. The lady yields back seeing that there
02:44:54 are no further members wishing to be recognized I would like to thank Ms. Goldberg, Mr. Burton,
02:44:59 Ms. Tumarillo, our witnesses for being here today. I ask unanimous consent to insert in the record
02:45:04 the documents included on the staff hearing documents list. Without objection that will be
02:45:09 the order. Without objection so ordered I remind members that they have 10 business days to submit
02:45:16 questions for the record and I ask the witnesses to respond to those questions as promptly as you
02:45:20 can. Members should submit their questions by the close of business on Wednesday June 5th.
02:45:25 With that objection, this subcommittee is adjourned.
Comments

Recommended