- 11 hours ago
Category
📺
TVTranscript
00:03From the most trusted journalists at Comedy Central, it's America's only source for news.
00:10This is The Daily Show with your host, Jon Stewart.
00:40Thank you very much.
00:45The opening of the show is starting to turn into, like, a wrestling open.
00:49Let's do some satire!
00:53Welcome to The Daily Show. My name is Jon Stewart. Great show for you tonight.
00:55Later on, I will be speaking with civil liberties attorney Cindy Cohn.
01:00She's going to be doing it.
01:06We're going to discuss her new book about digital privacy.
01:09And then she will show me how to post a close friend's story.
01:18I recently joined Instagram and I don't know how to do it.
01:24But as many of you know, it's the one-month anniversary of the Iran war.
01:28And as we all know, one month is the escalated threat anniversary.
01:32If the Strait of Hormuz is not immediately open for business, we will conclude by blowing up and completely obliterating
01:39all of their electric-generating plants, oil wells, and Karg Island.
01:44You better open that strait or we're going to blow you up.
01:52More than we already blew you up. We're going to double blow you up.
01:55We're going to blow you up.
01:58That's where we are at in our war.
02:12I believe we've entered what General Patton used to refer to as the human centipede portion of the war.
02:21I urge you not to look that up.
02:24And I understand why the world would like the Strait of Hormuz open.
02:30The conflict is disrupting global supply chains for other surprising and essential products.
02:36Grain, nuts, oil, saffron, dates.
02:39Paddles and pickleballs.
02:46Yeah, uh, I'm not sure how that snuck into the essential products store.
02:52But I guess we all make sacrifices in wartime.
03:07My dearest Eliza.
03:10It's been over a month since I've played, well, not tennis, kind of like tennis.
03:18It's more like, like baby tennis.
03:22It's like, well, it's bigger than ping pong.
03:24You really have to try it, Eliza.
03:26It's so fun.
03:33Acting. Acting.
03:35That was acting.
03:36Uh, what else is being blocked?
03:39It's getting harder to get food that comes with pistachios.
03:42You can't make Dubai chocolate bars.
03:48Oh, no.
03:50What will our influencers stand in line to pretend to eat?
03:55Dubai chocolate, my God, that's been an American staple for tens of days.
04:01I can't believe how the news has to frame world events to try and make Americans care.
04:07The whole region is being flattened.
04:11Innocent people are dying.
04:13Their food and fuel are in total chaos.
04:15And our news is like, if this goes on any longer, say goodbye to your stuffed crust pizza.
04:26Are Americans losing anything else?
04:29It turns out there is another key material being affected by the war in Iran, fertilizer.
04:33There could be helium shortages.
04:35Yes, the gas that's used in party balloons.
04:41Helium is a fundamental gas used in the production of advanced chip technology.
04:45You don't have to dumb it down to make us, oh, this war could be even bad for your promposals.
04:52Like, come on!
04:55For more on how the closing of the Strait of Hormuz is affecting Americans, we go live to
04:59Ronnie Chang, Jordan Klepper, and Michael Kosta, who are on the ground right now in the Strait
05:04of Hormuz.
05:06Uh, gentlemen, thanks for joining us.
05:08Uh, you know what I'm gonna do?
05:09Ronnie.
05:10Uh, Ronnie Chang.
05:12Ronnie, I'm gonna start with you.
05:14Uh, uh, what-what's the latest?
05:16Well, John, the latest is the Iranians think they can just close the Strait of Hormuz and
05:21stop us from eating pistachios.
05:24Well, I got a message for them from America.
05:30They can come and get these nuts out of my mouth!
05:33That's right.
05:37That's right, John.
05:38That's right, John, and I also have a message.
05:41I just want to add that if the Ayatollah thinks he's gonna stop Americans from getting
05:44our supply of helium, well, then I've got one thing to say.
05:52You pussy-pitches can come get it out of my mouth.
05:59That's amazing.
06:01Uh, and, uh, Jordan Klepper.
06:03Uh, yes, John.
06:08Uh, what do you have?
06:11Is there anything-is there anything else that the Strait is affecting?
06:14Uh, yes, uh, fertilizer, uh, shortage.
06:18Uh, back to you, John.
06:23Is there any message that you want to send about the fertilizer shortage?
06:28Oh, yeah, John, sure.
06:30Uh, big one.
06:31If you think you're gonna take our fertilizer, well, you're gonna have to come and take it
06:38from my hands, John.
06:42But-but-but your hand?
06:44What the f***?
06:45Eat it!
06:47Yeah, Jordan, I got no problem sending a message, all right?
06:51Hey, yo, these nuts don't rhyme!
06:55I'm not taking a bite.
06:56I-I just killed 10 million brain cells for America.
06:59Take a bite.
07:00No, I don't want to take a bite, okay?
07:01I mean, Jordan, the other guys are making the Ayatollah take things from their mouth.
07:11Does feel like you might not be as patriotic.
07:15I-I-I don't know what putting these things in our mouths proves, John.
07:20Yeah, it proves that we have the resolve to do what we have to.
07:24We die for the cause.
07:26They had luxury foods and party balloons!
07:30This is literally processed shit!
07:33I mean, this-this train is disrupting baklava.
07:36Can I have baklava?
07:37Let me do baklava!
07:37Just eat it, you baby!
07:39No!
07:40I don't want to eat!
07:41If you don't, Iran wins.
07:44Yeah.
07:46Yeah.
07:49You want Iran to win or Donald Trump?
07:53Are those my only choices?
07:55No, no!
07:57C'mon!
07:57C'mon!
07:58Okay, okay!
07:59Oh!
08:01Oh!
08:03Oh!
08:05You're happy now?
08:07You're happy?
08:07Come and get it a ride!
08:09Take the shit out of my mouth!
08:10Oh, yuck.
08:12Oh, yuck.
08:14Disgusting.
08:15You know what?
08:15You know what?
08:16In a way, this is a metaphor.
08:19Because as Americans in this quagmire with no end in sight,
08:23we are all eating shit.
08:29Yeah, but you did it for real.
08:36Yeah, it's very disgusting.
08:38Thank you, guys.
08:39It's Ronnie Chang, Michael Kosta, and Jordan Klepper, everybody.
08:43Good report.
08:44Good report.
08:45Now, meanwhile, and I thank them all three
08:49for that brave reporting that they've done.
08:53Meanwhile, back on the home front,
08:54this weekend offered stark images
08:57articulating our nation's deep political divide.
09:00As 8 to 9 million people took to the streets
09:03for the No Kings protest.
09:05Yeah, they did it.
09:07While a slightly smaller group of, let's call them heritage Americans,
09:15took to the Reagan meeting room 1A on the second floor
09:19for their Yaskings rally.
09:21If you want an object lesson on what a strong commander-in-chief looks like,
09:28take a look at the current resident of the Oval Office, President Donald J. Trump.
09:32What he's been able to do in one year is truly remarkable.
09:35Great president, greatest president of my lifetime.
09:37That guy is an energizer bunny.
09:39We'll never get another president like Donald Trump.
09:41Never.
09:41And that's why it's important that we do everything that we can
09:44to try to get him re-elected.
09:50I'm sorry, that's Reverend Franklin Graham saying,
09:54praise the Lord and f*** the Constitution.
09:58Now, look, far be it for me to question the wisdom of this year's CPAC convention,
10:04But clearly, the vibes this year were slightly off.
10:08How many of you would like to see impeachment hearings?
10:13No.
10:14That was the wrong answer.
10:22That's CPAC!
10:26That's a rookie crowd work mistake, Slap.
10:28See, they cheered for Trump's impeachment,
10:29not realizing that you were calling for the counter.
10:31But that's on you.
10:32Crowds are like dogs.
10:33They react to tone.
10:35You can get them excited about anything.
10:36Who wants to go to the vet?
10:37Who wants to get the balls cut off?
10:38Yes, you do it.
10:40Oh, you do it, James.
10:42See, in Trump land,
10:44old Donald didn't let something as trivial as a war
10:47keep him from doing his rounds
10:49at the golf course
10:50and at a Saudi investment meeting in Miami
10:52because, God forbid,
10:54during a war,
10:55he let the precipice of World War III
10:57yuck his yum in any way.
11:00That was Trump's weekend.
11:01And I find it so astounding
11:03that this nuclear-armed man-baby
11:06doesn't seem to have any understanding
11:09of the confusion and anxiety
11:11that his ill-planned adventure in Iran
11:14is causing this country.
11:16He's just trucking along
11:17like it's any old episode of The Apprentice.
11:20Here is Trump last night on Air Force One.
11:23Watch him try and focus
11:24for more than two sentences on the war.
11:26He started before veering off
11:29into what really matters.
11:30It truly is regime change.
11:33And regime change is an imperative,
11:35but I think we have it automatically.
11:38I did something today.
11:41We just got these in from the architects.
11:43A lot of people are talking
11:45about how beautiful the ballroom is.
11:47For 150 years,
11:50they've wanted to build a ballroom
11:53at the White House.
11:54Here's another view.
11:55This is coming from right opposite
11:58the Treasury Building.
12:00Here's a view on the south
12:01with the porch.
12:03This is a view of it from the north,
12:06and there'll be Corinthian,
12:07which is considered the best,
12:10most beautiful by far.
12:14I know what you're thinking.
12:16How do you get something that size on a plane?
12:25But I guess he's allowed.
12:27It's his emotional support ballroom picture.
12:31Now, you may say,
12:32what prompted this presentation
12:36about the ballroom?
12:37Well, it turns out
12:38this incredibly long-winded
12:40visual aid-assisted timeshare presentation
12:42was spurred by
12:43a critical New York Times article
12:46that had the gall to suggest
12:48that sometimes
12:49Trump's plans
12:50don't hold up to scrutiny.
12:52The Times had architects
12:54analyze plans for the ballroom.
12:56They pointed out design flaws
12:58like staircases
12:59that seemed to lead nowhere.
13:03Oh, okay,
13:05woke New York Times.
13:08When M.C. Escher does that,
13:11it's art.
13:13Oh, but you put it
13:15on an actual building,
13:16and suddenly it's,
13:18this doesn't make any sense.
13:20And I appear to be trapped.
13:25For the ballroom,
13:27Trump will pull an all-nighter
13:29for a point-by-point rebuttal.
13:32For the war,
13:34literally doesn't have
13:35the focus to answer
13:36one question
13:37about the dire consequences
13:39of his actions
13:40on his favorite network.
13:42I think it is alarming
13:45that we have not
13:47been able to see
13:48or hear from any
13:49of the Iranian people.
13:51And I think there is
13:52some general worry
13:53about them.
13:54Do you have any insight
13:55as to how they are doing?
13:57Do they have drinking water?
13:59Do they have food?
14:00Wow,
14:01straightforward question.
14:02It asked the president
14:03to put himself
14:04in the shoes of those
14:05purportedly we're trying
14:06to liberate
14:06and the suffering
14:08they may be going through.
14:10The Iranian people
14:12are hurting, sir.
14:13Do they have food?
14:14Right, I do.
14:15It's upsetting.
14:16I do.
14:17But first,
14:18you remember
14:18when we had lunch
14:19years ago
14:20in the base of Trump Tower
14:21when it was
14:22a brand new building?
14:26So the point is,
14:28the Iranians
14:29might not have food,
14:29but you remember
14:30you and I
14:31about 12, 13 years ago
14:33had a club sandwich?
14:34Remember that?
14:36It was a long time ago,
14:38yes.
14:38A long time ago
14:39and you haven't changed.
14:40You have not changed.
14:42Now, I'm not allowed
14:42to say this.
14:43It's the end
14:43of my political career,
14:45but you may be
14:46even better looking,
14:47okay?
14:47So I don't know
14:47what you're doing.
14:49But I will not say that.
14:49Fox hair and makeup
14:50has a lot to do with that.
14:51I will not say that
14:51because that will end
14:52my political career.
14:53You know,
14:54you're not allowed
14:54to say a woman's
14:55beautiful anymore.
14:56You know,
14:57it's funny, Dana.
14:58Your question about
14:59the suffering
15:00of the Iranian people
15:01has somehow made me horny.
15:03I don't know why.
15:05I guess you asked me
15:06do they have food,
15:07do they have water,
15:08and the whole time
15:09I'm thinking like,
15:10what a piece of ass.
15:12What a delightful
15:13piece of ass.
15:14You are.
15:15You remember that,
15:16Dana?
15:16Remember how I made you
15:16uncomfortable at lunch?
15:20You remember?
15:21I do that.
15:22I do that a lot to women.
15:23I make them uncomfortable.
15:25What was your question?
15:26I don't remember
15:26your question.
15:28The war,
15:30other than the occasional tweet,
15:32doesn't seem to occupy
15:33any space in Trump's brain.
15:35And by the way,
15:36not just for TV.
15:37On Friday,
15:38he addressed a room full
15:39of Saudi investors
15:41who you would think
15:42might be very concerned
15:43about the bombings
15:44in their neighborhood.
15:46But he wants to let them know
15:47we don't have to talk
15:48about that at all.
15:49I am asked to take
15:51a few questions,
15:52and unlike other politicians,
15:54they would like
15:55the questions screened.
15:56I don't ask for screening
15:58of the questions.
15:59You can ask me
15:59anything you want.
16:01You can talk sex.
16:02You can do whatever
16:03the hell you want.
16:20We can ask about sex.
16:28Okay.
16:29Okay.
16:40Whose name is that?
16:42Is that your f***ing name?
16:43Whose name is that?
16:46Or you meant like
16:47general sex shit,
16:48like Loveline shit.
16:50Sorry.
16:55You know, honestly,
16:56his leering behavior
16:58is less commander-in-chief
17:00at war and more grandpa
17:02who's lost his filter
17:03in public.
17:05Instead of assuaging
17:06a nervous nation,
17:07he's just embarrassing
17:07the whole family
17:08at dinner going,
17:09hey, do you see
17:10our waitress is a busty one?
17:12Huh?
17:16Just like your grandma was.
17:21But like any good monarchy,
17:24Trump's loyal henchmen
17:25assure you that it's really
17:26the opposite
17:27of everything you're seeing.
17:29He has encyclopedic
17:31molecular knowledge.
17:33I was on the
17:34airplane with them
17:35and we were sitting
17:36across the table
17:37from each other.
17:38We started talking
17:39about Syria
17:41and he got a placemat
17:44and he turned it
17:45on its back
17:45and then he took a Sharpie
17:47and he drew
17:47a perfect map
17:49of the Mid-East
17:50and then he put
17:51the troop strength
17:51of every country
17:54on every border
17:56on that map.
18:01has anyone thought
18:03about filming that?
18:05Let us see that
18:07because that's not
18:08what we see.
18:09What we see is a president
18:10four weeks into a war
18:11he has yet to fully explain
18:12with objectives
18:13he has yet to fully define
18:15only displaying
18:16molecular knowledge
18:17in a cabinet meeting
18:18of his own pen preferences.
18:20So I came here
18:21and they have
18:21thousand dollar pens
18:22and you know
18:23you hand pens out
18:24you're signing
18:24and you hand them out
18:25you're handing them
18:26to all these people
18:27beautiful pen
18:27ballpoint
18:29thousand
18:29it was gold
18:30silver
18:30I'm handing out
18:31to kids
18:32that don't even know
18:32what they're doing
18:33what is this mommy?
18:34So I take it out
18:35and I saw it
18:36and there's no ink
18:37and I got all you people
18:38looking
18:39and you say
18:39there must be
18:40something wrong
18:41with pen right here
18:42this pen is an interesting
18:43example
18:43it's the same thing
18:44here's what I called
18:45the guy
18:46I said
18:47I'd like to use your pen
18:48but I can't have a grey thing
18:49with a big S on it
18:51saying Sharpie
18:52he said
18:52why can't you make a dice
18:53he said
18:54what can you do
18:55he said
18:55I'll paint it black
18:56sir if you like
18:57in gold
18:59almost real gold
19:04that's a cabinet meeting
19:06during a f***ing war
19:09don't tell us
19:09what to use
19:10to draw the map
19:11just draw the f***ing map
19:13you know
19:14all we keep hearing
19:15from this administration
19:16is why the American people
19:17have to sacrifice
19:18for Trump's vision
19:20of America's greatness
19:21that these temporary
19:23disruptions
19:24are just part of the process
19:25and why can't we
19:26be patriots
19:27we have to be patient
19:29we have to suck it up
19:30whether it's high gas prices
19:32or whimsical tariff inflation
19:34or draconian ice rates
19:35or temporary
19:36bill of rights
19:37suspensions
19:38it's on us
19:38to understand
19:40but Trump gets to be
19:41just the same old
19:42ain't I a stinker
19:44utterly self-absorbed
19:46remember when I used to
19:47want to f*** hot girl's
19:49twat self
19:50can you imagine
19:51any other president
19:53let alone a wartime president
19:54being this f***ing indulgent
19:56and there were
19:57a thousand dollars apiece
19:59beautiful pen
20:00ballpoint
20:01a thousand
20:02who's gold
20:03silver
20:03gorgeous
20:04but I'm handing out
20:05to kids that don't even
20:06know what they're doing
20:06what is this mommy
20:08it's kids
20:08they're getting a pen
20:10for a thousand dollars
20:10they have no idea
20:11what it is
20:14now
20:15to be fair to Trump
20:16pen technology
20:17was at FDR's time
20:19quite primitive
20:21but Trump is the guy
20:22you want
20:23to be your king
20:25that's who you want
20:26it's important
20:26that we do everything
20:27that we can
20:28to try to get him
20:29re-elected
20:30really
20:32this is the guy
20:34you'll break
20:34the constitution for
20:36the American
20:37monarchy
20:39begins with this guy
20:43this is the line
20:44we face in this country
20:45no kings
20:46versus kings
20:48but a word of caution
20:50generally with monarchies
20:52the first guy
20:54is the best guy
20:56it's the guy
20:57that's so good
20:58it makes the people
21:00want a king
21:01a Charlemagne
21:03an Alexander
21:06a Ralph
21:10but pretty soon
21:12corruption
21:12and inbreeding
21:14take their toll
21:15on the monarchy
21:16and turn your king
21:18into this guy
21:21old Charles II
21:23yeah
21:24that's his real head
21:29he liked cheese
21:32it was his favorite food
21:36so go ahead
21:37but just understand
21:40we're starting
21:40a lot closer
21:41to Charles
21:43than Charlemagne
21:45when we come back
21:46Cindy Cohen
21:46will be joining me
21:47in the studio
22:01welcome back to the Daily Show
22:02my guest tonight
22:03she's the executive director
22:04of the Electronic Frontier Foundation
22:06and author of the new book
22:07Privacy's Defender
22:09my 30 year fight
22:10against digital surveillance
22:11please welcome to the program
22:12Cindy Cohen
22:28very very nice to see you
22:30I want to point out
22:31for the people at home
22:33who can't see
22:33she is wearing a
22:34let's sue the government
22:36t-shirt
22:41we have great designers
22:42at EFF
22:43they made this special
22:44what is EFF
22:45and what is that
22:47the Electronic Frontier Foundation
22:48is the world's biggest
22:50and oldest
22:50digital rights organization
22:52we were founded in 1990
22:53our basic goal
22:55is to make sure
22:55that when you go online
22:56your rights go with you
22:58so we do things
22:59around free speech
23:00around privacy
23:01and around trying
23:02to build an internet
23:03that stands with you
23:05rather than against you
23:06how did you know
23:09it's so prescient
23:10the idea
23:10I'm thinking back to 1990
23:13and the internet was
23:14you had a little AOL disc
23:17yeah
23:17maybe 95 or 96
23:19but whatever that
23:20right
23:20it might have been
23:21it was just like
23:22alt binary
23:24message boards
23:25yeah
23:25how did you all
23:28have the prescience
23:29to see that
23:30this could be turned
23:31into a tool of
23:33surveillance
23:34and of
23:36government repression
23:37yeah
23:38I mean
23:38the founders
23:39of the EFF
23:40the John Perry Barlow
23:41John Gilmore
23:42and Mitch Kapoor
23:43were involved
23:45in some
23:46early FBI missteps
23:48around the internet
23:49and they
23:50they worked with the FBI
23:51or they were like
23:51no
23:51they were like
23:52questioned by the FBI
23:54and there were a lot
23:55of raids going on
23:56and
23:57there was
23:58they weren't raided
24:00but they were asked
24:00a bunch of questions
24:01and they realized
24:02that the government
24:02didn't really understand
24:03how the internet
24:04was working either
24:05at least the FBI
24:05didn't
24:06and they thought
24:07we need to start
24:08an organization
24:09that's gonna think
24:10about how our
24:12civil liberties
24:13work in the
24:14digital age
24:15and a couple
24:17of them are
24:17pretty geeky
24:18and they realized
24:19that the internet
24:19we were building
24:20was not very secure
24:22it didn't have
24:23a lot of privacy
24:24or security in it
24:25and the first fight
24:27we did was
24:27to try to free up
24:28a technology
24:29called encryption
24:30that now
24:31is the reason
24:32that we have
24:33the privacy
24:34and security
24:34we have online
24:35most of it
24:36this is your first
24:37lawsuit
24:38and you joined this
24:39because you're
24:39you're a lawyer
24:41and you're interested
24:42in this space
24:43or you had met
24:44these guys
24:44how did your
24:45involvement
24:45I met them
24:46kind of by chance
24:47honestly they showed
24:48up at a party
24:48at my house
24:50some of them
24:51but I was
24:53this is all
24:53sounding very
24:54where did this
24:55take place
24:56San Francisco
24:58now I'm understanding
25:01there's a tub
25:02of Kool-Aid
25:02somebody put
25:03something
25:03it got a little
25:04messy
25:04no I was
25:06a human rights
25:07lawyer
25:08I had done
25:08human rights
25:09I've been in
25:09Geneva
25:10and I came
25:10back to San
25:11Francisco
25:11to try to start
25:12a regular law
25:13practice
25:13and I met
25:14some of these
25:15guys
25:15kind of really
25:16by chance
25:17and they were
25:18already online
25:19you know
25:20this was 1990
25:201991
25:21they were
25:22online at this
25:23very early
25:24internet
25:24before we had
25:25the world wide
25:26web
25:26but they were
25:27doing really
25:28cool things
25:29they were
25:29collaborating
25:30across distance
25:31they were having
25:31these long
25:32chats
25:32stuff that we
25:33take for granted
25:34now
25:34but it was
25:35pretty magical
25:35back then
25:37it's really
25:38interesting too
25:38that they
25:39and you
25:40decided
25:40because I
25:41remember that
25:42time as a
25:42gold rush
25:43so the 90s
25:45as it started
25:45this was when
25:46like every
25:47internet startup
25:48anybody that you
25:49knew that was
25:49even adjacent
25:50if they had a
25:51word processor
25:52they were a
25:53startup
25:53and they wanted
25:54VC funding
25:55and they were
25:56gonna you know
25:57make a ton
25:58of money
25:58and you
26:00and this
26:00group decide
26:01we're not
26:02going in that
26:02direction
26:02we're gonna
26:03go in the
26:05protecting
26:05citizens rights
26:07direction
26:08where the real
26:09money is
26:09why would you
26:10my stock
26:12options are so
26:13great
26:13yeah
26:15I mean look
26:16I think that
26:16there are people
26:17who you know
26:18plenty of people
26:19in the early
26:19internet were
26:20interested in
26:20making money
26:21I don't have
26:21any judgment
26:22about that
26:22but we were
26:23thinking about
26:24how could this
26:25be a technology
26:25that supports
26:26people
26:27how it could
26:27be a technology
26:28of freedom
26:28how it could
26:30give people
26:31you know
26:31more power
26:32against the
26:33powerful forces
26:34in their lives
26:34and they were
26:35again very
26:36forward looking
26:37and they
26:37created an
26:38organization
26:39to try to
26:40do that
26:41I don't know
26:42where it comes
26:42from
26:43for me
26:43I've always
26:44just wanted
26:44to make the
26:45world better
26:45and then I
26:46found you know
26:46I'm not a
26:47technical person
26:48but I kind
26:48of landed
26:49in this place
26:50in this time
26:51and was able
26:52to do it
26:52in 90
26:53and then so
26:54you're working
26:55over this
26:55sort of decade
26:56and you get
26:57this great
26:57encryption case
26:58and you make
26:58it so that
26:59people have
26:59a right
27:00to encrypt
27:00their data
27:01and these
27:01kinds of
27:02things
27:02but then
27:039-11 happens
27:04and that
27:05really supercharges
27:07the surveillance
27:08state
27:08yeah
27:09after 9-11
27:10the Patriot Act
27:11both the Patriot Act
27:12which they did
27:13above board
27:14and a lot of
27:15stuff that we
27:15later discovered
27:16as a country
27:16they were doing
27:17secretly
27:18these metadata
27:19these hoovering
27:20up of all
27:21of our data
27:21and putting it
27:22through process
27:23yeah
27:23tapping into
27:24the internet
27:24backbone
27:25collecting metadata
27:26collecting telephone
27:27records
27:28some of this
27:29you might be able
27:30to read in
27:31the Patriot Act
27:31but most of it
27:32was really
27:33under
27:34secret interpretations
27:35and things like that
27:36but they had
27:38really flipped
27:38from the idea
27:40that you know
27:40you needed
27:42probable cause
27:42you needed
27:43a warrant
27:43you needed
27:44individualized suspicion
27:45to doing
27:46mass surveillance
27:46first
27:47and figuring
27:48out what
27:48they need
27:49second
27:49and that
27:51has only
27:51increased
27:52as the
27:52technology
27:53has increased
27:54was it
27:55that they
27:55they had
27:56decided to
27:57treat
27:57virtual world
27:59as separate
28:00from physical
28:01world
28:01in other words
28:02the rules
28:03of like
28:03the fourth
28:04amendment
28:04might not
28:05apply to
28:06your digital
28:06life
28:07in the way
28:08that it
28:08would apply
28:09to your
28:09house
28:09yeah
28:10although
28:11they were
28:11using some
28:12earlier stuff
28:13that's in
28:13the non-digital
28:14world
28:14this idea
28:15that
28:15metadata
28:15isn't
28:16protected
28:16but the
28:17content
28:18of
28:18metadata
28:18meaning
28:19uh
28:20like
28:21the date
28:21and time
28:22and location
28:23and the other
28:23things that go
28:24along with
28:25your digital
28:25life
28:26correct
28:26but
28:26they were
28:27using an
28:28analogy
28:28from envelopes
28:29and letters
28:30in you know
28:31the actual
28:32physical world
28:33but in the
28:34digital world
28:35the metadata
28:35is much
28:36more powerful
28:37it can tell
28:37you a lot
28:38there's a
28:38very famous
28:39quote
28:39you know
28:40from a member
28:40of the NSA
28:41saying
28:41we kill
28:42people
28:42with
28:42metadata
28:43right
28:43because
28:44when you
28:45are able
28:46to collect
28:46it at
28:46scale
28:47and analyze
28:48it
28:48you can
28:49do a lot
28:49more with
28:50it
28:50than kind
28:51of the
28:51old school
28:52envelopes
28:52and
28:53you know
28:54I don't
28:54remember
28:55anybody
28:55saying
28:55we kill
28:56people
28:56with
28:56envelopes
28:57it
28:57hasn't
28:57happened
28:57and I
28:58know
28:58it
28:58hasn't
28:58happened
28:59but
28:59so
29:00this is
29:00one of
29:00the
29:01things
29:01they took
29:01this
29:01distinction
29:02that is
29:02actually
29:03an
29:03offline
29:03distinction
29:04that may
29:05have made
29:05some sense
29:05I'm not
29:06sure
29:06it did
29:06but in
29:07the online
29:07world
29:08it really
29:08has eroded
29:09our privacy
29:10and it's
29:11being used
29:11in ways
29:12now that
29:12are increasingly
29:13visible
29:14right
29:14I mean
29:15there's testimony
29:16in Oregon
29:16just last month
29:17from an ICE
29:18agent saying
29:18they used
29:19a Google
29:19like thing
29:20to identify
29:21where
29:22people were
29:23that they
29:23ended up
29:24doing an ICE
29:25raid
29:25so
29:27that's
29:27and the
29:27company
29:28Palantir
29:28is very
29:28proud
29:29of the
29:29fact
29:29that
29:29they're
29:30sending
29:32metadata
29:34to accomplish
29:35those things
29:36that these
29:36large tech
29:37companies
29:38are working
29:39with the
29:40company
29:40so
29:40now
29:41is your
29:42fight
29:42are you
29:42fighting
29:43two
29:44behemoths
29:45are you
29:45fighting
29:45the government
29:46and also
29:48the tech
29:49companies
29:49absolutely
29:50I mean
29:51we kind
29:51of always
29:51were
29:52right
29:52if you
29:52think
29:53about
29:53the
29:53post
29:539-11
29:53surveillance
29:54they didn't
29:55come to
29:55us
29:55and ask
29:56us
29:56for our
29:56telephone
29:56records
29:57right
29:57they went
29:57to the
29:57telephone
29:58companies
29:58that's
29:59where
29:59they
29:59got
29:59it
29:59so
29:59they've
30:00always
30:00gone
30:01through
30:02the
30:02companies
30:02that
30:03provide
30:03us
30:03services
30:05in order
30:05to get
30:06information
30:06but now
30:07the number
30:08one purchaser
30:08of information
30:09from data
30:10brokers
30:10is the
30:11government
30:11and you
30:12know
30:12FBI
30:13Director
30:13Kash Patel
30:14just said
30:15to Congress
30:15we're going
30:16to keep
30:16doing it
30:17so
30:18it was
30:18something
30:19they were
30:19doing a
30:19little of
30:20and now
30:20they're
30:20really
30:21supercharging
30:22it
30:22and doing
30:22it
30:22at a
30:23level
30:23that's
30:23much
30:23greater
30:24than
30:24it
30:24was
30:24before
30:24but
30:25you're
30:25right
30:25we
30:25don't
30:26really
30:26have
30:26the
30:26luxury
30:27of a
30:27choice
30:27about
30:28whether
30:28we care
30:32they work
30:32hand-in-hand
30:33and kind
30:33of always
30:34have
30:34how
30:35receptive
30:36have the
30:37courts
30:37been
30:38to the
30:39arguments
30:39that
30:40people's
30:41privacy
30:42is
30:42part and
30:43parcel
30:43to their
30:44security
30:45I mean
30:47good and
30:47bad
30:48I think
30:49we've got
30:49a Supreme
30:50Court
30:50that's
30:50somewhat
30:50hostile
30:51to the
30:51idea
30:51that
30:52privacy
30:52is a
30:52harm
30:53at all
30:53but we
30:54see a
30:54lot
30:55of
30:55lower
30:55courts
30:55being
30:56courageous
30:57I mean
30:57in many
30:58other issues
30:58we're
30:59seeing
30:59the same
31:00dynamic
31:00where
31:01we've
31:02got
31:02a lot
31:02of
31:02lower
31:03courts
31:03who are
31:03worried
31:03about
31:04people's
31:04privacy
31:04both
31:05in the
31:05commercial
31:07context
31:07and in
31:08the law
31:08enforcement
31:09context
31:09we've
31:10been
31:10chipping
31:10away
31:11at this
31:11post
31:129-11
31:12surveillance
31:13we've
31:13made a
31:13lot
31:13of
31:14progress
31:14over
31:14time
31:14not
31:15kind
31:15of
31:15in
31:15a
31:15one
31:16fell
31:16swoop
31:17but
31:17chipping
31:17away
31:18and
31:18chipping
31:18away
31:19and
31:19Congress
31:19has
31:20stepped
31:20in
31:20and
31:20done
31:20a
31:20little
31:21they
31:21need
31:22to
31:22do
31:22a
31:22lot
31:22more
31:23and
31:23there's
31:23a
31:23big
31:33we're
31:34continuing
31:34the fight
31:34we have
31:35to
31:35continue
31:35the fight
31:36but
31:36we've
31:36also
31:36come
31:37quite
31:37a ways
31:38from
31:38where
31:38we
31:38were
31:38right
31:39after
31:399-11
31:39where
31:40all
31:40this
31:40stuff
31:40was
31:40secret
31:41and
31:41massive
31:41what
31:42do
31:42you
31:42want
31:42people
31:42to
31:43know
31:43how
31:43are
31:43we
31:43culpable
31:44in
31:44the
31:45dismantling
31:45of
31:46our
31:46own
31:47security
31:47and
31:48by
31:48that
31:48I
31:48mean
31:48I
31:49don't
31:50understand
31:51at
31:52all
31:52I
31:53just
31:53click
31:54sure
31:54like
31:55whenever
31:56it
31:56always
31:56comes
31:56on
31:57like
31:57you
31:57know
31:57is
31:58it
31:58okay
31:58if
31:58we
31:58try
31:59and
31:59I'm
31:59just
31:59like
31:59I
31:59just
32:00want
32:00to
32:00buy
32:00shoes
32:01like
32:01so
32:02I
32:02just
32:02click
32:03and
32:03I
32:03do
32:04things
32:04and
32:04I'm
32:04just
32:05I
32:05don't
32:06know
32:06what
32:06I've
32:07given
32:07up
32:07yeah
32:08I
32:08really
32:08think
32:09that's
32:09blaming
32:09the
32:09victim
32:10you're
32:10the
32:10victim
32:10here
32:11this
32:11is
32:11cooked
32:12this
32:13is
32:13not
32:13set
32:14up
32:14so
32:14you
32:14have
32:14a
32:14real
32:15choice
32:15I
32:15don't
32:16thank
32:16you
32:17so
32:17we
32:18need
32:18to
32:18fix
32:18this
32:18but
32:19we
32:19need
32:19to
32:19fix
32:19it
32:19with
32:19law
32:20with
32:20policy
32:20maybe
32:21with
32:21better
32:21technologies
32:22but
32:22like
32:22I
32:23think
32:23that
32:23the
32:23idea
32:24that
32:24individuals
32:25should
32:26bear
32:26responsibility
32:27for this
32:28it's
32:28pretty
32:28down
32:29far
32:29on
32:29the
32:29list
32:30of
32:30things
32:30because
32:31it's
32:31not
32:31like
32:32we
32:32have
32:32other
32:32options
32:33right
32:33I
32:34mean
32:34we
32:34do
32:34for
32:34some
32:34things
32:35I
32:35think
32:35it's
32:35great
32:35if
32:36you
32:36want
32:36to
32:36use
32:36an
32:36end
32:36to
32:37end
32:37encrypted
32:37tool
32:37like
32:38signal
32:38rather
32:38than
32:39you
32:39know
32:39a
32:40tool
32:41that's
32:41not
32:41there
32:42are
32:42tools
32:42that
32:42can
32:42help
32:43you
32:43but
32:44I
32:44don't
32:44think
32:44that
32:45you
32:45should
32:46be
32:46blamed
32:46because
32:47the
32:47game
32:47is
32:48rigged
32:48against
32:48you
32:48on
32:49a lot
32:49of
32:49these
32:49privacy
32:50things
32:50and
32:51you
32:51know
32:51the
32:51original
32:52I think
32:53one of
32:54the
32:54original
32:54sins
32:54of the
32:55internet
32:55is
32:55everything
32:55is
32:56these
32:56little
32:56click
32:56wraps
32:57where
32:57I
32:57think
32:57somebody
32:58did
32:58a
32:58study
32:58it
32:59would
32:59take
32:59you
32:5937
32:59years
33:00to
33:00read
33:00all
33:00the
33:01terms
33:01of
33:01service
33:01that
33:02you
33:02agree
33:02to
33:02in
33:02an
33:03ordinary
33:03internet
33:04experience
33:05that's
33:05crazy
33:06because
33:06I'm
33:06halfway
33:07through
33:08it's
33:09not
33:09fair
33:09and
33:10it's
33:10not
33:10right
33:10we
33:10have
33:11to
33:11fix
33:11this
33:11with
33:11something
33:12other
33:12than
33:12personal
33:13responsibility
33:13so
33:14who
33:14who
33:15bears
33:15that
33:15responsibility
33:16then
33:16who
33:17is
33:18if
33:18I'm
33:19the
33:19victim
33:19who
33:20is
33:21the
33:21party
33:21that
33:21I'm
33:22going
33:22after
33:23am
33:23I
33:23going
33:23after
33:24Palantir
33:25am
33:25I
33:26going
33:26after
33:26you
33:26know
33:26they
33:26just
33:27had
33:27this
33:27big
33:27lawsuit
33:28where
33:29I think
33:30Meta
33:30was held
33:31responsible
33:31for
33:32certain
33:33so
33:34they get
33:35held
33:35responsible
33:35for that
33:36or
33:37is it
33:37that
33:38what am
33:38I
33:38trying
33:39to
33:39prevent
33:39am
33:39I
33:39trying
33:39to
33:40prevent
33:40the
33:40government
33:41from
33:42getting
33:42access
33:43to
33:44all
33:45of
33:45this
33:45data
33:45that
33:46should
33:46be
33:46proprietary
33:47for
33:47me
33:48am
33:48I
33:48trying
33:49to
33:49keep
33:49Meta
33:50and
33:50X
33:51and
33:52Instagram
33:52and
33:53all
33:53those
33:53other
33:53mega
33:54corporations
33:55or
33:55Palantir
33:55from
33:56being
33:56able
33:56to
33:56or
33:57even
33:57AI
33:58look
33:59here's
34:01what's
34:01crazy
34:01so
34:02AI
34:02basically
34:03runs
34:03on
34:04it's
34:04basically
34:05strip
34:05mining
34:06the
34:06totality
34:07of
34:07human
34:08existence
34:08it
34:09basically
34:10takes
34:10everything
34:11that we
34:11have
34:11ever
34:12accomplished
34:13through
34:13chemistry
34:14and poetry
34:15and art
34:16and music
34:16and it's
34:17fed into
34:18it
34:18and it
34:19gets
34:20to take
34:20everything
34:21that we
34:22are
34:22almost
34:23our
34:23essence
34:23and then
34:24if you
34:24ask
34:25open
34:26AI
34:26well
34:27what are
34:27you
34:28doing
34:28with
34:28it
34:28they
34:29go
34:29that's
34:29proprietary
34:30well
34:31so
34:31there's
34:31a few
34:31things
34:32there
34:32and
34:32there's
34:33a lot
34:33going on
34:34in that
34:34question
34:34but
34:35I'm
34:35victim
34:36I'm
34:36sorry
34:36you are
34:37the
34:37victim
34:37and the
34:38first
34:38thing
34:38that you
34:38need
34:39to say
34:39is
34:39I
34:39want
34:40a
34:40comprehensive
34:40privacy
34:41law
34:41I
34:41want
34:42a
34:42law
34:43that
34:43limits
34:44when
34:44what
34:45these
34:45companies
34:46can
34:47track us
34:48just
34:49limits it
34:50we can
34:50do this
34:50we don't
34:51need a
34:51surveillance
34:52business
34:52model
34:55from the
34:56mountain
34:56with stone
34:57tablets
34:57it says
34:57the only
34:58way to
34:58make money
34:58on the
34:59internet
34:59is by
34:59spying
35:00on everyone
35:00so that's
35:01the good
35:01news
35:01we can
35:02do better
35:02to be fair
35:03wearing those
35:04meta glasses
35:04he was
35:05recording
35:05because he
35:06wanted to know
35:06who people
35:07were
35:07but I
35:08think that
35:09there's that
35:09I also
35:10think
35:10so a
35:11comprehensive
35:11privacy
35:12law
35:12that says
35:13when information
35:13is collected
35:14by you
35:14from you
35:15for one
35:16purpose
35:16it can't
35:17be used
35:17for another
35:18purpose
35:18without
35:19a lot
35:20more
35:20than we
35:21than a click
35:22rap here
35:22and a very
35:23very high
35:24standard
35:24and one
35:25of the
35:25things that
35:25can't be
35:25done
35:26is just
35:26handed to
35:26the government
35:27without a
35:28warrant
35:28who fights
35:29that the
35:29government
35:29or the
35:30tech company
35:31they both
35:31do we got
35:31to take
35:32them both
35:32on but
35:32I think
35:33we we
35:34have to
35:34otherwise
35:35we're you
35:36know we
35:36can lose
35:36now or we
35:37can fight
35:37and lose
35:38later I
35:38think we
35:39need to
35:39fight
35:39why is
35:40this
35:41technology
35:41so difficult
35:42for the
35:42government
35:43to wrap
35:43its arms
35:43around you
35:44know there
35:45were very
35:45clearly
35:46guardrails
35:46put into
35:47place for
35:47radio and
35:48television
35:48and the
35:49airwaves
35:50and it
35:50was considered
35:50a public
35:51good and
35:51it was a
35:51certain
35:52utility and
35:53there were a
35:54lot of
35:54responsibilities
35:55associated with
35:56it why
35:57has this
35:57been so
35:58much more
35:58difficult to
36:00wrap our
36:00arms around
36:00is it because
36:01it's faster
36:02no it's
36:03because it's
36:03not I mean
36:05the airwaves
36:05are regulated
36:06because there
36:07was a minimal
36:07amount of
36:08them that's
36:08why cable
36:08doesn't have
36:09nearly the
36:09regulation that
36:10the broadcast
36:11oh bless
36:12your heart
36:12that is
36:12correct
36:13yeah I
36:14mean so
36:15but the
36:15internet
36:16wasn't
36:16isn't
36:17it isn't
36:18limited
36:18it doesn't
36:19have limited
36:20you know
36:20broadcast airwaves
36:21or things like
36:22that so it
36:23it could let
36:23everybody speak
36:24I think that's
36:25a good thing
36:25and I especially
36:26think that with
36:27the current
36:27regulators we
36:28have in power
36:29we should be
36:30very careful
36:31about arguing
36:32for a new
36:34regulatory
36:35structure over
36:36the internet
36:36I think that
36:37I'd like to see
36:38a comprehensive
36:39privacy law
36:39there's plenty
36:40of laws that
36:40we could pass
36:41but I'm not
36:42sure granting
36:44you know
36:45creating a second
36:46Brendan Carr
36:46for the internet
36:47is good for
36:48the internet
36:49right right
36:50well it's hard
36:52because you know
36:54what you hope
36:55is is that
36:56regulation is
36:57written you
36:58know the
36:58difficulty
36:58sometimes with
36:59government is
36:59they write
37:00regulation and
37:01then the courts
37:02who are textualists
37:02like for instance
37:03there's these
37:03immigration cases
37:04that are now
37:05rolling through
37:06the courts
37:06and the idea
37:07for immigration
37:08is the phrase
37:10arrive on
37:11America's shores
37:13and they're all
37:13arguing about
37:14does that mean
37:14you're first in
37:15line at the
37:16port of entry
37:16does that mean
37:16if you're fifth
37:17in line
37:18is that okay
37:18and it's really
37:20a failure of
37:21the imagination
37:22of Congress
37:23to write these
37:24laws
37:24but we have to
37:25guard against
37:26these false
37:27choices
37:27I would think
37:29of I don't
37:30want to
37:31regulate
37:33harm reduction
37:34because I'm
37:35worried that
37:36the government
37:37may not be
37:38what I like
37:39if they misuse
37:40it
37:40well I mean
37:41I think that's
37:42right fully
37:43I don't think
37:44it's a regulate
37:45or not kind
37:45of choice
37:46but I think
37:46you do need
37:47to think
37:47about what
37:48you're regulating
37:49and why
37:50and who you're
37:50empowering
37:51and a lot
37:52of the things
37:52that we see
37:53coming out
37:53of Congress
37:54right now
37:54that are
37:55regulation
37:56would either
37:56entrench
37:57the current
37:58tech giants
37:58and make it
37:59impossible
37:59for somebody
38:00else to start
38:01a different
38:02kind of social
38:03network
38:03talk about that
38:04what do you
38:05mean by that
38:05I mean
38:06some of the
38:06stuff that
38:07OpenAI
38:07said they wanted
38:08to do for AI
38:09that was regulatory
38:10basically would mean
38:11that we could
38:11never have
38:12civic AI
38:12we could
38:13never have
38:13a you know
38:14community
38:15kinds of
38:17they would
38:18monopolize
38:19the space
38:19because they
38:20would have
38:21the power
38:21to build
38:23in the loopholes
38:24or fight
38:24regularly
38:24or do any
38:25of those things
38:26but don't they
38:26really have
38:27I mean
38:27doesn't that
38:27exist now
38:28well it exists
38:29now but we
38:29could make
38:30it worse
38:30right
38:30we don't
38:31double down
38:32on that
38:32so here's
38:33an interesting
38:33one
38:33I'm curious
38:34what you
38:35would say
38:35about
38:37should we
38:38hold
38:38social media
38:39companies
38:40liable
38:41for harms
38:42or do you
38:43consider that
38:44where does that
38:45fit on your
38:46shouldn't restrict
38:47free speech
38:47with should
38:49protect
38:49Americans
38:50from harm
38:50I mean
38:51I think that
38:52one of the
38:52worries that I
38:53have on the
38:53regulatory
38:54on the legal
38:54side
38:55is that
38:56the Republican
38:57definition of
38:58harm
38:58and the
38:58Democratic
38:59definition of
39:00harm
39:00are very
39:01very different
39:01right
39:02the Republicans
39:02don't think
39:03any LGBTQ
39:04content
39:04they think
39:05that harms
39:05kids
39:06right
39:06and so
39:07we're seeing
39:07a couple
39:08of laws
39:08being floated
39:09in Congress
39:10now that
39:10will hold
39:11the companies
39:11responsible
39:12for harms
39:12and what
39:13that's going
39:13to mean
39:13is they're
39:14not going
39:14to let
39:14any of
39:14that content
39:15on it
39:16because
39:16if somebody
39:17claims they
39:17were harmed
39:18because their
39:18kids saw
39:18a trans person
39:19they're going
39:20to be able
39:20to sue
39:21the company
39:21so I
39:22worry
39:23about
39:24until we
39:24get really
39:25clear on
39:26what we
39:26mean by
39:26harms
39:27and shared
39:27agreement
39:29right now
39:30I think
39:30you really
39:31have to ask
39:31yourself
39:31why are the
39:32Republicans
39:33supportive of
39:33this
39:34and how
39:34are they
39:35defining
39:35this
39:36and some
39:38Democrats
39:38should know
39:39better
39:39in my opinion
39:40because I think
39:41they're getting
39:41played
39:42I think
39:43that for
39:44you know
39:44that's a kind
39:45of different
39:46question
39:46than your
39:46social media
39:47question
39:49the individual
39:50liability
39:50questions
39:51I think
39:52that's a hard
39:53one too
39:53sometimes
39:53because I
39:54think that
39:55it's easy
39:56to blame
39:56technology
39:57for what
39:57are kind
39:58of complicated
39:58human problems
39:59I'm not saying
40:00it doesn't
40:00have a rule
40:01the reason
40:01why I say
40:01this is
40:02because in
40:03my mind
40:05meta
40:05Instagram
40:06Twitter
40:07are not
40:08free speech
40:09they're
40:09algorithmically
40:10driven speech
40:11it's processed
40:12like in the way
40:13Doritos is food
40:14like so
40:15it strikes me
40:16as like
40:17that's actually
40:18not because
40:18free speech
40:19doesn't
40:20have a hierarchy
40:22there is not
40:23blue checks
40:23on free speech
40:24there's not
40:25ways to
40:28that
40:29you know
40:30their business
40:31model does not
40:32monetize
40:33hostility
40:34or other
40:34things
40:35like I don't
40:35I don't consider
40:36those social media
40:38companies
40:38free speech
40:39I consider it
40:40monetized
40:41incentivized
40:42algorithmic
40:44clusterf**ks
40:45that exist
40:46to create
40:47I mean
40:48I think
40:48this might be
40:49a difference
40:50where we disagree
40:51because I really
40:52think that
40:53you decide
40:54who gets to go
40:54on your show
40:55every night
40:55you decide
40:56what goes
40:56on the front
40:57of your show
40:57newspapers
40:58decide what
40:59goes on the
41:00headline
41:00and they do
41:01it based on
41:01how to try
41:02to sell more
41:02newspapers
41:04I'm not
41:04defending everything
41:05I don't like
41:06these companies
41:06don't give me a
41:07wrong
41:07I want to
41:07undercut their
41:08entire business
41:09model
41:10but I think
41:11that when
41:13your strategy
41:14for doing that
41:15is encouraging
41:15censorship
41:17there's a lot
41:17of people
41:18who don't get
41:18to speak
41:19they don't get
41:20to be on your
41:20show
41:20they don't get
41:21to be on the
41:21New York Times
41:22but they get
41:22to speak
41:23here's the flip
41:23side of it
41:23and I would
41:24just push
41:24back gently
41:25is that
41:26I think
41:27the current
41:27toxicity
41:28of social
41:29media
41:30is far
41:32more
41:32censorious
41:33of free
41:34speech
41:35most of the
41:35people that
41:36I know
41:36don't want
41:37to engage
41:37in that
41:38toxic soup
41:39immediately
41:40they do
41:41they're hit
41:41with all
41:41kinds of
41:42terrible
41:43terrible
41:43things
41:45and like
41:46Reddit's an
41:46example
41:47Reddit was
41:47a sinkhole
41:48like 10
41:4915 years
41:50ago that
41:50was a
41:50shithole
41:51and they
41:52cleaned it
41:52up
41:52and you
41:53can actually
41:53go on it
41:54now
41:54and have
41:55a human
41:56experience
41:56right
41:57and I
41:57think that
41:58it's important
41:59to continue
41:59to foster
42:00places for
42:01people to
42:01go so that
42:02when you
42:02look around
42:03and you realize
42:03you're in the
42:04Nazi bar
42:04you can leave
42:05and go to
42:06another bar
42:06but the thing
42:07is you don't
42:08know
42:08the algorithm
42:09doesn't tell
42:11you that
42:11they've notified
42:12the Nazis
42:13that you're
42:13there
42:14it's not
42:15transparent
42:15like on
42:17a newspaper
42:17when they
42:18say our
42:18top story
42:19you see it
42:20but it's
42:21like with
42:21advertising
42:22they outlawed
42:23subliminal
42:24messaging
42:24because it's
42:25not
42:27conscious
42:28the algorithm
42:29is unconscious
42:30it's doing
42:31things to you
42:31you don't see
42:32in the way
42:32that you
42:33don't see
42:34that in a
42:34food lab
42:35they're taking
42:36a raccoon's
42:37anus and
42:37turning it
42:38into raspberry
42:38flavoring
42:39I think
42:39that the worry
42:40that I have
42:41is that
42:42if you want
42:43somebody to
42:44develop a
42:45better version
42:45you can't
42:46create
42:47instant liability
42:48anytime somebody
42:49gets mad
42:50that their
42:50kids saw
42:50trans content
42:51you can't
42:53we need
42:54no
42:54we need
42:57better options
42:58and people
42:58are developing
42:59them
42:59right
42:59there's the
43:00whole
43:00mastodon
43:01universe
43:01I know
43:02it's not
43:02very big
43:02yet
43:03but it's
43:03a decentralized
43:04place
43:05where people
43:05can build
43:06safe communities
43:06for themselves
43:07there are
43:09environmental
43:10protection
43:10can't we
43:11just view
43:11it as
43:12in the same
43:12way we
43:13view
43:13environmental
43:13protection
43:14if there
43:14are toxins
43:15we put
43:16it in
43:16the air
43:16and downstream
43:17of those
43:18are decisions
43:19that we
43:19make
43:20and when you
43:20put those
43:20regulations
43:21in you
43:21could say
43:22well that's
43:22just going
43:22to help
43:23the big
43:23corporations
43:24and all
43:24that
43:24but don't
43:25we have
43:25to at
43:25some level
43:26use regulation
43:27to mitigate
43:28very real
43:29world harms
43:30that are
43:31occurring at
43:32this moment
43:33through those
43:33companies
43:34I think
43:34that when
43:35you pick
43:35censorship
43:36as your
43:37strategy
43:37you are
43:38dealing with
43:39no that's
43:39what you
43:40want to do
43:40you want to
43:41create liability
43:42for the company
43:43based on
43:43things that
43:43people say
43:44to each other
43:45on the network
43:45and what you're
43:46doing is you're
43:47saying you should
43:47make sure that
43:48those people
43:48can't speak
43:49on your network
43:50because they're
43:50causing harm
43:51on your platform
43:52and I worry
43:53that we do that
43:54don't we do that
43:55already with all
43:56forms of
43:56communication
43:57I can't
43:58I mean
43:59there are libel laws
44:00there are slander
44:01isn't that censorship
44:02I can't say
44:03things on this show
44:04that are patently
44:05untrue with malevolence
44:06because
44:08we regulate that harm
44:09well I mean
44:10you can always
44:11sue the people
44:12who are harming
44:12you on the
44:13platforms
44:13the question
44:14is secondary
44:15liability
44:15cat turd 161
44:17yeah
44:18yeah
44:18I mean
44:19honestly
44:20but also
44:20I just want
44:21people to be
44:22able to leave
44:22and I think
44:23that if we
44:23you know
44:24we wrote a paper
44:24a little while ago
44:25called privacy first
44:27and we did a little
44:28thought experiment
44:28about all of
44:30these social harms
44:31that are happening
44:31in other places
44:32and what the world
44:33would look like
44:34if we just
44:34cut the knees
44:35out of the business
44:36model
44:38of surveillance
44:39and you know
44:40they don't go away
44:41but they get
44:42a lot smaller
44:43and I think
44:44that there's
44:46a lot of things
44:46that we could try
44:47I'd also like
44:48to see a lot
44:48more support
44:49for kids
44:49I really think
44:50that a lot
44:53of the cases
44:54that are horrible
44:55that we're seeing
44:55there's usually
44:56a lot of other
44:57things going on
44:57in these kids lives
44:58and we don't do
44:59a good job
44:59of taking care
45:01of them
45:01but I also
45:02I worry
45:02I think
45:03that the censorship
45:04strategy feels
45:05really easy
45:05and it feels
45:06really powerful
45:09and I worry
45:10it's going to backfire
45:10I really do
45:12yeah yeah
45:12I understand
45:15I guess
45:16my only point
45:17is I think
45:17I think we define
45:19censorship
45:19slightly differently
45:20we might
45:20and we always go
45:22and I look at it
45:23more as like
45:24with cigarette smoking
45:25like
45:28social media
45:29to me
45:29is like
45:30secondhand smoke
45:31sometimes
45:32in that
45:33I didn't choose
45:34to do it
45:34but because
45:35of the algorithm
45:36it still shows up
45:37for me
45:38and if you
45:38shipped over
45:39to Blue Sky
45:40you don't have
45:40an algorithm
45:40feeding you things
45:41but if I shipped
45:42over to Blue Sky
45:43then the world
45:43no longer makes sense
45:44I'm sorry about that
45:45I mean honestly
45:46but I think
45:47that the answer
45:48is more options
45:48for people
45:49I think that
45:50a world
45:50in which there are
45:51five big media companies
45:52that decide
45:53everything that we see
45:54and how we see it
45:55the answer
45:56isn't to try
45:57to take a dictator
45:58and make them
45:58a better dictator
45:59right
46:00we've got these
46:00these social media
46:02dictators
46:02the answer
46:03is to get rid
46:03of the dictators
46:04and make them
46:05less important
46:06alright
46:07well done
46:09and that's
46:10and we do that
46:10through the privacy act
46:11a privacy
46:12a real privacy act
46:13and defending our data
46:14and defending our data
46:15I love it
46:16well thank you
46:16for a really
46:18thought provoking
46:19and inspiring
46:20conversation
46:21that truly does
46:22make me look at it
46:23in a different way
46:24because I'm coming
46:25at it from this way
46:25and now I'm
46:26seeing it a bit
46:27from this way
46:27and I appreciate it
46:28this was fun
46:36it
46:36Cindy's book
46:37privacy's defender
46:38it's a great read
46:39Cindy Cohen
46:40we're going to take
46:41you a quick break
46:42we'll be right back
46:42after this
47:02hey let's go for tonight
47:03before we go
47:04we're going to check in
47:04with your host
47:05for the rest of the week
47:05Miss Desi Leiter
47:06Desi
47:12Desi
47:14what do you have
47:15for the people
47:16this week
47:16oh big holiday week
47:18John
47:18Passover and Easter
47:20and I for one
47:21could not be more excited
47:22which is why
47:23I brought
47:24some delicious
47:26chocolate eggs
47:27oh
47:28for me
47:28yeah yeah
47:29and for you
47:31a traditional
47:33Jewish
47:33shank bone
47:45oh you
47:47that's
47:47thank you
47:50you shouldn't have
47:54try it John
47:55try it now
48:01it's not really
48:03may I have one of the
48:04chocolate eggs
48:05and ruin your appetite
48:07for the shank bone
48:08no way
48:09eat up
48:10come on
48:11cheers
48:15yes
48:18happy holidays
48:19everybody
48:21Desi
48:24here it is
48:24your moment of jam
48:30back on Venezuela
48:31where Chris and I
48:32have both had a chance
48:33to be there
48:33I literally think
48:35they're going to put up
48:36a statue
48:36to President Trump
48:37lowering the price
48:38of gas in America
48:39so it's
48:40forget that
48:40when are they going
48:41to do the statue
48:51sorry
Comments