Skip to playerSkip to main content
  • 18 hours ago

Category

📺
TV
Transcript
00:04¡Suscríbete al canal!
00:50¡Suscríbete al canal!
01:00¡Suscríbete al canal!
01:00¡Suscríbete al canal!
01:05¡Suscríbete al canal!
01:08¡Suscríbete al canal!
01:08¡Suscríbete al canal!
01:10¡Suscríbete al canal!
01:12¡Suscríbete al canal!
01:14¡Suscríbete al canal!
01:17¡Suscríbete al canal!
01:36¡Suscríbete al canal!
01:43¡Suscríbete al canal!
01:51¡Suscríbete al canal!
01:57¡Suscríbete al canal!
02:00¡Suscríbete al canal!
02:02¡Suscríbete al canal!
02:04¡Suscríbete al canal!
02:16¡Suscríbete al canal!
02:22¡Suscríbete al canal!
02:23¡Suscríbete al canal!
02:25¡Suscríbete al canal!
02:28¡Suscríbete al canal!
02:31¡Suscríbete al canal!
02:31¡Suscríbete al canal!
02:33¡Suscríbete al canal!
02:34¡Suscríbete al canal!
02:36¡Suscríbete al canal!
02:38¡Suscríbete al canal!
02:38¡Suscríbete al canal!
02:39¡Suscríbete al canal!
02:42¡Suscríbete al canal!
02:47¡Suscríbete al canal!
02:49¡Suscríbete al canal!
02:52¡Suscríbete al canal!
02:52¡Suscríbete al canal!
02:53¡Suscríbete al canal!
02:54¡Suscríbete al canal!
02:54¡Suscríbete al canal!
02:55¡Suscríbete al canal!
02:56¡Suscríbete al canal!
02:56¡Suscríbete al canal!
03:07mi querido eliza
03:10ha sido un mes
03:13desde que he jugado
03:16bueno, no tenis
03:17un poco como tenis
03:18es más como baby tenis
03:22es más grande que ping pong
03:24y me gusta eliza, es tan divertido
03:33actri, actri
03:34que es actri
03:37que es más que quede
03:39es difícil comprar comida que come con pistachios
03:49no, no
03:50lo que nos influencers
03:51están en línea de pretendas
03:53¡Suscríbete al canal!
04:27¡Suscríbete al canal!
04:54¡Suscríbete al canal!
05:24¡Suscríbete al canal!
05:29¡Suscríbete al canal!
05:36¡Suscríbete al canal!
05:39¡Suscríbete al canal!
05:40¡Suscríbete al canal!
05:42¡Suscríbete al canal!
05:52¡Suscríbete al canal!
05:59¡Suscríbete al canal!
06:01¡Suscríbete al canal!
06:16¡Suscríbete al canal!
06:22¡Suscríbete al canal!
06:24¡Suscríbete al canal!
06:34¡Suscríbete al canal!
06:37¡Suscríbete al canal!
06:45¡Suscríbete al canal!
06:48¡Suscríbete al canal!
07:16¡Suscríbete al canal!
07:18¡Suscríbete al canal!
07:20¡Suscríbete al canal!
07:36¡Suscríbete al canal!
07:48¡Suscríbete al canal!
07:54¡Suscríbete al canal!
07:55No, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no,
07:58no, no, no, no, no, no, no, no, no, no, no, no, no.
08:00Oh, oh, oh.
08:05You happy now? You happy? Come and get it a ride. Take the shit out of my mouth.
08:11Oh, yuck, oh, yuck. Disgusting.
08:15You know what, you know what? In a way, this is a metaphor.
08:19porque as Americans
08:20en este quagmire con no end
08:22en sight, we are all
08:24eating shit
08:29yeah, but you did it for real
08:36yeah, it's very disgusting
08:38thank you guys, Ronnie Chang, Michael Costa
08:40and Jordan Klepper, everybody
08:43good report, good report
08:45now meanwhile
08:47and I thank them all three for that
08:50brave reporting that they've done
08:53meanwhile back on the home front
08:54this weekend offered stark
08:56images articulating our nation's
08:58deep political divide
09:00as 8 to 9 million people
09:02took to the streets for the No Kings
09:04protest
09:05yeah, they did it
09:07while
09:10a slightly
09:11smaller group of, let's call them
09:14heritage Americans, took to the
09:16Reagan meeting room 1A
09:18on the second floor for their
09:20Yas Kings rally
09:21if you want an object lesson
09:24on what a strong commander
09:26in chief looks like
09:27take a look at the current resident of the
09:30Oval Office, President Donald J. Trump
09:32what he's been able to do in one year
09:33is truly remarkable
09:35great president, greatest president of my lifetime
09:36that guy is an energizer bunny
09:38we'll never get another president
09:40like Donald Trump, never
09:41and that's why it's important
09:42that we do everything that we can
09:44to try to get him re-elected
09:50I'm sorry
09:51that's Reverend Franklin Graham
09:53saying
09:54praise the Lord
09:56and f*** the Constitution
09:58now look
10:00far be it
10:01for me to question the wisdom
10:02of this year's CPAC convention
10:04but clearly
10:05the vibes this year
10:07were slightly off
10:08how many of you
10:09would like to see
10:10impeachment hearings?
10:13no
10:13that was the wrong answer
10:22that's CPAC
10:25that's a rookie
10:26crowd work mistake
10:27slap
10:28see they cheered
10:28for Trump's impeachment
10:29not realizing
10:30that you were calling
10:31for the counter
10:31but that's on you
10:32crowds are like dogs
10:33they react to tone
10:34you can get them excited
10:35about anything
10:36who wants to go to the vet?
10:37who wants to get the balls cut off?
10:42yes
10:42see in Trump land
10:43old Donald
10:44didn't let something
10:45as trivial as a war
10:47keep him from doing
10:48his rounds
10:49at the golf course
10:50and at a Saudi investment
10:51meeting in Miami
10:52because God forbid
10:53during a war
10:55he let the precipice
10:56of World War 3
10:57yuck his yum
10:58in any way
10:59that was Trump's weekend
11:01and I find it
11:02so astounding
11:03that this nuclear
11:05armed man baby
11:06doesn't seem to have
11:08any understanding
11:09of the confusion
11:10and anxiety
11:11that his ill-planned
11:13adventure in Iran
11:14is causing this country
11:15he's just trucking along
11:17like it's any old episode
11:18of The Apprentice
11:20here is Trump
11:21last night
11:21on Air Force One
11:22watch him try
11:24and focus
11:24for more than two sentences
11:25on the war
11:26he started
11:27before veering off
11:29into what really matters
11:30it truly is
11:31regime change
11:33and regime change
11:34is an imperative
11:35but I think
11:37we have it automatically
11:38I did something
11:39today
11:40I
11:41we just got these
11:42in from the architects
11:43a lot of people
11:44are
11:44talking about
11:45how beautiful
11:46the ballroom
11:47for 150 years
11:49they've wanted
11:50to build a
11:52ballroom
11:53at the White House
11:54here's another view
11:55this is coming
11:56from
11:57right opposite
11:58the treasury building
11:59here's a view
12:00on the south
12:01with the porch
12:03this is a view
12:04of it from the north
12:06and there'll be
12:07Corinthian
12:07which is
12:09considered the best
12:10most beautiful
12:10by far
12:14I know
12:14I know what you're thinking
12:16how do you get
12:17something that size
12:18on a plane
12:25but I guess
12:26he's allowed
12:27it's his emotional
12:27support ballroom
12:28picture
12:31now you may say
12:32well
12:32what prompted
12:34this
12:35presentation
12:36about the ballroom
12:37well it turns out
12:38this incredibly
12:39long winded
12:40visual aid
12:40assisted timeshare
12:41presentation
12:42was spurred by
12:43a critical
12:44New York Times
12:45article
12:46that had the
12:47gall to suggest
12:48that sometimes
12:49Trump's plans
12:50don't hold up
12:51to scrutiny
12:52the Times
12:53had architects
12:54analyze plans
12:55for the ballroom
12:56they pointed out
12:57design flaws
12:58like staircases
12:59that seemed
12:59to lead nowhere
13:03oh
13:04okay
13:05woke
13:05New York Times
13:08when M.C. Escher
13:10does that
13:11it's art
13:13oh
13:14but you put it
13:15on an actual
13:16building
13:16and suddenly
13:18this doesn't make
13:19any sense
13:19and
13:21I appear
13:22to be trapped
13:25for the ballroom
13:27Trump will pull
13:28an all-nighter
13:29for a point
13:30by point
13:31rebuttal
13:32for the war
13:33literally doesn't
13:35have the focus
13:36to answer
13:36one question
13:37about the dire
13:38consequences
13:39of his actions
13:40on his favorite
13:42network
13:42I think it is
13:45alarming
13:45that we have
13:47not been able
13:47to see
13:48or hear
13:49from any
13:49of the
13:50Iranian people
13:51and I think
13:52there is some
13:53general worry
13:53about them
13:54do you have
13:55any insight
13:55as to how
13:56they are doing
13:57do they have
13:58drinking water
13:59do they have
13:59food
13:59wow
14:00straightforward
14:01question
14:02it asked the
14:03president to put
14:04himself in the
14:04shoes of those
14:05purportedly
14:05we're trying to
14:06liberate
14:06and the suffering
14:08they may be
14:08going through
14:10the Iranian people
14:12are hurting
14:12sir
14:12do they have
14:14food
14:14right I do
14:15it's upsetting
14:16I do
14:17but first
14:18you remember
14:18when we had
14:19lunch years ago
14:20in the base
14:21of Trump Tower
14:21when it was
14:22a brand new
14:23building
14:26so the point
14:27is
14:27the Iranians
14:29might not
14:29have food
14:29but you remember
14:30you and I
14:31about 12
14:3113 years ago
14:32had a club
14:34sandwich
14:34remember that
14:36a long time
14:37ago
14:38yes
14:38a long time
14:38ago
14:39and you haven't
14:40changed
14:40you have not
14:41changed
14:41now I'm not
14:42allowed to say
14:43this
14:43it's the end
14:43of my political
14:44career
14:45but you may
14:45be even
14:46better looking
14:46okay
14:47so I don't
14:47know what
14:48you're doing
14:48but I will
14:49not say that
14:50I will not say
14:51that because
14:52that will end
14:52by political
14:53you know
14:54you're not
14:54allowed to say
14:55a woman's
14:55beautiful anymore
14:56you know
14:57it's funny
14:57Dana
14:57your question
14:59about the
14:59suffering
15:00of the Iranian
15:01people
15:01has somehow
15:02made me horny
15:03I don't know
15:03why
15:05you asked me
15:06do they have
15:06food
15:07do they have
15:08water
15:08and the whole
15:09time I'm
15:09thinking like
15:10what a piece
15:10of ass
15:12what a delightful
15:13piece of ass
15:14you are
15:15you remember
15:15that Dana
15:16remember how
15:16I made you
15:16uncomfortable
15:17at lunch
15:20you remember
15:21I do that
15:22I do that
15:22a lot
15:23to women
15:23I make
15:23them
15:23uncomfortable
15:24what was
15:25your question
15:26I don't
15:26remember
15:26your question
15:28the war
15:30other than
15:30the occasional
15:31tweet
15:31doesn't seem
15:32to occupy
15:33any space
15:34in Trump's
15:34brain
15:35and by the way
15:36not just
15:36for TV
15:37on Friday
15:38he addressed
15:39a room full
15:39of Saudi
15:40investors
15:41who you would
15:42think might be
15:42very concerned
15:43about the bombings
15:44in their
15:45neighborhood
15:46but he wants
15:47to let them
15:47know we don't
15:48have to talk
15:48about that
15:49at all
15:49I'm asked
15:50to take
15:51a few
15:51questions
15:52and unlike
15:53other politicians
15:54they would
15:55like the
15:55questions
15:56screened
15:56I don't
15:57ask for
15:57screening
15:58of the
15:58questions
15:59you can
15:59ask me
15:59anything
16:00you want
16:00you can
16:01talk
16:02sex
16:02you can
16:03whatever
16:03the hell
16:04you want
16:20we can
16:21ask about
16:22sex
16:28okay
16:40whose name
16:41is that
16:42is that
16:42your
16:42name
16:43whose name
16:43is that
16:46or you
16:47meant like
16:47general sex
16:48shit
16:48like
16:48love line
16:48shit
16:50sorry
16:55you know
16:56honestly
16:56his leering
16:57behavior
16:58is less
16:59commander-in-chief
17:00at war
17:01and more
17:01grandpa
17:02who's lost
17:03his filter
17:03in public
17:05instead of
17:05assuaging
17:06a nervous
17:06nation
17:07he's just
17:07embarrassing
17:07the whole
17:08family at
17:08dinner
17:09going
17:09hey
17:09do you
17:10see
17:10our
17:10waitress
17:10is a
17:11busty
17:11one
17:11huh
17:15just like
17:16your grandma
17:17was
17:21but like
17:22any good
17:22monarchy
17:24trump's
17:24loyal
17:25henchmen
17:25assure you
17:26that it's
17:26really the
17:26opposite
17:27of everything
17:28you're
17:28seeing
17:28he has
17:31encyclopedic
17:31molecular
17:32knowledge
17:33i was on the
17:34airplane with
17:35him and we were
17:36sitting across the
17:37table from each
17:38other we started
17:38talking about
17:39syria
17:41and he got
17:42a place
17:43mad and he
17:44turned it on
17:45its back
17:45and then he took
17:46a sharpie
17:47and he drew
17:47a perfect map
17:49of the
17:49mid-east
17:50and then he put
17:51the troop
17:51strength
17:51of every
17:53country
17:54on every
17:55border
17:56on that
17:57map
18:01has anyone
18:02thought about
18:03filming that
18:05let us see
18:06that
18:07because that's
18:07not what we
18:08see
18:08what we see
18:09is a president
18:10four weeks
18:10into a war
18:11he has yet
18:11to fully
18:12explain
18:12with objectives
18:13he has yet
18:14to fully
18:14define
18:14only displaying
18:16molecular
18:16knowledge
18:17in a cabinet
18:17meeting
18:18of his own
18:18pen preferences
18:20so i came
18:21here to have
18:21thousand dollar
18:22pens
18:22and you know
18:23you hand pens
18:24out you're
18:24signing and
18:25handing them
18:25out you're
18:25handing them
18:26to all these
18:26people
18:27beautiful pen
18:27ballpoint
18:29thousand who's
18:30gold silver
18:30I'm handing out
18:31to kids that
18:32don't even know
18:32what they're
18:32what is this
18:33mommy
18:33so I take it
18:34out and I
18:35saw it and
18:36there's no
18:36ink and I
18:38got all you
18:38people looking
18:39and you say
18:39there must be
18:40something wrong
18:41with pen right
18:41here this pen
18:42is an interesting
18:43example it's the
18:43same thing
18:44here's what I
18:45called the guy
18:45I said I'd
18:47like to use
18:48your pen but
18:48I can't have
18:49a grey thing
18:49with a big
18:50S on it
18:51same sharp
18:52he said why
18:53can't make a
18:53dice
18:53he said what
18:54can you do
18:55he said I'll
18:55paint it black
18:56sir if you
18:56like in gold
18:59almost real
19:00gold
19:04that's a cabinet
19:05meeting during
19:07a fucking war
19:09don't tell us
19:09what to use
19:10to draw the
19:10map
19:11just draw
19:11the fucking
19:12map
19:13you know
19:14all we keep
19:15hearing from
19:15this administration
19:16is why the
19:17American people
19:17have to sacrifice
19:18for Trump's
19:20vision of
19:20America's
19:21greatness
19:21that these
19:23temporary
19:23disruptions are
19:24just part of
19:24the process
19:25and why can't
19:26we be patriots
19:27we have to be
19:28patient
19:29we have to
19:30suck it up
19:30whether it's
19:31high gas prices
19:32or whimsical
19:33tariff inflation
19:34or draconian
19:35ice rates
19:35or temporary
19:36bill of rights
19:37suspensions
19:38it's on us
19:38to understand
19:40but Trump
19:41gets to be
19:41just the same
19:42old
19:42ain't I
19:43a stinker
19:44utterly
19:45self-absorbed
19:46remember when
19:47I used to
19:47want to
19:48fuck
19:48hot girls
19:49twat self
19:50can you imagine
19:51any other
19:52president
19:53let alone
19:53a wartime
19:54president
19:54being this
19:55indulgent
19:56and there
19:57were a thousand
19:58dollars apiece
19:59beautiful pen
20:00ballpoint
20:02thousand
20:02it was gold
20:03silver
20:03gorgeous
20:04but I'm
20:04handing out
20:05to kids
20:05that don't
20:05even know
20:06what the hell
20:06what is this
20:07mommy
20:07it's kids
20:08they're getting
20:09a pen for
20:10a thousand
20:10dollars
20:10they have
20:11no idea
20:11what it is
20:14now
20:15to be fair
20:16to Trump
20:16pen technology
20:17was at
20:18FDR's time
20:19quite primitive
20:21but Trump
20:22is the guy
20:22you want
20:23to be your
20:24king
20:24that's who
20:25you want
20:26it's important
20:26that we do
20:27everything that
20:28we can
20:28to try to
20:29get him
20:29reelected
20:30really
20:32this is the
20:33guy you'll
20:34break the
20:34constitution
20:35for
20:36the American
20:37monarchy
20:39begins with
20:40this guy
20:43this is the
20:44line we face
20:45in this country
20:46no kings
20:46versus kings
20:48but a word
20:49of caution
20:51generally
20:51with monarchies
20:52the first
20:54guy
20:54is the
20:55best guy
20:56it's the
20:57guy that's
20:58so good
20:58it makes
20:59the people
21:00want a
21:00king
21:01a Charlemagne
21:03an Alexander
21:06a Ralph
21:10but pretty
21:11soon
21:12corruption
21:12and inbreeding
21:14take their
21:15toll
21:15on the
21:16monarchy
21:16and turn
21:17your king
21:18into this
21:19guy
21:21old
21:22Charles
21:22the second
21:23yeah
21:24that's
21:25his real
21:25head
21:29he liked
21:30cheese
21:32it was
21:33his favorite
21:34food
21:35so go
21:36ahead
21:37but just
21:38understand
21:40we're starting
21:40a lot closer
21:41to Charles
21:43than Charlemagne
21:45when we come
21:45back Cindy Cohen
21:46will be joining
21:47me in the
21:47studio
22:00welcome back to the
22:02daily show
22:02my guest tonight
22:03she's the executive
22:04director of the
22:05Electronic Frontier
22:06Foundation
22:06and author
22:07of the new
22:07book
22:07Privacy's Defender
22:09my 30 year
22:09fight against
22:10digital surveillance
22:11please welcome
22:11to the program
22:12Cindy Cohen
22:28very very nice
22:29to see you
22:30I want to point
22:31out for the
22:32people at home
22:33who can't see
22:33she's wearing
22:34a let's sue
22:35the government
22:36t-shirt
22:41we have great
22:42designers at EFF
22:43they made this
22:43special
22:44what is EFF
22:45and what is that
22:47the Electronic
22:47Frontier Foundation
22:48is the world's
22:49biggest and oldest
22:50digital rights
22:51organization
22:52we were founded
22:52in 1990
22:53our basic goal
22:55is to make sure
22:55that when you go
22:56online your rights
22:57go with you
22:58so we do things
22:59around free speech
23:00around privacy
23:01and around trying
23:02to build an internet
23:03that stands with you
23:05rather than against you
23:06how did you know
23:09it's so prescient
23:10the idea
23:10I'm thinking back
23:11to 1990
23:13and the internet
23:14was
23:15you had a little
23:16AOL disc
23:17yeah
23:17maybe 95 or 6
23:19but whatever that
23:20right
23:20it might have been
23:21it was just like
23:22alt binary
23:24message boards
23:25how did you all
23:28have the prescience
23:29to see that
23:30this could be turned
23:31into a tool
23:32of surveillance
23:34and of
23:36government repression
23:37yeah I mean
23:38the founders
23:39of the EFF
23:40the John Perry Barlow
23:41John Gilmore
23:42and Mitch Kapoor
23:43were involved
23:45in some
23:46early FBI
23:48missteps
23:48around the internet
23:49and they
23:50they worked with
23:50the FBI
23:51or they were like
23:51no they were like
23:52questioned by the FBI
23:54and there were a lot
23:55of raids going on
23:56and there was
23:58they weren't raided
24:00but they were asked
24:00a bunch of questions
24:01and they realized
24:02that the government
24:02didn't really understand
24:03how the internet
24:04was working either
24:05at least the FBI
24:05didn't
24:06and they thought
24:07we need to start
24:08an organization
24:09that's gonna think
24:10about how our
24:12civil liberties
24:13work in the digital age
24:16and a couple
24:17of them are pretty
24:18geeky
24:18and they realized
24:19that the internet
24:19we were building
24:20was not very secure
24:22it didn't have
24:23a lot of privacy
24:24or security in it
24:25right
24:25and the first fight
24:27we did was to try
24:28to free up a technology
24:29called encryption
24:30that now is the reason
24:32that we have the privacy
24:34and security
24:34we have online
24:35most of it
24:36this is your first
24:37lawsuit
24:38and you joined this
24:39because you're
24:39you're a lawyer
24:41and you're interested
24:42in this space
24:43or you had met
24:44these guys
24:44how did your involvement
24:45I met them
24:46kind of by chance
24:47honestly they showed up
24:48at a party
24:48at my house
24:50some of them
24:51but I was
24:53this is all sounding
24:53very
24:54where did this take place
24:56San Francisco
24:57okay
24:58in the hate
24:59yeah yeah
24:59now I'm understanding
25:00but I
25:01there's a tub of Kool-Aid
25:02somebody put something
25:03in something
25:04it got a little messy
25:04no I was a
25:06I was a human rights lawyer
25:08I had done human rights
25:09I've been in Geneva
25:10and I came back
25:11to San Francisco
25:11to try to start
25:12a regular law practice
25:13and I met some
25:15of these guys
25:15kind of really
25:16by chance
25:17and they were
25:18already online
25:19you know
25:20this was 1990
25:201991
25:21they were online
25:23at this very early
25:24internet before we had
25:25the world wide web
25:26but they were doing
25:27really cool things
25:29they were collaborating
25:30across distance
25:31they were having
25:31these long chats
25:32back in stuff
25:33that we take
25:33for granted now
25:34but it was pretty
25:35magical back then
25:36um
25:37it's really interesting
25:38too that they
25:39and you
25:40decided
25:40because I remember
25:41that time
25:42as a gold rush
25:43so the 90s
25:45as it started
25:45this was when like
25:46every internet startup
25:48anybody that you knew
25:49that was even adjacent
25:50if they had a word processor
25:52they were a startup
25:53yeah
25:53and they wanted
25:54VC funding
25:55and they were gonna
25:56you know
25:57make a ton of money
25:58and you and this group
26:00decide
26:01we're not going
26:02in that direction
26:02we're gonna go
26:04in the
26:04protecting citizens
26:06rights
26:07direction
26:08where the real money is
26:09yeah
26:10why would you
26:10my stock options
26:12are so great
26:13right
26:13yeah
26:14yeah
26:15I mean look
26:16I think that
26:16there are people
26:17who you know
26:18plenty of people
26:19in the early internet
26:19were interested
26:20in making money
26:21I don't have any
26:21judgment about that
26:22but we were thinking
26:23about how could
26:25this be a technology
26:26that supports people
26:27how it could be
26:27a technology of freedom
26:28how it could give people
26:31you know more power
26:32against the powerful
26:33forces in their lives
26:34and they were
26:35again very forward looking
26:37and they you know
26:38created an organization
26:39to try to do that
26:41I don't know
26:42where it comes
26:42so for me
26:43I've always just
26:44wanted to make
26:45the world better
26:45and then I found
26:46you know
26:46I'm not a technical
26:48person but I kind
26:48of landed
26:49in this place
26:50in this time
26:51and was able to do it
26:52in 90
26:53and then so
26:54you're working
26:55over this sort
26:55of decade
26:56and you get
26:57this great
26:57encryption case
26:58and you make it
26:58so that people
26:59have a right
27:00to encrypt
27:00their data
27:01and these kinds
27:02of things
27:02but then 9-11
27:04happens
27:04and that really
27:07supercharges
27:07is the surveillance
27:08state
27:08yeah
27:09after 9-11
27:10the Patriot Act
27:11both the Patriot Act
27:12which they did
27:13above board
27:14and a lot of stuff
27:15that we later
27:15discovered as a country
27:16they were doing
27:17secretly
27:18these metadata
27:19these hoovering up
27:20of all of our data
27:21and putting it
27:22through process
27:23yeah tapping into
27:24the internet backbone
27:25collecting metadata
27:26collecting telephone records
27:28some of this
27:29you might be able
27:30to read in the Patriot Act
27:31but most of it
27:32was really
27:33under
27:33you know
27:34secret interpretations
27:35and things like that
27:36but they had really
27:38flipped from
27:39the idea
27:40that you know
27:40you needed
27:42probable cause
27:42you needed
27:43a warrant
27:43you needed
27:44individualized suspicion
27:45to doing
27:46mass surveillance
27:46first
27:47and figuring out
27:48what they need
27:49second
27:49and that
27:51has only
27:51increased
27:52as the technology
27:53has increased
27:54was it that
27:55they had decided
27:56to treat
27:57virtual world
27:59as separate
28:00from physical world
28:01in other words
28:02the rules
28:03of like
28:03the fourth amendment
28:04might not apply
28:05to your digital life
28:07in the way
28:08that it would apply
28:09to your house
28:09yeah
28:10although they were
28:11using some
28:12earlier stuff
28:13that's in the
28:13non-digital world
28:14this idea
28:15that metadata
28:15isn't protected
28:16but the content
28:18of you too
28:18metadata meaning
28:20like the date
28:21and time
28:22and location
28:23and the other things
28:24that go along
28:24with your digital life
28:26correct
28:26but they were
28:27using an analogy
28:28from envelopes
28:29and letters
28:30in you know
28:31the actual
28:32physical world
28:33but in the digital world
28:35the metadata
28:35is much more powerful
28:37it can tell you a lot
28:38there's a very famous quote
28:40you know
28:40from a member
28:40of the NSA
28:41saying we kill people
28:42with metadata
28:43right
28:43because
28:44when you are able
28:46to collect it
28:46at scale
28:47and analyze it
28:48you can do a lot
28:49more with it
28:50than kind of
28:51the old school
28:52envelopes
28:52and you know
28:54I don't remember
28:55anybody saying
28:55we kill people
28:56with envelopes
28:57it hasn't happened
28:57that I know of
28:58so this is one
29:00of the things
29:01they took this
29:01distinction
29:02that is actually
29:03an offline distinction
29:04that may have
29:05made some sense
29:05I'm not sure it did
29:06but in the online world
29:08it really has eroded
29:09our privacy
29:10and it's being used
29:11in ways now
29:12that are increasingly
29:13visible right
29:14I mean
29:15there's testimony
29:16in Oregon
29:16just last month
29:17from an ICE agent
29:18saying they used
29:19a Google-like thing
29:20to identify
29:21where people were
29:23that they ended up
29:24doing an ICE raid
29:25so that's
29:27And the company
29:28Palantir is very
29:28proud of the fact
29:29that they're
29:30sending metadata
29:34to accomplish
29:35those things
29:36that these large
29:37tech companies
29:38are working
29:39with the company
29:40so now
29:41is your fight
29:42are you fighting
29:43two behemoths
29:45are you fighting
29:45the government
29:46and also
29:48the tech companies
29:49Absolutely
29:50I mean we kind of
29:51always were right
29:52if you think about
29:53the post 9-11 surveillance
29:54they didn't come to us
29:55and ask us
29:56for our telephone records
29:57right
29:57they went to the
29:57telephone companies
29:58that's where they got it
29:59so they've always
30:00gone through
30:01the companies
30:03that provide us
30:03services
30:05in order to get
30:06information
30:06but now
30:07the number one
30:08purchaser of information
30:09from data brokers
30:10is the government
30:11and you know
30:12FBI Director
30:13Kash Patel
30:14just said to Congress
30:15we're going to
30:16keep doing it
30:17so
30:18it was something
30:19they were doing
30:19a little of
30:20and now
30:20they're really
30:21supercharging it
30:22and doing it
30:22at a level
30:23that's much greater
30:24than it was before
30:24but you're right
30:25we don't really
30:26have the luxury
30:27of a choice
30:27about whether
30:28we care more
30:29about government
30:30surveillance
30:30or private surveillance
30:31they work hand in hand
30:33and kind of always have
30:34how receptive
30:36have the courts
30:37been
30:38to the arguments
30:39that people's privacy
30:42is part and parcel
30:43to their security
30:45I mean
30:47good and bad
30:48I think we've got
30:49a Supreme Court
30:50that's somewhat
30:50hostile to the idea
30:51that privacy
30:52is a harm
30:53at all
30:53but we see
30:54a lot of lower
30:55courts
30:55being courageous
30:57I mean
30:57in many other issues
30:58we're seeing
30:59the same dynamic
31:00where we've got
31:02a lot of lower courts
31:03who are worried
31:03about people's privacy
31:04both in the commercial
31:06you know context
31:07and in the law
31:08enforcement context
31:09we've been chipping away
31:11at this post 9-11
31:12surveillance
31:13we've made a lot
31:13of progress
31:14over time
31:14not kind of
31:15in a you know
31:16one fell swoop
31:17but chipping away
31:18and chipping away
31:19and Congress
31:19has stepped in
31:20and done a little
31:21they need to do
31:22a lot more
31:23and there's a
31:23big vote
31:24coming up
31:25in April
31:26on some of
31:27the mass surveillance
31:28it's called
31:28FISA section 702
31:30so
31:31it's
31:32it's
31:33we're continuing
31:34the fight
31:34we have to
31:35continue the fight
31:36but we've also
31:36come quite a ways
31:38from where we were
31:38right after 9-11
31:39where all this stuff
31:40was secret
31:41and massive
31:41what do you want
31:42people to know
31:43how are we culpable
31:44in the dismantling
31:45of our own security
31:47and by that
31:48I mean
31:48I don't understand
31:51at all
31:52I just click
31:54sure
31:54like whenever
31:56it always comes on
31:57I'm like
31:57you know
31:57is it okay
31:58if we try
31:59and I'm just like
31:59I just want to buy
32:00shoes
32:01like
32:01so I just click
32:03and I do things
32:04and I'm just
32:05I don't know
32:06what I've given up
32:07yeah
32:08I really think
32:09that's blaming the victim
32:10you're the victim here
32:11this is
32:11this is cooked
32:12this is not set up
32:14so you have a real choice
32:15I don't
32:16thank you
32:17and so
32:17we need to fix this
32:18but we need to fix it
32:19with law
32:20with policy
32:20maybe with better technologies
32:22but like
32:22I think that the idea
32:24that individuals
32:25should bear responsibility
32:27for this
32:28it's pretty down far
32:29on the list
32:30of things
32:30because
32:31it's not like
32:32we have other options
32:33right
32:33I mean we do
32:34for some things
32:35I think it's great
32:35if you want to use
32:36an end-to-end
32:37encrypted tool
32:37like Signal
32:38rather than
32:39you know
32:39a tool that's not
32:41there are tools
32:42that can help you
32:43but I don't think
32:44I think that
32:45you should be blamed
32:46because the game
32:47is rigged against you
32:48on a lot of these
32:49privacy things
32:50and you know
32:51the original
32:52I think one of the
32:54original sins
32:54of the internet
32:55is everything
32:55is these little
32:56click wraps
32:57where I think
32:57somebody did a study
32:58it would take you
32:5937 years
33:00to read all the
33:01terms of service
33:01that you agree to
33:02in an ordinary
33:03internet experience
33:05that's crazy
33:06because I'm
33:06I'm halfway through
33:07I'm about halfway through
33:09it's not fair
33:09and it's not right
33:10we have to fix this
33:11with something
33:12other than
33:12personal responsibility
33:13so who
33:14who bears that
33:15responsibility then
33:16who is
33:17if I'm the victim
33:19who is the party
33:21that I'm going after
33:23am I going after
33:24Palantir
33:25am I going after
33:26you know
33:26they just had
33:27this big lawsuit
33:28where
33:29I think Meadow
33:30was held responsible
33:31for
33:32certain
33:33so they get held
33:35responsible for that
33:36or is it
33:37that
33:38what am I trying
33:39to prevent
33:39am I trying
33:39to prevent
33:40the government
33:41from getting
33:42access
33:43to all of
33:45this data
33:45that should be
33:46proprietary for me
33:48am I trying
33:49to keep
33:49Meta
33:50and
33:50X
33:51and Instagram
33:52and all those
33:53other
33:53mega corporations
33:55or Palantir
33:55from being able
33:56to
33:56or even AI
33:58look
33:59AI supercharges
34:00here's what's crazy
34:01so AI
34:02basically runs on
34:04it's basically
34:05strip mining
34:06the totality
34:07of human existence
34:08it basically
34:10takes everything
34:11that we have
34:11ever accomplished
34:13through chemistry
34:14and poetry
34:15and art
34:16and music
34:16and it's fed
34:17into it
34:19and it gets
34:20to take
34:20everything
34:21that we are
34:22almost our essence
34:23and then if you
34:24ask
34:25open AI
34:26well what are you
34:28doing with it
34:28they go
34:29that's proprietary
34:30well so there's
34:31a few things there
34:32and there's a lot
34:33going on in that
34:34question
34:34but
34:35I'm the victim
34:36I'm sorry
34:36you are the victim
34:37and the first thing
34:38that you need to say
34:39is I want a comprehensive
34:40privacy law
34:41I want a law
34:43that limits
34:44when
34:44what these companies
34:46can
34:47track us
34:48just limits it
34:50we can do this
34:50we don't need a
34:51surveillance business
34:52model
34:52you know
34:53Moses didn't come
34:54down from the
34:54you know
34:55from the mountain
34:56with stone tablets
34:57that says
34:57the only way
34:58to make money
34:58on the internet
34:59is by spying
35:00on everyone
35:00so that's the good news
35:01we can do better
35:02to be fair
35:03wearing those
35:04meta glasses
35:04he was recording
35:05well because he
35:06wanted to know
35:06who people were
35:07but I think
35:08that there's that
35:09I also think
35:10so a comprehensive
35:11privacy law
35:12that says
35:13when information
35:13is collected
35:14by you
35:14from you
35:15for one purpose
35:16it can't be used
35:17for another purpose
35:18without a lot
35:20more than we
35:21than a click wrap
35:22here
35:22and a very
35:23very high standard
35:24and one of the
35:25things that can't
35:25be done
35:26is just handed
35:26to the government
35:27without a warrant
35:28who fights that
35:29the government
35:29or the tech company
35:31they both do
35:31we gotta take them
35:32both on
35:32but I think
35:33we have to
35:34otherwise
35:35we can lose now
35:37or we can fight
35:37and lose later
35:38I think we need
35:39to fight
35:39why is this technology
35:41so difficult
35:42for the government
35:43to wrap its arms around
35:44there were very clearly
35:46guardrails put into place
35:47for radio
35:48and television
35:48and the airwaves
35:50and it was considered
35:50a public good
35:51and it was a certain utility
35:52and there were a lot
35:54of responsibilities
35:55associated with it
35:56why has this been
35:57so much more difficult
35:59to wrap our arms around
36:00is it because
36:01it's faster
36:02no it's because
36:03it's not
36:04I mean the airwaves
36:05are regulated
36:06because there was
36:07a minimal amount
36:08of them
36:08that's why cable
36:08doesn't have nearly
36:09the regulation
36:10that the broadcast
36:11oh bless your heart
36:12that is correct
36:13yeah I mean
36:14so but the internet
36:16wasn't
36:16isn't
36:17it isn't limited
36:18it doesn't have
36:19limited you know
36:20broadcast airwaves
36:21or things like that
36:22so it could let
36:23everybody speak
36:24I think that's
36:25a good thing
36:25and I especially
36:26think that with
36:27the current regulators
36:28we have in power
36:29we should be very careful
36:31about arguing
36:32for a new regulatory
36:35structure over the internet
36:36I think that I'd like
36:38to see a comprehensive
36:39privacy law
36:39there's plenty of laws
36:40that we could pass
36:41but I'm not sure
36:42granting
36:44you know creating
36:45a second Brendan Carr
36:46for the internet
36:47is good for the internet
36:49right right
36:50well it's hard
36:52because
36:52you know
36:54what you hope is
36:56is that regulation
36:56is written
36:57you know
36:58the difficulty
36:58sometimes with government
36:59is they write regulation
37:00and then the courts
37:02who are textualists
37:02like for instance
37:03there's these immigration
37:04cases that are now
37:05rolling through the courts
37:06and the idea
37:07for immigration
37:08is
37:09the phrase
37:10arrive on
37:11America's shores
37:13and they're all arguing
37:14about does that mean
37:14you're first in line
37:15at the port of entry
37:16does that mean
37:16if you're fifth in line
37:18is that okay
37:19and it's really
37:20a failure
37:21of the imagination
37:22of Congress
37:23to write these laws
37:24but we
37:25we have to
37:25guard against
37:26these false choices
37:27yes
37:28I would think
37:29of
37:30I don't want to
37:31regulate
37:33harm reduction
37:34because I'm worried
37:35that
37:36the government
37:37may not be
37:38what I like
37:39if they misuse it
37:40well I mean
37:41I think that's right
37:42fully
37:43I don't think
37:44it's a regulator
37:45not kind of choice
37:46but I think
37:46you do need to think
37:47about what
37:48you're regulating
37:49and why
37:50and who you're
37:50empowering
37:51and a lot
37:52of the things
37:52that we see
37:53coming out of
37:53Congress right now
37:54that are regulation
37:56would either
37:56entrench the current
37:58tech giants
37:58and make it impossible
37:59for somebody else
38:00to start
38:01a different
38:02kind of social network
38:03talk about that
38:04what do you mean by that
38:05I mean
38:06some of the stuff
38:06that OpenAI
38:07said they wanted
38:08to do for AI
38:09that was regulatory
38:10basically would mean
38:11that we could never
38:12have civic AI
38:12we could never have
38:13a you know
38:14community
38:15kinds of
38:17they would monopolize
38:19the space
38:19because they would
38:20have the power
38:21to build in
38:23the loopholes
38:24or fight
38:24regulate
38:24or do any
38:25of those things
38:26but don't they
38:26really have
38:27I mean
38:27doesn't that exist
38:27now
38:28well it exists now
38:29but we could make
38:30it worse right
38:30we don't double down
38:32on that
38:32so here's an
38:33interesting one
38:34I'm curious
38:34what you would
38:35say about
38:37should we hold
38:38social media
38:39companies
38:40liable
38:41for harms
38:42or do you
38:43consider that
38:44where does that
38:45fit on your
38:46shouldn't restrict
38:47free speech
38:47with should protect
38:49Americans from harm
38:50I mean I think
38:51that one of the
38:52worries that I have
38:53on the regulatory
38:54on the legal side
38:55is that the
38:56Republican definition
38:57of harm
38:58and the Democratic
38:59definition of harm
39:00are very very
39:01different right
39:02the Republicans
39:02don't think
39:03any LGBTQ content
39:04they think that
39:05harms kids
39:06right
39:06and so we're
39:07seeing a couple
39:08of laws being
39:09floated in Congress
39:10now that will hold
39:11the companies
39:11responsible for harm
39:12so what that's
39:13going to mean
39:13is they're not
39:14going to let
39:14any of that
39:15content on it
39:16because if somebody
39:17claims they were harmed
39:18because their kids
39:18saw a trans person
39:19they're going to be
39:20able to sue the
39:21company
39:21so I worry about
39:24until we get really
39:25clear on what we
39:26mean by harms
39:27and shared agreement
39:28I right now
39:30I think you really
39:31have to ask yourself
39:31like why are the
39:32Republicans supportive
39:33of this and how
39:34are they defining
39:35this and some of
39:37the some Democrats
39:38should know better
39:39in my in my opinion
39:40because I think
39:41they're getting played
39:42um I think that
39:43for you know
39:44that's that's a kind
39:45of different question
39:46than your social media
39:47question
39:48um the the individual
39:50liability questions
39:51I I think that's a hard
39:53one too sometimes
39:54because I think that
39:55it's easy to blame
39:56technology for what
39:57are kind of complicated
39:58human problems
39:59I'm not saying it
40:00doesn't have
40:00the reason why I say
40:01this is because
40:02in my mind
40:05meta Instagram
40:06Twitter are not
40:08free speech
40:09they're algorithmically
40:10driven speech
40:11it's processed like
40:12like in the way
40:13Doritos is food
40:14like so
40:15it strikes me as
40:16like that
40:17that's actually not
40:18because free speech
40:19doesn't
40:20uh have a hierarchy
40:22there is not
40:23blue checks
40:23on free speech
40:24there's not
40:25uh ways to
40:26um
40:28that that
40:29you know
40:30their business model
40:31does not monetize
40:33hostility
40:34or other things
40:35like I don't
40:35I don't consider
40:36those social media
40:38companies
40:38free speech
40:39I consider it
40:40monetized
40:42incentivized
40:43algorithmic
40:44clusterf**ks
40:45that exist
40:46to create
40:47I mean I
40:48I think
40:48I think this might be
40:49a difference
40:50where we disagree
40:51because I really
40:52think that um
40:53you decide who gets
40:54to go on your show
40:55every night
40:55you decide what goes
40:56on the front of your show
40:57newspapers decide
40:59what goes on the headline
41:00and they do it
41:01based on how to try to
41:02sell more newspapers
41:03um I'm not defending
41:05everything
41:05I don't like these
41:06companies
41:06don't give me a wrong
41:07I want to undercut
41:08their entire business
41:09model
41:10but I think that when
41:11I think that when
41:12your strategy for doing
41:14that is encouraging
41:15censorship
41:17there's a lot of people
41:18who don't get to speak
41:19they don't get to be
41:20on your show
41:20they don't get to be
41:21on the New York Times
41:22but they get to speak
41:23here's the flip side of it
41:23and I would just push
41:24back gently
41:25is that
41:26I think the current
41:27toxicity
41:28of social media
41:30is far more
41:32censorious
41:33of free speech
41:35most of the people
41:36that I know
41:36don't want to engage
41:37in that toxic soup
41:39in the middle
41:40immediately they do
41:41they're hit with all
41:41kinds of
41:43terrible terrible
41:44things
41:45and like Reddit's
41:46an example
41:47Reddit was a sinkhole
41:48like 10-15 years ago
41:50that was a shithole
41:51and they cleaned it up
41:52and you can actually
41:53go on it now
41:54and have a human
41:56experience
41:56right
41:57and I think that
41:58it's important
41:59to continue to foster
42:00places for people
42:01to go
42:01so that when you
42:02look around
42:03and you realize
42:03you're in the Nazi bar
42:04you can leave
42:05and go to another bar
42:06but the thing is
42:07you don't know
42:08the algorithm
42:09doesn't tell you
42:11that they've notified
42:12the Nazis
42:13that you're there
42:14it's not transparent
42:15like on a newspaper
42:17when they say
42:18our top story
42:19you see it
42:20but it's like
42:21with advertising
42:22they outlawed
42:23subliminal messaging
42:24because it's not
42:27conscious
42:28the algorithm
42:29is unconscious
42:30it's doing things
42:31to you
42:31you don't see
42:32in the way that
42:33you don't see
42:34that in a food lab
42:35they're taking
42:36a raccoon's anus
42:37and turning it
42:38into raspberry flavoring
42:39I think that
42:39the worry that I have
42:41is that
42:42if you want somebody
42:44to develop
42:44a better version
42:45you can't create
42:47instant liability
42:48anytime somebody
42:49gets mad
42:50that their kids
42:50saw trans content
42:51but that's
42:52you can't
42:52but that's a false choice
42:53we need
42:54no
42:54we need
42:56we need
42:57better options
42:58and people are
42:58developing them
42:59right
42:59there's the whole
43:00mastodon universe
43:01I know it's not
43:02very big yet
43:03but it's a
43:03decentralized
43:04place where people
43:05can build safe
43:06communities for
43:07themselves
43:07there are
43:09but think about it
43:09as environmental
43:10protection
43:10can't we just
43:11view it as
43:12in the same way
43:12we view
43:13environmental protection
43:14if there are toxins
43:15we put it in the air
43:16and downstream
43:17of those
43:18are decisions
43:19that we make
43:20and when you put
43:20those regulations
43:21in you could say
43:22well that's just
43:22going to help
43:23the big corporations
43:23and all that
43:24but don't we have
43:25to at some level
43:26use regulation
43:27to mitigate
43:28very real world harms
43:30that are occurring
43:31at this moment
43:33through those companies
43:34I think that
43:34when you pick
43:35censorship
43:36as your strategy
43:37you are dealing
43:39with no
43:39that's what you
43:40want to do
43:40you want to create
43:41liability for the company
43:43based on things
43:43that people say
43:44to each other
43:45on the network
43:45and what you're doing
43:46is you're saying
43:47you should make sure
43:48that those people
43:48can't speak
43:49on your network
43:50because they're
43:50causing harm
43:51on your platform
43:52and I worry
43:53that that's
43:54don't we do that
43:55already with all
43:56forms of communication
43:57I can't
43:58I mean
43:59there are libel laws
44:00there are slander
44:01isn't that censorship
44:02I can't say things
44:03on this show
44:04that are patently
44:05untrue
44:05with malevolence
44:06because
44:07we regulate
44:09that harm
44:09well I mean
44:10you can always
44:11sue the people
44:12who are harming you
44:12on the platforms
44:13the question is
44:14secondary liability
44:15cat turd 161
44:17yeah
44:18yeah
44:18I mean honestly
44:20but also
44:20I just want people
44:21to be able to leave
44:22and I think that if we
44:23you know we wrote a paper
44:24a little while ago
44:25called privacy first
44:27and we did a little
44:28thought experiment
44:28about all of these
44:30social harms
44:31that are happening
44:31in other places
44:32and what the world
44:33would look like
44:34if we just cut the knees
44:35out of the business model
44:38of surveillance
44:39and you know
44:40they don't go away
44:41but they get a lot smaller
44:43and I think
44:44that there's
44:46a lot of things
44:46that we could try
44:47I'd also like to see
44:48a lot more support
44:49for kids
44:49I really think
44:50that the
44:52a lot of the cases
44:54that are horrible
44:55that we're seeing
44:55there's usually
44:56a lot of other things
44:57going on in these kids lives
44:58and we don't do
44:59a good job
44:59of taking care of them
45:01but I also
45:02I worry
45:02I think that the
45:03censorship strategy
45:04feels really easy
45:05and it feels
45:08really powerful
45:09and I worry
45:10it's going to backfire
45:10I really do
45:11yeah yeah
45:12I understand
45:14I guess
45:16my only point
45:17is I think
45:17I think we define
45:19censorship slightly differently
45:20we might
45:20and we always go
45:22and I look at it
45:23more as like
45:24with cigarette smoking
45:25like
45:28social media to me
45:29is like secondhand
45:31smoke sometimes
45:32in that
45:33I didn't choose
45:34to do it
45:34but because of
45:35the algorithm
45:36it still shows up
45:37for me
45:38and if you
45:38shipped over to
45:39Blue Sky
45:40you don't have
45:40an algorithm
45:40feeding you things
45:41but if I shipped
45:42over to Blue Sky
45:43then the world
45:43no longer makes sense
45:44I'm sorry about that
45:45I mean honestly
45:46but I think that
45:47the answer
45:48is more options
45:48for people
45:49I think that
45:50a world
45:50in which there are
45:51five big media companies
45:52that decide
45:53everything that we see
45:54and how we see it
45:55the answer isn't
45:56to try to take
45:57a dictator
45:58and make them
45:58a better dictator
45:59right
46:00we've got these
46:01social media dictators
46:02the answer is
46:03to get rid of
46:03the dictators
46:04and make them
46:05less important
46:06alright
46:07well done
46:09and we do that
46:10through the privacy act
46:11a privacy
46:12a real privacy act
46:13and defending our data
46:14and defending our data
46:15I love it
46:16well thank you
46:16for a really
46:18thought provoking
46:19and inspiring
46:20conversation
46:21that truly does
46:22make me look at it
46:23in a different way
46:24because I'm coming
46:25at it from this way
46:25and now I'm
46:26seeing it a bit
46:27from this way
46:27and I appreciate it
46:28this was fun
46:36Cindy's book
46:37privacy's defender
46:38it's a great read
46:39Cindy Cohen
46:40we're going to take
46:41you a quick break
46:42we'll be right back
46:42after this
47:02hey let's go for tonight
47:03before we go
47:04we're going to check in
47:04with your host
47:05for the rest of the week
47:05Miss Desi Leiden
47:06Desi
47:12what do you have
47:15for the people
47:16this week
47:16oh big holiday week
47:18John
47:18Passover and Easter
47:20and I for one
47:21could not be more excited
47:22which is why
47:23I brought
47:24some delicious
47:26chocolate eggs
47:27for me
47:28yeah yeah
47:29and for you
47:31a traditional
47:33Jewish
47:34shank bone
47:46oh you
47:47that's
47:47thank you
47:48um
47:50you shouldn't have
47:52aw
47:53try it John
47:55try it now
48:00it's not really
48:02may I have one
48:04of the chocolate
48:04eggs
48:05and ruin your appetite
48:07for the shank bone
48:08no way
48:09eat up
48:10come on
48:11cheers
48:14yes
48:17happy holidays
48:19everybody
48:20Desi
48:23here it is
48:24your moment of jam
48:30back on Venezuela
48:31where Chris and I
48:32have both had a chance
48:33to be there
48:34I literally think
48:35they're going to put up
48:36a statue to President
48:37Trump lowering the price
48:38of gas in America
48:39so it's a
48:40forget that
48:40when are they going to
48:41do the statue
48:44sorry
48:44sorry
48:44soon
48:44second
48:44втор
48:44second
48:44second
48:45second
48:46third
48:48fourth
48:54third
Comments

Recommended