Skip to playerSkip to main content
  • 4 hours ago

Category

📺
TV
Transcript
00:03From the most trusted journalists at Comedy Central, it's America's only source for news.
00:10This is The Daily Show with your host, Jon Stewart.
00:40Thank you very much.
00:45The opening of the show is starting to turn into, like, a wrestling open.
00:49Let's do some satire!
00:53Welcome to The Daily Show. My name is Jon Stewart. Great show for you tonight.
00:55Later on, I will be speaking with civil liberties attorney Cindy Cohn.
01:00She's going to be doing it.
01:06We're going to discuss her new book about digital privacy.
01:09And then she will show me how to post a close friend's story.
01:18I recently joined Instagram and I don't know how to do it.
01:24But as many of you know, it's the one-month anniversary of the Iran war.
01:28And as we all know, one month is the escalated threat anniversary.
01:32If the Strait of Hormuz is not immediately open for business, we will conclude by blowing up and completely obliterating
01:39all of their electric-generating plants, oil wells, and Karg Island.
01:44You better open that strait or we're going to blow you up.
01:52More than we already blew you up. We're going to double blow you up.
01:55We're going to blow you up.
01:58That's where we are at in our war.
02:12I believe we've entered what General Patton used to refer to as the human centipede portion of the war.
02:21I urge you not to look that up.
02:24And I understand why the world would like the Strait of Hormuz open.
02:30The conflict is disrupting global supply chains for other surprising and essential products.
02:36Grain, nuts, oil, saffron, dates.
02:39Paddles and pickleballs.
02:46Yeah, uh, I'm not sure how that snuck into the essential products store.
02:52But I guess we all make sacrifices in wartime.
03:07My dearest Eliza.
03:10It's been over a month since I've played, well, not tennis, kind of like tennis.
03:18It's more like, like baby tennis.
03:22It's like, well, it's bigger than ping pong.
03:24You really have to try it, Eliza.
03:26It's so fun.
03:33Acting. Acting.
03:35That was acting.
03:36Uh, what else is being blocked?
03:39It's getting harder to get food that comes with pistachios.
03:42You can't make Dubai chocolate bars.
03:48Oh, no.
03:50What will our influencers stand in line to pretend to eat?
03:55Dubai chocolate, my God, that's been an American staple for tens of days.
04:01I can't believe how the news has to frame world events to try and make Americans care.
04:07The whole region is being flattened.
04:11Innocent people are dying.
04:13Their food and fuel are in total chaos.
04:15And our news is like, if this goes on any longer, say goodbye to your stuffed crust pizza.
04:26Are Americans losing anything else?
04:29It turns out there is another key material being affected by the war in Iran, fertilizer.
04:33There could be helium shortages.
04:35Yes, the gas that's used in party balloons.
04:41Helium is a fundamental gas used in the production of advanced chip technology.
04:45You don't have to dumb it down to make us, oh, this war could be even bad for your promposals.
04:52Like, come on!
04:55For more on how the closing of the Strait of Hormuz is affecting Americans, we go live to
04:59Ronnie Chang, Jordan Klepper, and Michael Kosta, who are on the ground right now in the Strait
05:04of Hormuz.
05:06Uh, gentlemen, thanks for joining us.
05:08Uh, you know what I'm gonna do?
05:09Ronnie.
05:10Uh, Ronnie Chang.
05:12Ronnie, I'm gonna start with you.
05:14Uh, uh, what-what's the latest?
05:16Well, John, the latest is the Iranians think they can just close the Strait of Hormuz and
05:21stop us from eating pistachios.
05:24Well, I got a message for them from America.
05:30They can come and get these nuts out of my mouth!
05:33That's right.
05:37That's right, John.
05:38That's right, John, and I also have a message.
05:41I just want to add that if the Ayatollah thinks he's gonna stop Americans from getting
05:44our supply of helium, well, then I've got one thing to say.
05:52You pussy-pitches can come get it out of my mouth.
05:59That's amazing.
06:01Uh, and, uh, Jordan Klepper.
06:03Uh, yes, John.
06:08Uh, what do you have?
06:11Is there anything-is there anything else that the Strait is affecting?
06:14Uh, yes, uh, fertilizer, uh, shortage.
06:18Uh, back to you, John.
06:23Is there any message that you want to send about the fertilizer shortage?
06:28Oh, yeah, John, sure.
06:30Uh, big one.
06:31If you think you're gonna take our fertilizer, well, you're gonna have to come and take it
06:38from my hands, John.
06:42But-but-but your hand?
06:44What the f***?
06:45Eat it!
06:47Yeah, Jordan, I got no problem sending a message, all right?
06:51Hey, yo, these nuts don't rhyme!
06:55I'm not taking a bite.
06:56I-I just killed 10 million brain cells for America.
06:59Take a bite.
07:00No, I don't want to take a bite, okay?
07:01I mean, Jordan, the other guys are making the Ayatollah take things from their mouth.
07:11Does feel like you might not be as patriotic.
07:15I-I-I don't know what putting these things in our mouths proves, John.
07:20Yeah, it proves that we have the resolve to do what we have to.
07:24We die for the cause.
07:26They had luxury foods and party balloons!
07:30This is literally processed shit!
07:33I mean, this-this train is disrupting baklava.
07:36Can I have baklava?
07:37Let me do baklava!
07:37Just eat it, you baby!
07:39No!
07:40I don't want to eat!
07:41If you don't, Iran wins.
07:45Yeah!
07:46Yeah!
07:49You want Iran to win or Donald Trump?
07:53Are those my only choices?
07:54E-I-I!
07:55No! No!
07:56E-I-I!
07:57Come on, come on!
07:58Okay, okay, okay!
08:00Oh!
08:01Oh!
08:03Oh!
08:05You're happy now!
08:07You're happy!
08:07Let me get it around!
08:09Take the shit out of my mouth!
08:11Oh, yes, no!
08:14disgusting you know what you know what in a way this is a metaphor because as americans
08:20in this quagmire with no end in sight we are all eating shit
08:34yeah but you did it for real yeah it's very disgusting thank you guys ronnie chang michael
08:40costa and jordan clipper everybody good report good report now meanwhile and i thank them all three
08:49for that that brave reporting that they've done meanwhile back on the home front this weekend
08:55offered stark images articulating our nation's deep political divide as eight to nine million
09:02people took to the streets for the no kings protest yeah they did it while
09:10a slightly smaller group of let's call them heritage americans took to the reagan meeting
09:17room 1a on the second floor for their yaskings rally if you want an object lesson on what a strong
09:26commander-in-chief looks like take a look at the current resident of the oval office president
09:31donald j trump what he's been able to do in one year is truly remarkable great president greatest
09:36president of my lifetime that guy is an energizer bunny we'll never get another president like
09:40donald trump never and that's why it's important that we do everything that we can to try to get
09:45him re-elected i'm sorry that's reverend franklin graham saying praise the lord and the constitution
09:58now look far be it for me to question the wisdom of this year's cpac convention but clearly the vibes
10:06this year were slightly off how many of you would like to see impeachment hearings
10:13no that was the wrong answer
10:22that's a rookie crowd work mistake slap see they cheered for trump's impeachment not realizing that
10:30you were calling for the counter but that's on you crowds are like dogs they react to tone
10:34you can get them excited about anything who wants to go to the vet who wants to get the balls
10:38cut off
10:42see in trump land old donald didn't let something as trivial as a war keep him from doing his rounds
10:49at the golf course and at a saudi investment meeting in miami because god forbid during a war
10:55he let the precipice of world war three yuck his yum in any way that was trump's weekend and i
11:01find it so
11:02astounding that this nuclear armed man baby doesn't seem to have any understanding of the confusion
11:10and anxiety that his ill-planned adventure in iran is causing this country he's just trucking along
11:17like it's any old episode of the apprentice here is trump last night on air force one watch him try
11:24and focus for more than two sentences on the war he started before veering off into what really matters
11:30it truly is regime change and regime change is an imperative but i think we have it automatically
11:38i did something today i uh we just got these in from the architects a lot of people are
11:44talking about how beautiful the ball for 150 years they've wanted to build a ballroom at the white house
11:54here's another view this is coming from right opposite the treasury building here's a view
12:00on the south with the porch this is a view of it from the north and there'll be corinthian which
12:08is
12:09considered the best most beautiful by far
12:14i know i know what you're thinking how do you get something that size on a plane
12:25but i guess he's he's allowed it's his emotional support ballroom picture
12:31now you may say well what prompted this presentation about the ballroom well it turns
12:38out this incredibly long-winded visual aid assisted timeshare presentation was spurred by
12:43a critical new york times article that had the gall to suggest that sometimes trump's plans don't hold
12:51up to scrutiny the times had architects analyze plans for the ballroom they pointed out design flaws
12:58like staircases that seem to lead nowhere
13:03oh okay woke new york times
13:08when mc escher does that it's art
13:13oh but you put it on an actual building
13:16and suddenly it's this doesn't make any sense
13:20and i appear to be trapped
13:25for the ballroom trump will pull an all-nighter for a point-by-point rebuttal for the war
13:33literally doesn't have the focus to answer one question about the dire consequences of his actions
13:40on his favorite network i think it is alarming that we have not been able to see or hear from
13:49any of
13:49the iranian people and i think there is some general worry about them do you have any insight as to
13:56how they are doing do they have drinking water do they have food
13:59wow straightforward question it asked the president to put himself in the shoes of those purportedly
14:05we're trying to liberate and the suffering they may be going through the iranian people
14:11are hurting sir do they have food right i do it's upsetting i do but first you remember when we
14:19had lunch
14:19years ago in the base of trump tower when it was a brand new building
14:26so the point is the iranians might not have food but you remember you and i about 12 13 years
14:32ago
14:32had a club sandwich remember that
14:36a long time ago yes a long time ago and you haven't changed
14:40you have not changed now i'm not allowed to say this it's the end of my political career but
14:45you may be even better looking okay so i don't know what you're doing
14:48but i will not say that i will not say that because that will end my political career you know
14:54you're not allowed to say a woman's beautiful anymore
14:56you know it's funny dana your your question about the the suffering of the iranian people has
15:01somehow made me horny i don't know why
15:05i guess you asked me do they have food uh do they have water and the whole time i'm thinking
15:09like what a piece of ass what a delightful piece of ass you are you remember that dana remember how
15:16i made you uncomfortable at lunch you remember i do that i do that a lot to women i make
15:23them
15:23uncomfortable what was your question i don't remember your question the war other than the
15:31occasional tweet doesn't seem to occupy any space in trump's brain and by the way not just for tv
15:37on friday he addressed a room full of saudi investors who you would think might be very
15:42concerned about the bombings in their neighborhood but he wants to let them know we don't have to
15:48talk about that at all i'm asked to take a few questions and unlike other politicians they
15:55would like the question screened i don't ask for screening of the questions you can ask me
15:59anything you want you can talk sex you can do whatever the hell you want
16:20we we can ask about sex
16:28okay
16:40whose name is that is that your name whose name is that
16:45who are you meant like general sex shit like love line shit
16:50sorry
16:55you know honestly his leering behavior is less commander-in-chief at war and more grandpa who's
17:02lost his filter in public instead of assuaging a nervous nation he's just embarrassing the whole
17:08family at dinner going hey do you see our waitress is a busty one just like your grandma was
17:21but like any good monarchy trump's loyal henchmen assure you that it's really the opposite of
17:27everything you're seeing he has uh encyclopedic molecular knowledge i was on the airplane with him
17:35and we were sitting across the table from each other we started talking about syria
17:41and he got a a placemat and he turned it on its back and then he took a sharpie and
17:47he drew a
17:48perfect map of the mid east and then he put the troop strength of every country on every border
17:56on that map
18:01has anyone thought about filming that
18:05let us see that because that's not what we see what we see is a president four weeks into a
18:11war
18:11he has yet to fully explain with objectives he has yet to fully define only displaying molecular
18:16knowledge in a cabinet meeting of his own pen preferences
18:20so i came here to have thousand dollar pens and you know you hand pens out you're signing
18:24and you hand them out you're handing them to all these people beautiful pen ballpoint
18:29thousand it was gold silver going handing out to kids that don't even know what they're what is
18:33this mommy so i take it out and i saw it and there's no ink and i got all you
18:38people looking
18:39and you say there must be something wrong with pen right here this pen is an interesting example it's
18:43the same thing here's what i called the guy i said i'd like to use your pen but i can't
18:48have a gray
18:49thing with a big s on it saying sharpie you said why can't you make a dice so what can
18:54you do he
18:55said i'll paint it black sir if you like in gold almost real gold
19:04that's a cabinet meeting during a war don't tell us what to use to draw the map
19:11just draw the map you know all we keep hearing from this administration is why the american people
19:17have to sacrifice for trump's vision of america's greatness that these temporary disruptions are
19:24just part of the process and why can't we be patriots we have to be patient we have to suck
19:30it up
19:30whether it's high gas prices or whimsical tariff inflation or draconian ice rates or temporary bill
19:36of rights suspensions it's on us to understand but trump gets to be just the same old ain't i
19:43a stinker utterly self-absorbed remember when i used to want a hot girl's twat self
19:50can you imagine any other president let alone a wartime president being this indulgent
19:56and there were a thousand dollars apiece beautiful pen ballpoint thousand who's gold silver gorgeous but
20:04i'm handing out to kids that don't even know what they're what is this mommy
20:07is his kids they're getting a pen for a thousand dollars they have no idea what it is
20:15now to be fair to trump pen technology was at fdr's time quite primitive
20:21but trump is the guy you want to be your king that's who you want it's important that we do
20:27everything that we can to try to get him re-elected really this is the guy you'll break the constitution
20:35for the american monarchy begins with this guy
20:42i did it this is the line we face in this country no kings versus kings but a word of
20:49caution
20:51generally with monarchies the first guy is the best guy it's the guy that's so good it makes the
20:59people want a king a charlemagne an alexander a ralph
21:10but pretty soon corruption and inbreeding take their toll on the monarchy and turn your king
21:18into this guy old charles the second yeah that's his real head
21:29he liked cheese
21:32it was a favorite food
21:35so go ahead but just understand we're starting a lot closer to charles than charlemagne
21:45when we come back cindy cohen will be joining me in the studio
22:00welcome back to the daily show my guest tonight she's the executive director of the electronic
22:05frontier foundation and author of the new book privacy's defender my 30-year fight against
22:10digital surveillance please welcome to the program cindy cohen
22:28very very nice to see you uh i want to point out uh for the people at home who can't
22:33see she is
22:34wearing a let's sue the government t-shirt
22:41we have great designers at eff they made this special what is eff and what is that the electronic
22:47frontier foundation is the world's biggest and oldest digital rights organization we were founded
22:52in 1990 our basic goal is to make sure that when you go online your rights go with you so
22:58we do things
22:59around free speech around privacy and around trying to build an internet that that stands with you
23:05rather than against you how did you know it's so prescient the idea i'm thinking back to 1990
23:13and the internet was you had a little aol disc yeah maybe 95 or 6 but whatever that right it
23:20might
23:21have been it was just like alt binary message boards yeah how did you all have the prescience
23:29to see that this could be turned into a tool of surveillance and of uh government repression
23:37yeah i mean the founders of the eff the john perry barlow uh john gilmore and mitch kapoor
23:43were involved in some early fbi missteps around the internet and they worked with the fbi or they
23:51were like no they were like questioned by the fbi and there were a lot of raids going on and
23:57uh there
23:58was uh they weren't rated but they were asked a bunch of questions and they realized that the
24:02government didn't really understand how the internet was working either at least the fbi didn't
24:06and they thought we need to start an organization that's gonna think about how our civil liberties
24:13work in the digital age um and a couple of them are pretty geeky and they realized that the internet
24:19we were building was not very secure it didn't have a lot of privacy or security in it right and
24:26the
24:26first fight we did was to try to free up a technology called encryption that now is the reason that
24:33we have
24:33the privacy and security we have online most of it this is your first uh lawsuit case and you joined
24:38this because you're uh you're a lawyer and you're interested in this space or you had met these guys
24:44how did how did your involvement i met them kind of by chance honestly they showed up at a party
24:48at my
24:49house um some of them um but i was doing this is all sounding very where did this take place
24:56san francisco okay in the hate yeah yeah now i'm understanding but there's a tub of kool-aid
25:02somebody put something in something it's got a little messy no um i was a i was a human rights
25:07lawyer i had done human rights i've been in geneva and i came back to san francisco to try to
25:12start a
25:12regular law practice and i met some of these guys kind of really by chance and they were already
25:19online you know this was 1990 1991 they were online at this very early internet before we had the
25:25world wide web but they were doing really cool things they were collaborating across distance they
25:31were having these long chats vacuum stuff that we take for granted now but it was pretty magical back
25:36then um it's really interesting too that they and you decided because i remember that time as a gold
25:43rush so the 90s as it started this is when like every internet startup anybody that you knew that was
25:49even adjacent if they had a word processor they were a startup yeah and they wanted vc funding and they
25:56were going to you know make a ton of money and you and this group decide we're not going in
26:02that
26:02direction we're going to go in the protecting citizens rights direction where the real money is
26:09yeah why would you my stock options are so great right yeah yeah i mean look i think that there
26:17are
26:17people who you know plenty of people in the early internet were interested in making money i don't have
26:21any judgment about that but we were thinking about how could this be a technology that supports people
26:27how it could be a technology of freedom how it could give people you know more power against the
26:33powerful forces in their lives and they were again very forward-looking and they you know created an
26:38organization to try to do that um i don't know where it comes from for me i've always just wanted
26:44to make
26:45the world better and then i found you know i'm not a technical person but i kind of landed in
26:49this place
26:50in this time and was able to do it in 90 and then so you're you're working over this sort
26:55of decade
26:56and you get this great encryption case and you make it so that people have a right to encrypt their
27:00data
27:01and and these kinds of things but then 9-11 happens correct and that really supercharges the
27:07surveillance state yeah after 9-11 the patriot act both the patriot act which they did above board
27:14and a lot of stuff that we later discovered as a country they were doing secretly
27:18these metadata these hoovering up of all of our data and putting it through process yeah tapping
27:23into the internet backbone collecting metadata collecting telephone records um some of this
27:29you might be able to read in the patriot act but most of it was really under under you know
27:34secret
27:35interpretations and things like that right um but they had really flipped from the idea that you know
27:41you you needed probable cause you needed a warrant you needed individualized suspicion to doing
27:46mass surveillance first and figuring out what they need second and uh that has only increased as the
27:52technology has increased um was it that they they had decided to treat virtual world as separate from
28:00physical world in other words the rules of like the fourth amendment might not apply to your digital
28:06life in the way that it would apply to your house yeah although they were using some earlier stuff
28:13that's in the non-digital world this idea that metadata isn't protected um but the content of
28:18metadata meaning uh like the date and time and location and the other things that go along with
28:25your digital life correct but they were using an analogy from envelopes and letters in you know the
28:32actual physical world but in the digital world the metadata is much more powerful it can tell you a lot
28:38there's a very famous quote you know from a member of the nsa saying we kill people with metadata right
28:43because when you are able to collect it at scale and analyze it you can do a lot more with
28:50it than
28:50kind of the old school envelopes and and you know and i don't remember anybody saying we kill people with
28:56envelopes it hasn't happened that i know but but but so this is one of the things they took this
29:01distinction
29:02that is actually an offline distinction uh that may have made some sense i'm not sure it did but in
29:07the online world it really has eroded our privacy and it's being used in ways now that are increasingly
29:13visible right i mean in the you know there's testimony in oregon just last month from an ice agent saying
29:18they used a google-like thing to identify where uh people were that they ended up doing an ice raid
29:25so
29:25um that's and the company palantir is very proud of the fact that they're sending uh metadata
29:34to to accomplish those things that they're that these large tech companies are working with the
29:40company so now is your fight are you fighting two behemoths are you fighting the government and also
29:48the tech companies absolutely i mean we kind of always were right that if you think about the post
29:539 11 surveillance they didn't come to us and ask us for our telephone records right they went to the
29:57telephone companies that's where they got it so they've always gone through the companies that
30:03provide us services um in order to get information but now the number one purchaser of information from
30:09data brokers is the government and you know fbi director cash patel just said to congress we're going
30:16to keep doing it so um it was something they were doing a little of and now they're really super
30:21charging it and doing it at a level that's much greater than it was before but you're right we
30:25we don't really have the luxury of a choice about whether we care more about government surveillance or
30:30private surveillance they they work hand in hand and kind of always have how receptive have the
30:37courts been to the arguments that people's privacy is part and parcel to their security
30:45i mean good and bad um i think we've got a supreme court that's somewhat hostile to the idea that
30:52privacy is a harm at all um but we see a lot of lower courts yeah being courageous i mean
30:57in many other
30:58issues we're we're seeing the same dynamic where um we've got a lot of lower courts who are worried
31:03about people's privacy both in the commercial you know context and in the law enforcement context we've
31:10been chipping away at this post 9 11 surveillance we've made a lot of progress over time not kind of
31:15in a you know one fell swoop but chipping away and chipping away and congress has stepped in and done
31:20a little um they need to do a lot more and there's a big vote coming up in april on
31:27some of the mass
31:27surveillance it's called fisa section 702 so it's it's we're continuing the fight we have to continue
31:35the fight but we've also come quite a ways from where we were right after 9 11 where all this
31:40stuff was secret and massive what do you want people to know how are we culpable in the dismantling
31:45of our own security and by that i mean i don't understand at all i just click sure like whenever
31:56it always comes on and like you know uh is it okay if we try and i'm just like i
31:59just want to buy
32:00shoes like so i just click and i do things and i'm just i don't know what i've given up
32:07yeah i i
32:08really think that's blaming the victim you're the victim here this is thank god this is cooked this is
32:13not set up so you have a real choice i don't thank you so we need to fix this but
32:19we need to fix it
32:19with law with policy maybe with better technologies but like i think that the idea that individuals
32:25should bear responsibility for this it's pretty down far on the list of things because it's not
32:31like we have other options right i mean we do for some things i think it's great if you want
32:36to use
32:36an end-to-end encrypted tool like signal rather than you know a a tool that's not there are tools
32:42that
32:42can help you but i don't think that you should be blamed because the game is rigged against you on
32:49a
32:49lot of these privacy things and you know the original i think one of the original sins of the
32:55internet is everything is these little click wraps where i think somebody did a study it would take
32:59you 37 years to read all the terms of service that you agree to in an ordinary internet experience
33:05that's crazy because i'm halfway through i'm about halfway through it's not fair and it's not right
33:10we have to fix this with something other than personal responsibility so who who bears that
33:15responsibility then who is if i'm the victim who is the party that i'm going after am i going after
33:24palantir am i going after you know they just had this big lawsuit uh where i think meta was held
33:31responsible for uh certain so so they get held responsible for that or is it that what am i trying
33:39to prevent am i trying to prevent the government from getting access to all of this data that should
33:46be proprietary for me am i trying to keep meta and uh x and instagram all those other mega corporations
33:55or palantir for being able to or even ai look ai supercharges here's what's crazy so ai basically
34:03runs on it's basically strip mining the totality of human existence it basically takes everything that
34:11we have ever accomplished through chemistry and poetry and art and music and it's fed into it
34:19and it gets to take everything that we are almost our essence and then if you ask open ai
34:26well what are you doing with it they go that's proprietary well so there's a few things there and
34:32you there's a lot going on in that question but i'm sorry you are the victim and the first thing
34:38that you need to say is i want a comprehensive privacy law i want a law okay that limits when
34:44what the what these companies can track us just limits it we can do this we don't need a surveillance
34:52business model you know moses didn't come down from the you know from the the mountain with stone
34:57tablets that says the only way to make money on the internet is by spying on everyone so that's the
35:01good news we can do better to be fair wearing those meta glasses he was recording well because he
35:06wanted to know who people were but i think that there's that i also think so a comprehensive
35:11privacy law that says when information is collected by you from you for one purpose it can't be used
35:17for another purpose without a lot more than we than a click wrap here right and a very very high
35:24standard and
35:24one of the things that can't be done is just handed to the government without a warrant who fights
35:29that the government or the tech company they both do we got to take them both on but i think
35:33we
35:33we have to otherwise we're you know we can lose now or we can fight and lose later i think
35:39we need
35:39to fight why is this technology so difficult for the government to wrap its arms around you know
35:44there were very clearly guardrails put into place for radio and television and the airwaves and it was
35:50considered a public good and it was a certain utility and uh there were a lot of responsibilities
35:55associated with it why has this been so much more difficult to wrap our arms around is it because
36:01it's faster no it's because it's not um i mean the airwaves are regulated because there was a
36:07minimal amount of them that's why cable doesn't have nearly the regulation that the broadcast oh bless
36:12your heart that is correct um yeah i mean so but the internet wasn't isn't it isn't limited um it
36:19doesn't
36:19have limited you know broadcast airwaves or things like that so it it could let everybody speak i think that's
36:25a
36:25good thing and i especially think that with the current regulators we have in power we should be
36:30very careful about arguing for a new regulatory structure over the internet i i think that i'd like
36:38to see a comprehensive privacy law there's plenty of laws that we could pass but i'm not sure right
36:43granting you know creating a second brendan car for the internet is good for the internet right right
36:51well it's hard because you know what what you hope is is that regulation is written you know the
36:58difficulty sometimes with government is they write regulation and then the courts who are textualists
37:02like for instance there's these immigration cases that are now uh rolling through the courts and the
37:06idea for immigration is the phrase arrive on america's shores and they're all arguing about does that
37:14mean you're first in line at the port of entry does that mean are if you're fifth in line is
37:18that okay
37:19are and it's really a failure of the imagination of congress to write these laws but we we have to
37:25guard against these false choices yes i would think of i don't want to regulate harm reduction because i'm
37:35worried that the government may not be what i like if they misuse it well i mean i i think
37:42that's right
37:43fully i don't think it's a it's a regulate or not kind of choice but i think you do need
37:47to think
37:47about what you're regulating and why and who you're empowering and a lot of the things that we see
37:53coming out of congress right now right that are regulation would either entrench the current tech
37:58giants and make it impossible for somebody else to start an a different kind of social network talk
38:04about that what do you mean by that i mean some of the stuff that open ai said they wanted
38:08to do for
38:08ai that was regulatory basically would mean that we could never have civic ai we could never have a you
38:14know community uh kinds of of of they would monopolize the space because they would have the power
38:21to build in the loopholes or fight regularly or do any of those things but don't they really have i
38:27mean doesn't that exist now well it exists now but we could make it worse right i mean we don't
38:31double
38:31down on that so here's an interesting one i'm curious what you would say about should we hold social
38:39media companies liable for harms or do you consider that where does that fit on your shouldn't restrict
38:47free speech with should protect americans from harm i mean i think that one of the worries that i have
38:53on the regulatory on the legal side is that the republican definition of harm and the democratic
38:59definition of harm are very very different right the republicans don't think any lgbtq content they think
39:05that harms kids right and so we're seeing a couple of laws being floated in congress now that will hold
39:11the companies responsible for harm so what that's going to mean is they're not going to let any of
39:15that content on it because if somebody claims they were harmed because their kids saw a trans person
39:19they're going to be able to sue the company um so i worry about until we get really clear on
39:26what we
39:26mean by harms and shared agreement i right now i think you really have to ask yourself like why are
39:32the
39:32republicans supportive of this and how are they defining this um and some of the uh some democrats
39:38should know better in my in my opinion because i think they're getting played um i think that for
39:44you know that's that's a kind of different question than your social media question um the the individual
39:50liability questions i i think that's a hard one too sometimes because i think that it's easy to blame
39:56technology for what are kind of complicated human problems i'm not saying it doesn't have the reason why i
40:01say it is is because in my mind meta instagram twitter are not free speech they're algorithmically
40:10driven speech it's processed like like in the way doritos is food like so it strikes me as like
40:17that that's actually not because free speech doesn't uh have a hierarchy there is not blue checks on free
40:24speech there's not uh ways to um that that you know their their business model does not monetize
40:33hostility or other things like i don't i don't consider those social media companies free speech i
40:40consider it monetized incentivized algorithmic clusterf**ks that exist to create
40:47i mean i i think yeah i think this might be a difference where we disagree because i really
40:52think that um you decide who gets to go on your show every night you decide what goes on the
40:57front
40:57of your show newspapers decide what goes on the headline and they do it based on how to try to
41:02sell more newspapers um but i'm not defending everything i don't like these companies don't
41:07get me wrong i want to undercut their entire business model sure but i think that when i think that
41:12when your your your strategy for doing that is encouraging censorship there's a lot of people
41:18who don't get to speak they don't get to be on your show they don't get to be on the
41:21new york
41:21times but they get to speak here's the flip side of it and i would just push back gently is
41:26that
41:26i think the current toxicity of social media is far more censorious of free speech most of the people
41:36that i know don't want to engage in that toxic soup in the middle immediately they do they're hit with
41:41all kinds of terrible terrible things and like reddit's an example reddit was a sinkhole like
41:4910 15 years ago that was a shithole and they cleaned it up and you can actually go on it
41:54now
41:54and have a human experience right and i think that it's important to continue to foster places for
42:01people to go so that when you look around and you realize you're in the nazi bar you can leave
42:05and go
42:05to another bar but the thing is you don't know the algorithm doesn't tell you that they've notified
42:12the nazis that you're there it's not transparent but like on a newspaper when they say our top story
42:19you see it but it's like with advertising they outlawed subliminal messaging because it's not
42:27conscious the algorithm is unconscious it's doing things to you you don't see in the way that
42:33you don't see that in a food lab they're taking a raccoon's anus and turning it into raspberry
42:38flavor i think that the worry that i have yeah yeah is that if you want somebody to develop a
42:45better
42:45version yes you can't create instant liability anytime somebody gets mad that their kids saw
42:50trans content but that you can't we need we need no we yeah we need we need better options and
42:58people
42:58are developing them right there's the whole mastodon universe i know it's not very big yet
43:03but it's a decentralized place where people can build safe communities for themselves there are
43:09think about it as environmental protection can't we just view it as in the same way we view environmental
43:13protection if there are toxins we put it in the air and downstream of those are decisions that we make
43:20and when you put those regulations in you could say well that's just going to help the big corporations
43:23and all that but don't we have to at some level use regulation to mitigate very real world harms
43:30that are occurring at this moment through those companies i think that when you pick censorship
43:36as your strategy you are dealing with no that's what you want to do you want to create liability for
43:42the company based on things that people say to each other on the network and what you're doing is
43:47you're saying you should make sure that those people can't speak on your network because they're
43:50causing harm on your platform and i worry that we do that don't we do that already with all forms
43:56of
43:56communication i can't i mean there are libel laws there are slander isn't that censorship i can't say
44:03things on this show that are patently untrue with malevolence because we we regulate that harm well i
44:10mean you can always sue the people who are harming you on the platforms the question is secondary
44:15liability cat turd 161 yeah yeah i mean honestly but also i just want people to be able to leave
44:22and i
44:23think that if we you know we wrote a paper a little while ago called privacy first and we did
44:27a
44:27little thought experiment about all of these social harms that are happening in other places and what
44:33the world would look like if we just cut the knees out of the business model of scent of surveillance
44:39the
44:39and and you know they don't go away but they get a lot smaller and i think that that there's
44:46a lot of
44:46things that we could try i'd also like to see a lot more support for kids i really think that
44:50the
44:52uh a lot of the cases that are horrible that we're seeing there's usually a lot of other things
44:57going on in these kids lives and we don't do a good job of taking care of them but i
45:01also i worry i
45:03think that the censorship strategy feels really easy and it feels uh really uh powerful and i worry
45:10it's going to backfire i really do yeah yeah i i understand i i guess my only point is i
45:17think
45:17i think we define censorship slightly differently we might and and we always go and i i look at it
45:23more as like with cigarette smoking like social media to me is like secondhand smoke sometimes
45:32in that i didn't choose to do it but because of the algorithm it still shows up for me and
45:38if you
45:38shipped over to blue sky you don't have an algorithm feeding you things i mean if i shipped over to
45:42blue sky
45:43then the world no longer makes sense i'm sorry about that i mean honestly but i think that the
45:47answer is more options for people i think that a world in which there are five big media companies
45:52that decide everything that we see and how we see it the answer isn't to try to take a dictator
45:58and make
45:58them a better dictator right we've got these these social media dictators the answer is to get rid of
46:03the dictators and make them less important all right well done and that's and we do that through
46:10the privacy act a privacy the privacy act and defending our data and defending our data i love
46:15it well thank you for uh a really thought-provoking and inspiring uh conversation that truly does make
46:22me look at it in a different way because i'm coming at it from this way and now i'm seeing
46:27it a bit
46:27from this way and i appreciate it this was fun
46:36cindy's book privacy's defender it's a great read cindy cohen we're going to take a quick break we'll
46:42be right back after this
47:02hey that's our show for tonight before we go we're going to check in with your host for the
47:05rest of the week miss desi leiding desi
47:12what do you have for the people this week oh big holiday week john passover and easter and i for
47:21one
47:21could not be more excited which is why i brought some delicious chocolate eggs oh for me yeah yeah
47:30and for you a traditional jewish shank bone
47:46oh you that's thank you um
47:51you shouldn't have
47:52oh try it john try it now
48:01it's not really
48:02may i have one of the chocolate eggs and ruin your appetite for the shank bone no way
48:09eat up come on cheers
48:18happy holidays everybody
48:23here it is your moment is done
48:30back on venezuela where chris and i have both had a chance to be there
48:34i literally think they're going to put up a statute of president trump lowering the price of gas in
48:39america so it's forget that when are they going to do the statue
48:51sorry
Comments

Recommended