The ventral tegmental area and nucleus accumbens are in the "reward pathway" of a human brain. Morgan Freeman says, "A surge of dopamine can trigger pleasant feelings."
A DNA bank can store the genetic codes of hundreds of thousands of people; 23andMe and Ancestry.com will sequence a person's DNA. The ideas of Steve Mann have inspired Google Glass.
Thanks for watching. Follow for more videos.
#cosmosspacescience
#throughthewormhole
#season7
#episode2
#cosmology
#astronomy
#spacetime
#spacescience
#space
#nasa
#spacedocumentary
#morganfreeman
#privacy
A DNA bank can store the genetic codes of hundreds of thousands of people; 23andMe and Ancestry.com will sequence a person's DNA. The ideas of Steve Mann have inspired Google Glass.
Thanks for watching. Follow for more videos.
#cosmosspacescience
#throughthewormhole
#season7
#episode2
#cosmology
#astronomy
#spacetime
#spacescience
#space
#nasa
#spacedocumentary
#morganfreeman
#privacy
Category
📚
LearningTranscript
00:00We live under a billion unblinking eyes.
00:06A global surveillance system that solves crimes,
00:10uncovers terrorist plots,
00:13and helps stop abusive power.
00:16But are we ready for a world without secrets?
00:21Where not even our homes are off limits.
00:25And corporations know our every desire.
00:27Should we say goodbye to our privacy?
00:31Or is it time for the watched to become the watchers?
00:40Space.
00:42Time.
00:43Life itself.
00:47The secrets of the cosmos lie through the wormhole.
00:57Ever have the feeling you're being watched?
01:08Well, you probably are.
01:11If you live in a large city, surveillance cameras take your picture hundreds of times per day.
01:17Every transaction you make is electronically logged.
01:21Scanners at airports can peer through your clothes.
01:25The latest models can even detect your emotional state.
01:28And in those moments when you're not being tracked,
01:31we're busy giving away our personal information on social media.
01:35We think of our privacy as a fundamental right.
01:40Now it appears to be on the brink of extinction.
01:44Which sounds like a nightmare.
01:47But is it?
01:48This footage was shot by the Hawkeye 2 surveillance camera flying two miles above Ciudad Juarez in Mexico.
02:02Once every second for hours on end, it takes a picture of the entire city.
02:07Here, it documents the murder of a police officer by members of a drug cartel.
02:12But it also captures the movements of the assassins.
02:16It tracks their cars as they leave the scene.
02:19And leads the police to their hideout.
02:22Cities around the world are beginning to use these total surveillance systems.
02:27One could be watching you right now.
02:32Nick Bostrom runs the Future of Humanity Institute at Oxford University.
02:36He believes constant surveillance will radically reshape our lives.
02:42But we won't end up fearing it like Big Brother.
02:46Nick believes we'll embrace it.
02:49Surveillance technology might be one of those things that could change social dynamics in some fairly fundamental way.
02:55It's already in urban environments.
02:58There are a lot of cameras looking at us all the time.
03:01So it's a lot of eyeballs, but they are kind of semi-isolated.
03:04An obvious next step, where all these video feeds are stored in perpetuity and coupled with facial recognition systems.
03:13So you could automatically tag and keep track of where any individual has been, whom they have been talking with, what they have been doing.
03:24That sounds like a bad thing, but it doesn't have to be.
03:29Think about contagious diseases.
03:31The virus this man is carrying could spread across the city in just a few days.
03:37It could start a pandemic that kills tens of thousands.
03:41If you have a new outbreak, you have SARS or H1N1 or some new disease that pops up.
03:49It gets really important to try to trace who might have been exposed to the virus.
03:54And it's painstaking work. You have to interview the people and try to get who they have interacted with, and then you have to go and interview those people.
03:59But constant surveillance could spot the origins of the outbreaks in real time, locating the infected, dispatching medical teams to them, and establishing quarantines.
04:13Please return to your homes. This area is under temporary quarantine.
04:19Those things obviously can become more efficient the more detailed information you have.
04:25And if you nip it in the bud, potentially save millions of lives.
04:29Imagine if every single person you interacted with were tracked 24-7.
04:34Actually to be able to see what somebody has been up to in the past. Have they kept their promises?
04:41There are a lot of jerks and cheaters in the world who get away with it.
04:44And by the time people wise up, they have moved on to their next victims.
04:48It's kind of nice to be able to disempower the jerks and cheaters and encourage more people to behave in ways that are good.
04:58And if cameras track you every moment of the day, some aspects of your life would become a lot more convenient.
05:06We could go into a shop and just take what you need.
05:09And the camera recognizes who you are and automatically brings it up to your bank account and it's all taken care of.
05:16If you hadn't yet kicked the old habit of carrying a wallet, you'd never have to worry about remembering where you left it.
05:22In some ways, it's actually a return to a more normal human condition.
05:29We used to live in small tribes, small bands.
05:32You kind of know what everybody is doing, who they are, what they are up to, what they have been doing in the past.
05:37In some respects, it's not a complete novel. It might be more a return to normalcy.
05:41Life under global surveillance might resemble life in a small village.
05:46But could we adapt to being constantly surveilled, even inside our own homes?
05:53Most people say they don't like being watched when they're eating, washing, or doing anything in the new.
06:01But our homes are already full of cameras.
06:05From security cameras and cell phones to laptops and TVs.
06:10They're even hidden inside clocks that keep an eye on the nanny.
06:13You might think these cameras are harmless, but they all connect to the internet.
06:20Which means they can be hacked.
06:26Cognitive scientist Antti Ullasvirte lives in Finland, a country known for notoriously shy people.
06:35They say you can spot an extroverted Finn because they're looking at your shoes, not their own.
06:40Antti realized his countrymen would be perfect guinea pigs for an experiment to see how people react to having no privacy.
06:50I have a motivation to keep the kitchen clean, but I have an extra motivation today, this camera.
06:57Antti wanted to see how people's behavior changed when their homes were wired for constant surveillance.
07:04So he persuaded several households to do what for Finns is unthinkable, submit to being watched for an entire year.
07:14We wired 10 households in Finland for 12 months, including cameras and microphones and even screen capture.
07:22So this is Bob, and Bob looks like a regular piece of home electronics, but it's not.
07:27Bob, which stands for Behavioral Observation System, records all the video and audio around the house.
07:37Bob also keeps track of all email, web traffic, online purchases, and television viewing habits.
07:44So we had them covered pretty well in all areas of ubiquitous surveillance.
07:47We didn't want to bust people for doing anything wrong. We simply wanted to see how they would react to not being able to be alone in their own homes.
07:56In the first weeks of the study, Antti noticed his subjects appeared unsettled by the presence of the cameras.
08:03They had to keep their impulses in check, control their shouting, and if there was a stressful situation playing out in their lives, that could amplify the stress.
08:13And, no surprise, they were sensitive about being naked.
08:18Being naked was, of course, an issue, and we left them a few spots in, for example, bathrooms, where they could be alone without the cameras.
08:27They were like fish in a fishbowl.
08:30But as time went on, Antti noticed something surprising.
08:35So after surveilling people for six months, we asked them to draw us a graph of their stress levels.
08:41They were stressed out in the beginning, but after a while, it leveled off.
08:48Eventually, the subjects started to relax.
08:51They stopped worrying about being seen naked.
08:54So the mentality was that, now you've seen me once walking down the kitchen naked, so what's the point of continuing to hide?
09:01And when they really needed privacy for a delicate conversation, they figured out how to get it.
09:06They went to cafes to have private conversations, and they avoided the cameras in creative ways.
09:13Antti's study shows we can adapt to almost constant surveillance.
09:18He admits that this was a special case.
09:21The subject knew him and trusted him not to share the data.
09:24But to Antti's surprise, some people didn't care who was watching.
09:30We asked the subjects who would they least want to share the data with, and the most striking feature was some went as far as saying that it doesn't matter to whom you share the data.
09:39We have an amazing ability to adapt to changing environments.
09:47But if we learn to ignore cameras, it won't be long before we stop thinking about who's watching and why they are watching.
09:56They say, ignorance is bliss.
10:01But in this case, what you don't know could hurt you.
10:06In George Orwell's novel, 1984, everyone lived under the watchful eye of an authoritarian government, Big Brother.
10:19Big Brother.
10:20Today, in real life, there is a different watchful eye we should worry about.
10:27Big business.
10:29Ask yourself this.
10:31Do you know when corporations are watching you?
10:35Or how much they already know about you?
10:39The answer will shock you.
10:41Alessandro Acquisti always puts safety first.
10:52He knows the helmet will keep him safe on the road.
10:55And the same goes for his social media profile picture.
10:59I do have a Facebook profile.
11:01For my profile picture, I wear a motorcycle helmet.
11:06Your name and your profile picture are public by default.
11:11Therefore, they are searchable.
11:13So my question is, how much can I learn about you starting just from a photo of your face?
11:21Alessandro is a behavioral economist at Carnegie Mellon University in Pittsburgh.
11:27He's trying to find out what private corporations might be able to find out about you just by taking a picture of your face.
11:35Would you like to help us with a study?
11:36Sure.
11:37So Alessandro and his team developed their own data mining app.
11:42We took a shot of their faces and then you wait a few seconds.
11:47In the meanwhile, the shot is being uploaded to a cloud where we had previously downloaded a few hundred thousands images from Facebook profiles.
11:58The app uses commercially available facial recognition software to find a matching face online.
12:06This information is sent back to the phone and overlaid on the face of the subject in front of you.
12:13To see if we can identify you and see what information we can.
12:15Once it matches the photo to a social media profile, the software can find out someone's name, their great city, their interest and much more.
12:24Starting from just one snapshot of a person, no name and no personal information, we were able to lock onto the Facebook profiles of these subjects.
12:36Wow, no way.
12:37And once you get to the Facebook profile, a world of information opens up.
12:43That's really eerie.
12:45Most of us post photos of ourselves online, but not everyone realizes that photos are also data and no one stole the data from us.
12:54We are willingly and publicly disclosing it.
12:59With a name and birthplace in hand, deeper corporate data mining can reveal date of birth, criminal record and can even make a close guess at someone's social security number.
13:11That's one digit away from my actual social security number. How'd you predict that?
13:15Once they have this information, there is virtually no limit to what else they may be able to find out about you.
13:24Alessandro says he designed this software demonstration as a warning.
13:29It took very little effort to develop it.
13:32Imagine what the corporations that rule the internet might already be doing.
13:37On any given day, 2.1 billion people are active on social media.
13:43They tweet 500 million times, they share on Facebook 1 billion times.
13:48They upload 1.8 billion photos.
13:51And every time you click on like, a record is made of what you like.
13:56Today, the internet is essentially a surveillance economy.
14:02Companies like Amazon can sell millions of dollars of merchandising in an hour.
14:06And much of these revenues come through ads which are tailored to your preferences.
14:14The more a company can know you, the more they can manipulate you into clicking this link or buying this product.
14:20Alessandro is convinced we'll keep giving up our personal data and our privacy because corporations make it so easy for us.
14:30Marketers entice us into revealing more and more personal information.
14:37They work hard to make it a good experience for us.
14:40To us, it looks like the Garden of Eden where everything is free.
14:44You get free apps, free content, you get to play Angry Birds.
14:48All of these in exchange for, say, having your location tracked 1,000 times per day.
14:55Once corporations have collected enough of your location history, they know you better than you know yourself.
15:02They can predict where you will be at a particular time of day with 80% accuracy up to one year into the future.
15:10With the kind of information your phone and the companies that control the data in your phone will be able quite literally to steer your day.
15:22They may buy new shoes for you before you even know you need them.
15:27They may influence your decisions, which job you're going to take.
15:31Alessandro believes we're losing the battle for our privacy.
15:35And it's a battle we don't even know we're fighting.
15:37The problem is that the system is basically built around trying to nudge us into revealing more and more personal information so that we no longer know whether what's being collected about us will be used in our best interest or will be used to the best interest of another entity.
15:58But even if we wise up, we may have a hard time stopping ourselves from sharing our most intimate likes and needs online.
16:07Because this scientist believes we may already be hooked on sharing.
16:12Like a drug.
16:16People like to share.
16:19After all, we are social animals.
16:22But somehow the age of social media has got us sharing more and more.
16:27No matter how uninteresting it might be.
16:34Even though we know every post gives marketers more and more information about us.
16:40So, why do we do it?
16:45And could we stop ourselves even if we try?
16:51Psychologist Diana Tamir knows she's sometimes guilty of oversharing.
16:57Are you going to try this one?
16:58I'm going to try this one.
16:59Especially when she tries out her favorite new hobby.
17:05It's super satisfying to be able to do a route that you weren't able to do before.
17:09That's part of the joy of Rock Hunt.
17:12Getting to the top of a wall can feel rewarding.
17:15You don't have to conquer K2 to feel that basic human impulse.
17:19The impulse to talk about yourself.
17:28Check it out.
17:30People talk about themselves all the time.
17:31They talk about themselves when they're having a conversation with other people.
17:35Isn't that sweet?
17:36They talk about themselves when they're sharing information about themselves on social media
17:40or taking pictures of the thing that they ate for breakfast and posting it for the world to see.
17:44That's a really good one.
17:45There's a study that looked at what people tweet about on Twitter and they found that about 80% of what people are tweeting about is just their own personal experiences.
17:55Why do we enjoy this so much?
17:58As a neuroscientist, Diana thinks the answer may be hiding in our brains.
18:04So she designed an experiment using an MRI scanner to see how talking about ourselves versus others changes brain activity.
18:16Picture it like a public access talk show.
18:21With Diana taking the role of the interviewer.
18:25Is it rewarding to talk about yourself? Let's find out.
18:29And a ficus standing in for the scanner.
18:32Hey Adam. Hey.
18:34Do you get excited to dress up for Halloween?
18:36Diana asks her subjects to respond to questions about themselves or other people while showing them corresponding photographs.
18:46Does your dad like being photographed?
18:49Do you enjoy spending time in nature?
18:54Do you like being photographed?
18:55Do you enjoy having a dog as a pet?
19:00For Diana, the answers weren't important.
19:05What mattered was how her subjects' brains responded to the questions.
19:09All of them activated the prefrontal cortex, a region associated with higher thought.
19:17But something else happened when a subject answered questions about themselves.
19:22Diana saw activation in two brain regions.
19:26The ventral tegmental area and the nucleus accumbens.
19:30They belonged to what neuroscientists call the reward pathway.
19:34So we have these reward pathways in our brain that motivates our behavior by helping us to learn what things in the world feel rewarding that we need or want or desire, like food or sex.
19:49The brain's reward system is powered by a key chemical called dopamine.
19:54A surge of dopamine can trigger pleasant feelings, which motivate us to seek further rewards.
19:59It's the same system that fires up when people do drugs like cocaine or eat chocolate.
20:06So, like drugs, sharing can become addictive.
20:10But why does the dopamine system activate when we talk about ourselves?
20:15Humans have a fundamental need to belong or connect with other people.
20:19So, social connection and making friends and interacting with people are something that we're highly motivated to get.
20:25So, being part of a group gets you more resources, food, reproductive options than if you were by yourself.
20:35Self-promotion helps establish us as members of a group.
20:40And for hundreds of thousands of years, being part of a group has been essential to our survival.
20:46Even when we can't see the other people in our group, we still have the instinctual urge to promote ourselves.
20:52Part of the reason that people share so much on social media is because it activates the same sort of neural systems as self-disclosing in person.
21:01Sharing stems from a deep evolutionary drive.
21:05That's why it's so easy to get hooked on it.
21:08Diana wanted to know how easy it would be for her subject to kick the habit of oversharing.
21:14So, she tried bribing them.
21:18What we were looking at is whether or not people would kind of forego some extra monetary rewards in order to answer a question about themselves.
21:25This time, Diana let her subjects decide.
21:29Talk about yourself and earn nothing.
21:32Or get paid to talk about somebody else.
21:34Can you tell me about whether you or your friend like spending time in nature?
21:40Money activates the dopamine system.
21:44In fact, our neural wiring has taught us to chase it.
21:48But they say money can't buy happiness.
21:51At least not as much happiness as you get when you talk about you.
21:54While some participants chose the money, most turned it down.
22:00We see that people place significant amounts of value on answering questions about themselves and significantly less value on answering questions about other people.
22:08Kind of really brought the point home that sharing information is rewarding.
22:15Our compulsion to share is part of our biological makeup.
22:19But our biology could be the next target in the assault on our privacy.
22:26Our most sensitive personal information may already have been sold to the highest bidder.
22:36Which would you hate to lose the most?
22:40Your phone or your wallet?
22:43If either one of these is stolen, it's a total hassle.
22:49Your private information is exposed.
22:52However, you can cancel bank cards, wipe the data from your phone, and change all of your passwords.
22:59But there is something else you leave behind every day that could be far more devastating to your privacy.
23:06A single strand of hair contains the most private information you have.
23:15Your DNA.
23:24Yaniv Ehrlich is a former hacker who used to break into banks to test their security.
23:29Now he's a computational biologist, and he's concerned about the security of a different kind of bank.
23:47A DNA bank, which can store the individual genetic code of hundreds of thousands of people.
23:53He believes that hackers will soon be able to break into those biobanks and steal our most valuable and most private asset.
24:06A number of large-scale biobanks offer you the opportunity to contribute your DNA to science.
24:12It just takes a simple cheek swab to get the DNA out of your mouth, and then in a matter of days, with the current technology, we can analyze your entire genome.
24:26Companies like 23andMe and Ancestry.com will sequence your DNA and send you back information about your family tree or whether you are at risk for certain inherited diseases.
24:38And scientists are using this huge database of genetic information to develop new cures for a wide range of diseases.
24:48So with all these types of information, scientists can really understand how the variability we see in the population is affected by the DNA material we have.
25:00If the contents of your DNA were stolen and disclosed, the consequences could be disastrous.
25:06Imagine being denied health insurance or losing your job because your genes show you're at high risk for a heart attack.
25:15So biobanks say they make sure your DNA remains anonymous.
25:21To increase the security, biobanks usually don't store your identifiers together with your genetic material.
25:27They will keep your name, telephone number and address totally separated from this information.
25:34This way, no one knows what is the origin of the genetic material that you gave.
25:39But Yaniv has found a serious flaw in biobank security.
25:43In fact, he's discovered that even those of us who have never had our DNA sequenced are at risk too.
25:49Our DNA is vulnerable to theft every single day.
25:56Just think about what happens when you get a haircut.
25:58Although DNA is something very personal, you shed it everywhere.
26:03You go to the barber shop.
26:05You get a shave.
26:06You leave some of your DNA on the blade.
26:09You take a sip from a glass.
26:11You have some of your saliva on the glass.
26:13You leave behind some of your DNA.
26:15Maybe if you chew a gum or you smoke a cigarette.
26:19You leave the cigarette bite behind, you left some of your DNA.
26:21If a gene thief got a hold of your DNA, they could discover which inherited diseases you have,
26:29whether you have a tendency towards alcoholism or mental illness,
26:34and threaten to reveal that information to employers or insurers, unless you pay up.
26:43The key to being able to tie a piece of anonymous DNA to a name,
26:47whether in a biobank or a barber shop, is in the Y chromosome.
26:53If you're a male, we can know more about your paternal ancestry,
26:58because you inherited a short piece of DNA called the Y chromosome
27:02that you just get from your father's side.
27:05Now here's the funny thing about your Y chromosome.
27:08You get your surname from your father.
27:10He got it from his own father.
27:12And you got your Y chromosome from the same path.
27:14This creates a correlation between Y chromosome and surnames.
27:18In men, the Y chromosome contains patterns of repeating letters of DNA,
27:23a genetic fingerprint that passes from grandfather to father to son unchanged,
27:29just like a surname does.
27:31To prove that our genetic privacy is under threat,
27:35Yaniv pretends to be a gene thief.
27:36He downloads an anonymous DNA sequence from a biobank and zeros in on its unique Y chromosome patterns.
27:45Then he logs on to a genealogy database where people voluntarily upload their Y chromosome sequences,
27:52along with their names, to locate long-lost family.
27:56That allows him to match the anonymous biobank DNA to a specific surname.
28:01And since the anonymous biobank sequences are tagged with the age and state of residence of the person who supplied the DNA,
28:10a simple Internet search reveals their identity.
28:14He has done this successfully 50 times.
28:17I was so shocked by the results that I have to take a walk to think about the implications of our method to genetic privacy.
28:27It means that if hackers can get de-identified genetic information that is allegedly anonymous,
28:34it means that we cannot promise, we cannot guarantee full privacy,
28:37and we need to seek a different way to engage participants in these large-scale biobanks.
28:43In the wrong hands, a single strand of hair can ruin the life of the person who left it behind.
28:54How can we shield ourselves from this privacy onslaught?
28:58Lock ourselves in our homes and never go outside?
29:02Sterilize every room we've been in?
29:04One scientist thinks there's only one way to save our privacy.
29:09For him, the best defense is offense.
29:18Feels like pretty soon there won't be a minute of the day when we aren't being watched.
29:24Any device you own could be hacked into and used to spy on you.
29:29So, what's the answer?
29:33Go completely off-grid?
29:37Maybe there's another way.
29:39We could develop technology to know when we're being watched,
29:44and when we truly have privacy.
29:47Steve Mann has worn a computer every day for the last 38 years.
29:54In fact, he's been called the father of wearable computing.
29:59His ideas inspired better-known devices like Google Glass.
30:04But back when he began, his digital eyeglass was so bulky,
30:09he was often the subject of ridicule.
30:10So, 35 years of digital eyeglass, and finally we see how the industry is catching on to some of these concepts.
30:19So, I feel kind of vindicated after people laugh at me for all this sort of stupid eyeglasses and crazy things.
30:25Steve, a professor at the University of Toronto, has a cult following among his students as the original cyborg.
30:33His digital eyewear is bolted to his skull.
30:38His interest in using technology to augment what he could see began when he was a kid.
30:44Then in the 1970s, I started to notice these things watching us and sensing us.
30:51Microwave motion detectors and burglar alarms and stuff like that.
30:55And I was wondering, well, why are all these machines spying on us?
30:57And today, he runs an entire research team dedicated to developing technology that can sniff out when we're being surveilled.
31:07So, this device will help you identify what devices are recording your sound.
31:12So, the lights over here move faster and bigger near a microphone.
31:17And then it locates the mic, and that's how you can sweep bugs.
31:21They have devices that can pick up radio waves, including those from your cell phone.
31:28So, the radio waves coming from my smartphone here, for example, if I block that with my hand, the wave is very weak.
31:35See how weak that wave is?
31:36When it's going through my hand, and then whereas if I hold it like this, the wave is much stronger.
31:41Perhaps his most important invention in this age of near-total surveillance is technology that can detect precisely when you're being watched by a camera.
31:54So, there's a camera inside this dome, and we don't know which way it's pointing because it's shrouded in this dark dome.
32:00But the light here, when it comes into the field of view, the camera glows, and when it goes out of the field of the camera, it goes dim again.
32:07And so, you can see here, it sort of paints out, if you will, the sight field of the camera.
32:13If I put my coat in front of it, my jacket, the bulb, I haven't moved the bulb at all.
32:19I've just blocked it with my jacket, and when I unblock it, it glows.
32:24Most of us are used to seeing cameras everywhere.
32:28But Steve believes that if we knew when we were being watched, we'd start asking more questions about who's watching and why.
32:35It could be the police, it could be a computer, it could be artificial intelligence, it could be machine learning, we often don't know.
32:43Many times, surveillance embraces hypocrisy.
32:46Wanting to watch and not be watched, wanting to see and not be seen.
32:50Wanting to know everything about us, but reveal nothing about itself.
32:53To redress that balance, Steve is working to commercialize technology to detect the zones where a camera sees us.
33:02What he calls its valence field.
33:05Ryan Jensen is working with Steve on getting a valence field detector into a wearable device.
33:11These are some glasses where I can see the valence fields from this surveillance camera.
33:19So what it does is pokes and prods at the optical field until it figures out how much the camera is seeing.
33:26So you can really take this around and measure a whole valence field from a surveillance camera.
33:31What I'm excited about is being able to finally see and know how much we're watched, know how much the watchers are watching us.
33:37Valence fields aren't always places you want to avoid.
33:43Sometimes you may want to be watched.
33:46A lot of people who say, oh, you're into the sensing cameras, so you must be against cameras.
33:51And sometimes I'm walking home late at night in a dark alley and there's somebody sharpening a knife and somebody loading a gun down there.
33:57I might say, you know, you know what, I think I'd like to be watched.
34:00Sort of say, oh, there's valence flux over there, I think I'm going to move towards the camera.
34:04What I'm really against is the one-sided valence.
34:07Government or big business use cameras to watch regular people.
34:12But regular people rarely turn their cameras on big business and government.
34:17Steve believes wearable devices like his digital eyeglass can help us watch the watchers.
34:24Surveillance is a French word that means to watch from above.
34:28When we're doing the watching, we call that undersight or surveillance.
34:31Or surveillance.
34:33But Steve has already discovered that surveillance can invite trouble.
34:37Recently, he walked into a fast food restaurant wearing his digital eyeglass and was confronted by employees enforcing policies that don't allow filming in their buildings.
34:46Despite the eyeglass being bolted to his skull, the employees tried to remove it, damaging it in the process.
34:57The cameras want to watch but not be seen.
34:59And in fact, even if you photograph cameras, you find very quickly people come running out to tell you, no cameras are allowed here.
35:05And you say, well, aren't those all cameras around here?
35:07Oh, no, but those aren't cameras. They're surveillance.
35:10Steve believes that if we know when we're being watched and if surveillance becomes widespread,
35:16we'll finally have the weapons we need to fight back against the governments and corporations that constantly peer into our private lives.
35:24The goal is to create systems that improve the quality of people's lives, systems in which people are innately aware of what's happening,
35:34to create a society in which surveillance is balanced with surveillance.
35:40One day, widespread digital eyesight will merge surveillance and surveillance and transform society.
35:49Although someone may be watching you, you can now do your own watching.
35:55But what will a world where everyone is watched and everyone is a watcher look like?
36:01What will life be like in a world with no more secrets?
36:10We stand on the brink of a new era.
36:14Governments and corporations are peering into every corner of our lives.
36:20And we are developing tools to watch the watches.
36:24So, will life in a world with almost no secrets be a living nightmare?
36:31Or will the naked truth set us free?
36:34Futurist and science fiction author David Brin thinks there is no point in trying to hide.
36:45So, he's putting it all on display.
36:48In this modern era, when eyes are proliferating everywhere,
36:54with the cameras getting smaller, faster, cheaper, more numerous every day,
36:58the human reflex is to say, get those things away from me, ban them.
37:06But over the long run, that approach is not only futile,
37:11it also is kind of cowardly.
37:18David has got used to the idea that even private spaces aren't so private anymore.
37:22He thinks the key to getting comfortable is to look to the past.
37:27After all, for most of human history, we lived without privacy.
37:32Thank you. Cheers.
37:35Our ancestors didn't have much of a concept of privacy.
37:38Families would crowd into single cottages,
37:41knowing each other's business, seeing everything that was going on.
37:45The advantage was, everybody knew your name.
37:48There was some sense of solidarity.
37:51But the olden times were no utopia.
37:54The disadvantages were huge.
37:56You were dominated by the Lord on the Hill and his thugs,
38:00and by the local busybodies, who knew everybody's business.
38:06Today, we have our own versions of these watches.
38:10You can think of the Lord of the Village as the NSA or the FBI.
38:18The busybodies are the media and your neighbors, who can see almost anything you do.
38:24David thinks it's with our fellow citizens, not the government,
38:29that the battle to reclaim our privacy must begin.
38:32The first step is to make sure people who are watching us and talking about us can't hide.
38:37We're also used to personal gossip, where exchanging stories about other people is so natural that we put up with the filthier, more destructive aspects as just being part of life.
38:54What's going to bring this to a head is what's happening online.
38:59We all know about horrible crimes of bullying that have taken place online, empowered by anonymity.
39:08But we can use the tools of surveillance to expose prying eyes.
39:14The way to deal with the eyes is to spot them.
39:19Hey!
39:21To find out who's looking and hold them accountable.
39:24If we all look back at the watches, we have the power to change the way they behave.
39:31It's a step towards what David calls the transparent society.
39:36Transparency can stamp out bad behavior from nosy neighbors.
39:41They won't be so quick to talk about you if they know you could talk about them.
39:45But it doesn't stop there.
39:47It ripples all the way up our society.
39:492013 was the best year for civil liberties in the United States of America in a generation.
39:59That was the year that the administration joined the courts in declaring a universal right of citizens to record their encounters with police.
40:08It is empowering the good cops.
40:11But it's empowering groups like Black Lives Matter to say what you do to us is what matters.
40:19And now we can prove it.
40:21A loss of privacy for those in power can make society better.
40:26In fact, to make a transparent society work, David believes the government's right to secrecy must be massively curtailed.
40:34It should be able to keep secrets for a while, like plans to arrest criminals or military invasions.
40:41But nothing should stay secret forever.
40:44Any practical, tactical value to a secret is going to decay over time.
40:49Let's say government agencies, corporations, can get five years of secrecy for free.
40:59After five years you have to cash the secrets in a secure place and pay money to extend it another five years.
41:08It's for this reason that David supports whistleblowers like Edward Snowden.
41:12They shine a light in the dark corners of government.
41:16And in a free society, their leaks ultimately make us stronger.
41:22Everything leaks. Not a month goes by when something has not hemorrhaged all over the internet, getting headlines.
41:33But somehow Western governments and Western civilization keep surviving.
41:38In fact, it makes us better.
41:39Now think about our enemies.
41:43Terrorists, tyrannical governments, and criminal gangs.
41:47To them, it's lethal.
41:49The world is never going back to the way it was just two decades ago.
41:55Eyes will be everywhere.
41:58There will be no escaping them.
42:00But if we change our behavior, we can keep the privacy we need.
42:05Privacy is essential to be human.
42:07We're just going to have to defend it differently and redefine it.
42:13We will probably look back on the last couple of centuries as a golden age of privacy.
42:23A time before the age of almost total surveillance.
42:27But there is an upside.
42:31If we accept that we are going to be watched, then governments and corporations must accept the same.
42:39We need the privacy of our bedrooms.
42:42Government needs the privacy of its war rooms.
42:45Beyond that, our society will be transparent.
42:49And this loss of secrecy could herald a new age.
42:54The age of honesty.
42:56The existence remains of honesty.
42:57The existence of openness and communication with food.
42:58And this yellad a new率 of honesty.
42:59To the country again.
43:00To the country again.
43:01END
43:05To the country again.
43:06From the country all these c coordinates on the country,
43:07And this means to take us whole lives of honesty.
43:11Their law is Indian �ath,
43:13And this so of course we have established ourArthur,
43:15About the years of�ens,
43:16Our patriotism or auto-assessness,
43:17And this word is ethical.
43:18They will and conquer ourÕ«x whileAssad,
43:19Many tribes and the heart of lounge Christ,
43:20Yes,
43:21And this means to the self-re Yet exchange.
43:22That's the key card of honesty.
43:23And this means to tell us what it is going to do
Be the first to comment