- 2 days ago
Category
🤖
TechTranscript
00:00In 1999, the German newspaper Die Zeit ran an experiment.
00:04They asked a falafel salesman and former theater director, Salah Bengali, who in the world
00:10he would most like to be connected to.
00:12He chose his favorite actor, Marlon Brando.
00:15So the reporters then searched for a chain of friends, family, or acquaintances, people
00:20who knew each other on a first-name basis, who could connect Bengali to Brando.
00:25As it happens, Bengali had a friend in California.
00:28This friend worked alongside the boyfriend of a woman who was the sorority sister of
00:33the daughter of the producer of the film Don Juan DeMarco, starring Marlon Brando.
00:39So in total, it took just six steps, six degrees of separation.
00:44And the idea is that this is not a unique example, that you could connect any two people on the
00:49planet in six steps or less.
00:51But is it really true?
00:53And if it is, how does it affect our lives?
00:57How is this possible in a world of now eight billion people, that we could be that close,
01:02just six hops or less?
01:03Does that affect how diseases spread, how information travels?
01:07Our math showed.
01:08The question is not, why is the world small?
01:10It's really, how could it be otherwise?
01:12But then I got a call from the FBI.
01:16We are making the world smaller all the time, like it's supposed to be good.
01:21And yet it does expose you to toxicity and malevolence that you might have been shielded
01:25from.
01:26You look at the net effect of it, and it's actually been pretty negative by a lot of measures.
01:31People have suffered.
01:32It's not only dangerous in terms of disease propagation, but anything malevolent now has
01:38conduits that it didn't used to have.
01:42If we were all connected to everyone else on the planet completely at random, then it
01:47would be almost a mathematical certainty that any two of us would be connected through fewer
01:52than six steps.
01:54Let's suppose I have my hundred friends out of eight billion people.
01:58Each of them knows a hundred people.
02:00So two steps away from me is going to encompass a hundred times a hundred people.
02:06That's already 10 to the fourth people.
02:08And so if you do a hundred to the fifth power, that's 10 to the 10th, and that's more people
02:14than there are on earth.
02:16So notice that number is five.
02:18I said to the fifth power.
02:20That's the ballpark reason why six degrees of separation is true.
02:24But the shocking thing about this is the calculation you've just outlined is about having a hundred
02:30friends at random out of 10 billion, and they're all over the world.
02:35But we know that in the real world, that's nowhere near what the distribution of friends
02:40are like.
02:41Absolutely true.
02:42So this really crude calculation I did is absurd for the reason that you said.
02:47The world is very far from random.
02:49The truth is people naturally cluster geographically.
02:52Most of the people you know live close to you, and they also have a higher probability of
02:56knowing each other.
02:58If you calculate the fraction of people you know who also know each other, that is a measure
03:02of the clustering in the network.
03:05So let's try a model with a high degree of clustering.
03:08Imagine all eight billion people on earth are arranged into a circle.
03:13And say each person knows the 100 people closest to them.
03:16So 50 to the left and 50 to the right.
03:19In this case, the furthest person you can connect to is just 50 people away.
03:24So if you wanted to connect to someone on the other side of the planet through a chain of
03:27people who know each other, it would take 80 million steps.
03:32And to connect any two people would take on average 40 million steps.
03:37Even just getting 10% of the way there would take 8 million steps.
03:41And six steps would get you, well, here.
03:46This is the paradox of six degrees of separation.
03:49We know that we live in these local clusters of friends and acquaintances, but we also seem
03:54to be able to connect anyone anywhere in just six steps.
03:58Ten years ago I did my own experiment on this and I found that the average Veritasium viewer
04:03was only 2.7 degrees of separation from me.
04:07In social science, this is known as the small world problem, named after the phenomenon where
04:12you're, say, on holiday somewhere and you bump into a stranger who somehow knows your
04:17best friend and you say, wow, it's such a small world.
04:24In the mid-1990s, two mathematicians, Duncan Watts and Steve Strogatz, set out to solve this
04:29small world problem.
04:30Duncan really sort of had very far-seeing imagination at that point.
04:35We had computers that allowed us to simulate environments that were too complicated for math
04:42to work.
04:43Up until then, physicists had studied networks that were ordered and regular, like crystal
04:47lattices.
04:48And mathematicians, like Paul Erdos, had done lots of work on totally random networks.
04:53But no one had studied what happens in between.
04:56There must be some enormous middle ground.
04:59And that's what Duncan and I felt like we're starting to explore.
05:03To study this middle ground, Watts and Strogatz imagined a simple, regular network of people,
05:08or nodes, dotted around a circle, each connected to a few of their nearest neighbors.
05:13And so we had this idea, we're going to start with the physicist end of regular, and now we're
05:17going to turn the randomness knob to make it more and more random through these random
05:22shortcuts.
05:23We all have some experience with shortcuts.
05:26I belong to this club called the Internet Chess Club.
05:29I got to be very friendly with a guy in Holland.
05:32That connection makes the world small, because now, even though my friends don't realize it,
05:38they're only one step away from a guy in Holland.
05:41And so that kind of connection, where you sort of connect to someone outside your normal circle,
05:47is what we came to call a shortcut.
05:50So they went round the circle, disconnecting some of the links and reconnecting them at
05:54random to a different node in the network.
05:57And as they did that, they watched what happened to the average number of steps it took to get
06:01from any one node in the network to another, hopping between connected nodes.
06:07In other words, the degree of separation.
06:10This is now the moment for the big reveal.
06:12As Duncan turned the knob in his computer simulations, as soon as he introduced a few shortcuts,
06:19the world immediately gets as small as a random graph.
06:22When they had rewired just 1% of the links to shortcuts, the average degree of separation
06:27dropped from 50 in the original fully ordered network to 10.
06:32But Watson Strogatz also tracked how clustered the network was.
06:35That's the fraction of a node's connections that are also connected to each other.
06:39Or in other words, the fraction of my friends who are also friends with each other.
06:44What they found is that clustering remained high for much longer.
06:50The world immediately gets as small as a random graph.
06:54But it stays as clustered as if it were still regular.
06:57So you could simultaneously have the clustering that we know is real and the small world that
07:02we know is real.
07:05Now in Watts and Strogatz's model, they looked at a thousand nodes.
07:09But if you apply their model to the 8 billion people on Earth, well, then you would only
07:14need 3 out of every 10,000 friendships to be a shortcut.
07:18And the average degrees of separation drops to 6.
07:21Our math showed, the question is not, why is the world small?
07:25It's really, how could it be otherwise?
07:27Duncan started saying to me, this is about discovering a whole new universe and its properties
07:33and laws.
07:35I recognized that he was right.
07:37I just wanted to sort of reflect on what you said about these sort of shortcuts.
07:43I think I've had this phenomenon happen to me sometimes in my life where I'm sort of invited
07:49to an event and it seems like a very random event.
07:52Often I kind of feel like, I don't really want to go.
07:55You know, none of my friends are going.
07:57But then maybe at the last minute I just say like, well, let's just roll the dice.
08:02And I find that almost invariably those are productive meetings.
08:07I'm kind of wondering if there's a takeaway for people here, which is that they should
08:11put themselves in situations where the probability of forming these shortcut links, would it sort
08:17of increase the luck in your life?
08:20You have just put your finger on a very famous phenomenon in sociology that is called the strength
08:26of weak ties.
08:27Because you ask people how they got their job.
08:30And people would say, oh yeah, I heard about it from, you know, Randy.
08:35And then he'd say, oh, is Randy a friend of yours?
08:36And people invariably would say, no, he's an acquaintance.
08:39I wouldn't call him a friend, he's an acquaintance.
08:42That's a weak tie.
08:43The strong tie is your best friend or your circle of friends.
08:47Excited about their breakthrough, Watts and Strogatz wanted to test their small world model
08:52on some real world data.
08:54But this was 1996.
08:57We had to think, well, where are we going to get data on big networks where we could test
09:01this?
09:02And it was not so easy.
09:03The internet was not mapped out.
09:05Google didn't exist.
09:06So they turned to an unusual source.
09:10There was only one nervous system that had been mapped at that time, which was the worm
09:15C. elegans.
09:16A tiny worm, like a millimeter that you can find in the dirt, a favorite of neurobiologists.
09:23They knew every cell in the body of C. elegans from the time it's a single cell till it becomes
09:28a whole organism.
09:29So they had the total wiring diagram of that organism.
09:32Watts and Strogatz tested their model on the worm's neural network.
09:36The worm has precisely 282 neurons, and on average, they're connected to 14 others.
09:42If you lay that all out in a line along the worm's body, the neurons at the ends would
09:48be separated by around 40 steps, and the average degree of separation would be around 14.
09:54But when Watts and Strogatz ran the calculations, they found the average degrees of separation
09:58between any two neurons was just 2.65.
10:02To put that in context, if they were connected totally at random, it would be 2.25.
10:07And yes, okay, so bingo.
10:08That was a small world.
10:09Then we were popping the champagne.
10:10I mean, that was really exciting that nature had done that.
10:14So then we thought, well, okay, but this should be true of lots of networks because nature
10:18can't resist this mechanism.
10:21So they looked at Hollywood actors and power grids across the US.
10:25Sure enough, they were both small world networks.
10:27For example, in the database of over 200,000 Hollywood actors, the average degree of separation
10:33was less than four.
10:34Dangerfield was in Caddyshack with Bill Murray, and Bill Murray was in She's Having a Baby
10:41with Kevin Bacon.
10:43And then the real payoff for us as people interested in dynamical systems more than graph theory
10:47was, okay, so what, you know, so what if the world is small?
10:52Does that affect how things get in sync?
10:55Does it affect how diseases spread?
10:57Does it affect how information travels, whatever?
11:00And so we did a number of experiments again in the computer like that.
11:05Take disease.
11:06I wanted to know how a few shortcuts would affect how disease spreads through a network.
11:11So I asked Casper and the team to make a simulation.
11:14And then the question to you is, do you want to start with a completely regular world where
11:17it's completely clustered, or do you want to start with completely random?
11:21I would start with the regular world.
11:23Okay.
11:24There it goes.
11:26There's the spread of infections.
11:28Yeah.
11:29So it takes over the world completely.
11:31Well, if every step was a day, it would take 73 days for the infection to take over this
11:36entire world.
11:37Well, let's introduce a few shortcuts and see.
11:39Okay.
11:40Let's make a small world.
11:41Like 10%.
11:42Let's go.
11:43Boom.
11:44Wow.
11:45That's really dramatic.
11:48Right?
11:49That's really dramatic and very fast.
11:51Yeah.
11:52So fast.
11:53Yeah.
11:54After 26 days, the whole world.
11:55And that ramp up does look exponential at the beginning.
11:58Right?
11:59It kind of looks linear there as well, but it's almost like you can't go any faster.
12:02Yeah.
12:03Okay.
12:04Now let's make it a completely random network.
12:07Boom.
12:08Boom.
12:09Crazy.
12:10How many days now for a fully random network?
12:1425.
12:15Basically identical.
12:16Which is crazy because in the random case, all your links are random.
12:20You know, in the small world case, it's just 10%.
12:22It's like if one out of your 10 friends are a shortcut, which, you know, for some people
12:27might be a bit much, but I reckon for you, it's probably about right.
12:31Yeah, I got lots of shortcuts.
12:34But the crazy thing is that in this simulation, we only used 100 nodes.
12:40And if you use the same model to the 8 billion people on earth, then you would actually need
12:45less than 1% of all your links to be shortcuts.
12:49In 1998, Watts and Strogatz published their findings in a three-page article in Nature.
12:54And the paper took off.
12:56Within a few years, the paper already had hundreds of citations.
13:00By 2014, it was ranked the 63rd most cited paper of all time.
13:05And today, it's got around 58,000 citations.
13:08That's higher than Peter Higgs' paper on the Higgs boson and almost three times as many
13:12as Watson and Crick's Nobel Prize-winning paper on DNA.
13:15So it's probably worth making that distinction, that citations are one measure of impact.
13:20We're cited a lot more than Einstein, and I think you know who's more important.
13:25It's not us.
13:27But it does mean people thought it was worth citing.
13:30We had many tens of thousands of citations from people in far-flung fields, from neuroscience
13:36to sociology to graph theory to computer science.
13:39Even, you know, English literature, people would do things like draw networks between words.
13:44Is there any irony in the fact that this paper on Small Word Networks goes viral itself?
13:53Yes, I think so. Maybe so.
13:56But then things got a little weird.
13:59That's when I started getting some strange phone calls.
14:02I got a call from somebody at the FBI.
14:05I was a little scared, what's the FBI calling me about?
14:08And so I called back, and the person who picks up says, hair and fiber.
14:14I was calling the hair and fiber network at the FBI, the people who do, you know, criminology
14:20based on what telltale hairs or fibers are left on the victim's clothes after they've been murdered.
14:26There was a guy who said, what happens when the police have a suspect and they say you have fibers on your sweater that match the hair of the victim?
14:36And then the defense lawyer says, well, you know, maybe the victim was on a bus and left her fibers on the bus.
14:43And then my client sat on a secondary transfer, they would call it, of these fibers.
14:48That doesn't prove anything.
14:49So the FBI wanted to know what's the probability of secondary transfers compared to primary transfers from actually killing the person.
14:57And like, I don't, what do I know?
15:01Now that's a Steve Strogatz problem.
15:04For most of us, a random caller telling you they're an FBI agent is probably a scam.
15:09Chances are they got your number from a data leak or a data broker.
15:13Any time you provide your name, phone number, even your social security number, that personal information can be scraped, packaged, and sold to anyone who will pay.
15:22And if criminals get hold of it, well, they can open credit card accounts in your name or even use it to stalk or harass you.
15:29Fortunately, today's video sponsor, Incogni, can help.
15:32With your permission, they'll send out a letter to each broker using the correct legal terms and keep insisting until your data comes down.
15:41And remember, even one unremove profile can be enough for criminals to target you.
15:46If you sign up to Incogni's unlimited plans, you can even flag public websites where your information appears with their custom removals tool.
15:54Their data agents will take care of the rest.
15:56Since I started using Incogni 18 months ago, they've filed almost 700 requests for me, and over 600 of those have been completed.
16:04With their unlimited family plan, you can protect your whole family too.
16:08So to keep your information safe, head over to incogni.com slash Veritasium by clicking the link in the description or just use this QR code.
16:16And when you do, be sure to use the code Veritasium for 60% off your annual subscription.
16:21So I want to thank Incogni for sponsoring this video.
16:25And now back to networks.
16:28In 1998, Albert Laszlo Barabaschi was studying the internet.
16:33At that time, there were around 800 million web pages.
16:36But despite the web's enormous size, Barabaschi found that on average, you could connect any two sites with just 19 clicks.
16:44Apparently, the web was a small world too.
16:47But the strange thing was, it didn't look anything like the small world network in Watts and Strogatz's model.
16:53We ended up mapping out a region of the world wide web, and we had a very clear expectation of how that network should look like.
17:02Barabaschi thought the distribution of pages and links would resemble a bell curve, similar to what you'd get for people's height across a population.
17:09Most sites would have some average number of links, and there would be very few outliers either side.
17:15But that is not what he saw.
17:17And so we measured the distribution, and it didn't look anything like what we expected.
17:22The curve started out steep.
17:24Loads of websites had not many links.
17:27Then there was this really long tail.
17:30And here we saw web pages that had not only a little more, but sometimes a hundred times more links than the average degree or the average node on the website.
17:41These were websites like Yahoo, super connectors that linked to thousands of other sites.
17:46Barabaschi called them hubs because when he mapped out the network, they resembled the hub of a wheel with spokes going out to hundreds of other pages.
17:54And it was these hubs that made the web a small world, not shortcuts.
17:59So Barabaschi wondered how could this apply to other networks too?
18:03Most real networks, or virtually all large networks, follow two very fundamental principles.
18:12First, any large network out there never pops out as a large network, but it grows, right?
18:20You have a tiny World Wide Web in 1991, and now we have trillions of nodes on the World Wide Web.
18:27How big we get one to a trillion?
18:29One node at a time, one website at a time.
18:32All the networks out there, no matter how old, how fast they emerged, they always emerged to some kind of growth process.
18:40So if you think about networks, you must build in that growth process.
18:44Number two, when a new node comes in, you join Facebook, who you're going to connect to, right?
18:50And it is somewhat unpredictable, but it's biased.
18:53Your connections are always biased towards the more connected nodes, simply because you are more likely to know more connected nodes than less connected nodes.
19:01He named this process preferential attachment.
19:05Verbashi reasoned that these two principles could explain how hubs naturally emerge when a network grows.
19:12So together with his colleague, Reika Albert, he ran a simulation.
19:16We've also got a simulation for this.
19:19They started with a simple network of just a few connected nodes.
19:23Then they began adding new nodes to the network, one at a time.
19:27With just one condition, they'd be more likely to connect to nodes that already had more links.
19:33That's so cool.
19:35Looks very biological, very organic.
19:38Also, I like how the nodes come out and they don't just sort of stop in one spot.
19:44They kind of like wiggle around and like find their location.
19:48I really enjoy this.
19:50Like a space station.
19:52That's what I was thinking.
19:54Or like a space colony, right?
19:56Yeah, yeah, right?
19:57Like each little center one could be a planet, then you've got all the sort of stations going around it.
20:03Yes.
20:04So when Barabashi and Albert let these networks evolve, hubs emerged.
20:09And we showed that growth and preferential attachment together naturally to the burges of the hubs.
20:17With this simulation, Barabashi and Albert showed how hubs could emerge in virtually any complex network.
20:24Take airports, for example.
20:26In 1955, Chicago O'Hare opened to commercial flights.
20:30Unlike neighboring airport Midway, it had long runways and plenty of space for new jet aircraft.
20:35Airlines began shifting service there.
20:38As more airlines connected flights to O'Hare, passengers had more options to connect, making it increasingly attractive.
20:45After deregulation in the 1970s, more airlines were free to add routes, and the feedback loop accelerated.
20:51Each new route made the airport more useful to passengers and more appealing to other airlines.
20:57Today, O'Hare is the most connected airport in the United States, with direct flights to well over 200 destinations.
21:04But we don't just see hubs in man-made networks.
21:07In food webs, you have a few keystone species, like Atlantic cod, that connect hundreds of predators and prey.
21:13And in the metabolic networks in our cells, you have a few molecules, like ATP, that govern hundreds of chemical reactions.
21:20In the neural networks in our brain, you have a few regions, like the prefrontal cortex, that link hundreds of different functions.
21:28Now, as each of these networks evolved and grew over time, you had new species, new reactions and new circuits that latched on to what was already well connected.
21:37And so you get this sort of natural growth.
21:40Now, preferential attachment isn't the only mechanism that can create hubs.
21:44There are plenty of other factors at play, particularly in these more complex biological systems.
21:50But what Barbashi's and Albert's simulation showed is that all it takes is a tiny bias when growing a network, and hubs end up being inevitable.
22:00Once hubs are there, they fundamentally change the way the system behaves and the way we understand that system.
22:07Hubs like O'Hare mean you can get pretty much anywhere in the world in just a few flights.
22:13But that connectivity also has consequences.
22:17In August 2025, thunderstorms shut down Chicago O'Hare, and 280 flights were cancelled and 80 were diverted.
22:26Overflow hit at least six other US airports, while some planes stuck in Chicago never left for Europe or Asia.
22:33Bad weather in Chicago totally changes not only the travel pattern in Chicago, but within 24 hours the whole country is being affected by that.
22:42And we see the same phenomenon in natural networks.
22:46Knocking out one keystone species, like Atlantic Cod, can destabilize an entire ecosystem.
22:51So this is what we call the Achilles heel of networks.
22:55And this could be good news or it could be bad news, right?
22:57Good news if you want to create drugs to kill bacteria, then you're going to go for the house.
23:03This idea has created a whole new field of network medicine, where researchers develop drugs to target crucial parts of a disease's metabolic network.
23:11But understanding the role of hubs doesn't just help develop cures for a disease, it can help us control its spread.
23:18In 1990, Thailand was facing one of the fastest growing HIV epidemics in the world.
23:24The government tried broad campaigns like posters, TV ads, and school talks telling everyone to use condoms.
23:31But the infection kept spreading.
23:33So in 1991, the government tried something different.
23:36They started targeting hubs.
23:38They told brothels around the country that every client must use a condom or else they'd be shut down.
23:43And the impact was huge.
23:45For example, HIV infections among young men joining the military dropped by more than 50%.
23:50And by 2013, Thailand's Ministry of Public Health estimated the policy had prevented over 5 million infections.
23:57All because they realized the importance of hubs.
24:02Hubs and shortcuts make any complex network more connected than it seems.
24:06That means things spread quickly, whether that's airport delays, information, or the seas.
24:12But could that impact run even deeper?
24:14I mean, could the structure of our social network influence our very behavior and beliefs, without us even being aware of it?
24:23Back in 1997, Watts and Strogatz investigated just that, using a game called The Prisoner's Dilemma.
24:29It's probably the most famous problem in game theory, and it's used to represent a ton of different conflicts we see in the real world.
24:35We've actually done a full video on it before, but here's a quick recap.
24:39The premise is simple.
24:40A banker with a chest full of gold invites you and another player to play.
24:44You each get two choices.
24:46You can cooperate or defect.
24:48If you both cooperate, you each get three coins.
24:51But if you defect while your opponent cooperates, you get five coins and they get nothing.
24:56And if you both defect, then you each get one coin.
25:00So what would you do?
25:01Suppose your opponent cooperates.
25:03Then you could also cooperate and get three coins.
25:06Or you could defect and get five coins instead.
25:09So you're better off defecting.
25:11But what if your opponent defects?
25:14Well, you could cooperate and get no coins.
25:17Or you could defect and at least get one.
25:20So no matter what your opponent does, your best option is always to defect.
25:26Now, if your opponent is also rational, they'll reach the same conclusion and therefore they'll also defect.
25:33And as a result, when you both act rationally, you both end up in the suboptimal situation of getting one coin each when you could have gotten three.
25:42But in 1980, Professor Robert Axelrod found that if you play your opponent hundreds of times, well then cooperation wins out.
25:51He ran a tournament among the world's leading game theorists.
25:54And all the most successful strategies were nice.
25:57The winning strategy was called tit for tat because its default position was to cooperate and it would only defect in retaliation.
26:06He also showed that a small cluster of cooperators can work together to overcome a world of defectors.
26:12So that's kind of the scene we're set in, right?
26:14You get to this realistic place where you get tit for tat like strategies to sort of dominate the world.
26:20Because in Axelrod's tournament, every strategy played against every other strategy or they only interacted sort of with their near neighborhood, which is, you know, the small cluster.
26:29And now you could wonder, well, what if we start changing the way this works?
26:35What if we put them on a network?
26:37Well, Watts and Strogatz simulated their own version of the Prisoner's Dilemma that did just that.
26:42They set up a regular network where each player was connected to a few players on either side.
26:47Then they would simultaneously play against all of their connections.
26:51The rules were simple.
26:53If most of a player's connections cooperated, then that player would also cooperate.
26:58But if most of their connections defected, then they would defect in retaliation.
27:02They started with a small cluster of cooperators surrounded by defectors and they watched the network evolve.
27:09Over time, what they saw was cooperation spread, just like what Axelrod had found.
27:15But then they reran the simulation, this time with a few links rewired to shortcuts.
27:21And all of a sudden, the cooperators were crushed and they ended up with a world of defectors.
27:28And when they started from a totally regular network and gradually increased the fraction of shortcuts,
27:33they found there was this critical fraction beyond which the percentage of cooperators at the end of the game drops to zero.
27:40I think the thing that's really crazy is that you've taken the exact same strategies with all the same properties,
27:47same character traits and personalities, if you will.
27:50And you're not changing any of that.
27:52All you're changing is the way they're connected.
27:54And you go from a world where everyone's completely nice and working together,
27:58to one where it's filled with nastiness and people betraying each other.
28:02Only by changing how they're connected.
28:04It's like if the bulk of your interactions are sort of negative, then you start being negative too.
28:10And you just contribute to the overall negativity.
28:13Whereas if like a few people are nice, then you can imagine, oh, that makes me feel good.
28:17And so I'm going to be nicer to that person.
28:19The intuition for that is that cooperation is fostered by having little clumps.
28:24If I have a little clump of people that are kind of my buds, we get to have a lot of encounters
28:30and cooperation tends to emerge from familiarity, the same way that iteration helps.
28:35That if I know that I'm going to see you again, I'm going to encounter you again,
28:38it ends up being to my advantage to cooperate.
28:41Whereas like the world of the internet where anyone can get on Twitter and bad mouth anyone else,
28:46that tends to discourage.
28:49We don't have pockets.
28:51You don't have communities.
28:53It kind of explains this keyboard warrior phenomenon.
28:56And that, yeah, people say things on the internet they wouldn't say.
29:00They wouldn't. Most people are nice in real life.
29:02Yeah.
29:03It's funny that the small world, I know you think the small world,
29:06because that's a Disney song, right?
29:08It's a small world after all, like it's supposed to be good.
29:11And yet it does expose you to toxicity and malevolence that you might have been shielded from
29:16in the small town.
29:18Social media has kind of been toxic.
29:22The initial idea being, hey, we connect up a bunch of people
29:25and people have been separated geographically.
29:27We connect you with your old friends.
29:28You look at the net effect of it and it's actually been pretty negative by a lot of measures.
29:34Intrigued by the findings, Watts started wondering if the results applied to the real world too.
29:39For years after I did this work, I had wanted to test the hypothesis with actual human subjects.
29:49So he got some volunteers to play a similar game called the public goods game
29:54across different network structures.
29:56He was expecting that, like they found previously,
29:59more shortcuts in a network would make cooperation less likely to emerge.
30:03But what he found was the structure of the network had no effect.
30:08Cooperation was just as likely to emerge in a totally clustered network as it was in a totally random one.
30:15We were very puzzled by this result and then we kind of did some more work.
30:20When Watts dug deeper, he realized that the network structure did matter.
30:24In the more clustered networks, people were more likely to copy each other.
30:28So if by chance someone started out cooperating, then everyone would cooperate.
30:33But it was equally likely that someone would start out by defecting,
30:37in which case everyone else would defect.
30:39And over all the games they played, these two effects canceled each other out,
30:43which is why it seemed like the network structure didn't matter.
30:47It's sort of on a knife edge, right?
30:50Where like one person does something selfish and everything goes south.
30:56In another world, everybody kind of holds it together and everything goes well.
31:02It's crazy that the world could be like on a knife edge like that, you know,
31:05could tip one way or the other, kind of just depends on how someone gets out of bed that day.
31:10But then Watts realized something. See, in real life, you can choose who you hang out with.
31:16So he re-ran the experiment, allowing players to change who they were playing with.
31:20And this time he used the prisoner's dilemma so that players could easily identify the defectors.
31:25And the finding was clear. The more you allowed players to choose who they were playing with,
31:30the more likely they were to cooperate.
31:32You can make this a lot better for yourself by just acting and being decisive and being proactive about things.
31:38Yeah, yeah. It's the thing I try to teach my kids, too.
31:41Like, if someone's annoying you, just ignore them.
31:44Like, there's nothing to be gained by continuing to interact with people who are bringing negativity into your life.
31:51In fact, making a choice can be powerful in more than one way.
31:55There's something about the world that makes it prone to those upheavals,
31:59meaning it's always kind of poised on an edge of instability.
32:03And that gives each of us more power than you'd think we would have.
32:06It is actually possible for individual people to start movements that grow and take off.
32:13And ultimately, if you look at history, that is what happens.
32:16It's always one person who is stubborn and does something that leads to ten people, a thousand people,
32:23and things change because of it. It always starts with one person, somehow.
32:27It's the Steve Jobs quote, right? But the people who are crazy enough to think they can change the world are the ones who do.
32:33And the wonderful thing is, it all starts with you believing you have that power.
32:37Yeah.
32:39Learning all about network science has taught me many things.
32:42But perhaps the most important is that our networks shape us.
32:46But our actions shape the networks.
32:49So choose both wisely.
32:51wisely.
32:52Hey, if you made it this far, all the simulations we ran through with Derek,
33:03we will actually make them available on a website that you can go to so you can play around with them yourself.
33:08So thank you so much for watching.
33:10We really appreciate it.
33:12And yeah, see you for the next one.
Recommended
42:34
|
Up next
23:06
32:32
34:42
22:17
54:08
0:46
2:11
2:47
3:44
0:47
Be the first to comment