Skip to playerSkip to main content
  • 2 days ago

Category

📚
Learning
Transcript
00:04Science fiction has given us a lot to look forward to, but now some say that very fiction
00:08might be informing the way humans wage war, and we're probably not going to like it.
00:13For years, tech experts have warned about the use of robots and drones in warfare, with
00:17movies like The Terminator being uncloaked cautionary tales.
00:20But in the real world, we've already started to see this becoming a real issue.
00:24Police forces have already begun to use Boston Dynamics' Spot Robot to begin to aid them
00:29in their jobs, something the American Civil Liberties Union says is worrying, as they
00:33warn it's simply too easy to change a policy and begin to arm those very robots.
00:37In fact, San Francisco already passed a law allowing robots to use deadly force.
00:42That policy has since been redacted after public outcry.
00:45And while the US has used deadly UAV drones for years for surgical strikes around the
00:49world, recently Russia began using small, explosive-laden drones to strike civilian populations in
00:54Ukraine.
00:55Still, these robots are controlled at least in part by humans.
00:58But just like in movies like The Matrix, AI could become the very real existential threat
01:02for humanity.
01:03Recently, more than a thousand industry leaders wrote an open letter to whomever is developing
01:07these tools, hoping to slow their development until proper safety protocols are in place.
01:12With many experts saying it's completely possible that somewhere in the world, an algorithm
01:16has already become self-aware.
01:18And we just haven't realized it yet.
01:20Other work that they might have been doing出去ziehableение работ are having found the
01:20key to solve any issues.
01:20NASA will做 a bit.
01:22So probably informally about us who will show up because that'll work in their campaigns.
01:23And it might be larger, but also can remember that there isn't enough.
01:25Sounds really well that's very engaging.
Comments

Recommended