00:00Google had enunciated a policy a few years ago where they clearly stated that they would
00:06not allow their AI systems to be used for weapons and also not used for illegal surveillance.
00:16You have to assume that it's not a coincidence that this change in policy comes with a new
00:22administration that has removed all the regulations on AI that were placed by the Biden administration
00:33and is now placing a huge emphasis on the use of AI for military prowess.
00:41Increasingly, the progress of the war in Ukraine is dictated by the use of either remotely
00:50operated drones or fully autonomous drones, and that's been a radical change just over
01:00the two years or two years, nearly three years now that that conflict has been going.
01:08So I think many military strategists think that without this kind of capability, you
01:14just can't fight a modern war.
01:16The issues that are more concerning are really around the use of AI in weapons, that we would
01:26have weapon systems where the AI that is controlling that weapon system is deciding who to attack,
01:34when to attack, and so on.
01:36It would presumably be operating with fairly general instructions, like attack and destroy
01:45anything that moves in this region, for example.
01:49But it could also be used in much more dangerous and harmful ways.
01:56For example, kill anyone who fits the following description, and that description could be
02:06by age, by gender, by ethnic group, by religious affiliation, or even a particular individual.
02:13It seems reasonable that, particularly when it comes to human extinction, which could
02:17result from AI systems that are much more intelligent than humans, and therefore much
02:22more capable of affecting the world than we are, it seems reasonable to ask for cast iron
02:30guarantees, not just, well, we'll make an effort, or trust us, we know what we're doing.
02:39Governments must require cast iron guarantees in the form of either statistical evidence
02:48or mathematical proof that can be inspected, that can be checked carefully, and anything
02:57short of that is just asking for disaster.
03:01There were, at last count, about 75 countries that had either developed or were using remotely
03:08piloted weapons, and I think most of those are in the process of thinking about how to
03:14convert them to fully autonomous weapons.
03:17But on the other hand, there are more than 100 countries that have already stated their
03:21opposition to autonomous weapons, and I think there's a good chance that we'll achieve the
03:30necessary majority in the United Nations General Assembly to have a resolution calling for
03:36a ban.
Comments