Automated Weapons and the Battlefield of 2050


European law protects you from automated decisions.
Which sounds like a minor thing, but if an automated system is making a meaningful decision
about your life — whether you get a pay rise, or if you get a formal warning for clocking
in late at work — you must be told that it was an automated decision, and you must be
able to get it reviewed by a human. At least, in Europe. And importantly, you can tell a
company to never make automated decisions about you and they must comply with that.
Now that doesn’t apply to little things like online personality quizzes, unless that quiz
decides — without a human in the loop — whether you’re a safe person to lend money to. But those are all fairly benign automated
systems, even if it might not feel like it when you’ve just been turned down for a loan. What about automated systems that are meant
to be harmful to humans? I’ve filmed at this tank once before. It’s
an old Soviet T-34 battle tank, put here in London two decades ago by an angry local resident
who was denied permission to build on this bit of land… but did have permission to
store vehicles here. Or maybe he got permission to put a “tank” there, because the local council
thought he was talking about some sort of storage tank, not an actual T-34. The story’s
got a bit muddled over the last twenty years or so. But this tank is designed for people. It’s
designed for human soldiers making human decisions. And that concept might not be the case for
very much longer. See, the US Army just put out a report called
Visualising the Tactical Ground Battlefield in the Year 2050. Which sounds like a really
dull comic book, but it’s actually a pretty good read, and they have some things to say
about automated decision making. The US Army believes that humans will be incapable
of handling all the information that’ll be thrown at them in the height of a future battle.
Computer processing power, they say, won’t be a limiting factor, and as humans grow more
accustomed to automated decision making, of course we’ll be happy to use that in the battlefield
too. If you have a soldier who’s happy with flipping the autopilot on their car, of course
they’re going to be happy doing exactly the same for their tank and for their gun. “Battle rhythm,” — and I’m quoting the report
directly here, “Battle rhythm will increase to the point
that, in many instances, humans will no longer be able to be ‘in the loop,’ but will instead
need to operate ‘on the loop’, where ‘humans can only observe the behaviors that are taking
place’… and ‘act after the fact or in anticipation of expected behaviors.'” So just to be clear: the US Army believes,
in the next few decades, that they will have autonomous weapons which will have patterns
of behaviour that an engineer is going to have to try and debug while it is firing live
ammunition. If you’re a programmer, working on open-source computer vision, or collision
avoidance, or, say, something like flocking behaviour algorithms, there is a chance that
your code will end up as part of that. And if you’ve got moral objections: well, good
for you, but it’s probably not going to stop anyone. And I suspect all the legislation that Europe
can muster won’t do a thing when you’ve got one of these, quite literally, calling the
shots. [Translating these subtitles? Add your name here!]

100 thoughts on “Automated Weapons and the Battlefield of 2050

  1. I love these videos.
    This topic though – Machines killing humans with no human interaction. I think that treaties like the Ottawa Treaty should be set in action. If not, a skilled hacker will be able to kill thousands of people few clicks. The drone-warfare tactics is grey-zoned already.

  2. Today it's drones and autonomous tanks. Tomorrow, it will be Berserker probes on a mission to exterminate all life in the galaxy.

  3. As a Canadian, I appreciate the red maple leaf with the white background at the front (possibly back) of that peace tank. 0:47 1:47 2:41 Looks like there's another above the tracks briefly around 3:06.

  4. Reminds me of the plot of Congo. The book, not the movie.

    I'm not sure that I buy it, combat has actually slowed down since the effective range on weapons was increased to the point where it's basically "to the horizon".

  5. [WARNING – YOUTUBE AUTOMATED NOTICE]
    Hi,

    This video has been demonetized as you referenced "Weapon"', "Battlefield", "Tank", "'Tactic"', "US Army", "Collision" – we need to take this course of action as we think our 'Advertisers' are two month-year-old babies who cry when an adult whispers the word "bang" because you know those words are scary and you know how hard it is to get our advertisers to bed when you wake them up by accident in the middle of the night!.

  6. Some fully automated military weapons already exist, look into the Metal Storm Area Denial System, and the radar activated anti-aircraft systems You turn them on and they do the rest.

  7. Another benefit of automation is that, if absolutely everything is done by machines on both sides, fewer people will die.

  8. Dirty, dangerous, dis honest, these, these are the words that describe the choice to use a computer, to kill a human being.

  9. Not under any circumstances, I repeat, not under any cirumstances and in no context, is it okay, to kill a man, with a computer.

  10. Maybe it'll end death in war? Combatants will be autonomous on both sides with the aim of destroying and depleting as much of the enemy's resources as possible rather than using attrition as a tactic…

  11. Future? Isn't this already the present? But the US Army won't tell us.
    If I wanted to, had ten good engineers, enough money, one or two years time and access to all data NSA, CIA etc. collect over the internet, I could develop such a system. But I definetely DON'T want to!

  12. If two countries went
    to war, would it be more sensible not to have bloodied soldiers
    wounds to dress or have to amputate their limbs or to have dead
    bodies lying everywhere? Could not the scenario be more humane? Could
    the countries not agree to dispose of certain numbers of humans in a
    humane way, rather than the nasty way war is asserted now?
    Let's say the
    unemployed or those currently in jail should be sent to the gas
    chambers, NAZI style.
    Better that than to
    have people machine gunned in cold blood , dying in agony for want of
    assistance from a medic or doctor from wounds on a muddy, dusty
    battlefield, yes?
    Assume it was agreed
    that territory could be gained by the other side amputating 452
    limbs, ten finger and two eyes. Would that be better than a mud and
    blood spattered unit desperately trying to fight for the life of a
    comrade under fire?
    Where would it end? Who
    knows? What are your thoughts on war?

  13. My personal prediction is if the US Army can take humans out of the battlefield and replaces with automated weapons, warfare will not change much. Here's the primary issue: your enemy are not idiots. They are smart. If they stand no chance of killing you, they will not stand around shooting at your automated weapons. It's a waste of ammo and men. They will disperse, go into hiding, and become guerrillas and terrorists. They will bomb that market your civilians go to for shopping, or that restaurant your officers go to for lunch.

    You need to provide a target of opportunity for the enemy to shoot at for them to appear. That's where the traditional infantry come in. That poor bloke with a rifle.

    That's what the US Army found out in Afghanistan and Iraq. They were so overwhelmingly powerful at dislodging a concentrated formation of any enemy in the world that their opponents simply revert to being guerrillas. Then they keep having to send in the infantry, just to patrol around, asking people,, occasionally take some shots and shoot back. A mortar in the 2000s is a lot more important than fancy aircrafts.

  14. Wow, I watched Tom's video on how snow and confetti wreck havoc upon video quality, and I think the same is occurring in the background here, about a minute in.

  15. Or we could just connect ourselves to computers (and each other) and expand our own processing power. That would not only solve the problem of "automated decisions" but likely inoculate us against the threat of hostile AI.

  16. "Autopilot on there gun"
    This reminds me heavily of psychopass, an anime about a future where law enforcement are equipped with guns that detect the mental state of the target and how likely they are to commit crime in future making the gun unable to fire, stun or kill depending on the result.
    A good watch if anyone is interested in this topic.

  17. So basically, the future of warfare will involve Engineers mindlessly banging their fully autonomous turrets with a wrench as it does all the killing for them? Do we get to pick our hats?

  18. Anybody working on open-source computer vision or any other projects similar to the ones tom gives as examples towards the end, change your license quickly. "Any vehicle using this library is required to be painted as pink as physically possible, and covered in glitter", or something like that. Make it so stupid and impractical that they just can't use your libraries.

  19. Perfect. If every country only uses automatic tanks, the tanks can shoot themselves without any humans involved in a war. Therefore nobody dies, just some tanks get destroyed and much money burnt.

  20. its as simple as "no ones gonna have a reaction time fast enough to fight these machines" so it must be automated. why are you such a hippy Tom Scott

  21. Wars without soldiers sound good to me. Maybe armies will develop into a standoff measure like nukes, you won't attack a country, not because you don't think you can win, but simply because its too costly and countries will keep sizable drone armies to prevent conventional wars from breaking out and monitoring the uncivilized parts of the world.

  22. So you want a Robot Uprising? Because that's how you get a Robot Uprising.
    Seriously, never give guns to machines.

  23. If strategy game AI is any indicator, the problem won't be the weapons system targeting wrong individuals, it will be pathing. Terribly embarrassing if, halfway through a battle, three quarters of your army decides to leave and drive 500 miles to come back to roughly the same position.

  24. We need a new open-source license that is, for the most part, permissive (as in MIT), but contains one copyleft-ish clause: this code may not be used for military purposes or other purposes of war, and any project that uses this code must contain the same clause.

  25. Battle technicians in the future may have to know computer code. Where their job is to approach automated weapons and then hack/disable it. Would be the most dangerous job for a programmer.

  26. It's just gonna be robots killing each other. War will be safe for people. But what about the old soldiers? And the robots?

  27. How about YouTube automatically demonetizing video's which might represent the livelihood of creators?
    Can you tell YouTube (if you live in Europe) to stop that and review all your video’s manually?

  28. Just put the code under a GPL License, so that any code derived from yours also has to be open source. I doubt the army would want everyone to see the source code to their systems.

  29. I've been to this tank before. I worked around the area for several years. One of my colleagues told me about the same story with the addition that the turret was pointed at the local council offices. However, when I open up maps and look at satellite imagery dated over the years, the turret seems to have been moved, pointing South and later NW in 2003, SW in 2009, South in 2010, NW in 2011, SW in 2018 facing the direction of the tank's front. So somebody's managing to turn it and none of these directions ever poitned at the local council's building as far as I can see unless I'm looking at the wrong/missing building(s).

Leave a Reply

Your email address will not be published. Required fields are marked *