A few days ago I noticed a piece by Katy Waldman on Slate.com about how soldiers may be developing excessively close attachments to the robots, according to a professor who is doing research into “human/robot interrelations”.

“She interviewed 23 explosive ordnance personnel—22 men and one women—who worked with robot sidekicks, looking at how they imagined the bots in relation to themselves. She found that some troops anthropomorphized their machines, assigning them names (at times painting them on), genders, and even personalities. And while the soldiers denied that affection for the bots colored their combat strategy, they reported feeling sad and angry when the equipment was destroyed.”

The irony, as Waldman notes, is that militaries are developing robots for purposes such as ordnance and explosive demolition to spare human soldiers from engaging in this risky tasks; “if troops care too much about the bots to put them in danger, that hesitance could compromise outcomes in the field.”

However, yesterday the New York Times reported that while the US Air Force has done well in acquiring killer (if not lovable) robots, the Army has not had the same success in acquiring funding for so called Autonomous Ground Vehicles.   There have been some advances, such as the Legged Squad Support System, a vehicle the size of a cow that can carry 400 pounds of equipment.  Watch this video and you can easily imagine soldiers becoming emotionally attached to this device.  However, budget reductions, and the ongoing cost of supporting human soldiers damaged in the wars of the last decade, may well mean that the peaceful (or terrifying) future of killer robots predicted here by John Arquilla may not happen anytime soon.

0 Responses

  1. I don't think that in the short-medium term this is going to pose a great problem. Soldiers always anthropomorphise things – an unseen enemy, distant FNG-type HQs, comms problems, shortages of food… in short it's part of a soldiers' armoury of mental and moral tools to deal with being in a warzone which have existed for thousands of years. The only reason this generates controversy is because it's visually similar to nightmare scenarios depicted in popular culture.

    Soldiers get mentally attached to each other in a warzone – while that has some negative effects when casualties are suffered (such as short term anger/pain and longer term combat stress issues), overall it's a big part of the moral component of fighting power. Extending this to machines (at least to the point referred to in this article, ie naming them and getting upset when they're destroyed) is going to bring a few negative effects but all the positives of comradeship are going to be applied in this area too.