{Ethical flipside of smart tech} Social and ethics lessons from operating Military Drones and Robots

754px-IED_detonatorThe battlefield in the Middle-East teaches important lessons on the virtualisation of real-life situation: on the social bonds that humans develop with machines, and on the other end of the spectrum on the impossibility to totally ‘virtualise’ an experience: the link with the physical world always comes back to remind of its reality.

New research from the University of Washington explores social bonds soldiers develop with their ‘bomb disposal’ robots in Iraq and Afghanistan. They often anthropomorphize the machines that help keep them alive, assigning those human attributes, and even displaying empathy toward them. To the point of refusing replacement robots unless it is ‘their’ machine being repaired or holding funerals for their fallen brothers in robotic arms. It recently happened in Iraq: the tribute involved a 21-gun salute, and the awarding of a Purple Heart and a Bronze Star Medal. ‘He’ was a MARCbot, a R2D2-like robot designed to disarm explosives.

The other side of this “virtualisation” paradigm is the issue impacting Drone* Operators (* technically un-manned machines guided by operators, as opposed to fully automated robots) diagnosed with post-traumatic stress disorder despite not physically facing the battlefield. This runs counter-intuitive to the initial concept of drone warfare, which assumed that combat’s devastating psychological effects had been mitigated. Instead, “moral injury” is now an accepted issue, which represents a shift from the violence done to people toward their feelings about what they have done to others. Yet, 61% of Americans in the latest Pew survey support military drones because they won’t risk US lives. Unfortunately there are serious impacts: a 2011 survey shows 42% of operators reported moderate to high stress, and 20% reported emotional exhaustion or burnout.

The specific insight is that even in a highly digitised world shielding operators from direct physical action, psychological consequences still exist. It means that the very idea of a robotic or artificial intelligence fulfilling our duties unsettles. Whilst the technology is advancing at fast pace, the moral and ethical burdens it carries have been largely unconsidered. As in many domains, the military is at the vanguard of future civilian issues. Scientists, legal experts and philosophers are now joining forces to scrutinise the promise of intelligent systems and wrangle over their implications: after wining chess games, IBM’s supercomputer Watson will soon diagnose diseases, and also be used by health Insurers to assess customers. But what are the implications for businesses and customers when they want to argue decisions made by “black boxes”? The ethics of this emerging paradigm experienced in the extreme environment of the military will ultimately need to be addressed by civilian organisations.

Xavier Rizos – 1/12/2013 – Sydney

Read more >

– Confessions of a Drone Warrior: http://www.gq.com/news-politics/big-issues/201311/drone-uav-pilot-assassination?currentPage=1

– In French: Grégoire Chamayou, Théorie du drone http://atheles.org/lafabrique/livres/theoriedudrone/index.html

View and download this update as a PDF on scribd > click here


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s