Tuesday, July 23, 2013

Drone to Human: Leave the Thinking to Me

[Part of the series: 10 Questions to Spur the Drone Debate]


In early July, 1988, I was on a business trip in China. On the 4th of July, I was in Chengdu, in Sichuan province, when a brief exchange occured that is etched in my memory.

We were scheduled to visit a leather tannery, and my counterpart from the local office of the China National Native Produce and Animal By-Products Import/Export Corporation (ANIBY for short) met me in the usual white van after breakfast for the long ride to the tannery. We may have been going to Leshan that day; or perhaps to Yaan; or else to one of the other far-flung tanneries I used to visit.


"A missile departs the forward launcher of Vincennes
during a 1987 exercise."
(Source: Wikipedia)


As I climbed into the back of the van that morning, he said to me in Chinese, "Shi Gelei [Scarry]? Have you heard? A U.S. missile shot down an Iranian 747 . . . ."

I couldn't quite parse the sentence. My Chinese was pretty good, but he was saying words weren't making any sense. What was he talking about?

He told me again, slowly, "In the Mideast - Iran - a plane, a 747. The United States. Shot it down."

I repeated the words back to him, putting the meaning together in my mind, as he looked me in the eyes and nodded. "More than 200 people were killed," he said.

Finally, I squinted, shook my head, and replied, incredulous. "No . . . " I said, "How could a mistake like that happen . . . ?"

And then came his response, the part that I have never been able to forget:

"Mistake?" He smiled ironically. "I suppose some people might imagine it was a mistake . . . ."


Map of the Iran Air 655 shootdown
(Source: Wikipedia)


That's how I heard about the two hundred and ninety (290) people who died on Iran Air Flight 655.

It was the 4th of July, I was a long way from home, and I was with people who had a completely independent worldview from the one I was surrounded with when I was in the United States. At that moment, I realized that we Americans are deluded when we imagine that people in the rest of the world assume that we are filled with nothing but good intentions, and that they give us the benefit of the doubt.  At that brief moment in July, 1988, I didn't yet understand why this was so, but it began a journey to try to find out. In the twenty-five years that followed my thinking has gone through a big evolution.

I think of that day often. Nonetheless, I have to admit that I still hadn't quite made the connection between this incident and the work to rein in drone warfare, in which I am so involved, until reading P.W. Singer's book, Wired for War. Singer makes a very important point about the Iran Air Flight 655 case. As he explains it, the Aegis information system that spotted and interpreted the information about that civilian flight, and guided the shoot-down, is exactly the kind of "man-in-the-loop" system that is being used in the drone killings in the Mideast today: in theory, humans are exercising their judgement in the process, but in reality the computer system is viewed as too "smart" to be second-guessed by a human being.


Aegis cruiser "control room": who's controlling who?
(Source: Wikipedia)


I encourage everyone to read Singer on the The Ethical, Psychological Effects Of Robotic Warfare. As he explains:
"Designed to defend Navy ships against missile and plane attacks, the [Aegis] system operates in four modes, from 'semi-automatic,' in which humans work with the system to judge when and at what to shoot, to 'casualty,' in which the system operates as if all the humans are dead and does what it calculates is best to keep the ship from being hit. Humans can override the Aegis system in any of its modes, but experience shows that this capability is often beside the point, since people hesitate to use this power . . . . [In the Iran Air 655 incident] Though the hard data were telling the human crew that the plane wasn't a fighter jet, they trusted the computer more. Aegis was in semi-automatic mode, giving it the least amount of autonomy, but not one of the 18 sailors and officers in the command crew challenged the computer's wisdom. They authorized it to fire."
Multiply this situation by 1,000, and you have the situation in place today with the U.S. drone fleet: skies full of lethal robots theoretically controlled by "pilots" . . . but, in fact, front-loaded with a bias to kill, with little impetus to contradict that bias.

Some people think it's not yet time to worry about drones, because they're "not really making decisions on their own" -- and they're certainly not at the point of trying to out-think humans. People think, "There's still a 'man in the loop,' so what is there to worry about?" In other words, as Isaac Asimov has been very effective at illustrating in his "Robot" series, it is always tempting to think that the culprit we have to be afraid of is the robot:
"I don't like to have Nestor 10 continue to elude us. It's bad. It must be gratifying his swollen sense of superiority. I'm afraid that his motivation is no longer simply one of following orders. It think it's becoming more a matter of sheer neurotic necessity to out-think humans." (Isaac Asimov, "Little Lost Robot," p. 178)
But as Peter Singer's description of the Iran Air Flight 655 case illustrates, the real culprit is the humans "in-the-loop" who have ceased to exercise control.

So . . . what do we need to be more afraid of? Robots with a compulsion to out-think humans? or humans that are afraid to second-guess the robots?

Let's start debating the drones now. (Here are nine more questions to guide the debate.)


(Page references are to the 1990 Byron Preiss Visual Publications edition of Robot Visions.)


Related posts

The article goes on, in what can only be dubbed "dronespeak," to explain away all the deadly consequences of the U.S. drone program in terms of information theory and neurobiology . . . the "swirl of data" . . ."multitasking" . . . "theta" . . . .

(See Drones, Dronespeak, and Death TV: "Intense" )






We can now entrust all the dirty work -- including war -- to robots. (Or can we?)

(See A Modest Proposal: Debate the Drones )












In my opinion, the reason to focus on drones is this: when we focus on drones, the general public is able to "get," to an unusual extent, the degree to which popular consent has been banished from the process of carrying out state violence. (Sure, it was banished long ago, but the absence of a human in the cockpit of a drone suddenly makes a light bulb go off in people's heads.) It takes some prodding, but people can sense that drone use somehow crosses a line. And that opens up the discussion about how our consent has been eliminated from the vast range of US militarism.

(See "Why focus on drone attacks?")

No comments:

Post a Comment