[Part of the series: 10 Questions to Spur the Drone Debate]
"The government was offering the company a fortune, and threatening it with antirobot legislation in case of refusal. We were stuck then, and we're badly stuck now." (Isaac Asimov, "Little Lost Robot," p. 167)
Not surprisingly, the industrial companies that have developed an addiction to war production are now veering into an addiction to robotic war production. Just yesterday, at a meeting of the Anti-War Committee of Chicago, we were discussing the way in which Boeing has bet the future of the company on the next generation drone and a $1.5 billion contract from the Navy. (More at What If Illinois Became a "War-Profiteer-Free Zone" ?)
The "military-industrial complex" that Dwight Eisenhower warned against has been extended via a government-industrial robot complex.
And, as with the military-industrial complex, the government-industrial robot complex has thoroughly penetrated our country's education and research sector. While reading P.W. Singer's book, Wired for War, I was struck by the problem faced by all robotics researchers: sooner or later all roads lead to military use. Midwestern universities and colleges have a BIG drone problem.
The government calls the shots. If you think it doesn't, and/or that you're not implicated, you're kidding yourself. Four anecdotes come to mind.
First, the talk by a scientist at a conference entitled "The Atomic Age" in Chicago in 2011. "You would think there are some areas of study that the military is not interested in. For instance, I specialize in sand: how it flows, how it piles up, etc. What could be more boring than sand? Well, lo and behold, in 1991 the U.S. invaded Iraq, and suddenly the Army was interested in sand. Guess whose phone started to ring!" Nothing is immune from military interest; the more useful the technology, the more the military will demand to be involved.
Second, the film Inextinguishable Fire, by Harun Farocki. The film is about napalm, but it could easily be about drones. It shows people who work for a technology company trying mightily to disown the consequences of their company's products. ("I just work in the testing department . . . ." or "Actually, our products have lots of other uses . . . .") We need to start being honest: If you're a cog in the wheel of war, you're responsible!
Third, a story that most people are familiar with: right before the dropping of the atomic bombs on Hiroshima and Nagasaki, numerous Manhattan Project scientists signed on to a letter urging President Truman to not use the bomb. We all know how that turned out. (I was reminded of this incident by its dramatization in the powerful opera by John Adams, Doctor Atomic.) The only way to stop governments from using weapons is to make sure they never get access to them in the first place.
Finally, a very similar account by the Soviet dissident Andrei Sakharov. Sakharov recounted, in his biography, the celebration after the successful development by his scientific team of a working hydrogen bomb. "May our bombs only ever explode over test sites, not over cities!" he proclaimed in a toast. This was greeted with withering, profane sarcasm by the political commissar who was presence. "You take care of making us strong, comrade; we'll choose the targets!"
What's the solution? Should we all become Luddites?
It's not clear how to keep the military's hands off robotic and other technologies. But the first step is to open our eyes and open and our mouths and start to be honest about the extent of the problem. What are we prepared to do to keep technologies from being used by the government to injure people?
Let's start debating the drones now. (Here are nine more questions to guide the debate.)
(Page references are to the 1990 Byron Preiss Visual Publications edition of Robot Visions.)