War is a hellacious thing. People die, lives are destroyed, countries fall, economies collapse and those directly involved never see sunrises the same way again. But out of war often comes great technological and medical advances. I am the beneficiary of one such advance. In 1975 I got a new kneecap and had my joint stapled back together and a few other, now common, modifications. I mention that they are now common because when I had it done amputation was still a legitimate option and, in fact, the more common. I have two legs because my doctor knew a surgeon who had taken the new training provided, in part, by Du Pont. They paid for it because they provided the plastics that made body parts, like my new kneecap. Back then the general rule of thought held that the parts could last up to five years. I still have the original and it works okay. That means my fake kneecap is two years older, and more reliable, than my ex wife.
Okay, that sounded bitter.
Anyway, as I said, war can bring progress as well as devastation. But one thing people forget is that some great art comes from conflict as well. And art, like war, often contains surprises. John Lundburg writes today about a collection of poetry that is being released …. by the Taliban.
Poetry of the Taliban, a soon-to-be-released collection of poetry written by Taliban fighters, faced a storm of criticism this past week. The book’s editors — two scholars — acknowledged that one strongly voiced complaint they hear is that their book will be giving voice to terrorists.
Richard Kemp, a former Commander of British Forces in Afghanistan, recently raised the stakes when he publically decried the book as “self-justifying propaganda”:
What we need to remember is that these are fascist, murdering thugs who suppress women and kill people without mercy if they do not agree with them, and of course are killing our soldiers. It doesn’t do anything but give the oxygen of publicity to an extremist group which is the enemy of this country.
Alex Strick van Linschoten, one of the book’s editors, argues instead that the book reveals the human side of the Taliban: “The poetry shows that the Taliban are people just like we are, with feeling, concerns, anxieties like ours.” The book’s cover art makes this line of argument quite clear. It portrays a lone, distant figure in a gorgeous landscape — framing the book’s poets as more Wordsworth than warrior.
The poems themselves support both sides of the debate. Some of them rail against the U.S. and its allies, conveying a predictable fanaticism:
I know the black ditches
I always carry a rocket launcher on my shoulder;
I know the hot trenches
I always ambush the enemy;
I know war, conflict, and disputes
I will tell the truth even if I am hung on the gallows…
Others celebrate love and landscapes, and even convey doubt. These can be strikingly disarming:
It’s a pity that we are wandering as vagrants,
We did this all to ourselves.
John Jeffcock, a former British Army captain who edited the poetry collection Heroes, written by British soldiers, doesn’t find the universal, raw human elements of the poems surprising. He told the press, “They are written by soldiers. While you may not agree with their cause, they go through the same anguish and pain and heartache that British soldiers would do.”
In defending their book, the editors of Poetry of the Taliban also argue that there is a great deal of value in learning about one’s supposed enemy, whether you’re a military captain or a responsible civilian. Most people would agree. But is reading a book of Taliban poetry the right way to learn?
Flagg Miller, a UC Davis professor who four years ago translated some of Osama bin Laden’s poetry for the academic journal Language and Communication, was keenly aware of the poetry’s propagandistic potential. In an interview with The Times of London, he said, “The violence and barbarism of war can sicken anybody and poetry is a way to frame that violence in higher ethics.”
That bit of wisdom would apply in the case of this book as well, I think. And it seems foolish to believe — as the editors claim — that the book’s verse, so much of which was first published on the Taliban’s website, is not concerned with politics. The Taliban’s website serves a purpose — and that purpose is not to be a literary magazine.
The editors are right that their book will offer some remarkable insight into the minds of the Taliban’s soldiers — insight that a lot of people, including me, will find intriguing. But it seems only reasonable that, when reading poetry that highlights the Taliban’s “human side,” one would do well to keep the whole of who they are firmly in mind.
Poetry of the Taliban is scheduled for release in the United States next month.
I know the hot trenches
I always ambush the enemy;
Yeah, I’m really feeling his humanity and want to give him a hug. I can’t even remember why we have a conflict with these lovely people in the first place. What say ya’ll? Let’s send them an invite for a night at Old Country Buffet. My treat.
Then we can kill them.
Now that we’ve agreed on that let’s discuss HOW we are going to kill them. Show of hands?
Yes, you in the back ….
That’s right, we’re going to use sentient robots. Because heavily armed, nearly impossible to kill, robot slaves are such a good idea. Johnathan D. Moreno has the whole story.
Much controversy has surrounded the use of remote-controlled drone aircraft or “unmanned aerial vehicles” in the war on terror. But another, still more awe-inducing possibility has emerged: taking human beings out of the decision loop altogether. Emerging brain science could take us there.
Today drone pilots operate thousands of miles away from the battlefield. They must manage vast amounts of data and video images during exceptionally intense workdays. They are scrutinized by superiors for signs of stress, and to reduce such stress the Air Force is attempting shift changes, less physical isolation on the job, and more opportunities for rest.
Yet even as this remarkable new form of war fighting is becoming more widely recognized, there are at least two more possible technological transitions on the horizon that have garnered far less public attention. One is using brain-machine interface technologies to give the remote pilot instantaneous control of the drone through his or her thoughts alone. The technology is not science fiction: Brain-machine interface systems are already being used to help patients with paralytic conditions interact with their environments, like controlling a cursor on a computer screen.
In a military context, a well-trained operator, instead of using a joystick for very complicated equipment, may be able to process and transmit a command much more rapidly and accurately through a veritable mind-meld with the machine.
There are enormous technical challenges to overcome. For example, how sure can we be that the system is not interpreting a fantasy as an intention? Even if such an error were rare it could be deadly and not worth the risk.
Yet there is a way to avoid the errors of brain-machine interface that could change warfare in still more fundamental and unpredictable ways: autonomous weapons systems combining the qualities of human intelligence that neuroscience has helped us understand with burgeoning information and communications technologies.
Even now there are defensive weapons systems on U.S. naval ships that routinely operate on their own, but with human monitoring. A new automated weapons system has been deployed at the demilitarized zone between North and South Korea. This robot sentry is said to be the first that has integrated systems for surveillance, tracking, firing and voice-recognition. Reportedly it has an “automatic” mode that would allow it to fire without a human command, but that mode is not being used.
Robot warriors, proponents argue, would not be subject to the fatigue, fear and fury that often accompany the chaos of combat — emotions can result in accidental injuries to friends or even barbaric cruelties motivated by a thirst for revenge and a sense of power. Others say the proponents of robot warriors are naive: What would inhibit dictators or nonstate actors from developing robotic programs that ignored the laws of war?
Moreover, some security analysts already worry that remote control unacceptably lowers the bar for a technologically superior force to engage in conflict. And will their adversaries, frustrated by their lack of opportunity to confront an enemy in person, be more likely employ robotic terror attacks on soft targets in that enemy’s territory? Will this be the death knell of whatever ethos of honor remains in modern military conflict?
Another technology is even more radical. Neuroscientists and philosophers are exploring the parameters of “whole brain emulation,” which would involve uploading a mind from a brain into a non-biological substrate. It might be that Moore’s Law (the idea that computing capacity doubles about every two years) would have to persist for decades in order for a computer to be sufficiently powerful to receive an uploaded mind. Then again, the leap might come by means of the new science of quantum computing — machines that use atomic mechanical phenomena instead of transistors to manage vast amounts of information. Experiments with quantum computing are already being performed at a number of universities and national laboratories in the United States and elsewhere.
Robotic warriors whose computers are based on whole brain emulation raise a stark question: Would these devices even need human minders? Perhaps, if we’re not careful, these creatures could indeed inherit the Earth.
National security planners and arms-control experts have already begun to have conversations about the ethical and legal implications of neurotechnologies and robotics in armed conflict. For it is inevitable that breakthroughs will be incorporated into security and intelligence assets.
The various international agreements about weapons and warfare do not cover the convergence of neuroscience and robotic engineering. Thus new treaties will have to be negotiated, specifying the conditions under which research and deployment may proceed, what kinds of programming rules must be in place, verification procedures, and how human beings will be part of the decision loop.
Given the obvious dangers to human society, fully autonomous offensive lethal weapons should never be permitted. And though the technical possibilities and operational practicalities may take decades to emerge, there is no excuse for not starting to develop new international conventions, which themselves require many years to craft and negotiate before they may be ratified by sovereign states. The next presidential administration should lead the world in taking up this complex but important task.
A version of this article appeared May 12, 2012, on page A15 in some U.S. editions of The Wall Street Journal, with the headline: Robot Soldiers Will Be a Reality — and a Threat.
Arthur C. Clarke wrote often of how giant robotic brains would be the best thing ever to happen to humanity.
Arty obviously missed the whole Frankenstein memo.
Listen to Bill McCormick on WBIG (FOX! Sports) every Friday around 9:10 AM.