Do Androids Dream of Electric Sheep, Philip K. Dick (1968)
Here’s a question:
If I pick up Arthur C Clarke’s 2001: A Space Odyssey (1968) and George Orwell’s Nineteen Eighty-Four (1949), will I discover that their impact has diminished because those particular years have come and gone without the suggested future coming to pass?
Yes, yes, I realize: FICTION. But still. The authors gave us handles, something to dread, and it’s difficult to let go.
And of course I realize also METAPHOR. I say: Those years from the 20th and 21st centuries resonate in ways I can’t ignore. Rarely do random numbers hold such meaning.
I’ve just finished Philip K. Dick’s Do Androids Dream of Electric Sheep? (hereinafter DADES), which specifies a date: Rick Deckard kills 6 android escapees on January 3, 2021. When I saw that, I couldn’t help thinking, Damn! why didn’t I start this book 9 months ago? (Not to mention, what took me so long to read it in the first place?) But also, I’m relieved to find another dystopian vision has failed to come true.
For anyone who has never seen Blade Runner, DADES takes place in a dystopian world ruined by nuclear war and ecological devastation. The manufacture of androids is big-business, especially given how useful the “andys” are for off-earth colonies. Andys are essentially slaves, designed with minimal emotive faculties and short life-spans; they have visible gender but humans referred to them as “it”. They are things, not humans. They feel no loyalty for other andys, no sense of awe for any form of life, yet they fully understand their own situation as the property of humans and thus desire freedom.
Occasionally, andys escape enslavement and when they reach earth, they must be “retired”. Which is to say, a bounty hunter earns $1000 for each android he destroys. Rick Deckard is a bounty hunter. No one in Deckard’s world makes clear why it is necessary to destroy the androids who escape Mars/enslavement. It’s possible to argue that any who have reached earth must have killed humans in order to escape, but once they arrive here, they’re mostly harmless. An accomplished opera singer is one of Deckard’s androids, and he doesn’t understand why she — “it”, he would correct me — had to be killed. “Retired” is the proper term, Deckard would point out. It’s impossible to kill something that isn’t actually living.
Two essential elements of the novel are missing from the movie, and they’re worth discussing here. One is the need humans feel to own and care for animals. The need is so great, there are companies that supply robots, “electric” animals that must be fed, groomed, taken to “vets”, etc. Deckard and his wife have an electric sheep. There are electric ostriches, toads, horses, along with dogs, cats, parakeets and goats. Living animals are so rare, there’s a weekly blue-book publication listing animals by species, latest sale price, and extinction status. At one point, Deckard buys a living goat, paying $3000 down with 36 additional payments of $500 per month, and that’s a bargain. It’s more than he can afford, but his need is too great. His sheep no longer satisfies.
And then there’s Mercerism. The religion of Dick’s dystopia, Mercerism’s inspirational leader (Wilbur Mercer) helps humans share empathic responses and modulate extreme emotions. It’s all very strange, with empathy boxes that take characters into a virtual world, and mood control machines that run from 0 to the high triple digits. It’s no surprise that people inhabiting Dick’s sunless and nearly lifeless Earth are in a perpetual funk and need help cheering themselves up. Electric pets can do only so much for a person. But why would they need a machine to de-escalate elation? I should think a glance out a window, or the daily cleanup of radioactive dust, would dampen any high spirits.
Turns out, though, that those resonant years (1984, 2001, 2021) are red herrings, not count-down clocks. I ought to let them go. What Dick foretold in DADES is this issue of drawing the line between human and android, a theme that the screenplay retains and that we’re still wrestling with all these years later. As we cede more control of our lives to AI, are we constructing individual dystopias where we need help empathizing with other humans? In building androids, are we simply creating a species we can consider non-human and thus suitable for enslavement? Or are we possibly creating a species we’ll eventually merge with?
In a well-known talk given in 1972, Dick wondered why much of his fiction up to then featured
… artificial constructs masquerading as humans. Usually with a sinister purpose in mind. I suppose I took it for granted that if such a construct, a robot for example, had a benign or anyhow decent purpose in mind, it would not need to so disguise itself. Now, to me, that theme seems obsolete. The constructs do not mimic humans; they are, in many deep ways, actually human already. They are not trying to fool us, for a purpose of any sort; they merely follow lines we follow, in order that they, too, may overcome such common problems as the breakdown of vital parts, loss of power source, attack by such foes as storms, short circuits….P K Dick, “The Android and the Human“, 1972
… what is it, in our behavior, that we can call specifically human? That is special to us as a living species? And what is it that, at least up to now, we can consign as merely machine behavior, or, by extension, insect behavior, or reflex behavior?
Philip K. Dick isn’t merely asking “what makes androids human?” He also wants to know what might make humans robot-like? Deckard’s job is to destroy androids. When he shoots one, it doesn’t dissolve into a pile of wires and metal. There’s blood and guts. The only way to know the retired being is not human is through bone marrow analysis. When Deckard reminds himself to objectify each andy he kills — to refer to each as “it” — how much of his humanity does he give up? Dick argues that a human becomes an android when she is “pounded down, manipulated, made into a means without [her] knowledge or consent”. Or perhaps with her knowledge and consent? Deckard consented to being a bounty killer.
Ironically, Dick finds hope in reckless youth who can’t be controlled — the kid who “rebels not out of theoretical, ideological considerations, only out of what might be called pure selfishness.”
I have to laugh. Of course! If Dick is right, selfishness, the most human characteristic, is the one that will save us. If we’re lucky, we’ll always have young people flouting rules and laws, acting unpredictably, and teaching AI that it’ll never be the boss. I think I’ll sleep better at night now.