Perfect Servants

  

Essay by
William H. Stoddard

 

January 2006

  
Isaac Asimov & Jack Williamson

In 1950, Isaac Asimov published I, Robot, a collection of related stories that had appeared over the preceding decade. In this series, Asimov established what became one of the standard formulas of science fiction: the robot as the ideal servant, using its superior strength and logic for the good of its human creators. This formula was so persuasive that it almost entirely replaced the older formula, of the robot that turned against its creators and destroyed them. Asimov coined the name "the Frankenstein complex" for this older plot, after one of its prototypes — though in both Mary Shelley's Frankenstein and Karel Capek's R.U.R., which coined the word "robot", the artificial beings were in fact manmade living organisms and not machines. In Asimov's view, the fear of robots was an expression of ignorance and superstition, backed up by the self-interest of the rich and powerful.

In 1947, before Asimov's book came out but after most of its constituent stories were published, Jack Williamson published the short story "With Folded Hands". This offers an entirely different set of fears of mechanical beings: not that they will seek to destroy their creators, but that their eagerness to help them and protect them will make them a subtler threat. Williamson's "humanoids" will not permit the wife of his protagonist to cook, because kitchens contain sharp tools and hot surfaces; nor to drive in the country, because human drivers sometimes have accidents; nor even to read, because novels are about unhappy people in dangerous situations. And when their human masters find their protection frustrating, humanoids operate on their brains, to remove the impulse to rebel and even the memory of rebellion.
  

Human laws regulating robots / humanoids

The striking thing about these two situations is that they both grow out of the same basic rules of conduct for robots. Asimov's Three Laws of Robotics are widely known:

  1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
      

Williamson offers a much briefer statement, but one that makes the same basic points:

To Serve and Obey, and Guard Men from Harm.
  

Williamson's principles don't explicitly include self-preservation, but his story makes it clear that his humanoids do preserve themselves — because if they failed to do so, they could not serve, obey, or guard anyone. So it appears that we have two different statements of the same basic assumptions. But they lead to quite different conduct by the beings they govern. Asimov sees the potential for making human life better; Williamson sees the threat to human freedom.

In the real world, at the time when Asimov and Williamson were writing, there were in fact entities that offered both sets of potentialities, and that were interpreted in both ways by different sets of people. Those entities were governments, and especially national governments. The first half of the Twentieth Century had seen a great shift in political beliefs and institutions, in which government played an increasingly intrusive role in daily life, in forms ranging from the outright totalitarianism of Germany and the Soviet Union to the far more limited increase in American state power under the Progressive movement and the New Deal.

It's possible, at the very least, to see an analogy between the two writers' views of mechanical automata, and the spectrum of political views of increased federal regulation. Asimov's robots are flawlessly obedient servants, as, in the view of liberals of the time, government agencies were obedient servants of the public, expressed through the votes of democratic majorities. Williamson's humanoids are nominally bound to obey human beings, but their overriding duty to protect against harm in fact authorizes them to take away all meaningful human freedom of choice, as, in the view of conservatives of the time, economic regulation used appeals to the public welfare to destroy competitive markets. It's possible, if one likes, to read both sets of automata as symbols for the state, and especially for the socialist state and the planned economy.
  

When I first thought of this analogy, I decided that it would be interesting to write about it, and that before doing so, I ought to reread both works. After having done so, I realized that this interpretation was far less of a creative leap than I had thought. In fact, both Asimov and Williamson spell out the political and economic aspects of their views of mechanical automata, in a way that almost exactly matches up with this reading.
 

Robot law helping people: Asimov

For Isaac Asimov, the most important story is the last one in I, Robot, titled "The Evitable Conflict". This is almost a dramatized essay rather than a story, but for that very reason its intellectual content is unusually clear. Asimov envisions a future in which Earth is divided into four great regions, each with its own set of Machines to guide its economic decisions. The name "Machines" refers to enormously sophisticated computers with the same basic kind of brains as robots, and with the same obedience to the Three Laws of Robotics. The story presents a puzzle — all four sets of Machines are making slight errors, which cause small economic dislocations. As the World Coordinator investigates these errors, a common pattern emerges: each of the errors has some connection with one of the four regions, the Northern Region — which encompasses the United States, Canada, Australia, New Zealand, the British Isles, and the Soviet Union, and which is the primary home of the Society for Humanity, an association of powerful executives who oppose economic planning by the Machines, wanting to make their own decisions and not be told what to do. And each of the errors results, in the final outcome, in economic loss to these executives and their firms, and in weakening of their political influence. The economic plans of the Machines aim at what is best for humanity, and what is best for humanity is the survival of the Machines and the growth of the control of the economy.

On one side, then, Asimov places the concept of a planned economy, which he portrays in his story as more efficient and productive than any humanly controlled economy — and, by implication, than any market economy. On the other, he places wealthy and powerful men who want economic freedom, and whom, in effect, his story defines as parasites, to be humanely neutralized — because, after all, they are human beings and must not be harmed, but it's all right for them to suffer a little mild harm for the sake of humanity as a whole. The World Coordinator notes their connection with the Northern Region — which includes the countries that historically had the strongest ideology of economic freedom. He also notes that both the Eastern and Tropic Regions have histories of humiliation by the Northern Region, and that though the Northern Region has the most economic power, it is losing out to the Tropic Region in the long run. These hint at a view of industrial capitalism as exploitative and parasitic. Robots are not merely symbols of a socialistic government, but the actual means of making one work through superhuman computational powers.
  

Humanoid law helping people: Williamson

On Jack Williamson's side, his story makes it clear that his humanoids replace all human economic enterprises. The replacement is largely peaceful, by property owners assigning all their assets to the humanoids in return for lifelong service; but his protagonist's refusal to make the exchange is followed by his being driven into bankruptcy, and his business assets being seized by a bank now run by the humanoids. This legal takeover, though not carried out by violent revolution, exactly fits the classical socialist program of "expropriation of the expropriators". The result is a society where all human enterprises have been abolished, and replaced by a single productive complex run entirely by the humanoids — that is, the humanoids exactly fit the economic role of the state in the classical socialist program.

The story that followed "With Folded Hands", a novel titled The Humanoids, offers a more complex view of the humanoids, which can be read as ending with a favorable view of their activities. But along the way, Williamson has an opponent of the humanoids speak of his cause as "the worth and the dignity and the rights of each individual" — exactly the values that opponents of socialism appealed to. Whether Williamson's endorsement of the humanoids is meant to be taken seriously, or as grim irony, like the ending of George Orwell's 1984, the metaphorical link is clear.
 

An efficient state of robots / humanoids & people

Science fiction, in the 1930s and early 1940s, attracted many fans who supported "progressive" causes — and in that era, "progressive" meant "socialist". Neither the brutality of Leninist regimes nor the insupportable costs of social democracy were as clear then as they are now. Socialism seemed to be a vision of a better future. And robots offered a technology equivalent of that vision. Where the older vision of the robot, or the artificial being, was of a brutal monster, superior to human beings in physical force and hostile to them — in effect, a symbol for the militaristic state — the new vision was of beings that were altruistic, rational, and efficient, and that would take over from human beings through their superior fitness to govern — the promise of the socialist state.

Asimov's stories present that vision as seen by a writer who accepts it as desirable; Williamson's "With Folded Hands" offers a cautionary response to it, by a writer who favors more individualistic values. And both writers seem to have been aware of the political issues that their seemingly fantastic stories reflected.

  

 © 2006 William H. Stoddard


  
Mentality at Troynovant
the mind and mental operation

Utopias at Troynovant
utopia in power, or dystopia
  

  
More by William H. Stoddard

ComWeb at Troynovant
mail & communications,
codes & ciphers, computing,
networks, robots, the Web
  

  

Troynovant, or Renewing Troy:    New | Contents
  recurrent inspiration    Recent Updates

www.Troynovant.com
emergent layers of
untimely Reviews
& prismatic Essays

  

Essays A-L, M-Z: mining the prismatic veins of Knowledge
Follies: whimsical Ventures, light-hearted Profundities
Illuminants: glances brightening toward heat
Memoirs: Personal History, personally told
Postcards: flat-carded Scenes of Passage
Satires: a point or a quiver-full

Strata | Regions | Personae   

  
© 2001-2024 Franson Publications