“A very simple point,” Tonya Welton replied. “With all due respect, Donald,
“One other point. This speechblock put on the staff robots, preventing them from saying who ordered them to go to the far wing of the labs that night. It seems to me that a mechanical device, an override circuit, would be more effective in setting an absolute block against speech concerning certain subjects than in giving an intricate series of orders to each and every robot. It would be easier to set up as well. And before you object that such a speechblock circuit would weaken the robot’s ability to obey the damned Three Laws, we are assuming that the attacker was not too fastidious about such things. Donald-how large a piece of microcircuitry would that take?”
“It could be made small enough to be invisible to the human eye, and could be wired in anywhere in the robot’s sensory system.”
“I’ll bet your people never even thought to look for a
There was an uncomfortable silence before Tonya continued. “Even if you do insist on that,” she said at last, “there are documented cases where Three Law robots did kill human beings.”
Donald’s head snapped back a bit, and his eyes grew dim for a moment. Tonya looked toward him with some concern. “Donald-are you in difficulty?”
“No, I beg your pardon. I am aware of-such cases-but I am afraid that the abrupt mention of them was most disturbing. The mere contemplation of such things is most unpleasant, and caused a slight flux in my motor function. However, I am recovered now, and I believe you can pursue your point without concern for me. I am now braced for it. Please continue.”
Tonya hesitated for a moment, until Kresh felt he had to speak. “It’s all right,” he said. “Donald is a police robot, programmed for special resilience where the contemplation of harm to humans is concerned. Go on.”
Tonya nodded, a bit uncertainly. “It was some years ago, about a standard century ago, and there was a great deal of effort to hush it up, but there was a series of incidents on Solaria. Robots, all with perfectly functional Three Law positronic brains, killed humans, simply because the robots were programmed with a defective definition of what a human being was. Nor is the myth of robotic infallibility completely accurate. There have doubtless been other cases we don’t know about, because the cover-ups were successful. Robots can malfunction, can make mistakes.
“It is foolish to flatly assume that a robot capable of harming a human could not be built, or to believe that a robot with Three Laws could not inadvertently harm a human under any circumstances. For my part, I see the Spacer faith in the perfection and infallibility of robots as a folk myth, an article of faith, and one that is contradicted by the facts.”
Alvar Kresh was about to open his mouth and protest, but he did not get the chance. Donald spoke up first.
“You may well be correct, Lady Tonya,” the robot said, “but I would submit that the myth is a needful one.”
“Needful in what way?” Tonya Welton demanded.
“Spacer society is predicated, almost completely, on the use of robots. There is almost no activity on Inferno, or on the other Spacer worlds, that does not rely in some way upon them. Spacers, denied robots, would be unable to survive.”
“Which is precisely the objection we Settlers have to robots,” Welton said.
“As is well known, and as is widely regarded as a specious argument by Spacers,” Donald said. “Deny Settlers computers, or hyperdrive, or any other vital machine knit into the fabric of their society, and Settler culture could not survive. Human beings can be defined as the animal that needs tools. Other species of old Earth used and made tools, but only humans need them to survive. Deny all tools to a human, and you sentence that human to all but certain death. But I digress from the main point.” Donald turned to look at Alvar and then turned back toward Welton.