‘Yes, sir.’’
Defense pursed his lips. ‘In other words, it turns out that the man bringing the action for payment of $750,000 damages against my client, United States Robot and Mechanical Men, Incorporated, was the one who from the beginning opposed the use of the robot-although everyone else on the Executive Committee of the University Senate was persuaded that it was a good idea.’
‘He voted against the motion, as was his right.’
‘You didn’t mention in your description of the meeting any remarks made by Professor Ninheimer. Did he make any?’
‘I think he spoke.’
‘You think?’
‘Well, he did speak.’
‘Against using the robot?’
‘Yes.’
‘Was he violent about it?’
Dean Hart paused. ‘He was vehement.’
Defense grew confidential. ‘How long have you known Professor Ninheimer, Dean Hart?’
‘About twelve years.’
‘Reasonably well?’
‘I should say so, yes.’
‘Knowing him, then, would you say he was the kind of man who might continue to bear resentment against a robot, all the more so because an adverse vote had—’
Prosecution drowned ou’t the remainder of the question with an indignant and vehement objection of his own. Defense motioned the witness down and Justice Shane called luncheon recess.
Robertson mangled his sandwich. The corporation would not founder for loss of three-quarters of a million, but the loss would do it no particular good. He was conscious, moreover, that there would be a much more costly long-term setback in public relations.
He said sourly, ‘Why all this business about how Easy got into the university? What do they hope to gain?’
The Attorney for Defense said quietly, ‘A court action is like a chess game, Mr Robertson. The winner is usually the one who can see more moves ahead, and my friend at the prosecutor’s table is no beginner. They can show damage; that’s no problem. Their main effort lies in anticipating our defense. They must be counting on us to try to show that Easy couldn’t possibly have committed the offense-because of the Laws of Robotics.’
‘All right,’ said Robertson, ‘That is our defense. An absolutely airtight one.’
‘To a robotics engineer. Not necessarily to a judge. They’re setting themselves up a position from which they can demonstrate that EZ-27 was no ordinary robot. It was the first of its type to be offered to the public. It was an experimental model that needed field testing and the university was the only decent way to provide such testing. That would look plausible in the light of Dr Lanning’s strong efforts to place the robot and the willingness of U.S. Robots to lease it for so little. The prosecution would then argue that the field test proved Easy to have been a failure. Now do you see the purpose of what’s been going on?’
‘But EZ-27 was a perfectly good model,’ argued Robertson.
‘It was the twenty-seventh in production.’
‘Which is really a bad point,’ said Defense somberly. ‘What was wrong with the first twenty-six? Obviously something. Why shouldn’t there be something wrong with the twenty-seventh, too?’
‘There was nothing wrong with the first twenty-six except that they weren’t complex enough for the task. These were the first positronic brains of the sort to be constructed and it was rather hit-and-miss to begin with. But the Three Laws held in all of them! No robot is so imperfect that the Three Laws don’t hold.’
‘Dr Lanning has explained this to me, Mr Robertson, and I am willing to take his word for it. The judge, however, may not be. We are expecting a decision from an honest and intelligent man who knows no robotics and thus may be led astray. For instance, if you or Dr Lanning or Dr Calvin were to say on the stand that any positronic brains were constructed ‘hit-and-miss,’ as you just did, Prosecution would tear you apart in cross-examination. Nothing would salvage our case. So that’s something to avoid.’
Robertson growled, ‘If only Easy would talk.’
Defense shrugged. ‘A robot is incompetent as a witness, so that would do us no good.’
‘At least we’d know some of the facts. We’d know how it came to do such a thing.’
Susan Calvin fired up. A dullish red touched her cheeks and her voice had a trace of warmth in it. ‘We know how Easy came to do it. It was ordered to! I’ve explained this to counsel and I’ll explain it to you now.’
‘Ordered by whom?’ asked Robertson in honest astonishment. (No one ever told him anything, he thought resentfully. These research people considered themselves the owners of U.S. Robots, by God!)
‘By the plaintiff,’ said Dr Calvin.
‘In heaven’s name, why?’
‘I don’t know why yet. Perhaps just that we might be sued, that he might gain some cash.’ There were blue glints in her eyes as she said that.
‘Then why doesn’t Easy say so?’
‘Isn’t that obvious? It’s been ordered to keep quiet about the matter.’
‘Why should that be obvious?’ demanded Robertson truculently.
‘Well, it’s obvious to me. Robot psychology is my profession. If Easy will not answer questions about the matter directly, he will answer questions on the fringe of the matter. By measuring increased hesitation in his answers as the central question is approached, by measuring the area of blankness and the intensity of counter-potentials set up, it is possible to tell with scientific precision that his troubles are the result of an order not to talk, with its strength based on First Law. In other words, he’s been told that if he talks, harm will be done a human being. Presumably harm to the unspeakable Professor Ninheimer, the plaintiff, who, to the robot, would seem a human being.’
‘Well, then,’ said Robertson, ‘can’t you explain that if he keeps quiet, harm will be done to U.S. Robots?’