"Unleash your creativity and unlock your potential with MsgBrains.Com - the innovative platform for nurturing your intellect." » » "The Martian Way and Other Stories" by Isaac Asimov

Add to favorite "The Martian Way and Other Stories" by Isaac Asimov

Select the language in which you want the text you are reading to be translated, then select the words you don't know with the cursor to get the translation above the selected word!




Go to page:
Text Size:

‘Into every one. That will be borne’ out by any roboticist.’

‘And into Robot EZ-27 specifically?’

‘Yes, Your Honor.’

‘You will probably be required to repeat those statements under oath.’

‘I am ready to do so, Your Honor.’

He sat down again.

Dr Susan Calvin, robopsychologist-in-chief for U.S. Robots, who was the gray-haired woman sitting next to Lanning, looked at her titular superior without favor, but then she showed favor to no human being. She said, ‘Was Goodfellow’s testimony accurate, Alfred?’

‘Essentially,’ muttered Lanning. ‘He wasn’t as nervous as all that about the robot and he was anxious enough to talk business with me when he heard the price. But there doesn’t seem to be any drastic distortion.’

Dr Calvin said thoughtfully, ‘It might have been wise to put the price higher than a thousand.’

‘We were anxious to place Easy.’

‘I know. Too anxious, perhaps. They’ll try to make it look as though we had an ulterior motive.’

Lanning looked exasperated. ‘We did. I admitted that at the University Senate meeting.’

‘They can make it look as if we had one beyond the one we admitted.’

Scott Robertson, son of the founder of U.S. Robots and still owner of a majority of the stock, leaned over from Dr Calvin’s other side and sa!d in a kind of explosive whisper, ‘Why can’t you get Easy to talk so we II know where we’re at?’

‘You know he can’t talk about it, Mr Robertson.’

‘Make him. You’re the psychologist, Dr Calvin. Make him.’

‘If I’m the psychologist, Mr Robertson,’ said Susan Calvin c?ldly, ‘let me make the decisions. My robot will not be made to do anythmg at the price of his well-being.’ .

Robertson frowned and might have answered, but Justice Shane was tapping his gavel in a polite sort of way and they grudgingly fell silent. Francis J. Hart, head of the Department of English and Dean of Graduate Studies, was on the stand. He was a plump man, meticulously dressed in dark clothing of a conservative cut, and possessing sever l strands of hair traversing the pink top of his cranium. He sat well back m the witness chair with his hands folded neatly in his lap and displaying, from time to time, a tight-lipped smile.

He said, ‘My first connection with the matter of the Robot E!--27 was on the occasion of the session of the University Senate Executive Committee at which the subject was introduced by Professor Goodfellow. Thereafter, on the 10th of April of last year, we held a special meeting on the subject, during which I was in the chair.’

‘Were minutes kept of the meeting of the Executive Committee? Of the special meeting, that is?’

‘Well, no. It was a rather unusual meeting.’ The dean smiled bnefly. ‘We thought it might remain confidential.’

‘What transpired at the meeting?’

Dean Hart was not entirely comfortable as chairman of that meeting. Nor did the other members assembled seem completely calm. Only Dr Lanning appeared at peace with himself. His tall, gaunt figur and the shock of white hair that crowned him reminded Hart of portraits he had seen of Andrew Jackson.

Samples of the robot’s work lay scattered along the central regions of the table and the reproduction of a graph drawn by the robot was now in the hands of Professor Minott of Physical Chemistry. The chemist s hps were pursed in obvious approval.

Hart cleared his throat and said, ‘There seems no doubt that the robot can perform certain routine asks with adequate competence. I have gone over these, for instance, JUSt before commg m, and there 1s very little to find fault with.’

He picked up a long sheet of printing, some three times as long as the average book page. It was a sheet of galley proof, designed to be corrected by authors before the type was set up in page form. Along both of the wide margins of the galley were proofmarks, neat and superbly legible. Occasionally, a word of print was crossed out and a new word substituted in the margin in characters so fine and regular it might easily have been print itself. Some of the corrections were blue to indicate the original mistake had been the author’s, a few in red, where the printer had been wrong.

‘Actually,’ said Lanning, ‘there is less than very little to find fault with. I should say there is nothing at all to find fault with, Dr Hart. I’m sure the corrections are perfect, insofar as the original manuscript was. If the manuscript against which this galley was corrected was at fault in a matter of fact rather than of English, the robot is not competent to correct it.’

‘We accept that. However, the robot corrected word order on occasion and I don’t think the rules of English are sufficiently hidebound for us to be sure that in each case the robot’s choice was the correct one.’

‘Easy’s positronic brain,’ said Lanning, showing large teeth as he smiled, ‘has been molded by the contents of all the standard works on the subject. I’m sure you cannot point to a case where the robot’s choice was definitely the incorrect one.’

Professor Minott looked up from the graph he still held. ‘The question in my mind, Dr Lanning, is why we need a robot at all, with all the difficulties in public relations that would entail. The science of automation has surely reached the point where your company could design a machine, an ordinary computer of a type known and accepted by the public, that would correct galleys.’

‘I am sure we could,’ said Lanning stiffly, ‘but such a machine would require that the galleys be translated into special symbols or, at the least, transcribed on tapes. Any corrections would emerge in symbols. You would need to keep men employed translating words to symbols, symbols to words. Furthermore, such a computer could do no other job. It couldn’t prepare the graph you hold in your hand, for instance.’

Minott grunted.

Lanning went on. ‘The hallmark of the positronic robot is its flexibility. It can do a number of jobs. It is designed like a man so that it can use all the tools and machines that have, after all, been designed to be used by a man. It can talk to you and you can talk to it. You can actually reason with it up to a point. Compared to even a simple robot, an ordinary computer with a nonpositronic brain is only a heavy adding machine.’

Goodfellow looked up and said, ‘If we all talk and reason with the robot, what are the chances of our confusing it? I suppose it doesn’t have the capability of absorbing an infinite amount of data.’

‘No, it hasn’t. But it should last five years with ordinary use. It will know when it will require clearing, and the company will do the job without charge.’

‘The company will?’

‘Yes. The company reserves the right to service the robot outside the ordinary course of its duties. It is one reason we retain control of our positronic robots and lease rather than sell them. In the pursuit of its ordinary functions, any robot can be directed by any man. Outside its ordinary functions, a robot requires expert handling, and that we can give it. For instance, any of you might clear an EZ robot to an exte t by telling it to forget this item or that. But you would be almost certam to phrase the order in such a way as to cause it to forget too much or too little. We would detect such tampering, because we have built-in safe-guards. However, since there is no need or cle rin the robot in it, ordinary work, or for doing other useless thmgs, this raises no problem.

Dean Hart touched his head as though to make sure his carefully cultivated strands lay evenly distributed and said, ‘You are anxious to have us take the machine. Yet surely it is a losing proposition for U.S. Robots. One thousand a year is a ridiculously low price. Is it that you hope through this to rent other such machines to other universities at a more reasonable price?’

‘Certainly that’s a fair hope,’ said Lanning.

‘But even so, the number of machines you could rent would be limited. I doubt if you could make it a paying proposition.’

Lanning put his elbows on the table and earnestly leaned forward. ‘Let me put it bluntly, gentlemen. Robots cannot be used on Earth, except in certain special cases, because of prejudice against the on t e part of the public. U.S. Robots is a highly successful corpor t1on with our extraterrestrial and space-flight markets alone, to say nothmg of our computer subsidiaries. However, we are concerned with mor than profits alone. It is our firm belief that the use of robots on Earth itself would mean a better life for all eventually, even if a certain amount of economic dislocation resulted at first.

‘The labor unions are naturally against us, but surely we may expect cooperation from the large universities. The robot, Easy, will help you by relieving you of scholastic drudgery – by assuming, if you permit it, the role of galley slave for you. Other universities and research institutions will follow your lead, and if it works out, then perhaps other robots of other types may be placed and the public’s objections to them broken down by stages.’

Minott murmured, ‘Today Northeastern University, tomorrow the world.’

Angrily, Lanning whispered to Susan Calvin, ‘I wasn’t nearly that eloquent and they weren’t nearly that reluctant. At a thousand a year, they were jumping to get Easy. Professor Minott told me he’d never seen as beautiful a job as that graph he was holding and there was no mistake on the galley or anywhere else. Hart admitted it freely.’

The severe vertical lines on Dr Calvin’s face did not soften. ‘You should have demanded more money-than they could pay, Alfred, and let them beat you down.’

‘Maybe,’ he grumbled.

Prosecution was not quite done with Professor Hart. ‘After Dr Lanning left, did you vote on whether to accept Robot EZ-27?’

‘Yes, we did.’

‘With what result?’

‘In favor of acceptance, by majority vote.’

Are sens