"Unleash your creativity and unlock your potential with MsgBrains.Com - the innovative platform for nurturing your intellect." » » "The Martian Way and Other Stories" by Isaac Asimov

Add to favorite "The Martian Way and Other Stories" by Isaac Asimov

Select the language in which you want the text you are reading to be translated, then select the words you don't know with the cursor to get the translation above the selected word!




Go to page:
Text Size:

‘It is like you to be more concerned for a machine than for a man.’ He looked at her with savage contempt.

It left her unmoved. ‘It merely seems so, Professor Ninheimer. It is only by being concerned for robots that one can truly be concerned for twenty-first-century Man. You would understand this if you were a roboticist.’

‘I have read enough robotics to know I don’t want to be a roboticist!’

‘Pardon me, you have read a book on robotics. It has taught you nothing. You learned enough to know that you could order a robot to do many things, even to falsify a book, if you went about it properly. You learned enough to know that you could not order him to forget something entirely without risking detection, but you thought you could order him into simple silence more safely. You were wrong.’

‘You guessed the truth from his silence?’

‘It wasn’t guessing. You were an amateur -and didn’t know enough to cover your tracks completely. My only problem was to prove the matter to the judge and you were kind enough to help us there, in your ignorance of the robotics you claim to despise.’

‘Is there any purpose in this discussion?’ asked Ninheimer wearily.

‘For me, yes,’ said Susan Calvin, ‘because I want you to understand how completely you have misjudged robots. You silenced Easy by telling him that if he told anyone about your own distortion of the book, you would lose your job. That set up a certain potential within Easy toward silence, one that was strong enough to resist our efforts to break it down. We would have damaged the brain if we had persisted.

‘On the witness stand, however, you yourself put up a higher counter-potential. You said that because people would think that you, not a robot, had written the disputed passages in the book, you would lose far more than just your job. You would lose your reputation, your standing, your respect, your reason for living. You would lose the memory of you after death. A new and higher potential was set up by you – and Easy talked.’

‘Oh, God,’ said Ninheimer, turning his head away.

Calvin was inexorable. She said, ‘Do you understand why he talked? It was not to accuse you, but to defend you! It can be mathematically shown that he was about to assume full blame for your crime, to deny that you had anything to do with it. The First Law required that. He was going to lie – to damage himself – to bring monetary harm to a corporation. All that meant less to him than did the saving of you. If you really understood robots and robotics, you would have let him talk. But you did not understand, as I was sure you wouldn’t, as I guaranteed to the defense attorney that you wouldn’t. You were certain, in your hatred of robots, that Easy would act as a human being would act and defend itself at your expense. So you flared out at him in panic – and destroyed yourself.’

Ninheimer said with feeling, ‘I hope some day your robots turn on you and kill you!’

‘Don’t be foolish,’ said Calvin. ‘Now I want you to explain why you’ve done all this.’

Ninheimer grinned a distorted, humorless grin. ‘I am to dissect my mind, am I, for your intellectual curiosity, in return for immunity from a charge of perjury?’

‘Put it that way if you like,’ said Calvin emotionlessly. ‘But explain.’ ‘So that you can counter future antirobot attempts more efficiently?

With greater understanding?’

‘I accept that.’

‘You know,’ said Ninheimer, ‘I’ll tell you – just to watch it do you no good at all. You can’t understand human motivation. You can only understand your damned machines because you’re a machine yourself, with skin on.’

He was breathing hard and there was no hesitation in his speech, no searching for precision. It was as though he had no further use for precision.

He said, ‘For two hundred and fifty years, the machine has been replacing Man and destroying the handcraftsman. Pottery is spewed out of molds and presses. Works of art have been replaced by identical gimcracks stamped out on a die. Call it progress, if you wish! The artist is restricted to abstractions, confined to the world of ideas. He must design something in his mind – and then the machine does the rest.

‘Do you suppose the potter is content with mental creation? Do you suppose the idea is enough? That there is nothing in the feel of the clay itself, in watching the thing grow as hand and mind work together? Do you suppose the actual growth doesn’t act as a feedback to modify and improve the idea?’

‘You are not a potter,’ said Dr Calvin.

‘I am a creative artist! I design and build articles and books. There is more to it than the mere thinking of words and of putting them in the right order. If that were all, there would be no pleasure in it, no return.

‘A book should take shape in the hands of the writer. One must actually see the chapters grow and develop. One must work and rework and watch the changes take place beyond the original concept even. There is taking the galleys in hand and seeing how the sentences look in print and molding them again. There are a hundred contacts between a man and his work at every stage of the game – and the contact itself is pleasurable and repays a man for the work he puts into his creation more than anything else could. Your robot would take all that away.’

‘So does a typewriter. So does a printing press. Do you propose to return to the hand-illumination of manuscripts?’

‘Typewriters and printing presses take away some, but your robot would deprive us of all. Your robot takes over the galleys. Soon it, or other robots, would take over the original writing, the searching of the sources, the checking and cross-checking of passages, perhaps even the deduction of conclusions. What would that leave the scholar? One thing only – the barren decisions concerning what orders to give the robot next! I want to save the future generations of the world of scholarship from such a final hell. That meant more to me than even my own reputation and so I set out to destroy U.S. Robots by whatever means.’

‘You were bound to fail,’ said Susan Calvin.

‘I was bound to try,’ said Simon Ninheimer.

Calvin turned and left. She did her best to feel no pang of sympathy for the broken man.

She did not entirely succeed.


Lenny

United States Robots and Mechanical Men, Inc., had a problem. The problem was people.

Peter Bogert, Senior Mathematician, was on his way to Assembly when he encountered Alfred Lanning, Research Director. Lanning was bending his ferocious white eyebrows together and staring down across the railing into the computer room.

On the floor below the balcony, a trickle of humanity of both sexes and various ages was looking about curiously, while a guide intoned a set speech about robotic computing.

‘This computer you see before you,’ he said, ‘is the largest of its type in the world. It contains five million three hundred thousand cryotrons and is capable of dealing simultaneously with over one hundred thousand variables. With its help, U. S. Robots is able to design with precision the positronic brains of new models.

‘The requirements are fed in on tape which is perforated by the action of this keyboard – something like a very complicated typewriter or linotype machine, except that it does not deal with letters but with concepts. Statements are broken down into the symbolic logic equivalents and those in tum converted to perforation patterns.

‘The computer can, in less than one hour, present our scientists with a design for a brain which will give all the necessary positronic paths to make a robot . . .’

Alfred Lanning looked up at last and noticed the other. ‘Ah, Peter,’ he said.

Bogert raised both hands to smooth down his already perfectly smooth and glossy head of black hair. He said, ‘You don’t look as though you think much of this, Alfred.’

Lanning grunted. The idea of public guided tours of U. S. Robots was a fairly recent origin, and was supposed to serve a dual function. On the one hand, the theory went, it allowed people to see robots at close quarters and counter their almost instinctive fear of the mechanical objects through increased familiarity. And on the other hand, it was supposed to interest at least an occasional person in taking up robotics research as a life work.

‘You know I don’t,’ Lanning said finally. ‘Once a week, work is disrupted. Considering the man-hours lost, the return is insufficient.’

‘Still no rise in job applications, then?’

‘Oh, some, but only in the categories where the need isn’t vital. It’s research men that are needed. You know that. The trouble is that with robots forbidden on Earth itself, there’s something unpopular about being a roboticist.’

‘The damned Frankenstein complex,’ said Bogert, consciously imitating one of the other’s pet phrases.

Lanning missed the gentle jab. He said, ‘I ought to be used to it, but I never will. You’d think tqat by now every human being on Earth would know that the Three Laws represented a perfect safeguard; that robots are simply not dangerous. Toke this bunch.’ He glowered down. ‘Look at them. Most of them go through the robot assembly room for the thrill of fear, like riding a roller coaster. Then when they enter the room with the MEC model-damn it, Peter, a MEC model that will do nothing on God’s green Earth but take two steps forward, say ‘Pleased to meet you, sir,’ shake hands, then take two steps back-they back away and mothers snatch up their kids. How do we expect to get brainwork out of such idiots?’

Bogert had no answer. Together, they stared down once again at the line of sightseers, now passing out of the computer room and into the positronic brain assembly section. Then they left. They did not, as it turned out, observe Mortimer W. Jacobson, age 16--who, to do him complete justice, meant no harm whatever.

In fact, it could not even be said to be Mortimer’s fault. The day of the week on which the tour took place was known to all workers. All devices in its path ought to have been carefully neutralized or locked, since it was unreasonable to expect human beings to withstand the temptation to handle knobs, keys, handles and pushbuttons. In addition, the guide ought to have been very carefully on the watch for those who succumbed.

But, at the time, the guide had passed into the next room and Mortimer was tailing the line. He passed the keyboard on which instructions were fed into the computer. He had no way of suspecting that the plans for a new robot design were beingfed into it at that moment, or, being a good kid, he would have avoided the keyboard. He had no way of knowing that, by what amounted to almost criminal negligence, a technician had not inactivated the keyboard.

So Mortimer touched the keys at random as though he were playing a musical instrument.

He did not notice that a section of perforated tape stretched itself out of the instrument in another part of the room-soundlessly, unobtrusively.

Are sens