"Unleash your creativity and unlock your potential with MsgBrains.Com - the innovative platform for nurturing your intellect." » » "The Martian Way and Other Stories" by Isaac Asimov

Add to favorite "The Martian Way and Other Stories" by Isaac Asimov

Select the language in which you want the text you are reading to be translated, then select the words you don't know with the cursor to get the translation above the selected word!




Go to page:
Text Size:

‘I tried to get it to talk and it hit me.’

What do yo_u mean: you trie toget it to talk? How did you try?’

I – I asked It questions, but 1t wouldn’t say anything, and I had to give the thmg a fair shake, so I kind of – yelled at it and—’

‘And?’

There was a lo g paus . Under Susan Calvin’s unwavering stare, Randow finally said, I tned to scare it into saying something.’ He added defensively, ‘I had to give the thing a fair shake.’

‘How did you try to scare it?’

‘I pretended to take a punch at it.’

‘And it brushed your arm aside?’

‘It hit my arm.’

‘Very well. That’s all.’ To Lanning and Bogert, she said, ‘Come, gentlemen.’

At the doorway, she turned back to Randow. ‘I can settle the bets go ng around, if you are still interested. Lenny can speak a few words qmte well.’

They said nothing until they were in Susan Calvin’s office. Its walls were lined with her books, some of which she had written herself. It retained the patina of her own frigid, carefully-ordered personality. It had only one chair in it and she sat down. Lanning and Bogert remained standing.

She said, ‘Lenny only defended itself. That is the Third Law: A robot must protect its own existence.

‘Except,’ said Lanning forcefully, ‘when this conflicts with the First or Second Laws. Complete the statement! Lenny had no right to defend itself in any way at the cost of harm, however minor, to a human being.’ ‘Nor did it,’ shot back Calvin, ‘knowingly. Lenny has an aborted brain. It had no way of knowing its own strength or the weakness of humans. In brushing aside the threatening arm of a human being it could not know the bone would break. In human terms, no moral blame can be attached to an individual who honestly cannot differentiate good and evil.’

Bogert interrupted, soothingly, ‘Now, Susan, we don’t blame. We understand that Lenny is the equivalent of a baby, humanly speaking, and we don’t blame it. But the public will. U.S. Robots will be closed down.’

‘Quite the opposite. If you had the brains of a flea, Peter, you would see that this is the opportunity U. S. Robots is waiting for. That this will solve its problems.’

Lanning hunched his white eyebrows low. He said, softly, ‘What problems, Susan?’

‘Isn’t the corporation concerned about maintaining our research personnel at the present – Heaven help us – high level?’

‘We certainly are.’

‘Well, what are you offering prospective researchers? Excitement? Novelty? The thrill of piercing the unknown? No! You offer them salaries and the assurance of no problems.’

Bogert said, ‘How do you mean, no problems?’

‘Are there problems?’ shot back Susan Calvin. ‘What kind of robots do we turn out? Fully developed robots, fit for their tasks. An industry tells us what it needs; a computer designs the brain; machinery forms the robot; and there it is, complete and done. Peter, some time ago, you asked me with reference to Lenny what its use was. What’s the use, you said, of a robot that was not designed for any job? Now I ask you – what’s the use of a robot designed for only one job? It begins and ends in the same place. The LNE models mine boron. If beryllium is needed, they are useless. If boron technology enters a new phase, they become useless. A human being so designed would be sub-human. A robot so designed is sub-robotic.’

‘Do you want a versatile robot?’ asked Lanning, incredulously.

‘Why not?’ demanded the robopsychologist. ‘Why not? I’ve been handed a robot with a brain almost completely stultified. I’ve been teaching it, and you, Alfred, asked me what was the use of that. Perhaps very little as far as Lenny itself is concerned, since it will never progress beyond the five-year-old level on a human scale. But what’s the use in general? A very great deal, if you consider it as a study in the abstract problem of learning how to teach robots. I have learned ways to short-circuit neighboring pathways in order to create new ones. More study will yield better, more subtle and more efficient techniques of doing so.’

‘Well?’

‘Suppose you started with a positronic brain that had all the basic pathways carefully outlined but none of the secondaries. Suppose you then started creating secondaries. You could sell basic robots designed for instruction; robots that could be modelled to a job, and then modelled to another, if necessary. Robots would become as versatile as human beings. Robots could learn!’’

They stared at her.

She said, impatiently, ‘You still don’t understand, do you?’

‘I understand what you are saying,’ said Lanning.

‘Don’t you understand that with a completely new field of research and completely new techniques to be developed, with a completely new area of the unknown to be penetrated, youngsters will feel a new urge to enter robotics? Try it and see.’

‘May I point out,’ saitl Bogert, smoothly, ‘that this is dangerous. Beginning with ignorant robots such as Lenny will mean that one could never trust First Law – exactly as turned out in Lenny’s case.’

‘Exactly. Advertise the fact.’

‘Advertise it!’

‘Of course. Broadcast the danger. Explain that you will set up a new research institute on the moon, if Earth’s population chooses not to allow this sort of thing to go on upon Earth, but stress the danger to the possible applicants by all means.’

Lanning said, ‘For God’s sake, why?’

‘Because the spice of danger will add to the lure. Do you think nuclear technology involves no danger and spationautics no peril? Has your lure of absolute security been doing the trick for you? Has it helped you to cater to the Frankenstein complex you all despise so? Try something else then, something that has worked in other fields.’

There was a sound from beyond the door that led to Calvin’s personal laboratories. It was the chiming sound of Lenny.

The robopsychologist broke off instantly, listening. She said, ‘Excuse me. I think Lenny is calling me.’

‘Can it call you?’ said Lanning.

‘I said I’ve managed to teach it a few words.’ She stepped toward the door, a little flustered. ‘If you will wait for me—’

They watched her leave and were silent for a moment. Then Lanning said, ‘Do you think there’s anything to what she says, Peter?’

‘Just possibly, Alfred,’ said Bogert. ‘Just possibly. Enough for us to bring the matter up at the directors’ meeting and see what they say. After all, the fat is in the fire. A robot has harmed a human being and knowledge of it is public. As Susan says, we might as well try to tum the matter to our advantage. Of course, I distrust her motives in all this.’

‘How do you mean?’

‘Even if all she has said is perfectly true, it is only rationalization as far as she is concerned. Her motive in all this is her desire to hold on to this robot. If we pressed her,’ (and the mathematician smiled at the incongruous literal meaning of the phrase) ‘she would say it was to continue learning techniques of teaching robots, but I think she has found another use for Lenny. A rather unique one that would fit only Susan of all women.’

‘I don’t get your drift.’

Bogert said, ‘Did you hear what the robot was calling?’

‘Well, no, I didn’t quite – ’ began Lanning, when the door opened suddenly, and both men stopped talking at once.

Susan Calvin stepped in again, looking about uncertainly. ‘Have either of you seen – I’m positive I had it somewhere about – Oh, there it is.’

She ran to a corner of one bookcase and picked up an object of intricate metal webbery, dumbbell shaped and hollow, with variously-shaped metal pieces inside each hollow, just too large to be able to fall out of the webbing.

As she picked it up, the metal pieces within moved and struck together, clicking pleasantly. It struck Lanning that the object was a kind of robotic version of a baby rattle.

Are sens