"Unleash your creativity and unlock your potential with MsgBrains.Com - the innovative platform for nurturing your intellect." » English Books » "The Martian Way and Other Stories" by Isaac Asimov

Add to favorite "The Martian Way and Other Stories" by Isaac Asimov

Select the language in which you want the text you are reading to be translated, then select the words you don't know with the cursor to get the translation above the selected word!




Go to page:
Text Size:

Bogert said, ‘How do you mean, no problems?’

‘Are there problems?’ shot back Susan Calvin. ‘What kind of robots do we turn out? Fully developed robots, fit for their tasks. An industry tells us what it needs; a computer designs the brain; machinery forms the robot; and there it is, complete and done. Peter, some time ago, you asked me with reference to Lenny what its use was. What’s the use, you said, of a robot that was not designed for any job? Now I ask you – what’s the use of a robot designed for only one job? It begins and ends in the same place. The LNE models mine boron. If beryllium is needed, they are useless. If boron technology enters a new phase, they become useless. A human being so designed would be sub-human. A robot so designed is sub-robotic.’

‘Do you want a versatile robot?’ asked Lanning, incredulously.

‘Why not?’ demanded the robopsychologist. ‘Why not? I’ve been handed a robot with a brain almost completely stultified. I’ve been teaching it, and you, Alfred, asked me what was the use of that. Perhaps very little as far as Lenny itself is concerned, since it will never progress beyond the five-year-old level on a human scale. But what’s the use in general? A very great deal, if you consider it as a study in the abstract problem of learning how to teach robots. I have learned ways to short-circuit neighboring pathways in order to create new ones. More study will yield better, more subtle and more efficient techniques of doing so.’

‘Well?’

‘Suppose you started with a positronic brain that had all the basic pathways carefully outlined but none of the secondaries. Suppose you then started creating secondaries. You could sell basic robots designed for instruction; robots that could be modelled to a job, and then modelled to another, if necessary. Robots would become as versatile as human beings. Robots could learn!’’

They stared at her.

She said, impatiently, ‘You still don’t understand, do you?’

‘I understand what you are saying,’ said Lanning.

‘Don’t you understand that with a completely new field of research and completely new techniques to be developed, with a completely new area of the unknown to be penetrated, youngsters will feel a new urge to enter robotics? Try it and see.’

‘May I point out,’ saitl Bogert, smoothly, ‘that this is dangerous. Beginning with ignorant robots such as Lenny will mean that one could never trust First Law – exactly as turned out in Lenny’s case.’

‘Exactly. Advertise the fact.’

‘Advertise it!’

‘Of course. Broadcast the danger. Explain that you will set up a new research institute on the moon, if Earth’s population chooses not to allow this sort of thing to go on upon Earth, but stress the danger to the possible applicants by all means.’

Lanning said, ‘For God’s sake, why?’

‘Because the spice of danger will add to the lure. Do you think nuclear technology involves no danger and spationautics no peril? Has your lure of absolute security been doing the trick for you? Has it helped you to cater to the Frankenstein complex you all despise so? Try something else then, something that has worked in other fields.’

There was a sound from beyond the door that led to Calvin’s personal laboratories. It was the chiming sound of Lenny.

The robopsychologist broke off instantly, listening. She said, ‘Excuse me. I think Lenny is calling me.’

‘Can it call you?’ said Lanning.

‘I said I’ve managed to teach it a few words.’ She stepped toward the door, a little flustered. ‘If you will wait for me—’

They watched her leave and were silent for a moment. Then Lanning said, ‘Do you think there’s anything to what she says, Peter?’

‘Just possibly, Alfred,’ said Bogert. ‘Just possibly. Enough for us to bring the matter up at the directors’ meeting and see what they say. After all, the fat is in the fire. A robot has harmed a human being and knowledge of it is public. As Susan says, we might as well try to tum the matter to our advantage. Of course, I distrust her motives in all this.’

‘How do you mean?’

‘Even if all she has said is perfectly true, it is only rationalization as far as she is concerned. Her motive in all this is her desire to hold on to this robot. If we pressed her,’ (and the mathematician smiled at the incongruous literal meaning of the phrase) ‘she would say it was to continue learning techniques of teaching robots, but I think she has found another use for Lenny. A rather unique one that would fit only Susan of all women.’

‘I don’t get your drift.’

Bogert said, ‘Did you hear what the robot was calling?’

‘Well, no, I didn’t quite – ’ began Lanning, when the door opened suddenly, and both men stopped talking at once.

Susan Calvin stepped in again, looking about uncertainly. ‘Have either of you seen – I’m positive I had it somewhere about – Oh, there it is.’

She ran to a corner of one bookcase and picked up an object of intricate metal webbery, dumbbell shaped and hollow, with variously-shaped metal pieces inside each hollow, just too large to be able to fall out of the webbing.

As she picked it up, the metal pieces within moved and struck together, clicking pleasantly. It struck Lanning that the object was a kind of robotic version of a baby rattle.

As Susan Calvin opened the door again to pass through, Lenny’s voice chimed again from within. This time, Lanning heard it clearly as it spoke the words Susan Calvin had taught it.

In heavenly celeste-like sounds, it called out, ‘Mommie, I want you. I want you, Mommie.’

And the footsteps of Susan Calvin could be heard hurrying eagerly across the laboratory floor toward the only kind of baby she could ever have or love.


A Loint of Paw

There was no question that Montie Stein had, through clever fraud, stolen better than $100,000. There was also no question that he was apprehended one day after the statute of limitations had expired.

It was his manner of avoiding arrest during that interval that brought on the epoch-making case of the State of New York vs. Montgomery Harlow Stein, with all its consequences. It introduced law to the fourth dimension.

For, you see, after having committed the fraud and possessed himself of the hundred grand plus, Stein had calmly entered a time machine, of which he was in illegal possession, and set the controls for seven years and one day in the future.

Stein’s lawyer put it simply. Hiding in time was not fundamentally different from hiding in space. If the forces of law had not uncovered Stein in the seven-year interval that was their hard luck.

The District Attorney pointed out that the statute of limitations was not intended to be a game between the law and the criminal. It was a merciful measure designed to protect a culprit from indefinitely prolonged fear of arrest. For certain crimes, a defined period of apprehension of apprehension – so to speak – was considered punishment enough. But Stein, the D.A. insisted, had not experienced any period of apprehension at all.

Stein’s lawyer remained unmoved. The law said nothing about measuring the extent of a culprit’s fear and anguish. It simply set a time limit.

Are sens

Copyright 2023-2059 MsgBrains.Com