The Bicentennial Man and Other Stories Part 7

You’re reading novel The Bicentennial Man and Other Stories Part 7 online at LightNovelFree.com. Please use the follow button to get notification about the latest chapter next time when you visit LightNovelFree.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy!

"I know better. As long as the Three Laws of Robotics exist, they cannot. They can serve as partners partners of mankind; they can share in the great struggle to understand and wisely direct the laws of nature so that together they can do more than mankind can possibly do alone; but always in such a way that robots serve human beings." of mankind; they can share in the great struggle to understand and wisely direct the laws of nature so that together they can do more than mankind can possibly do alone; but always in such a way that robots serve human beings."

"But if the Three Laws have shown themselves, over the course of two centuries, to keep robots within bounds, what is the source of the distrust of human beings for robots?"

"Well"--and Harriman's graying hair tufted as he scratched his head vigorously--"mostly superst.i.tion, of course. Unfortunately, there are also some complexities involved that anti-robot agitators seize upon."

"Involving the Three Laws?"

"Yes. The Second Law in particular. There's no problem in the Third Law, you see. It is universal. Robots must always sacrifice themselves for human beings, any human beings."

"Of course," said George Ten.

"The First Law is perhaps less satisfactory, since it is always possible to imagine a condition in which a robot must perform either Action A or Action B, the two being mutually exclusive, and where either action results in harm to human beings. The robot must therefore quickly select which action results in the least harm. To work out the positronic paths of the robot brain in such a way as to make that selection possible is not easy. If Action A results in harm to a talented young artist and B results in equivalent harm to five elderly people of no particular worth, which action should be chosen."

"Action A, " said George Ten. "Harm to one is less than harm to five."

"Yes, so robots have always been designed to decide. To expect robots to make judgments of fine points such as talent, intelligence, the general usefulness to society, has always seemed impractical. That would delay decision to the point where the robot is effectively immobilized. So we go by numbers. Fortunately, we might expect crises in which robots must make such decisions to be few....But then that brings us to the Second Law."

"The Law of Obedience."

"Yes. The necessity of obedience is constant. A robot may exist for twenty years without every having to act quickly to prevent harm to a human being, or find itself faced with the necessity of risking its own destruction. In all that time, however, it will be constantly obeying orders....Whose orders?"

"Those of a human being."

"Any human being? How do you judge a human being so as to know whether to obey or not? What is man, that thou art mindful of him, George?"

George hesitated at that.

Harriman said hurriedly, "A Biblical quotation. That doesn't matter. I mean, must a robot follow the orders of a child; or of an idiot; or of a criminal; or of a perfectly decent intelligent man who happens to be inexpert and therefore ignorant of the undesirable consequences of his order? And if two human beings give a robot conflicting orders, which does the robot follow?"

"In two hundred years," said George Ten, "have not these problems arisen and been solved?"

"No," said Harriman, shaking his head violently. "We have been hampered by the very fact that our robots have been used only in specialized environments out in s.p.a.ce, where the men who dealt with them were experts in their field. There were no children, no idiots, no criminals, no well-meaning ignoramuses present. Even so, there were occasions when damage was done by foolish or merely unthinking orders. Such damage in specialized and limited environments could be contained. On Earth, however, robots must must have judgment. So those against robots maintain, and, d.a.m.n it, they are right." have judgment. So those against robots maintain, and, d.a.m.n it, they are right."

"Then you must insert the capacity for judgment into the positronic brain."

"Exactly. We have begun to reproduce JG models in which the robot can weigh every human being with regard to s.e.x, age, social and professional position, intelligence, maturity, social responsibility and so on."

"How would that affect the Three Laws?"

"The Third Law not at all. Even the most valuable robot must destroy himself for the sake of the most useless human being. That cannot be tampered with. The First Law is affected only where alternative actions will all do harm. The quality of the human beings involved as well as the quant.i.ty must be considered, provided there is time for such judgment and the basis for it, which will not be often. The Second Law will be most deeply modified, since every potential obedience must involve judgment. The robot will be slower to obey, except where the First Law is also involved, but it will obey more rationally."

"But the judgments which are required are very complicated."

"Very. The necessity of making such judgments slowed the reactions of our first couple of models to the point of paralysis. We improved matters in the later models at the cost of introducing so many pathways that the robot's brain became far too unwieldy. In our last couple of models, however, I think we have what we want. The robot doesn't have to make an instant judgment of the worth of a human being and the value of its orders. It begins by obeying all human beings as any ordinary robot would and then it The necessity of making such judgments slowed the reactions of our first couple of models to the point of paralysis. We improved matters in the later models at the cost of introducing so many pathways that the robot's brain became far too unwieldy. In our last couple of models, however, I think we have what we want. The robot doesn't have to make an instant judgment of the worth of a human being and the value of its orders. It begins by obeying all human beings as any ordinary robot would and then it learns. learns. A robot grows, learns and matures. It is the equivalent of a child at first and must be under constant supervision. As it grows, however, it can, more and more, be allowed, unsupervised, into Earth's society. Finally, it is a full member of that society." A robot grows, learns and matures. It is the equivalent of a child at first and must be under constant supervision. As it grows, however, it can, more and more, be allowed, unsupervised, into Earth's society. Finally, it is a full member of that society."

"Surely this answers the objections of those who oppose robots."

"No," said Harriman angrily. "Now they raise others. They will not accept judgments. A robot, they say, has no right to brand this person or that as inferior. By accepting the orders of A in preference to that of B, B is branded as of less consequence than A and his human rights are violated."

"What is the answer to that?"

"There is none. I am giving up."

"I see."

"As far as I myself am concerned....Instead, I turn to you, George."

"To me?" George Ten's voice remained level. There was a mild surprise in it but it did not affect him outwardly. "Why to me?"

"Because you are not a man," said Harriman tensely. "I told you I want robots to be the partners of human beings. I want you to be mine."

George Ten raised his hands and spread them, palms outward, in an oddly human gesture. "What can I do?"

"It seems to you, perhaps, that you can do nothing, George. You were created not long ago, and you are still a child. You were designed to be not overfull of original information--it was why I have had to explain the situation to you in such detail--in order to leave room for growth. But you will grow in mind and you will come to be able to approach the problem from a non-human standpoint. Where I see no solution, you, from your own other standpoint, may see one."

George Ten said, "My brain is man-designed. In what way can it be non-human?"

"You are the latest of the JG models, George. Your brain is the most complicated we have yet designed, in some ways more subtly complicated than that of the old giant Machines. It is open-ended and, starting on a human basis, may--no, will-- will--grow in any direction. Remaining always within the insurmountable boundaries of the Three Laws, you may yet become thoroughly non-human in your thinking."

"Do I know enough about human beings to approach this problem rationally? About their history? Their psychology?"

"Of course not. But you will learn as rapidly as you can."

"Will I have help, Mr. Harriman?"

"No. This is entirely between ourselves. No one else knows of this and you must not mention this project to any human being, either at U. S. Robots or elsewhere."

George Ten said, "Are we doing wrong, Mr. Harriman, that you seek to keep the matter secret?"

"No. But a robot solution will not be accepted, precisely because it is robot in origin. Any suggested solution you have you will turn over to me; and if it seems valuable to me, I will present it. No one will ever know it came from you."

"In the light of what you have said earlier," said George Ten calmly, "this is the correct procedure....When do I start?"

"Right now. I will see to it that you have all the necessary films for scanning."

1a.

Harriman sat alone. In the artificially lit interior of his office, there was no indication that it had grown dark outside. He had no real sense that three hours had pa.s.sed since he had taken George Ten back to his cubicle and left him there with the first film references.

He was now merely alone with the ghost of Susan Calvin, the brilliant roboticist who had, virtually single-handed, built up the positronic robot from a ma.s.sive toy to man's most delicate and versatile instrument; so delicate and versatile that man dared not use it, out of envy and fear.

It was over a century now since she had died. The problem of the Frankenstein complex had existed in her her time, and she had never solved it. She had never tried to solve it, for there had been no need. Robotics had expanded in her day with the needs of s.p.a.ce exploration. time, and she had never solved it. She had never tried to solve it, for there had been no need. Robotics had expanded in her day with the needs of s.p.a.ce exploration.

It was the very success of the robots that had lessened man's need for them and had left Harriman, in these latter times-- But would Susan Calvin have turned to robots for help. Surely, she would have-- And he sat there long into the night.

Maxwell Robertson was the majority stockholder of U. S. Robots and in that sense its controller. He was by no means an impressive person in appearance. He was well into middle age, rather pudgy, and had a habit of chewing on the right corner of his lower lip when disturbed.

Yet in his two decades of a.s.sociation with government figures he had developed a way of handling them. He tended to use softness, giving in, smiling, and always managing to gain time.

It was growing harder. Gunnar Eisenmuth was a large reason for its having grown harder. In the series of Global Conservers, whose power had been second only to that of the Global Executive during the past century, Eisenmuth hewed most closely to the harder edge of the gray area of compromise. He was the first Conserver who had not been American by birth and though it could not be demonstrated in any way that the archaic name of U. S. Robots evoked his hostility, everyone at U. S. Robots believed that.

There had been a suggestion, by no means the first that year--or that generation--that the corporate name be changed to World Robots, but Robertson would never allow that. The company had been originally built with American capital, American brains, and American labor, and though the company had long been worldwide in scope and nature, the name would bear witness to its origin as long as he was in control.

Eisenmuth was a tall man whose long sad face was coa.r.s.ely textured and coa.r.s.ely featured. He spoke Global with a p.r.o.nounced American accent, although he had never been in the United States prior to his taking office.

"It seems perfectly clear to me, Mr. Robertson. There is no difficulty. The products of your company are always rented, never sold. If the rented property on the Moon is now no longer needed, it is up to you to receive the products back and transfer them."

"Yes, Conserver, but where? It would be against the law to bring them to Earth without a government permit and that has been denied."

"They would be of no use to you here. You can take them to Mercury or to the asteroids."

"What would we do with them there?"

Eisenmuth shrugged. "The ingenious men of your company will think of something."

Robertson shook his head. "It would represent an enormous loss for the company."

"I'm afraid it would," said Eisenmuth, unmoved. "I understand the company has been in poor financial condition for several years now."

"Largely because of government imposed restrictions, Conserver."

"You must be realistic, Mr. Robertson. You know that the climate of public opinion is increasingly against robots."

"Wrongly so, Conserver."

"But so, nevertheless. It may be wiser to liquidate the company. It is merely a suggestion, of course."

"Your suggestions have force, Conserver. Is it necessary to tell you that our Machines, a century ago, solved the ecological crisis?"

"I'm sure mankind is grateful, but that was a long time ago. We now live in alliance with nature, however uncomfortable that might be at times, and the past is dim."

"You mean what have we done for mankind lately?"

"I suppose I do."

"Surely we can't be expected to liquidate instantaneously; not without enormous losses. We need time."

"How much?"

"How much can you give us?"

"It's not up to me."

Robertson said softly. "We are alone. We need play no games. How much time can you give me?"

Eisenmuth's expression was that of a man retreating into inner calculations. "I think you can count on two years. I'll be frank. The Global government intends to take over the firm and phase it out for you if you don't do it by then yourself, more or less. And unless there is a vast turn in public opinion, which I greatly doubt--" He shook his head.

"Two years, then," said Robertson softly.

2a.

Robertson sat alone. There was no purpose to his thinking and it had degenerated into retrospection. Four generations of Robertsons had headed the firm. None of them was a roboticist. It had been men such as Lanning and Bogert and, most of all, most most of all, Susan Calvin, who had made U. S. Robots what it was, but surely the four Robertsons had provided the climate that had made it possible for them to do their work. of all, Susan Calvin, who had made U. S. Robots what it was, but surely the four Robertsons had provided the climate that had made it possible for them to do their work.

Without U. S. Robots, the Twenty-first Century would have progressed into deepening disaster. That it didn't was due to the Machines that had for a generation steered mankind through the rapids and shoals of history.

And now for that, he was given two years. What could be done in two years to overcome the insuperable prejudices of mankind? He didn't know.

Harriman had spoken hopefully of new ideas but would go into no details. Just as well, for Robertson would have understood none of it.

But what could Harriman do anyway? What had anyone ever done against man's intense antipathy toward the imitation. Nothing-- Robertson drifted into a half sleep in which no inspiration came.

Harriman said, "You have it all now, George Ten. You have had everything I could think of that is at all applicable to the problem. As far as sheer ma.s.s of information is concerned, you have stored more in your memory concerning human beings and their ways, past and present, than I have, or than any human being could have."

"That is very likely."

"Is there anything more that you need, in your own opinion?"

"As far as information is concerned, I find no obvious gaps. There may be matters unimagined at the boundaries. I cannot tell. But that would be true no matter how large a circle of information I took in."

"True. Nor do we have time to take in information forever. Robertson has told me that we only have two years, and a quarter of one of those years has pa.s.sed. Can you suggest anything?"

"At the moment, Mr. Harriman, nothing. I must weigh the information and for that purpose I could use help."

"From me?"

"No. Most particularly, not from you. You are a human being, of intense qualifications, and whatever you say may have the partial force of an order and may inhibit my considerations. Nor any other human being, for the same reason, especially since you have forbidden me to communicate with any."

"But in that case, George, what help?"

"From another robot, Mr. Harriman."

"What other robot?"

"There are others of the JG series which were constructed. I am the tenth, JG-10."

"The earlier ones were useless, experimental--"

"Mr. Harriman, George Nine exists."

"Well, but what use will he be? He is very much like you except for certain lacks. You are considerably the more versatile of the two."

"I am certain of that," said George Ten. He nodded his head in a grave gesture. "Nevertheless, as soon as I create a line of thought, the mere fact that I have created it commends it to me and I find it difficult to abandon it. 1f I can, after the development of a line of thought, express it to George Nine, he would consider it without having first created it. He would therefore view it without prior bent. He might see gaps and shortcomings that I might not."

Harriman smiled. "Two heads are better than one, in other words, eh, George?"

"If by that, Mr. Harriman, you mean two individuals with one head apiece, yes."

"Right. Is there anything else you want?"

The Bicentennial Man and Other Stories Part 7

You're reading novel The Bicentennial Man and Other Stories Part 7 online at LightNovelFree.com. You can use the follow function to bookmark your favorite novel ( Only for registered users ). If you find any errors ( broken links, can't load photos, etc.. ), Please let us know so we can fix it as soon as possible. And when you start a conversation or debate about a certain topic with other people, please do not offend them just because you don't like their opinions.


The Bicentennial Man and Other Stories Part 7 summary

You're reading The Bicentennial Man and Other Stories Part 7. This novel has been translated by Updating. Author: Isaac Asimov already has 569 views.

It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.

LightNovelFree.com is a most smartest website for reading novel online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to LightNovelFree.com