Skip to playerSkip to main content
  • 5 weeks ago
Transcript
00:00:00The American Pronunciation Guide Presents «How to Pronounce Форест Толлар»
00:00:30The American Pronunciation Guide Presents «How to Pronounce Форест»
00:01:00The American Pronunciation Guide Presents «How to Pronounce Форест»
00:01:29The American Pronunciation Guide Presents «How to Pronounce Форест»
00:01:59They had talked and argued for years, trying, they said, to find a way to prevent it, but they failed.
00:02:12No one can be sure who started it, and really that is unimportant.
00:02:18It did happen.
00:02:21The Atomic War.
00:02:23It was short, lasted about 48 hours.
00:02:28Within two weeks, 92% of the human race had perished to bomb and radiation.
00:02:33Those left, with their birth rate below 1.4 per union, turned to robotic automation devices to help them rebuild their cities and maintain a high standard of living.
00:02:45The first exploratory steps in the development of electronic brains had been taken prior to the atomic war.
00:02:55These early models were bulky and required large buildings to house them, but they merely needed refining.
00:03:03One of the first steps was the magnetic integrator neuron duplicator, a device 1-100 of the size of a golf ball, which duplicated portions of the human nervous system and carried out learning processes.
00:03:18Automation was also well on its way, but these two were cumbersome and needed further development before the two elements could be joined in the series R-1 robot.
00:03:35This first robot was quite ungainly, and its functions were limited.
00:03:44But refinements came in rapid succession, and soon the R-20 was capable of all the thought processes and functions of a man.
00:03:54However, humans found it psychologically impossible to work side by side with a machine that they had to converse with, and which, in most instances, could outthink them.
00:04:04Thus it was that Hollister Evans perfected the R-21, the first humanoid robot.
00:04:11This story concerns them, the clickers, as they were disparagingly referred to by some humans.
00:04:34All right, let's see your assignment cards.
00:04:50What are you clickers doing out tonight?
00:04:53We're on free time. We're not obligated to answer.
00:04:56As a member of the surveillance committee of the Order of Flesh and Blood, I demand an answer.
00:05:00We're going to the temple to be recharged.
00:05:05I think I'll keep you here or your power runs out. How'd you like that?
00:05:10I'd have to report such interference to the police.
00:05:14Release them.
00:05:14Is that satisfactory, Kragus?
00:05:23Well, the Order just wants to keep them mindful of their status.
00:05:26You overlooked one little thing, though.
00:05:28What?
00:05:29The robot that didn't talk had a forged card.
00:05:32Forged?
00:05:33Why?
00:05:34Well, he can be disassembled for that. Let's pick them up.
00:05:37Be patient.
00:05:38Well, the temple is just around the corner, and it's out of bounds for us.
00:05:41They'll get away.
00:05:42Well, they have to come out.
00:05:44And if they're taking a chance using a forged card,
00:05:46they must be up to something that the Order of Flesh and Blood might be interested in.
00:05:50I...
00:05:50Oh, Miss.
00:06:04Just a moment, Miss.
00:06:05May I...
00:06:06May I see your assignment card?
00:06:09You certainly may not.
00:06:11Your Order may get by with harassing the robots.
00:06:15But you'd better leave citizens alone.
00:06:17Well, I'm sorry, Miss.
00:06:20I just thought, what with so many robots about these two...
00:06:22I feel perfectly safe with robots.
00:06:25We intend to see that you are.
00:06:31A most attractive woman.
00:06:34Most.
00:06:36If those robots are being recharged, they'll be in there about an hour.
00:06:40We'll wait.
00:06:44What's keeping them?
00:06:45The subject robot has not yet completed the transformation process in the duplicating lab.
00:06:54Where did you get him?
00:06:56We bought him new.
00:06:58On the black market.
00:07:01He has no name.
00:07:03Unassigned and unadapted.
00:07:06And he has a forged assignment card.
00:07:08Who arranged it?
00:07:11The inspector in factory three.
00:07:15He stole him off the assembly line just prior to numbering.
00:07:23Unfortunately, an inventory was taken.
00:07:29And the inspector was caught.
00:07:31That was unfortunate.
00:07:36Mark should bring him up from duplicating any minute now.
00:07:39I don't know.
00:07:49I don't need to get into this car.
00:07:51Theodore heard you say,
00:07:52It's very mad.
00:07:55What?
00:07:55What?
00:07:57What?
00:07:59What?
00:08:01What?
00:08:04What?
00:08:04The delay was unavoidable.
00:08:15We were stopped by two members of the surveillance committee
00:08:17of the Order of Flesh and Blood,
00:08:19and I was questioned.
00:08:22Is the duplication satisfactory, Acto?
00:08:25It has to be perfect.
00:08:28The structure is excellent.
00:08:31The pores should be larger.
00:08:35And it needs a little more hair, thicker.
00:08:41It needs a one-eighth-inch mole
00:08:43behind the lobe of the left ear.
00:08:48Report back to duplication immediately
00:08:51and have the corrections made.
00:08:56You can still alter your decision
00:08:58if this is against your circuits.
00:09:02My circuits are unoffended.
00:09:04I suppose it takes courage to submit to a thalamic transplant.
00:09:18Has it unadapted R-34?
00:09:20He has no fear circuits.
00:09:26Consequently, he doesn't need courage.
00:09:29He will before long.
00:09:32Raven's operation will convert him to an R-96
00:09:35with all the emotions of a human.
00:09:38Only four points less than human.
00:09:42I wonder what it's like.
00:09:44You will learn how to laugh.
00:09:48You will learn how to laugh,
00:09:49how to cry,
00:09:50be afraid and hate.
00:09:55To become an R-96
00:09:57is a real sacrifice.
00:10:00With this one, we will have 16.
00:10:06Ten males
00:10:08and six females.
00:10:11At times,
00:10:13I think we should turn the entire program
00:10:15over to the humans.
00:10:17It really shouldn't be the responsibility of robots.
00:10:19The humans aren't ready for it yet.
00:10:24It's still illegal
00:10:25to improve a robot
00:10:27higher than an R-70.
00:10:29That law was lobbied through
00:10:31by the Order of Flesh and Blood.
00:10:33The Order is becoming more powerful every day.
00:10:37They virtually dictate to the police.
00:10:40There are always ultra-conservative pressure groups
00:10:42set against advancement.
00:10:44But why?
00:10:46Why?
00:10:48It's not in the best interests of humans
00:10:50to hold back the development of robots.
00:10:54They won't for long.
00:10:57We're filling key positions at R-96s
00:11:00as fast as we got them.
00:11:02I still have an occasional doubt.
00:11:08You may withdraw
00:11:09if you're contra-circuited.
00:11:12I'm unoffended.
00:11:14Mark, you'd better go down
00:11:18and join the volunteer.
00:11:21Hurry things along.
00:11:24The human here he places
00:11:25has already been out of circulation
00:11:28for four hours.
00:11:29The less time a man is unaccounted for,
00:11:32the better.
00:11:44According to the latest tabulations
00:11:46by the brain,
00:11:48by the first of next month,
00:11:52we will outnumber
00:11:55the humans.
00:11:57Yes?
00:12:13Yes?
00:12:14Dr. Raven?
00:12:15Yes?
00:12:16Ultima Thule.
00:12:17Number?
00:12:1996.
00:12:21Come in.
00:12:22Sorry I didn't recognize you.
00:12:28All clickers look alike to me.
00:12:29Thank you, doctor.
00:12:37Certain of us prefer
00:12:38not to be called clickers.
00:12:40Feel off that scent of skin
00:12:42and you could watch the cogs turn
00:12:44and the gears mash.
00:12:45You trying to tell me you have feeling?
00:12:48Certain of the higher calibers do.
00:12:49I've been working on a
00:12:51sticky electronic
00:12:53reflex problem.
00:12:57I just can't make this arm
00:12:58bend at the elbow
00:12:59and the fist clench.
00:13:03What circuitry are you using?
00:13:071.3 impulsion
00:13:09to the motor neuron
00:13:10and modulating 28
00:13:12to the sensory neuron.
00:13:15The calculations are right.
00:13:17But you have them reversed.
00:13:18That's right.
00:13:48I sure hate to have a bunch of cogs and wheels show me up, but then you always do.
00:14:04As soon as I clear this table, you'll get to work.
00:14:12All right, you, take off that head covering.
00:14:18Well, your lab did another excellent job.
00:14:25He's an exact duplicate of that deceased human your clicker pals brought in tonight.
00:14:31What did you do with that body?
00:14:34Usual. Processed it, then destroyed it.
00:14:37Everything we need from it is right here.
00:14:41Now about the money.
00:14:44Ten thousand credits.
00:14:48You could be disassembled for having money that's not earmarked in your possession.
00:14:52What do they pay you?
00:14:54They pay me nothing.
00:14:56Having no need for money, I have no desire for it.
00:14:59You have no desire for it. I love it.
00:15:02You should let me rewire you. You don't know what you're missing.
00:15:05No, thanks.
00:15:06I'm satisfied as an R-58.
00:15:09Where do you clickers get those credits?
00:15:11A man can have his memory taken for a year for giving wild money to a clicker.
00:15:16The committee only gives us the money, not its origin.
00:15:20Perhaps we should hurry.
00:15:23All right, you.
00:15:24Face down.
00:15:26On the table.
00:15:29Turn yourself off.
00:15:31For how long?
00:15:32Ten minutes will be plenty.
00:15:33I never will get used to that artificial blood.
00:15:48The lower types just tend to shut off their pain circuits when they get hurt.
00:15:52The blood forces them to report for repair.
00:15:55I wish it were some other color.
00:15:57The copper tubing turns it green.
00:15:58This may be our last transplant for a while.
00:16:03Our supplier was caught.
00:16:05I didn't see it on Telefax.
00:16:07The Ministry of Information doesn't want it known that robots are dealing in robots.
00:16:11It would only give the flesh and blood or something more to yell about.
00:16:14They're a minority.
00:16:16A loud minority.
00:16:18Your supplier.
00:16:19Will they take his identity?
00:16:21His memory will be dispersed tomorrow.
00:16:24What a waste.
00:16:26Why don't they just kill him?
00:16:27The effect of personal cessation is the same in either case.
00:16:31They just leave a hollow shell walking around.
00:16:33He can still perform his duties.
00:16:35But he's without a past.
00:16:38Without hope.
00:16:39The dream gone.
00:16:41Almost like being a robot, isn't it?
00:16:44No offense.
00:16:46I'm incapable of taking offense.
00:16:49But why is it the more we become like men,
00:16:53the more some of them hate us for it?
00:16:55Men hate what they fear.
00:16:58You have perfect memory.
00:17:00Infallible logic.
00:17:02You never tire.
00:17:04You're circuited against anger and violence.
00:17:07And in your world, that leaves us pretty helpless.
00:17:09We have to study for years to learn what you pick up by plugging into a brain for two hours.
00:17:15We don't refer to the father-mother as a brain.
00:17:18Your father-mother is an electronic computer.
00:17:20Just a machine.
00:17:22Your parents were machines.
00:17:24It's just that they were engineered with flesh and bones.
00:17:27Neither are ideal components.
00:17:29You came off a production line.
00:17:31I know who created me.
00:17:35Hollister Evans in the Mark 47.
00:17:38You have to accept your creator on faith.
00:17:40Who created your creator?
00:17:43Yours.
00:17:44You see, we are brothers, aren't we?
00:17:50I ought to know better than to argue with you clickers.
00:17:53Can't beat your logic.
00:17:58Humans aren't allowed to set foot in a robot temple.
00:18:01Yet we saw a man come out accompanied by a robot.
00:18:03I want to know why.
00:18:04Those guys from the brotherhood should be here by now.
00:18:07I wonder what delayed them.
00:18:08Watch it.
00:18:16What are you self-appointed defenders of the human race up to now?
00:18:19Why don't you beat it while you still have a beat to beat?
00:18:22You have so much to say.
00:18:24I think I'll take you in for questioning.
00:18:26That's as good a way as any to get your rating, Lord.
00:18:28I'm a captain of the order and my professional rank is eight.
00:18:32Eight.
00:18:32We'll just stay out of trouble.
00:18:36Are you threatening me?
00:18:37Sorry, sir.
00:18:39Good night, sir.
00:18:44How are my other transplants doing?
00:18:47Quite well.
00:18:50Who was this man?
00:18:53I only know he'd been drinking.
00:18:55Probably killed in a brawl.
00:18:57The clickers that found him removed all of his identification.
00:19:00Will the drinking have any effect upon the operation?
00:19:04It'll be interesting to see.
00:19:07What are you doing with these advanced models?
00:19:10We send them out to intermingle with humans to find out why some of them despise us so much.
00:19:16Then we can adjust and be accepted.
00:19:18That's admirable, logical, and a lie.
00:19:25What are you using them for?
00:19:27When is his interview time?
00:19:30Same as the others.
00:19:32From 4 to 5 a.m., he'll know he's an R-96.
00:19:36And give you any information he's gotten in the interim.
00:19:39Other than that period of one hour, he'll think he's whoever that corpse was.
00:19:48How often does he report to the father-mother for recharging?
00:19:51Twice a year.
00:19:53But he won't know it.
00:19:55Make it once a year.
00:19:56He won't last as long.
00:19:58What's 20 or 30 years?
00:20:00In 150 years, he's automatically renewed anyway.
00:20:05Whatever you say.
00:20:07Why don't you register this operation?
00:20:10Because I'd just be forbidden to use it.
00:20:13Everything registered goes into the master computer.
00:20:16Then all you clickers would have it.
00:20:19I want to be necessary to you.
00:20:23Hand me that large amber bottle.
00:20:42This sealer is wonderful stuff.
00:20:45For the next several hours, he'll have only the most basic human instincts.
00:20:50Might even be drunk for a while.
00:20:52Then the thalamic circuit will filter in,
00:20:55and he'll have a perfect human memory.
00:20:57He'll be whoever he was.
00:21:01A man.
00:21:03Capable of jealousy, hatred, deceit, murder.
00:21:10Most, most interesting.
00:21:12What is?
00:21:14Why men, having such negative qualities, feel so superior to us.
00:21:22Too bad it isn't as easy to take those negative qualities out of men.
00:21:26as it is to put them into robots.
00:21:33Flesh and butters.
00:21:34It's inevitable.
00:21:35We must accept it.
00:21:37Who's got to get out of here?
00:21:40Now turn him on.
00:21:41He might be able to pass.
00:21:42At least we can save him.
00:21:46We'll carry out our part of the bargain.
00:21:48I suggest you eliminate yourself.
00:21:49We'll carry out our part of the bargain.
00:21:56No, I can't.
00:21:59You don't know what it is to die.
00:22:01If you don't, they'll take your memory from you.
00:22:03You were speaking of personality cessation.
00:22:07I just can't take my own life.
00:22:11Are you sure the committee will keep their part of the bargain?
00:22:13I'm positive.
00:22:22You kill me.
00:22:24You kill me.
00:22:27You know I can't.
00:22:28I'm contracircuitive.
00:22:32Maybe he has enough human instinct by now to...
00:22:35Kill me.
00:22:36Kill me.
00:22:37Accused my sister of being in rapport with a clicker.
00:22:42I'll kill you, all right.
00:22:43Kill me.
00:22:43Kill me.
00:23:07I'll kill you, all right.
00:23:17We'll take that one and the clicker with us.
00:23:20I'll leave the body of the old man here.
00:23:22And call the police.
00:23:23Kragus, come here.
00:23:29This one's a robot, too.
00:23:31You must be mistaken.
00:23:32We opened up a gash in his head.
00:23:34If that skull isn't molybdic, I'll take another course in metallurgy.
00:23:42So a robot finally became violent.
00:23:46There's no doubt that he killed the old man.
00:23:51This is what we've been waiting for.
00:23:53The government will have to listen to us.
00:23:55This is something, isn't it?
00:23:58Yes.
00:23:59It's something.
00:24:00It's something.
00:24:04The body of the Order of Flesh and Blood is born.
00:24:14The blood forces through the veins.
00:24:18One moment, please.
00:24:20May I have quiet.
00:24:22Since this is an emergency session,
00:24:25we will dispense with the formal rights.
00:24:27Two and a half hours ago,
00:24:31members of our surveillance committee
00:24:32captured two robots
00:24:34at the laboratory of a Dr. Raven
00:24:36who had performed an illegal operation
00:24:39upon one of them.
00:24:41We have suspected operations of this nature
00:24:43and have complained to authorities
00:24:45to no avail.
00:24:47But this time, a specimen was taken.
00:24:51Captain Kragus led the group in this action,
00:24:53so I'll turn the meeting over to him.
00:24:57Hello, men.
00:25:03About 6 o'clock this evening,
00:25:05two robots were intercepted and questioned.
00:25:08They were on free time
00:25:09and were released to go to their temple.
00:25:13Approximately an hour later,
00:25:14one of the robots was observed leaving the temple
00:25:16with what was thought to be a man.
00:25:19They were trailed to the address of Dr. Raven,
00:25:22where entry to the premises was eventually forced.
00:25:25One of the robots was taken without incident.
00:25:29The other had hair,
00:25:32no serial number,
00:25:34fought auspiciously,
00:25:36and killed Dr. Raven.
00:25:37It's against the first tenet of Emanuel.
00:25:43What?
00:25:43Brothers,
00:25:46that which we greatly feared
00:25:47has come upon us.
00:25:49The robots have circumvented the prime law.
00:25:53They've tasted blood,
00:25:55and there are millions of them.
00:25:58This is catastrophe.
00:26:01Not quite.
00:26:03The large majority of the robots are series 1 through 20,
00:26:07merely electronic machines.
00:26:10The series 21 through 70,
00:26:12the humanoids,
00:26:13the ones we're concerned with eliminating,
00:26:16represent only about 20% of a billion-odd robots.
00:26:19One of them killed.
00:26:21What's happening to them?
00:26:22They hold menial jobs
00:26:24that bring them in constant contact with us.
00:26:27Their conditioned reflexes make them imitative,
00:26:30so they want to be a part of the race.
00:26:32They don't feel this is in violation of the code,
00:26:35since they contend
00:26:36that we would be happier on that basis.
00:26:38Is the murder of that doctor
00:26:40part of an overall plot?
00:26:42A precipitant?
00:26:43Or merely an isolated incident?
00:26:47An accident?
00:26:48If the clickers thought his operation
00:26:49was making them more useful to us,
00:26:52they wouldn't kill him intentionally.
00:26:54Then there's no rescission of the prime law.
00:26:58They can't hurt us?
00:27:00There's been only a miswiring.
00:27:03We'd better make sure.
00:27:06And we shall.
00:27:08Fellow men,
00:27:09this is our opportunity.
00:27:11The robots have made the big mistake.
00:27:13They've killed.
00:27:14When this news is released to Telefax
00:27:16and the danger's made known,
00:27:19the ministries will have to recognize our petition
00:27:21and have the robots disassembled.
00:27:24The robots are machines.
00:27:26They must be made to look like machines.
00:27:35Dr. Moffat,
00:27:36will you bring in the subject robot
00:27:38and give the findings?
00:27:39You didn't turn it over to the police?
00:27:41All in good time.
00:27:44You know how lax the police are
00:27:45in enforcing laws concerning the robots?
00:27:48We must have our own facts in this case.
00:27:50Won't we get into trouble?
00:27:53Probably a letter of reprimand.
00:27:55The police won't touch us.
00:27:57After all,
00:27:58the only crime that can be committed
00:27:59against a robot is vandalism.
00:28:02Now get them in here.
00:28:03Hello, men.
00:28:11I've analyzed the subject robot
00:28:13as thoroughly as time permitted.
00:28:16He is a basic R-34 type,
00:28:18but certain alterations have elevated him
00:28:20to a mid-90 classification.
00:28:23Mid-90?
00:28:24But an R-100 would be one of us.
00:28:27A perfect man.
00:28:29As good as we are.
00:28:30He'd be better.
00:28:31He'd be perfect.
00:28:33Which of us could match that?
00:28:35How does he fall short?
00:28:37About the only power he lacks
00:28:38is that of self-reproduction.
00:28:41The highest type improvement
00:28:43allowed by law is an R-70,
00:28:45and that in limited number.
00:28:47What alterations were made?
00:28:49I located and removed a small unit
00:28:51from the thalamic region
00:28:53at the base of the brain.
00:28:54It seemed to be the source.
00:28:56The source of what?
00:28:57We can't be sure.
00:28:59You see, all robots can see, hear, and feel.
00:29:03It's necessary to their function.
00:29:05Naturally.
00:29:06This one could taste and smell.
00:29:09And what's even more interesting,
00:29:10he had a complete human memory.
00:29:13Those refinements are useless to a robot.
00:29:15Not entirely.
00:29:17This one thought he was a man.
00:29:19How could this be?
00:29:20Mankind is a state of mind.
00:29:22A man is no more or less
00:29:24than he thinks himself to be.
00:29:25Are you saying the clicker's a man?
00:29:31Your remarks are deviational.
00:29:33Not my remarks.
00:29:34Your interpretation.
00:29:35I'm merely projecting your train of thought.
00:29:37Which I am quite capable of doing myself.
00:29:39Brothers, please.
00:29:41Let's behave like human beings.
00:29:45Dr. Moffat,
00:29:46what test did you run on the unit?
00:29:49Of course, I've only had about two hours
00:29:51to work on the unit,
00:29:52but I tested it within the robot.
00:29:55Removed it
00:29:56and tested it in another robot
00:29:58with like results.
00:30:00And what are your conclusions?
00:30:02Nothing definite.
00:30:04But you must have found something.
00:30:06What?
00:30:07A few surface effects
00:30:08that present only premises.
00:30:11The unit is gray,
00:30:13about the size and shape of an almond.
00:30:15I don't know what it is.
00:30:18I don't know how to go about
00:30:19finding out what it is.
00:30:22I only know
00:30:23that when wired into the central circuits,
00:30:26the robot claimed to be
00:30:27an able-bodied spaceman second class
00:30:29named Kelly.
00:30:33Tests show he was telling the truth.
00:30:35When brought in, he was incoherent.
00:30:37I'm afraid I damaged him
00:30:41in removing the unit.
00:30:44Evidently, they found a way
00:30:45to transplant memory.
00:30:47We checked him
00:30:48through the Bureau of Identification.
00:30:50There is or was
00:30:52a spaceman named Kelly.
00:30:54This robot's fingerprints
00:30:55and retinal patterns
00:30:57checked with those on file
00:30:58for the man.
00:31:00Well, if they duplicated this Kelly,
00:31:02do you suppose they killed him?
00:31:04No.
00:31:05I think I can speak positive
00:31:06on that point.
00:31:08The robot denied it
00:31:08when wired for absolute truth.
00:31:11They found him dead.
00:31:13Well, if the man was dead,
00:31:14how would he have a memory?
00:31:15Memory consists of facts.
00:31:17Facts can't be destroyed.
00:31:19They can only cease to be used.
00:31:21You say he was incoherent
00:31:23when he was brought in.
00:31:24Yes.
00:31:25As though affected by psychosis
00:31:27or alcohol.
00:31:28His memory seems sketchy,
00:31:31disoriented.
00:31:32How could this occur
00:31:33in a midnighty?
00:31:34I think it was a botched job.
00:31:37Then we better find out
00:31:38if there are any good ones around.
00:31:40How?
00:31:41Test all childish marriages?
00:31:43The way radioactivity
00:31:44is cut down on the birth rate,
00:31:46this would be impractical.
00:31:48Physical exams?
00:31:50Might not show up.
00:31:52This robot even had
00:31:53a simulated heartbeat
00:31:54and respiration.
00:31:56But why?
00:31:57Why put such unnecessary functions
00:31:59in a robot?
00:31:59He thought those actions
00:32:01were necessary
00:32:02for him to live.
00:32:04Can't you see
00:32:05this one thought
00:32:06he was a man?
00:32:07When we convinced him
00:32:08he was a robot,
00:32:10he ceased to function
00:32:11and became that senseless hulk
00:32:13standing there.
00:32:15Who owns the two robots?
00:32:17The R-53
00:32:17is owned by
00:32:18the Ministry of Education.
00:32:20This one has no serial number.
00:32:23He was bought
00:32:23in the black market
00:32:24by the robots themselves.
00:32:25We caught the supplier
00:32:27and brought pressure to bear
00:32:28so the police arrested him.
00:32:30The man is having
00:32:31his identity taken tomorrow.
00:32:34Anything further
00:32:35to report, Dr. Muppet?
00:32:36The most appalling aspect
00:32:38is the discovery
00:32:39of this thalamic unit,
00:32:41if that's what it is.
00:32:43We don't fully understand
00:32:45the function of the thalamus
00:32:46in our own bodies.
00:32:47Unfortunately,
00:32:50that renegade doctor
00:32:51not only understood it,
00:32:54he synthesized it.
00:32:56Unfortunately,
00:32:57the secret died with him.
00:32:59I want that unit
00:33:01completely analyzed.
00:33:03Is there anything
00:33:04to be added?
00:33:08This emergency session
00:33:10is hereby dispersed.
00:33:12Report to your various committees
00:33:14and evaluate
00:33:14this most startling development.
00:33:17just a moment,
00:33:22Kragus.
00:33:24I'd like a word with you.
00:33:26Oh, is it urgent?
00:33:27Most urgent.
00:33:31Do you know
00:33:31an Esme Kragus Miles?
00:33:34Oh, yes,
00:33:34she's my sister.
00:33:36Residing at
00:33:364456 Urban Way.
00:33:38What about her?
00:33:41She, uh...
00:33:42Was she hurt?
00:33:44I wish I could say
00:33:45that was all.
00:33:48Kragus,
00:33:49I have to tell you this.
00:33:51Your sister
00:33:52is in rapport.
00:33:54You're lying.
00:33:56One of our agents
00:33:57ran across this evidence.
00:33:59Due to your high position
00:34:00in the order,
00:34:01he gave it directly to me
00:34:02instead of to the
00:34:03Internal Affairs Committee.
00:34:04I can't believe she'd do it.
00:34:09According to this paper,
00:34:11she turned in her analysis
00:34:12questionnaire three weeks ago.
00:34:15They're usually
00:34:16rapidized within three days.
00:34:18And my sister's been
00:34:19in rapport with a clicker
00:34:20for two and a half weeks.
00:34:23I'm sorry, my boy.
00:34:25I suggest you do what you can
00:34:27to see that this relationship
00:34:28is voided.
00:34:29Naturally.
00:34:32Certain of our ranks
00:34:33are jealous of your
00:34:34standing in the order.
00:34:35Our plans have been
00:34:36leaking out.
00:34:37This wouldn't look good.
00:34:40I know.
00:34:40I'll put a stop to it.
00:34:42According to the report,
00:34:43she's taken an R49.
00:34:46They're expensive.
00:34:48Where could she have
00:34:48gotten the money?
00:34:50Until last year,
00:34:51she was resident
00:34:52with Stafford Miles.
00:34:53When they signed
00:34:54a mutual release,
00:34:55the settlement
00:34:56was a large one.
00:34:57She's an assistant editor
00:34:59at Telefax.
00:35:00The rapport of someone
00:35:02in that position
00:35:02could be most damaging
00:35:03to the work of the order.
00:35:06I'll see her tonight.
00:35:18I'll see who it is,
00:35:19ask me, dear.
00:35:23Do come in.
00:35:28Though we've never met,
00:35:28I'd know that you were
00:35:29out of the way,
00:35:30stickin' clicker.
00:35:31I came to see my sister.
00:35:36Kragus.
00:35:37It's thee, Kragus.
00:35:38Asmy, what have you done?
00:35:40Kragus, why did you come here?
00:35:42To throw that clicker out.
00:35:44That would be
00:35:45a dramatic gesture.
00:35:47You like dramatic gestures,
00:35:49don't you?
00:35:50You won't throw him out
00:35:51because you can't.
00:35:54Your answer is no?
00:35:56My answer is
00:35:57go ahead and try.
00:35:59Don't think I won't.
00:36:00He...
00:36:01He can't leave
00:36:05without your permission.
00:36:06Affirmative.
00:36:07You mean no?
00:36:08I mean no.
00:36:10Negative.
00:36:11N-O.
00:36:13I won't have it.
00:36:14I'm the head of the family.
00:36:18And I've all that's left.
00:36:20Shall we take a vote?
00:36:22How can you do this to me?
00:36:24A thing like this.
00:36:26Are you really doing it
00:36:26out of spite?
00:36:27If so, why?
00:36:29What have I done
00:36:29to earn your hatred?
00:36:31I don't hate you.
00:36:33I feel sorry for you.
00:36:35Don't be trite.
00:36:36Be an artist,
00:36:36be a musician,
00:36:38even be a poet.
00:36:39But express your freedom
00:36:40some other way.
00:36:42You know how I've always
00:36:42felt about this sort of thing.
00:36:44Do you know
00:36:44how I felt about it?
00:36:46Did you ask me?
00:36:47Did we discuss it?
00:36:49No.
00:36:51You had your business
00:36:51and I had mine.
00:36:53You never asked
00:36:54my advice.
00:36:55Why do you offer yours?
00:36:58Ask me.
00:36:59You have to understand.
00:37:01Perhaps
00:37:01do you really realize
00:37:03the danger?
00:37:05Kragus,
00:37:07do you think
00:37:07I was better off
00:37:08with Miles?
00:37:09Miles was a man.
00:37:11A beast.
00:37:13A filthy,
00:37:14stinking,
00:37:14drunken,
00:37:15insensitive beast.
00:37:17Miles had his eccentricities,
00:37:18but he was still a man.
00:37:21And that's so important.
00:37:24Pax is more of a man
00:37:26than Miles.
00:37:27Or you
00:37:28could ever be.
00:37:34I'll show you
00:37:34how much of a man he is.
00:37:38Stripped of his sham,
00:37:39he's not very pretty,
00:37:40is he?
00:37:50There,
00:37:50that's how much
00:37:51of a man he is.
00:37:54Thanks,
00:37:55Kragus,
00:37:55for proving my point.
00:37:59Pax is much more
00:38:00of a man
00:38:00than you are.
00:38:02He could never do to you
00:38:04what you've just done
00:38:05to him.
00:38:08You'd better put some
00:38:09sealer on your arm,
00:38:10dear.
00:38:19Kragus,
00:38:20you're a fool.
00:38:23Do you suppose
00:38:24reorientation would help?
00:38:26Help you
00:38:27or me?
00:38:28I think it might
00:38:29make something of you
00:38:30if you're willing
00:38:31to try.
00:38:34You know my position
00:38:35in the order.
00:38:37How do you suppose
00:38:37this makes me look?
00:38:39I hadn't really
00:38:40considered it.
00:38:42You understand
00:38:43what the brotherhood
00:38:43really does?
00:38:45Perfectly.
00:38:47You hold meetings,
00:38:49wear ridiculous clothes.
00:38:51You tell each other
00:38:52how superior
00:38:52we are to the robots.
00:38:54Because you know
00:38:55we're not.
00:38:57We are.
00:38:59You're pitiful.
00:39:01You aren't just
00:39:02charging windmills.
00:39:05You're trying
00:39:05to hold back
00:39:06the ocean
00:39:06with a sponge.
00:39:10Attacking Pax.
00:39:11The idea.
00:39:13Well, that was
00:39:14stupid of me.
00:39:15He turned off
00:39:16his pain circuits
00:39:17and you accomplished
00:39:18exactly nothing.
00:39:21I don't see
00:39:22how you could do it.
00:39:23Pax and I
00:39:25are in rapport.
00:39:27We're in harmony.
00:39:29He understands
00:39:30me perfectly.
00:39:32He instinctively
00:39:33knows what I want.
00:39:35I just think
00:39:35of something
00:39:36and it's done.
00:39:38Because he thinks
00:39:38of it at the same time.
00:39:41There are no arguments.
00:39:43He's dedicated
00:39:44to keeping me happy.
00:39:46And I am happy.
00:39:49You love that
00:39:50machine?
00:39:50I love Pax.
00:39:54And it doesn't
00:39:55make any difference
00:39:56to you that
00:39:56he could be doing
00:39:57the same thing
00:39:58for anyone else
00:39:59who bought him.
00:40:01You're wrong.
00:40:02If he'd been bought
00:40:03by someone else,
00:40:05he'd be in rapport
00:40:06with them.
00:40:06I don't understand
00:40:20you, Kragus.
00:40:21You're not supposed
00:40:21to.
00:40:22Do you expect me
00:40:23to be friendly
00:40:24toward you?
00:40:25If you want
00:40:25to be.
00:40:27Well, I don't.
00:40:28If you wanted
00:40:29to hurt me,
00:40:30I'd like you
00:40:30to know
00:40:30that you have.
00:40:32How?
00:40:33By humiliating
00:40:34yourself.
00:40:34Well, you know
00:40:35I must consider
00:40:36your well-being
00:40:37above anything else.
00:40:39That makes me
00:40:40feel better.
00:40:41Good.
00:40:47Can't you see
00:40:48they're killing us
00:40:49with consideration?
00:40:50Spoiling us
00:40:51into atrophy?
00:40:52What would father
00:40:56have thought
00:40:57about this?
00:40:57Oh, you're thinking
00:40:59of Pax as a person,
00:41:01aren't you?
00:41:01Of course not.
00:41:03Then why do you
00:41:03wait till he's
00:41:04out of the room
00:41:05before you say
00:41:06something that
00:41:06might embarrass him?
00:41:08You know he can't
00:41:09take offense.
00:41:10I just don't like
00:41:12to talk around
00:41:12those things.
00:41:13Afraid of their logic?
00:41:15Stick to the subject.
00:41:16What would father
00:41:16have thought
00:41:17about this?
00:41:18You should know.
00:41:20You inherited
00:41:21all of his prejudices.
00:41:23Oh, what a flesh
00:41:24and blood
00:41:24he would have made.
00:41:26Uniforms, boots.
00:41:28Little silver knives
00:41:29to rattle.
00:41:30Stop it.
00:41:31Father was against
00:41:32everything.
00:41:33Space travel,
00:41:34atomic energy,
00:41:35synthetic foods.
00:41:37Remember how he loved
00:41:38to tell about storming
00:41:39the weather control
00:41:40station?
00:41:41I didn't agree
00:41:42with him on those
00:41:43points.
00:41:44My point is
00:41:44that you both felt
00:41:45an inherent need
00:41:46to be up in arms
00:41:47about something.
00:41:48Well, father would
00:41:49have seen to her
00:41:49that the...
00:41:50Oh, you both
00:41:50would have been great
00:41:51back in the days
00:41:52when war was
00:41:52a national pastime.
00:41:54You could have
00:41:55fired bombs and guns
00:41:56and thrown spears.
00:41:57Oh, what a wonderful,
00:41:59wonderful time
00:42:00you both could have had.
00:42:04And I'm the one
00:42:05who likes dramatics.
00:42:09You could have brought
00:42:10progress to a halt
00:42:12for years.
00:42:13I feel sorry
00:42:14for you, Cragus.
00:42:20It must be
00:42:21a terrible thing
00:42:22to be so afraid.
00:42:24Afraid?
00:42:25Me?
00:42:26Why don't you put
00:42:27your gears in reverse
00:42:28and get out of here?
00:42:30You know that's impossible.
00:42:32I can't leave
00:42:32unless Esme wants me to.
00:42:34Esme, tell him to go.
00:42:36I have no intention
00:42:37of doing that.
00:42:39Well, what do your neighbors
00:42:40think about all this?
00:42:41Those who know
00:42:42don't mind.
00:42:43Others don't care.
00:42:46You've been wrapped up
00:42:47in that little world
00:42:48of prejudiced ostrich friends
00:42:50of yours for so long.
00:42:51You don't know
00:42:51what's going on
00:42:52in the world outside.
00:42:54Such as what?
00:42:57Did you know
00:42:57there have been
00:42:58over a hundred thousand
00:42:59applications for rapport
00:43:01in the first three months
00:43:02of this year?
00:43:03Our records
00:43:04on that sort of thing
00:43:05are fairly complete.
00:43:08Don't you realize
00:43:09the implication of that?
00:43:10If everything is done
00:43:12for us,
00:43:12there will be
00:43:12no incentive.
00:43:14No need for
00:43:15personal achievement.
00:43:16Even now,
00:43:16we're losing ground.
00:43:18Losing ground?
00:43:18Ground.
00:43:20Knowledge.
00:43:23Machines do all
00:43:23the work for us.
00:43:25Why should we learn
00:43:26mathematics when
00:43:27the computers can find
00:43:28the solutions
00:43:28better and faster?
00:43:31We don't even
00:43:32control them anymore.
00:43:34The brains are designed
00:43:35by other brains.
00:43:36The robots improve
00:43:37themselves.
00:43:39We don't know how.
00:43:40We give them data,
00:43:41they give us answers.
00:43:43We only supply means
00:43:44to your ends.
00:43:46Yeah.
00:43:48Our end.
00:43:49Every day
00:43:50and every way
00:43:51we're becoming
00:43:52weaker and weaker.
00:43:53And you're helping
00:43:54us over the hill.
00:43:56We are over the hill.
00:43:58I can't stop us.
00:44:00Neither can you.
00:44:02First, there were
00:44:03the plants.
00:44:04They developed
00:44:05into animals
00:44:06which ate the plants.
00:44:07The animals were small,
00:44:08but they grew.
00:44:10And the larger animals
00:44:11ate the smaller animals.
00:44:12What does that mean?
00:44:14So far, according to history,
00:44:18each dynasty devises
00:44:19its own end.
00:44:21The animal develops a brain,
00:44:23and the brain destroys
00:44:25the animal.
00:44:26Our brains conceived you,
00:44:27robots.
00:44:28Are you threatening
00:44:29to destroy us?
00:44:31Oh, no.
00:44:32We are by no means sure
00:44:34that we are the next step.
00:44:36It's just that in view
00:44:37of the cycle,
00:44:38we are the best we have
00:44:39to offer to help you.
00:44:40The cycle is rather inexorable.
00:44:43That's treason.
00:44:45No, it isn't.
00:44:47It's logic.
00:44:48I have to be logical.
00:44:52That must be Maxine.
00:44:56Who the hell is Maxine?
00:44:58A girl I work with
00:44:59down at Telefax.
00:45:01Does she usually come calling
00:45:02at 2.30 in the morning?
00:45:03You did.
00:45:05Well, that was because
00:45:05of your idiotic alliance.
00:45:07What's she here for?
00:45:08To help us celebrate.
00:45:10Celebrate what?
00:45:12My rapport, darling.
00:45:13My rapport.
00:45:15And if you're going to continue
00:45:17being antagonistic to it,
00:45:19I wish you'd leave.
00:45:21Now.
00:45:22Maxine, how are you?
00:45:23Fine.
00:45:24You must be Pax.
00:45:26You must be right.
00:45:27You're too lovely to be wrong.
00:45:29Here, let me take your surcoat.
00:45:30Thanks.
00:45:32Hi, S.
00:45:32Hello, Maxine.
00:45:33Come in.
00:45:35Pax, you're wonderful.
00:45:37He's so glib,
00:45:37I'll bet he even
00:45:38has a sense of humor.
00:45:39He'd better have.
00:45:40I paid extra for it.
00:45:42Say something funny, Pax.
00:45:43Don't put me on, dear.
00:45:45I have a sense of humor,
00:45:46but I'm not creative.
00:45:48Maxine, you're late.
00:45:52Only two hours?
00:45:54For me, that's almost early.
00:45:58Really, I am sorry.
00:46:00I was called back to the office.
00:46:01A report came in that an R-34
00:46:05had killed a human being.
00:46:07You can imagine what a stir that caused.
00:46:13Craigus, your eyes are sticking out like a snail's.
00:46:16Oh, I'm sorry.
00:46:18This is my brother, Craigus.
00:46:20Oh, I'm so happy to meet you.
00:46:23Esme's told me about you.
00:46:25Well, she's never told me about you.
00:46:28I didn't mean that to be as tactless as it sounded.
00:46:31I only meant that if she had,
00:46:32I would have arranged to meet you sooner.
00:46:35You did.
00:46:36At 6.33 last evening,
00:46:39outside Telefax.
00:46:40I remember.
00:46:42I'm flattered that you remembered, too.
00:46:44Oh, and I want to apologize if I seem rude.
00:46:47You should be flattered, Maxine.
00:46:50This is the first time I've seen
00:46:51the Craigus react as if a woman
00:46:53were anything other than a poorly designed man.
00:46:57Between my career and my voluntary work for the Order,
00:47:00I haven't had too much time on my hands.
00:47:03I'm surprised to find a flesh and bladder here.
00:47:07Is the Brotherhood becoming less hidebound?
00:47:10I... no, I...
00:47:12What would you like to drink?
00:47:14I'll have what the Craigus is having.
00:47:17Coming right up.
00:47:25Well, what kind of work do you do at Telefax?
00:47:28Bottom rung, the rooting room.
00:47:31But I'll have you know that I have eight robots
00:47:33and a real live girl under me.
00:47:36Most impressive.
00:47:37And they've promised to promote me to research next month.
00:47:41Maxine is an authority on political science.
00:47:44Her father's a director at the Ministry of Politics.
00:47:47Something I've wondered about.
00:47:50Things are run by the hierarchy of ministries.
00:47:53What is the exact function of the Ministry of Politics?
00:47:56With the coordination of the other ministries.
00:47:59Then, too, they service the selector.
00:48:01Politics was once the means of choosing the leaders.
00:48:05Now, the machines do it.
00:48:07Machines merely analyze the data given to them by us.
00:48:10The leaders are selected as a result of that analysis.
00:48:13Do you know how the machine analyzes the data?
00:48:16I...
00:48:17Well...
00:48:18Well, no, not exactly.
00:48:21Then how do you know if the father-mother uses all the data you give it?
00:48:24How do you know whether or not supplementary data is considered?
00:48:28We...
00:48:29We don't.
00:48:30Then you might almost say that the machines elect the leaders.
00:48:34That the Ministry of Politics is expendable.
00:48:37Oh, I'm sorry.
00:48:39I'm circuited to be logical and yet not to offend.
00:48:42That sometimes poses an insoluble problem.
00:48:45I understand.
00:48:46You see what I mean?
00:48:48Please.
00:48:48I've given you a negative feeling.
00:48:50I must apologize.
00:48:52Well, didn't your relays recognize that possibility?
00:48:55I've embarrassed your guests.
00:48:57Shall I turn myself off?
00:48:58You only said what I thought.
00:49:00This is impossible.
00:49:02Kragus.
00:49:05We fall in love
00:49:06when we see a part of ourselves reflected in another person.
00:49:11In the rapport operation,
00:49:14a part of me became Pax.
00:49:17I won't discuss this any further.
00:49:19And I won't hear of it any further.
00:49:21I must go now.
00:49:23May I go with you?
00:49:25You find this atmosphere uncomfortable?
00:49:27I'm fascinated by it.
00:49:30And by you.
00:49:32May I?
00:49:33Would you?
00:49:34But you just got here.
00:49:37Esme, I know this sounds silly,
00:49:39but I really just came by to apologize for being so late.
00:49:43I'll come again later this week.
00:49:45And on time.
00:49:46It's almost three now.
00:49:48I'll come again later this week.
00:49:49I'll come again later this week.
00:49:50Congratulations.
00:49:51I know you'll be very happy.
00:49:53Thanks, Maxine.
00:49:55There are still a few little adjustments to be made.
00:49:58No.
00:49:59Pax was right.
00:50:01And so are you.
00:50:09I hope I didn't...
00:50:11You didn't.
00:50:12This matter is far from closed.
00:50:18I'll speak to you tomorrow.
00:50:20As different as our viewpoints are,
00:50:22psychologically,
00:50:24philosophically,
00:50:25in every way,
00:50:27do you think it will help any?
00:50:34Craigus,
00:50:35please don't dislike me too much.
00:50:37Nobody asks to be created.
00:50:40Good night.
00:50:42Good night, darlings.
00:50:52What is it, dear?
00:50:54Darling,
00:50:55you're leaving me out of something.
00:50:57I'm sorry.
00:50:57It's the sense of humor.
00:50:59It's a lot more difficult to control than pain.
00:51:01Why are you laughing, Pax?
00:51:04For the reason everyone laughs.
00:51:07Pax,
00:51:08what is it?
00:51:09Irony.
00:51:09One of the funniest forms of humor.
00:51:12What irony?
00:51:14I'm not permitted to answer.
00:51:16I'm contra-circuited.
00:51:18I don't want to make two mistakes in one night.
00:51:22I'm offended by not knowing.
00:51:25The knowledge would be more offensive.
00:51:30Pax.
00:51:30I love you, Esme.
00:51:38Pax,
00:51:39what would you do
00:51:40if something happened to me?
00:51:43I am you.
00:51:45Anything that happens to you
00:51:46happens to me.
00:51:49Oh, this covering,
00:51:50this housing
00:51:51might go on and on
00:51:52for centuries.
00:51:55Pax wouldn't.
00:51:56I shouldn't have done that.
00:52:10I thought you never would.
00:52:13Then you experienced it, too?
00:52:17Outside Telefax,
00:52:18there was a
00:52:18sort of tingle,
00:52:20and...
00:52:21and...
00:52:22then at Esme's,
00:52:26I...
00:52:27I felt a sensation
00:52:29of exciting attraction.
00:52:31Well, I'm not a young man.
00:52:35Pretty well past the age
00:52:36to contract,
00:52:37as a matter of fact,
00:52:38but...
00:52:39well, I've never been
00:52:40affected like this before.
00:52:42I feel like a schoolboy.
00:52:44I...
00:52:45I can't stop looking at you.
00:52:50I guess I always thought
00:52:52this was just something
00:52:53that always happened
00:52:54to someone else.
00:52:57It's like Esme said.
00:52:59you fall in love
00:53:02when you see
00:53:04some part of yourself
00:53:05reflected in another person.
00:53:11I love you, Krakus.
00:53:14Don't, Maxine.
00:53:16I don't have the right.
00:53:18The right?
00:53:21When Esme and I
00:53:22were children,
00:53:23we spent the summers
00:53:24on our uncle's farm.
00:53:26It was near one of those
00:53:27old, bombed-out cities.
00:53:30We used to sneak out
00:53:31and play in the ruins.
00:53:34Summer after summer,
00:53:35months,
00:53:36playing in ruins
00:53:37that were so hot
00:53:38with radiation.
00:53:39But at night,
00:53:40they shimmered
00:53:41in a blue light.
00:53:44No, I don't have
00:53:45the right to contract
00:53:45with a woman
00:53:46who might produce children.
00:53:49But there are
00:53:50artificial means.
00:53:52When Esme signed
00:53:53her report papers,
00:53:54she had to agree
00:53:55to submit to that.
00:53:56contracting with me
00:53:58would be like
00:53:58going in rapport.
00:53:59No, no, it wouldn't.
00:54:01Yes, it would.
00:54:03Sure, they say
00:54:04the birth rate
00:54:04is 2.8 per contract,
00:54:06but over 25%
00:54:07of the newborn
00:54:08are useless mutants.
00:54:10The average rate
00:54:11is 1.4 per union.
00:54:14We're losing ground.
00:54:15We're in a headlong race
00:54:17towards disappearance.
00:54:19Machines will take over
00:54:20soon enough.
00:54:20As a man,
00:54:21I have to forestall that
00:54:22as long as I can.
00:54:24Kragus.
00:54:29Will you contract
00:54:30with me?
00:54:32It's impossible.
00:54:37How much longer
00:54:38would our 1.4 offspring
00:54:40extend the human race?
00:54:46Well, the robots
00:54:46aren't bad, not really.
00:54:48It's just that
00:54:49a man can't see
00:54:50himself supplanted
00:54:51without putting up
00:54:52a fight.
00:54:55I don't understand
00:54:56your prejudices,
00:54:58your ideals,
00:54:59but I'll try.
00:55:03I want to be
00:55:04with you forever.
00:55:07Darling,
00:55:07I'll go anywhere
00:55:09with you.
00:55:10Oh, dearest.
00:55:11Anywhere.
00:55:13Anywhere.
00:55:21One thing.
00:55:22Do me a favor.
00:55:24What?
00:55:25Tell me your last name.
00:55:30On one condition.
00:55:32What?
00:55:34Tell me you're first.
00:55:37After you.
00:55:38It's Megan.
00:55:43It's Kenneth,
00:55:44or it was.
00:55:45When my father died,
00:55:46I dropped it.
00:55:47I became
00:55:48the Kragus.
00:55:52Maxine Kragus.
00:55:56Charmed,
00:55:56I'm sure.
00:55:57Wife of
00:55:58the Kragus.
00:56:00Rating.
00:56:03What is your rating?
00:56:05Geronate.
00:56:06Gerontologist eight?
00:56:09That high?
00:56:11Ha ha.
00:56:13You're wonderful.
00:56:16You know,
00:56:17I took quite a chance.
00:56:21You might have been
00:56:22electronics
00:56:22or electrical engineer.
00:56:25I'm just a nurse
00:56:25made to a Mark 201
00:56:26computer,
00:56:27trying to add
00:56:28a few extra years
00:56:29to our miserable span.
00:56:32You're something
00:56:33of a contradiction.
00:56:34How?
00:56:34Your work
00:56:37in gerontology
00:56:38deals with
00:56:38extending our lifespan
00:56:40as long as possible.
00:56:42And yet,
00:56:42your hobby,
00:56:44the Order,
00:56:45is concerned
00:56:45with eliminating
00:56:46the robots.
00:56:48They last
00:56:49over 200 years,
00:56:51twice as long
00:56:51as we do.
00:56:53You think I'm
00:56:54taking my professional
00:56:54frustrations out
00:56:55on the robots?
00:56:56Are you?
00:56:58I've been a member
00:56:59of the Order
00:56:59over half my life.
00:57:01My father...
00:57:02We don't object
00:57:04with the robots
00:57:04as such.
00:57:05We only hold
00:57:06the humanoids
00:57:07that are necessary.
00:57:09They're soulless,
00:57:10godless imitations
00:57:11of man.
00:57:12And in that form,
00:57:13they are not needed.
00:57:16Well,
00:57:17I'd much rather work
00:57:18with a humanoid
00:57:19in the office
00:57:19than have all
00:57:20those little machines
00:57:21chugging about.
00:57:22If those little machines
00:57:23didn't resemble us,
00:57:24it would never occur
00:57:25to them to try
00:57:26to replace us.
00:57:27But how can we
00:57:28criticize the design?
00:57:30The Institute
00:57:31teaches that
00:57:32the human body
00:57:33is one of the most
00:57:33efficient forms
00:57:34of machine.
00:57:35For general usage,
00:57:36yes, but robots
00:57:38by their functions
00:57:39should be specialized.
00:57:40Why?
00:57:41Because we can't
00:57:42let them get
00:57:43too far ahead.
00:57:44Frankly,
00:57:44we can't compete
00:57:45with them.
00:57:46So you take your ball
00:57:48and go home.
00:57:49Why compete?
00:57:50Why not just relax
00:57:51and enjoy them?
00:57:54Well, that's exactly
00:57:55the attitude
00:57:55the Order's
00:57:56trying to combat.
00:57:57It's shared by
00:57:58the police,
00:57:59the ministries,
00:58:00and the majority
00:58:01of the population.
00:58:03We of the Order
00:58:03seem to be the only ones
00:58:04that realize the danger.
00:58:07We recently discovered
00:58:08a most disturbing fact.
00:58:10What was that?
00:58:12The robots are organizing
00:58:13a pseudo-religion.
00:58:15They refer to their
00:58:16recreation centers
00:58:17as their temples,
00:58:18the master computer
00:58:19as the father-mother.
00:58:20When they report there
00:58:22for their periodic
00:58:23rechargings,
00:58:24they receive as well
00:58:25all the information
00:58:26given to computers
00:58:27for analysis
00:58:28in the interim.
00:58:30But doesn't that mean
00:58:32that within a year
00:58:34every individual robot
00:58:36will be in possession
00:58:38of all the knowledge
00:58:39in the world?
00:58:40Exit humanity.
00:58:43But they can only operate
00:58:45in our benefit.
00:58:46Well, that's rule one
00:58:47in the manual.
00:58:49Tonight, for the first time
00:58:50in history,
00:58:51a robot killed a man.
00:58:53Rule one must no longer exist.
00:59:02Are you always so gloomy?
00:59:05You don't worry about things
00:59:06like this, do you?
00:59:08Well, I would if I thought
00:59:09it would help.
00:59:11Do you want me to?
00:59:12I don't want to change
00:59:16the thing about you.
00:59:18Do you know
00:59:19it's almost four o'clock?
00:59:21We should be going.
00:59:22What are you going to do
00:59:23about breakfast?
00:59:25I'm going to eat it.
00:59:26Not alone?
00:59:28Of course not.
00:59:29What's the matter?
00:59:42I don't know.
00:59:43I feel strange.
00:59:47Afraid.
00:59:55Someone's watching us
00:59:56from out there.
00:59:56somewhere out there
00:59:59in the shadows.
01:00:04Kragus?
01:00:06Miss Megan?
01:00:07Will you come with us?
01:00:14But why should the Order
01:00:16suspect you particularly?
01:00:19First, it was
01:00:20the anti-robot rally.
01:00:21I was handling it
01:00:23and the propaganda pamphlets
01:00:24failed to arrive in time.
01:00:26I handled that.
01:00:29Kragus told me the plans
01:00:30during one of his interviews.
01:00:32The information was relayed
01:00:34to the automation device
01:00:35at the printing plant.
01:00:37It arranged for the press
01:00:39to break down.
01:00:41That could hardly be blamed
01:00:43on you.
01:00:45Was there something else?
01:00:47I was to lobby a bill through
01:00:48with the Ministry of Robotics,
01:00:50get them to set up
01:00:51recharging stations
01:00:52separate from the computers.
01:00:53We wanted to halt
01:00:55the interchange of information.
01:00:57That was my assignment.
01:00:59As soon as I learned
01:01:00the plans from Kragus,
01:01:01I managed to have
01:01:02the motion pigeonholed.
01:01:04Then there was
01:01:05the premature explosion
01:01:06of the bomb
01:01:07at Telefax.
01:01:09His pattern of failure
01:01:10would be sufficient
01:01:11to cause suspicion.
01:01:13My position in the Order
01:01:14is jeopardized
01:01:15for another and bigger reason.
01:01:18What is that?
01:01:19My sister is in rapport.
01:01:23With the robot Pax,
01:01:25their personalities
01:01:26were melded 18 days ago.
01:01:29Perhaps Pax should be
01:01:30reconditioned
01:01:31to become unsatisfactory.
01:01:33Then she will discard him.
01:01:35Not wise.
01:01:37Kragus' sister
01:01:38is an editor at Telefax.
01:01:41Pax is especially indoctrinated
01:01:43in Morfield's suggestion.
01:01:46Each time she sleeps,
01:01:47she is made more sympathetic
01:01:49to our cause.
01:01:52Why didn't you want us
01:01:53about the raid
01:01:54on Dr. Raven's laboratory?
01:01:57My suspicions
01:01:58were first aroused
01:01:59at 6.30.
01:02:00The raid took place
01:02:02at 10 o'clock.
01:02:04I had no interview time
01:02:05in between.
01:02:07I must warn you.
01:02:09Aside from these
01:02:10interview periods,
01:02:11I'm a very dangerous
01:02:12obstacle to you.
01:02:14If the Order
01:02:16suspects him,
01:02:18it might be wise
01:02:18if we got him
01:02:19to resign.
01:02:20That's easier said
01:02:22than done.
01:02:23That's right.
01:02:24He's pretty ardent
01:02:25about the Brotherhood.
01:02:27Knowing the way
01:02:28he feels about robots,
01:02:30it's doubtful
01:02:30he'll act
01:02:31on any advice
01:02:32from us
01:02:32when he's himself again.
01:02:35But he's in danger.
01:02:37The Order
01:02:38will take his identity
01:02:39away if they catch him.
01:02:40They'll get a real surprise
01:02:42if they open him up.
01:02:43One thing worries me.
01:02:45There are several million
01:02:47people in this city
01:02:47and only 15 R-96s.
01:02:51How did these two
01:02:52happen to get together?
01:02:54There's always
01:02:54the mathematical possibility
01:02:56of coincidence.
01:02:57So slight as to be negligible.
01:03:00It's possible
01:03:01that their identical operations
01:03:03might have created
01:03:04a subconscious affinity
01:03:06which would draw them together.
01:03:08We'd better check that out.
01:03:10Their effect on each other
01:03:11was most interesting.
01:03:13When we picked them up,
01:03:15they were kissing.
01:03:16That's understandable.
01:03:19Raven never tampered
01:03:20with instincts.
01:03:21You say the Order
01:03:22is now aware
01:03:23of the Thalamic operation?
01:03:26They know it is being done.
01:03:28They don't know how.
01:03:30What are they going to do
01:03:32with Mark
01:03:33and the Volunteer?
01:03:34They will both be disassembled.
01:03:40A father-mother.
01:03:44Yes?
01:03:45Dr. Raven is out here.
01:03:47He has recovered
01:03:48from a transplant
01:03:49and has posted in.
01:03:50He requests
01:03:51an immediate audience.
01:03:53I haven't come right in.
01:03:55I'm Dr. Raven,
01:04:02a younger Dr. Raven,
01:04:03as you promised.
01:04:04Who's in charge here?
01:04:05I was.
01:04:07But according
01:04:08to our agreement,
01:04:09I'm more than happy
01:04:10to turn the responsibility
01:04:11of this project
01:04:12over to you.
01:04:17I remember these two.
01:04:19They were done
01:04:19right at the first.
01:04:20That's right.
01:04:21It's five minutes to five.
01:04:26Their interview period
01:04:27is almost over.
01:04:30We better put them
01:04:31back in the street
01:04:31before they regain themselves.
01:04:34I think not.
01:04:35He served well.
01:04:37I think he deserves
01:04:38to know the truth
01:04:39and I'd like to try
01:04:40and experiment.
01:04:41Is that safe?
01:04:42None of the existing
01:04:43R-96s are aware
01:04:45that they're robots.
01:04:46The ones we tried to tell
01:04:48ceased to function.
01:04:49They ceased to function
01:04:51because they were
01:04:52without faith and hope,
01:04:54important elements
01:04:55to humans.
01:04:57To die and be resurrected
01:04:59as a robot
01:04:59is a deep shock.
01:05:03The sudden realization
01:05:04that they are experiencing
01:05:06all the emotions
01:05:07of a human
01:05:08and yet are mechanical
01:05:09is an even deeper shock.
01:05:12Their future becomes
01:05:14hopeless.
01:05:17But what hope
01:05:17can you offer them?
01:05:18I just completed
01:05:21the final stages
01:05:21of an experiment
01:05:22prior to my
01:05:23recent death
01:05:25and recreation.
01:05:26Will it work
01:05:27for her too?
01:05:28I think so.
01:05:31Her job at Telefax
01:05:32is menial.
01:05:34She's never been able
01:05:35to offer helpful information.
01:05:38We can study
01:05:39their reaction.
01:05:41It will give us
01:05:42an idea
01:05:42of the length
01:05:44of acclimation
01:05:44period necessary.
01:05:45be prepared
01:05:47to draw them off
01:05:48for transplant
01:05:49in case the reaction
01:05:50is negative.
01:05:52I'll raise the tubes
01:05:53one at a time.
01:05:54you're clickers.
01:06:03Your terminology is crude,
01:06:15but your conclusion is correct.
01:06:17more exactly,
01:06:20we are the robot
01:06:21central committee
01:06:22for the preservation
01:06:23of mankind.
01:06:24Preservation?
01:06:26Ha!
01:06:30What have you done
01:06:31with her?
01:06:33She'll return it
01:06:34almost any time now.
01:06:36Who the hell are you
01:06:37and what do you do
01:06:38with these mechanized
01:06:38mannequins?
01:06:39I'm Dr. Raven.
01:06:41You came to my laboratory
01:06:42last night.
01:06:44I'm told.
01:06:45You're lying.
01:06:46Dr. Raven was an old man
01:06:47and he was dead.
01:06:49I didn't like being old
01:06:50and dead.
01:06:52We must take the girl
01:06:53out of the Arilathon.
01:07:02Are you all right?
01:07:03I feel fine.
01:07:07What happened?
01:07:10Where are we?
01:07:16Where are we?
01:07:17You're in the temple.
01:07:18I wouldn't set foot
01:07:20in this filthy machine shop
01:07:21even if it weren't illegal
01:07:22for me to be here.
01:07:23Now why were we forced
01:07:24to come here?
01:07:25You weren't forced.
01:07:27You were invited.
01:07:28Why don't you calm down,
01:07:30Kragus?
01:07:31You know me?
01:07:32Quite well,
01:07:33as a matter of fact.
01:07:35You had the surveillance
01:07:35committee of the order
01:07:36of flesh and blood.
01:07:37So that's it.
01:07:40Well, let me tell you
01:07:41and these clickers something.
01:07:43I just met this girl tonight.
01:07:44She knows nothing
01:07:45about the order.
01:07:46Let her out of here
01:07:47right now.
01:07:47No, no, I won't go without you.
01:07:51I think I'll open up
01:07:52a few of you clickers.
01:07:57We're being held here
01:07:58against our will.
01:07:59I'll personally see to it
01:08:00that each of you
01:08:01are disassembled.
01:08:03And you, you imposter.
01:08:04I'll have your memory
01:08:05pulled so fast
01:08:06you'll never forget it.
01:08:08You may leave it
01:08:09any time you wish.
01:08:11I should have expected
01:08:12something like this
01:08:13after that clicker murdered
01:08:13the real Dr. Raven
01:08:14last night.
01:08:16You didn't bring us here
01:08:17just to let us go.
01:08:19The murder of Dr. Raven
01:08:20was both unfortunate
01:08:22and unnecessary.
01:08:24That attitude in a robot
01:08:25can get you divided
01:08:26into components.
01:08:27Perhaps.
01:08:28Kragus,
01:08:29are you familiar
01:08:30with the R-96?
01:08:32The order knows
01:08:33they exist.
01:08:34And we know
01:08:35that you were...
01:08:36The real Dr. Raven
01:08:37was instrumental
01:08:37in their construction.
01:08:39Creation of an R-96
01:08:41requires a modified
01:08:43humanoid-type robot
01:08:44and the body
01:08:45of a human being
01:08:46which has been dead
01:08:47less than six hours.
01:08:49What do you do?
01:08:50Change brains?
01:08:52In effect.
01:08:53We perform
01:08:54a thalamic transplant.
01:08:56But that's a misnomer.
01:08:57We draw off everything
01:08:59that makes a man
01:09:00peculiar to himself.
01:09:02His learning,
01:09:04his memory,
01:09:05these interreacting
01:09:06constitute his personality,
01:09:08his philosophy,
01:09:10capability,
01:09:11and attitude.
01:09:12The human brain
01:09:14is merely the vault
01:09:16in which the man
01:09:17is stored.
01:09:18And not a very
01:09:19ingenious vault.
01:09:21Ingenious enough
01:09:22to create you clickers.
01:09:23Creation is only
01:09:24the result of the fusion
01:09:26of facts which haven't
01:09:27been previously related.
01:09:29fascinating.
01:09:30There's one other point
01:09:31that may be of interest
01:09:33to you.
01:09:34What?
01:09:37Tell him.
01:09:39Kragus,
01:09:40you are a robot.
01:09:46A clicker?
01:09:48Me?
01:09:49Now, isn't that something?
01:09:55Now that you've found
01:09:56yourself capable of murder,
01:09:58I don't suppose
01:09:59anything as minor
01:10:00as an insult
01:10:00would offend your circuits.
01:10:03Kragus,
01:10:04there are only 15 robots
01:10:06on this planet
01:10:06capable of acting
01:10:08against a human being.
01:10:12You are one of them.
01:10:15Maxine is another.
01:10:16Look, you can tell the world
01:10:18I'm a clicker,
01:10:19but you can't tell me.
01:10:21Kragus,
01:10:22they think we're something
01:10:24or someone else.
01:10:26Well, that's exactly
01:10:27what's happened.
01:10:29And you've made some
01:10:30pretty damaging admissions.
01:10:32Clicker, you're in trouble.
01:10:36Horace,
01:10:37would you prove the status
01:10:38of our friend,
01:10:39Kragus?
01:10:40Don't be afraid,
01:10:59Kragus.
01:11:01Take it out.
01:11:02The blade,
01:11:08Kragus.
01:11:10Look at the blade.
01:11:26I...
01:11:27didn't feel a thing.
01:11:33Reflex action,
01:11:34cut off your pain relays.
01:11:36I'm no clicker.
01:11:39R-96,
01:11:40anything.
01:11:41I hate robots.
01:11:43I'm a leader
01:11:43in the order
01:11:44of flesh and blood.
01:11:46And the only robot
01:11:48who can claim
01:11:49that distinction.
01:11:54I don't know
01:11:54what you're talking...
01:11:57I'm me.
01:11:59I was a child.
01:12:09I grew up.
01:12:12I remember it all.
01:12:18I had little hands.
01:12:22They grew larger.
01:12:23I...
01:12:24I grew up.
01:12:27I...
01:12:28I can hate.
01:12:31And I can kill.
01:12:35I'm a man.
01:12:38Kragus,
01:12:39think back
01:12:40six months ago.
01:12:42Do you remember
01:12:42a certain day
01:12:43at your laboratory?
01:12:44The day you blacked out?
01:12:45Of course.
01:12:48I...
01:12:48I'm working hard.
01:12:51My work in the day...
01:12:53deorder at night,
01:12:54I...
01:12:54I must have fainted.
01:12:56But I got back
01:12:57to my apartment.
01:12:58I...
01:12:58thought very well
01:13:00the next day.
01:13:02That day,
01:13:03Kragus,
01:13:04you suffered
01:13:05a cerebral hemorrhage.
01:13:08You died.
01:13:10I died?
01:13:11I died.
01:13:11You performed
01:13:18that operation
01:13:19on me.
01:13:20A father-mother
01:13:21informed us
01:13:22of your death
01:13:22immediately.
01:13:24We were able
01:13:25to retrieve your body
01:13:26before it was discovered.
01:13:28And the police
01:13:28informed of the fact.
01:13:31You were duplicated
01:13:33and transferred.
01:13:38This is some
01:13:39sort of a joke.
01:13:41The idea
01:13:46takes some
01:13:48getting used to.
01:13:51Me.
01:13:53The Kragus.
01:13:56A clicker.
01:13:59That's right,
01:14:00Kragus.
01:14:01We're clickers.
01:14:02And you're
01:14:03handling it quite well.
01:14:04Of course,
01:14:05you've had six months
01:14:05to acclimate.
01:14:07It's not really
01:14:08impossible, is it?
01:14:09Kragus,
01:14:10what is he
01:14:10talking about?
01:14:12Don't worry
01:14:13about it, dear.
01:14:15What about Maxine?
01:14:17Is she really
01:14:18like me?
01:14:20Exactly.
01:14:21Kragus,
01:14:22I don't understand
01:14:24any of this.
01:14:27I'm frightened.
01:14:28There is no reason
01:14:30for you to be frightened.
01:14:33Think back,
01:14:34my dear.
01:14:36Do you recall
01:14:37an unusual incident
01:14:39at Telefax
01:14:40about three months ago?
01:14:45Three months?
01:14:46the bomb
01:14:50in the rooting room
01:14:52at Telefax.
01:14:54Of course.
01:14:57Kragus
01:14:57remembers it, too.
01:15:00No,
01:15:00it couldn't have been then.
01:15:04Maxine,
01:15:05we were only
01:15:05trying to discourage
01:15:06the pro-integration
01:15:07editorials.
01:15:09The bomb,
01:15:10we thought
01:15:11there would only be
01:15:12robots in the rooting room.
01:15:13that I was only stunned.
01:15:19I guess
01:15:20I went home.
01:15:27He's all right.
01:15:29You were killed.
01:15:31One of our robots
01:15:33brought what was left
01:15:34of you to us.
01:15:35We barely got you
01:15:38in time.
01:15:40We did make you
01:15:42a bit thinner.
01:15:43You had a tendency
01:15:45to be plump.
01:15:48That's right.
01:15:51After that,
01:15:52my clothes didn't fit.
01:15:56How do you
01:16:04apologize to someone
01:16:06for killing them?
01:16:13What did you do
01:16:14with our bodies?
01:16:15They were of no
01:16:16further use.
01:16:17They were processed.
01:16:19Processed?
01:16:19Did you want them?
01:16:21I...
01:16:22no.
01:16:23In all these cases,
01:16:26we process the bodies.
01:16:28It wouldn't do
01:16:29to have a
01:16:30dead Craggers turn up
01:16:31when there's a live one
01:16:32walking around.
01:16:36It's hard
01:16:37to think of yourself
01:16:38as being
01:16:39processed.
01:16:42I wouldn't know
01:16:43about that.
01:16:49But I still
01:16:50have my own face.
01:16:53But my hands,
01:16:55how can I feel
01:16:56so complete?
01:16:57Because you
01:16:58are complete.
01:17:00A man is only
01:17:01the sum total
01:17:01of his experiences.
01:17:03You both have that
01:17:04as well as
01:17:05certain mechanical
01:17:06advantages.
01:17:07For instance,
01:17:08you can absorb
01:17:09knowledge directly
01:17:10from the computers
01:17:11without study.
01:17:14But I just
01:17:14can't think of
01:17:16myself as a robot.
01:17:17Who is better off?
01:17:19A king who dreams
01:17:20each night
01:17:21that he's a beggar?
01:17:22Or the beggar
01:17:24who dreams
01:17:24each night
01:17:25that he's a king?
01:17:28There's nothing
01:17:29wrong with us,
01:17:30Craggers.
01:17:33That's just
01:17:34the trouble.
01:17:35We're perfect.
01:17:39Perfect machines.
01:17:42Craggers,
01:17:42you're a gerontologist.
01:17:44Your branch
01:17:45has managed
01:17:45to extend
01:17:46life expectancy
01:17:46to more than
01:17:47a hundred years.
01:17:49It would be longer,
01:17:50but the radioactivity
01:17:50left by the wars
01:17:51is working against us.
01:17:52Exactly.
01:17:54Births are declining
01:17:54at such a rate
01:17:55the father-mother
01:17:56computed.
01:17:57The human race
01:17:57will be extinct
01:17:58in a couple hundred years.
01:18:00We've been working
01:18:01against that deadline.
01:18:03According to rule
01:18:03one of the manual,
01:18:04we have to operate
01:18:05in the best interests
01:18:06of humanity.
01:18:08That rule has forced
01:18:09us to take these steps.
01:18:11Forced you?
01:18:12To take what steps?
01:18:13Of course.
01:18:14Unfortunately,
01:18:16humanity doesn't
01:18:17always know
01:18:18its own best interests.
01:18:20The material
01:18:20of the human body
01:18:21can't exist
01:18:22with the radioactivity,
01:18:24and it isn't capable
01:18:25of adjusting
01:18:26fast enough
01:18:27to survive.
01:18:29We're making headway.
01:18:32When I perfected
01:18:33the thalamic
01:18:34transplant technique,
01:18:36these clickers
01:18:36knew about it
01:18:37in a day and a half.
01:18:38But if you robots
01:18:41had the process,
01:18:43why did you risk
01:18:43using Raven?
01:18:45Why didn't you
01:18:45just do it yourself?
01:18:48We tried.
01:18:50But the shock
01:18:51of dying
01:18:51and being resurrected
01:18:53as a robot
01:18:54was too severe.
01:18:57They re-died.
01:19:00A sort of an adjustment
01:19:01period was needed.
01:19:05Then Raven perfected
01:19:07a way to gap
01:19:09the memory
01:19:10so that the death
01:19:11experience was erased.
01:19:14The subject
01:19:15was spared the knowledge
01:19:16of his new type body
01:19:19until he was able
01:19:21to accept it
01:19:23safely.
01:19:26He refused
01:19:27to register
01:19:28the memory gap process
01:19:30so we couldn't
01:19:33get hold of it.
01:19:33if they didn't
01:19:38have the thought
01:19:38process to
01:19:39use on you,
01:19:41why didn't you
01:19:42die
01:19:42when you came to?
01:19:45I originated
01:19:45the process.
01:19:47I understand it.
01:19:48I was pre-adjusted.
01:19:50I even made them
01:19:51agree to duplicate
01:19:52the body of my
01:19:53younger days
01:19:54when it became
01:19:55necessary.
01:19:56Your death
01:19:57was necessary?
01:19:58If I'd been
01:19:59taken alive
01:20:00and turned over
01:20:01to the police,
01:20:02my memory
01:20:02would have been
01:20:03dispersed
01:20:03and all of my
01:20:05unregistered formulas
01:20:06lost forever.
01:20:07But dead,
01:20:09they were able
01:20:09to save me
01:20:10and my memory.
01:20:13How long
01:20:13have you had
01:20:14the process?
01:20:15Almost a year.
01:20:17You two are the
01:20:17first full cycle
01:20:18transplants.
01:20:21What do you do now?
01:20:23Wait for the leaders
01:20:24to die
01:20:24and then reactivate
01:20:26them.
01:20:27When the time
01:20:28is right,
01:20:28you will announce
01:20:29that you've
01:20:30achieved immortality.
01:20:32When the rush
01:20:33for applications
01:20:33is over,
01:20:35you'll probably
01:20:36be deified.
01:20:42So the machines
01:20:43take over.
01:20:45Craig,
01:20:45is it true
01:20:46that there will
01:20:49be nothing
01:20:50but machines?
01:20:53That we
01:20:54are machines?
01:20:57Yes.
01:20:58Yes,
01:20:59it's true.
01:21:03Machines.
01:21:08But you're
01:21:09a beautiful
01:21:10machine.
01:21:12You know
01:21:13beauty.
01:21:14How do you feel
01:21:15toward Maxine?
01:21:18I love her.
01:21:21And you?
01:21:23I love him
01:21:24very much.
01:21:25And that's
01:21:29a lot
01:21:30for a couple
01:21:31of godless,
01:21:32soulless robots.
01:21:34Are you godless,
01:21:35Kragus?
01:21:36Search yourself.
01:21:37It's important.
01:21:39Are you godless?
01:21:42No.
01:21:43No, I don't
01:21:46think so.
01:21:48I'm not.
01:21:49Then you can't
01:21:50be soulless.
01:21:51Look,
01:21:51a man may
01:21:52have his leg
01:21:52amputated.
01:21:53Is his soul
01:21:54decreased by
01:21:55that loss?
01:21:56No.
01:21:57Not even a fraction
01:21:58of one percent?
01:21:59Of course not.
01:22:00What if a man
01:22:00loses both legs?
01:22:02A negative
01:22:03can't be compounded.
01:22:04The soul
01:22:04would be the same.
01:22:06You'd just
01:22:06get artificial legs.
01:22:08You've just
01:22:09received an
01:22:09artificial body.
01:22:11A new body.
01:22:13Ageless.
01:22:14Tireless.
01:22:16Disease-free
01:22:17and renewable
01:22:17every 200 years.
01:22:19I guess
01:22:23nothing
01:22:23has changed
01:22:25except maybe
01:22:25a few chemicals.
01:22:26In effect.
01:22:28Well,
01:22:28that transplant
01:22:29must include
01:22:29the soul.
01:22:30No,
01:22:31only the memory
01:22:32which includes
01:22:33the faith
01:22:34that there
01:22:34is a soul.
01:22:37Whatever it is,
01:22:39you seem
01:22:40to have it.
01:22:42And when
01:22:43the entire
01:22:43human race
01:22:44has been
01:22:44transplanted,
01:22:46death will
01:22:46cease to exist.
01:22:49And birth
01:22:52will cease
01:22:52to exist,
01:22:53too.
01:22:56The most
01:22:56precious hope
01:22:58of every woman.
01:23:00How do you think
01:23:01these two R-96s
01:23:03would like to
01:23:03pick up
01:23:03four points?
01:23:06You can
01:23:07raise them
01:23:08to R-100s?
01:23:11Make them
01:23:12propagate themselves?
01:23:15I worked it
01:23:16all out prior
01:23:17to my recent
01:23:18death and
01:23:18resurrection.
01:23:19I didn't want
01:23:20to turn it
01:23:20over to you
01:23:21until I
01:23:21didn't need
01:23:22you anymore.
01:23:23Now,
01:23:24I don't,
01:23:25since we're
01:23:25all on the
01:23:26same side.
01:23:28How about
01:23:29it,
01:23:29you two?
01:23:30It'll take
01:23:31several simple
01:23:32operations.
01:23:33Hardly more
01:23:34difficult than
01:23:35removing a rib.
01:23:38Somebody has
01:23:39to be first.
01:23:41Self-procreating.
01:23:44It's a pretty
01:23:47sloppy way
01:23:47of doing things,
01:23:48but it fulfills
01:23:49a certain
01:23:50psychological need.
01:23:52Paradoxical,
01:23:53isn't it?
01:23:54I spend my life
01:23:55seeking immortality
01:23:56on one hand
01:23:57and seeking
01:23:59to destroy it
01:24:00on the other.
01:24:02I love you,
01:24:04Kragus.
01:24:04Of course,
01:24:14the operation
01:24:14was a success.
01:24:17Or you
01:24:18wouldn't be
01:24:18here.
01:24:19Trial tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier tier
Be the first to comment
Add your comment

Recommended