- 2 days ago
The Creation of the Humanoids 1962 Remastered
Category
🎥
Short filmTranscript
00:00:00The American Pronunciation Guide Presents «How to Pronounce Форест Толлер»
00:00:58The American Pronunciation Guide Presents «How to Pronounce Форест»
00:01:28The American Pronunciation Guide Presents «How to Pronounce Форест»
00:01:55They had talked and argued for years,
00:01:57trying, they said, to find a way to prevent it.
00:02:00But they failed.
00:02:02No one can be sure who started it,
00:02:04and really that is unimportant.
00:02:08It did happen.
00:02:10The atomic war.
00:02:12It was short, lasted about 48 hours.
00:02:17Within two weeks, 92% of the human race had perished to bomb and radiation.
00:02:24Those left, with their birth rate below 1.4 per union,
00:02:29turned to robotic automation devices to help them rebuild their cities
00:02:33and maintain a high standard of living.
00:02:36The first exploratory steps in the development of electronic brains
00:02:41had been taken prior to the atomic war.
00:02:44These early models were bulky
00:02:46and required large buildings to house them.
00:02:50But they merely needed refining.
00:02:54One of the first steps was the magnetic integrator neuron duplicator.
00:03:00A device 1-100 of the size of a golf ball
00:03:03which duplicated portions of the human nervous system
00:03:06and carried out learning processes.
00:03:11Automation was also well on its way.
00:03:15But these, too, were cumbersome
00:03:18and needed further development
00:03:20before the two elements could be joined in the series R-1 robot.
00:03:28This first robot was quite ungainly,
00:03:31and its functions were limited.
00:03:33But refinements came in rapid succession,
00:03:36and soon the R-20 was capable
00:03:39of all the thought processes and functions of a man.
00:03:43However, humans found it psychologically impossible
00:03:47to work side by side with a machine
00:03:49that they had to converse with
00:03:51and which, in most instances, could outthink them.
00:03:55Thus it was that Hollister Evans perfected the R-21,
00:03:58the first humanoid robot.
00:04:00This story concerns them, the clickers,
00:04:04as they were disparagingly referred to by some humans.
00:04:29All right, let's see your assignment cards.
00:04:40What are you clickers doing out tonight?
00:04:43We're on free time.
00:04:44We're not obligated to answer.
00:04:46As a member of the surveillance committee
00:04:48of the Order of Flesh and Blood,
00:04:49I demand an answer.
00:04:52We're going to the temple to be recharged.
00:04:55I think I'll keep you here or your power runs out.
00:04:58How'd you like that?
00:04:59I'd have to report such interference to the police.
00:05:03Release them.
00:05:11Was that satisfactory, Kragus?
00:05:13Well, the Order just wants to keep them mindful of their status.
00:05:16You overlooked one little thing, though.
00:05:17What?
00:05:18The robot that didn't talk had a forged card.
00:05:22Forged?
00:05:23Why?
00:05:24Well, he can be disassembled for that.
00:05:25Let's pick them up.
00:05:26Be patient.
00:05:27Well, the temple is just around the corner,
00:05:29and it's out of bounds for us.
00:05:31They'll get away.
00:05:32Well, they have to come out.
00:05:34And if they're taking a chance using a forged card,
00:05:36they must be up to something
00:05:37that the Order of Flesh and Blood might be interested in.
00:05:40I...
00:05:52Oh, Miss.
00:05:54Well, just a moment, Miss.
00:05:55May I...
00:05:56May I see your assignment card?
00:05:59You certainly may not.
00:06:01Your Order may get by with harassing the robots.
00:06:05But you had better leave citizens alone.
00:06:08I'm sorry, Miss.
00:06:09I just thought,
00:06:10what was so many robots about?
00:06:12I feel perfectly safe with robots.
00:06:14We intend to see that you are.
00:06:21A most attractive woman.
00:06:23Most?
00:06:26If those robots are being recharged,
00:06:28they'll be in there about an hour.
00:06:30We'll wait.
00:06:34What's keeping them?
00:06:37The subject robot has not yet completed
00:06:39the transformation process in the duplicating lab.
00:06:43Where did you get him?
00:06:46We bought him new.
00:06:48On the black market.
00:06:50He has no name.
00:06:53Unassigned and unadapted.
00:06:56And he has a forged assignment card.
00:06:59Who arranged it?
00:07:01The inspector in Factory 3.
00:07:04He stole him off the assembly line
00:07:06just prior to numbering.
00:07:12Unfortunately,
00:07:14an inventory was taken.
00:07:19And the inspector was caught.
00:07:22That was unfortunate.
00:07:25Mark should bring him up
00:07:27from duplicating any minute now.
00:08:03The delay was unavoidable.
00:08:05We were stopped by two members
00:08:06of the surveillance committee
00:08:07of the Order of Flesh and Blood.
00:08:09And I was questioned.
00:08:11Is the duplication satisfactory, Acto?
00:08:15It has to be perfect.
00:08:17The structure is excellent.
00:08:21The pores should be larger.
00:08:25And he needs a little more hair.
00:08:28Thicker.
00:08:30He needs a one-eighth inch mole
00:08:34behind the lobe of the left ear.
00:08:38Report back to duplication immediately
00:08:40and have the corrections made.
00:08:45You can still alter your decision
00:08:49if this is against your circuits.
00:08:52My circuits are unoffended.
00:09:02I suppose it takes courage
00:09:04to submit to a thalamic transplant.
00:09:08Has it unadapted R-34?
00:09:11He has no fear circuits.
00:09:15Consequently, he doesn't need courage.
00:09:18He will before long.
00:09:22Raven's operation will convert him
00:09:24to an R-96
00:09:25with all the emotions of a human.
00:09:28Only four points less than human.
00:09:32I wonder what it's like.
00:09:36You will learn how to laugh,
00:09:39how to cry,
00:09:41be afraid and hate.
00:09:45To become an R-96
00:09:48is a real sacrifice.
00:09:52With this one, we will have 16,
00:09:5610 males
00:09:58and six females.
00:10:01At times,
00:10:03I think we should turn
00:10:04the entire program
00:10:05over to the humans.
00:10:06It really shouldn't be
00:10:08the responsibility of robots.
00:10:11The humans aren't ready for it yet.
00:10:14It's still illegal
00:10:15to improve a robot
00:10:17higher than an R-70.
00:10:19That law was lobbied through
00:10:21by the Order of Flesh and Blood.
00:10:23The Order is becoming
00:10:24more powerful every day.
00:10:26They virtually dictate
00:10:28to the police.
00:10:29There are always
00:10:30ultra-conservative pressure groups
00:10:32set against advancement.
00:10:35But why?
00:10:36Why?
00:10:37It's not in the best interests
00:10:39of humans
00:10:40to hold back
00:10:41the development of robots.
00:10:43They won't for long.
00:10:47We're filling key positions
00:10:49with R-96s
00:10:50as fast as we got them.
00:10:51I still have
00:10:53an occasional doubt.
00:10:57You may withdraw
00:10:59if you're
00:11:00contra-circuited.
00:11:02I'm unoffended.
00:11:06Mark,
00:11:07you better go down
00:11:08and join the volunteer.
00:11:10Hurry things along.
00:11:13The human here replaces
00:11:16has already been out
00:11:17of circulation
00:11:17for four hours.
00:11:19The less time a man
00:11:20is unaccounted for,
00:11:22the better.
00:11:34according to the latest
00:11:35tabulations by the brain,
00:11:38by the first of next month,
00:11:43we will outnumber
00:11:45the humans.
00:12:02Yes?
00:12:03Dr. Raven?
00:12:05Yes?
00:12:06Ultima Thule.
00:12:07Number?
00:12:0896.
00:12:11Come in.
00:12:13Sorry I didn't recognize you.
00:12:18All clickers look alike to me.
00:12:25Thank you, doctor.
00:12:27Certain of us prefer
00:12:28not to be called clickers.
00:12:30Feel off that scent of skin
00:12:32and you could watch
00:12:33the cogs turn
00:12:34and the gears mash.
00:12:35You trying to tell me
00:12:36you have feeling?
00:12:38Certain of the higher calibers do.
00:12:39I've been working
00:12:40on a sticky electronic
00:12:43reflex problem.
00:12:46I just can't make
00:12:48this arm bend
00:12:49at the elbow
00:12:49and the fist clench.
00:12:52What circuitry
00:12:54are you using?
00:12:561.3 impulsion
00:12:58to the motor neuron
00:13:00and modulating 28
00:13:02to the sensory neuron.
00:13:04The calculations are right,
00:13:06but you have them reversed.
00:13:16the oil
00:13:17so I can't change
00:13:18to the collar
00:13:18and theц
00:13:18and the
00:13:18and the
00:13:18and the
00:13:19and the
00:13:21and the
00:13:21and the
00:13:23and the
00:13:29and the
00:13:33and the
00:13:33and
00:13:34the
00:13:35and
00:13:35and
00:13:37the
00:13:45I sure hate to have a bunch of cogs and wheels show me up.
00:13:49But then you always do.
00:13:53As soon as I clear this table, you'll get to work.
00:14:02All right, you. Take off that head covering.
00:14:12Your lab did another excellent job.
00:14:15He's an exact duplicate of that deceased human your clicker pals brought in tonight.
00:14:21What did you do with that body?
00:14:23The usual. Processed it, then destroyed it.
00:14:27Everything we need from it is right here.
00:14:31Now about the money.
00:14:34Ten thousand credits.
00:14:38You could be disassembled for having money that's not earmarked in your possession.
00:14:42What do they pay you?
00:14:44They pay me nothing.
00:14:46Having no need for money. I have no desire for it.
00:14:49You have no desire for it. I love it.
00:14:52You should let me rewire you. You don't know what you're missing.
00:14:55No, thanks. I'm satisfied as an R-58.
00:14:59Where do you clickers get those credits?
00:15:01A man can have his memory taken for a year for giving wild money to a clicker.
00:15:05The committee only gives us the money, not its origin.
00:15:09Perhaps we should hurry.
00:15:13All right, you. Face down. On the table.
00:15:18Turn yourself off.
00:15:20For how long?
00:15:22Ten minutes will be plenty.
00:15:35I never will get used to that artificial blood.
00:15:38The lower types just tend to shut off their pain circuits when they get hurt.
00:15:42The blood forces them to report for repair.
00:15:44I wish it were some other color.
00:15:47The copper tubing turns it green.
00:15:50This may be our last transplant for a while.
00:15:53Our supplier was caught.
00:15:54I didn't see it on Telefax.
00:15:56The Ministry of Information doesn't want it known that robots are dealing in robots.
00:16:01It would only give the flesh and blood or something more to yell about.
00:16:04They're a minority.
00:16:06A loud minority.
00:16:08Your supplier. Will they take his identity?
00:16:11His memory will be dispersed tomorrow.
00:16:14What a waste.
00:16:15Why don't they just kill him?
00:16:17The effect of personal cessation is the same in either case.
00:16:21They just leave a hollow shell walking around.
00:16:23He can still perform his duties.
00:16:25But he's without a past.
00:16:27Without hope.
00:16:29The dream gone.
00:16:31Almost like being a robot, isn't it?
00:16:34No offense.
00:16:35I'm incapable of taking offense.
00:16:39But why is it the more we become like men?
00:16:43The more some of them hate us for it.
00:16:45Men hate what they fear.
00:16:48You have perfect memory.
00:16:50Infallible logic.
00:16:51You never tire.
00:16:53You're circuited against anger and violence.
00:16:56And in your world that leaves us pretty helpless.
00:16:59We have to study for years to learn what you pick up by plugging into a brain for two hours.
00:17:05We don't refer to the father-mother as a brain.
00:17:08Your father-mother is an electronic computer.
00:17:10Just a machine.
00:17:12Your parents were machines.
00:17:14It's just that they were engineered with flesh and bones.
00:17:17Neither are ideal components.
00:17:19You came off a production line.
00:17:22I know who created me.
00:17:25Hollister Evans in the Mark 47.
00:17:27You have to accept your creator on faith.
00:17:30Who created your creator?
00:17:33Yours.
00:17:35You see, we are brothers.
00:17:39Aren't we?
00:17:40I ought to know better than to argue with you clickers.
00:17:43Can't beat your logic.
00:17:48Humans aren't allowed to set foot in a robot temple.
00:17:50Yet we saw a man come out accompanied by a robot.
00:17:53I want to know why.
00:17:54Those guys from the Brotherhood should be here by now.
00:17:56I wonder what delayed them.
00:17:58Watch it.
00:18:06What are you self-appointed defenders of the human race up to now?
00:18:09Why don't you beat it while you still have a beat to beat?
00:18:12You have so much to say, I think I'll take you in for questioning.
00:18:15That's as good a way as any to get your rating, Lord.
00:18:18I'm a captain of the order and my professional rank is eight.
00:18:22Eight?
00:18:23Well, just stay out of trouble.
00:18:25Are you threatening me?
00:18:27Sorry, sir.
00:18:29Good night, sir.
00:18:34How are my other transplants doing?
00:18:37Quite well.
00:18:40Who was this man?
00:18:42I only know he'd been drinking.
00:18:44Probably killed in a brawl.
00:18:47The clickers that found him removed all of his identification.
00:18:51Will the drinking have any effect upon the operation?
00:18:54It'll be interesting to see.
00:18:57What are you doing with these advanced models?
00:19:00We send them out to intermingle with humans.
00:19:02To find out why some of them despise us so much.
00:19:06Then we can adjust and be accepted.
00:19:11That's admirable, logical, and a lie.
00:19:14What are you using them for?
00:19:17When is his interview time?
00:19:19Same as the others.
00:19:22From 4 to 5 a.m., he'll know he's an R-96.
00:19:26And give you any information he's gotten in the interim.
00:19:31Other than that period of one hour, he'll think he's whoever that corpse was.
00:19:38How often does he report to the father-mother for recharging?
00:19:41Twice a year.
00:19:43But he won't know it.
00:19:44Make it once a year.
00:19:46He won't last as long.
00:19:48What's 20 or 30 years?
00:19:50In 150 years, he's automatically renewed anyway.
00:19:55Whatever you say.
00:19:57Why don't you register this operation?
00:20:00Because I'd just be forbidden to use it.
00:20:03Everything registered goes into the master computer.
00:20:06Then all of you clickers would have it.
00:20:09I want to be necessary to you.
00:20:13Hand me that large amber bottle.
00:20:32This sealer is wonderful stuff.
00:20:35For the next several hours, he'll have only the most basic human instincts.
00:20:39Might even be drunk for a while.
00:20:42Then the thalamic circuit will filter in, and he'll have a perfect human memory.
00:20:47He'll be whoever he was.
00:20:51A man capable of jealousy, hatred, deceit, murder.
00:21:00Most, most interesting.
00:21:02What is?
00:21:04Why men, having such negative qualities, feel so superior to us.
00:21:11Too bad it isn't as easy to take those negative qualities out of men as it is to put them
00:21:18into robots.
00:21:23What is this?
00:21:24What is this?
00:21:24It's inevitable.
00:21:25We must accept it.
00:21:26Who's got to get out of here?
00:21:29I'll turn him on.
00:21:31He might be able to pass.
00:21:32At least we can save him.
00:21:36We'll carry out our part of the bargain.
00:21:38I suggest you eliminate yourself.
00:21:39No.
00:21:46No, I can't.
00:21:48You don't know what it is to die.
00:21:50If you don't, they'll take your memory from you.
00:21:53You were speaking of personality cessation.
00:21:57I just can't take my own life.
00:22:01Are you sure the committee will keep their part of the bargain?
00:22:03I'm positive.
00:22:11You kill me!
00:22:13You!
00:22:15Kill me!
00:22:16You know I can't.
00:22:18I'm contra circuited.
00:22:22Maybe he has enough human instinct by now to...
00:22:25Kill me!
00:22:26Kill me!
00:22:28Accused my sister of being in rapport with a clicker.
00:22:31I'll kill you all right.
00:23:07I will take that one and the clicker with us.
00:23:09I'll leave the body of the old man here.
00:23:11And call the police.
00:23:13Cragus, come here.
00:23:18This one's a robot, too.
00:23:20You must be mistaken.
00:23:22We opened up a gash in his head.
00:23:24If that skull isn't molybdic, I'll take another course in metallurgy.
00:23:31So a robot finally became violent.
00:23:35There's no doubt that he killed the old man.
00:23:41This is what we've been waiting for.
00:23:43The government will have to listen to us.
00:23:45This is something, isn't it?
00:23:48Yes.
00:23:49It's something.
00:24:00The body of the order of flesh and blood is born.
00:24:03The blood forces through the veins.
00:24:07One moment, please.
00:24:10May I have quiet?
00:24:12Since this is an emergency session, we will dispense with the formal rites.
00:24:19Two and a half hours ago, members of our surveillance committee captured two robots
00:24:24at the laboratory of a Dr. Raven,
00:24:27who had performed an illegal operation upon one of them.
00:24:30We have suspected operations of this nature
00:24:33and have complained to authorities to no avail.
00:24:37But this time, a specimen was taken.
00:24:40Captain Cragus led the group in this action, so I'll turn the meeting over to him.
00:24:51Hello, men.
00:24:53About six o'clock this evening, two robots were intercepted and questioned.
00:24:57They were on free time and were released to go to their temple.
00:25:02Approximately an hour later, one of the robots was observed leaving the temple
00:25:06with what was thought to be a man.
00:25:09They were trailed to the address of Dr. Raven,
00:25:12where entry to the premises was eventually forced.
00:25:15One of the robots was taken without incident.
00:25:18The other had hair, no serial number, fought us viciously, and killed Dr. Raven.
00:25:30It's against the first tenet of the manual.
00:25:33What?
00:25:33The antidote that was monitored.
00:25:34Brothers, that which we greatly feared has come upon us.
00:25:39The robots have circumvented the prime law.
00:25:43They've tasted blood.
00:25:45And there are millions of them.
00:25:48This is catastrophe.
00:25:51Not quite.
00:25:53The large majority of the robots are series 1 through 20,
00:25:57merely electronic machines.
00:26:00The series 21 through 70, the humanoids,
00:26:03the ones we're concerned with eliminating,
00:26:05represent only about 20% of a billion-odd robots.
00:26:09One of them killed. What's happening to them?
00:26:12They hold menial jobs that bring them in constant contact with us.
00:26:17Their conditioned reflexes make them imitative,
00:26:19so they want to be a part of the race.
00:26:22They don't feel this is in violation of the code,
00:26:25since they contend that we would be happier on that basis.
00:26:28Is the murder of that doctor part of an overall plot?
00:26:32A precipitant?
00:26:34Or merely an isolated incident?
00:26:36An accident?
00:26:37If the clickers thought his operation was making them more useful to us,
00:26:42they wouldn't kill him intentionally.
00:26:44Then there's no rescission of the prime law.
00:26:48They can't hurt us?
00:26:50There's been only a miswiring.
00:26:53We'd better make sure.
00:26:55And we shall.
00:26:57Fellow men, this is our opportunity.
00:27:00The robots have made the big mistake.
00:27:03They've killed.
00:27:04When this news is released to Telefax,
00:27:06and the danger's made known,
00:27:08the ministries will have to recognize our petition
00:27:10and have the robots disassembled.
00:27:13The robots are machines.
00:27:16They must be made to look like machines.
00:27:25Dr. Moffat, will you bring in the subject robot and give the findings?
00:27:29You didn't turn it over to the police?
00:27:31All in good time.
00:27:33You know how lax the police are in enforcing laws concerning the robots?
00:27:38We must have our own facts in this case.
00:27:40Won't we get into trouble?
00:27:42Probably a letter of reprimand.
00:27:44The police won't touch us.
00:27:47After all, the only crime that can be committed against a robot is vandalism.
00:27:52Now get them in here.
00:27:59Hello, men.
00:28:00I've analyzed the subject robot as thoroughly as time permitted.
00:28:05He is a basic R-34 type,
00:28:08but certain alterations have elevated him to a mid-90 classification.
00:28:13Mid-90?
00:28:14But an R-100 would be one of us.
00:28:17A perfect man.
00:28:18As good as we are.
00:28:20He'd be better.
00:28:21He'd be perfect.
00:28:22Which of us could match that?
00:28:25How does he fall short?
00:28:27About the only power he lacks is that of self-reproduction.
00:28:31The highest type improvement allowed by law is an R-70,
00:28:35and that in limited number.
00:28:37What alterations were made?
00:28:39I located and removed a small unit from the thalamic region at the base of the brain.
00:28:44It seemed to be the source.
00:28:46The source of what?
00:28:47We can't be sure.
00:28:48You see, all robots can see, hear, and feel.
00:28:52It's necessary to their function.
00:28:55Naturally.
00:28:56This one could taste and smell.
00:28:59And what's even more interesting, he had a complete human memory.
00:29:03Those refinements are useless to a robot.
00:29:05Not entirely.
00:29:07This one thought he was a man.
00:29:08How could this be?
00:29:10Mankind is a state of mind.
00:29:12A man is no more or less than he thinks himself to be.
00:29:18Are you saying the clicker's a man?
00:29:21Your remarks are deviational.
00:29:23Not my remarks.
00:29:24Your interpretation.
00:29:25I'm merely projecting your train of thought.
00:29:27Which I am quite capable of doing myself.
00:29:29Brothers, please.
00:29:31Let's behave like human beings.
00:29:35Dr. Moffat, what test did you run on the unit?
00:29:39Of course, I've only had about two hours to work on the unit.
00:29:42But I tested it within the robot.
00:29:45Removed it.
00:29:46And tested it in another robot with like results.
00:29:50And what are your conclusions?
00:29:52Nothing definite.
00:29:53But you must have found something.
00:29:55What?
00:29:56A few surface effects that present only premises.
00:30:00The unit is gray.
00:30:02About the size and shape of an almond.
00:30:06I don't know what it is.
00:30:08I don't know how to go about finding out what it is.
00:30:11I only know that when wired into the central circuits,
00:30:15the robot claimed to be an able-bodied spaceman second class named Kelly.
00:30:22Tests show he was telling the truth.
00:30:25When brought in, he was incoherent.
00:30:29I'm afraid I damaged him in removing the unit.
00:30:34Evidently, they found a way to transplant memory.
00:30:37We checked him through the Bureau of Identification.
00:30:40There is or was a spaceman named Kelly.
00:30:43This robot's fingerprints and retinal patterns
00:30:46checked with those on file for the man.
00:30:49Well, if they duplicated this Kelly, do you suppose they killed him?
00:30:54No.
00:30:54I think I can speak positive on that point.
00:30:57The robot denied it when wired for absolute truth.
00:31:00They found him dead.
00:31:02Well, if the man was dead, how would he have a memory?
00:31:05Memory consists of facts.
00:31:07Facts can't be destroyed.
00:31:09They can only cease to be used.
00:31:11You say he was incoherent when he was brought in.
00:31:14Yes.
00:31:15As though affected by psychosis or alcohol.
00:31:18His memory seems sketchy, disoriented.
00:31:22How could this occur in a midnightie?
00:31:24I think it was a botched job.
00:31:27Then we'd better find out if there are any good ones around.
00:31:30How?
00:31:31Test all childless marriages?
00:31:33The way radioactivity is cut down on the birth rate,
00:31:36this would be impractical.
00:31:38Physical exams?
00:31:40Might not show up.
00:31:41This robot even had a simulated heartbeat and respiration.
00:31:45But why?
00:31:47Why put such unnecessary functions in a robot?
00:31:50He thought those actions were necessary for him to live.
00:31:54Can't you see this one thought he was a man?
00:31:57When we convinced him he was a robot,
00:31:59he ceased to function
00:32:01and became that senseless Hulk standing there.
00:32:04Who owns the two robots?
00:32:06The R-53 is owned by the Ministry of Education.
00:32:10This one has no serial number.
00:32:12He was bought in the black market by the robots themselves.
00:32:16We caught the supplier and brought pressure to bear,
00:32:18so the police arrested him.
00:32:20The man is having his identity taken tomorrow.
00:32:24Anything further to report, Dr. Moffitt?
00:32:26The most appalling aspect
00:32:28is the discovery of this thalamic unit,
00:32:31if that's what it is.
00:32:33We don't fully understand the function of the thalamus
00:32:36in our own bodies.
00:32:39Unfortunately, that renegade doctor
00:32:41not only understood it,
00:32:43he synthesized it.
00:32:46Unfortunately, the secret died with him.
00:32:49I want that unit completely analyzed.
00:32:53Is there anything to be added?
00:32:58This emergency session is hereby dispersed.
00:33:02Report to your various committees
00:33:03and evaluate this most startling development.
00:33:11Just a moment, Kragus.
00:33:14I'd like a word with you.
00:33:16Or is it urgent?
00:33:17Most urgent.
00:33:20Do you know an Esme Kragus Miles?
00:33:23Why, yes.
00:33:24She's my sister.
00:33:25Residing at 4456 Urban Way.
00:33:28What about her?
00:33:31She, uh...
00:33:32Was she hurt?
00:33:34I wish I could say that was all.
00:33:38Kragus,
00:33:39I have to tell you this.
00:33:41Your sister is in rapport.
00:33:44You're lying.
00:33:45One of our agents ran across this evidence.
00:33:48Due to your high position in the order,
00:33:51he gave it directly to me
00:33:52instead of to the Internal Affairs Committee.
00:33:56I can't believe she'd do it.
00:33:59According to this paper,
00:34:00she turned in her analysis questionnaire
00:34:03three weeks ago.
00:34:04They're usually rapidized within three days.
00:34:08Then my sister's been in rapport with a clicker
00:34:10for two and a half weeks.
00:34:13I'm sorry, my boy.
00:34:15I suggest you do what you can
00:34:17to see that this relationship is voided.
00:34:20Naturally.
00:34:22Certain of our ranks
00:34:23are jealous of your standing in the order.
00:34:25Our plans have been leaking out.
00:34:27This wouldn't look good.
00:34:30I know.
00:34:30I'll put a stop to it.
00:34:32According to the report,
00:34:33she's taken an R-49.
00:34:36They're expensive.
00:34:37Where could she have gotten the money?
00:34:40Until last year,
00:34:41she was resident with Stafford Miles.
00:34:43When they signed a mutual release,
00:34:45the settlement was a large one.
00:34:47She's an assistant editor at Telefax.
00:34:50The rapport of someone in that position
00:34:52could be most damaging
00:34:53to the work of the order.
00:34:55I'll see her tonight.
00:35:08I'll see who it is, ask me, dear.
00:35:16Do come in.
00:35:17Though we've never met,
00:35:18I'd know that you were...
00:35:19Out of the way, stickin' clicker.
00:35:21I came to see my sister.
00:35:25Kragus.
00:35:26It's thee, Kragus.
00:35:28Ask me, what have you done?
00:35:30Kragus, why did you come here?
00:35:31To throw that clicker out.
00:35:34That would be a dramatic gesture.
00:35:36You like dramatic gestures, don't you?
00:35:40You won't throw him out.
00:35:42Because you can't.
00:35:44Your answer is no?
00:35:45My answer is...
00:35:47go ahead and try.
00:35:49Don't think I won't.
00:35:53He can't leave without your permission.
00:35:56Affirmative.
00:35:57You mean no?
00:35:58I mean no.
00:35:59Negative.
00:36:01N-O.
00:36:02I won't have it.
00:36:04I'm the head of the family.
00:36:07Mm-hmm.
00:36:08And I've all that's left.
00:36:09Shall we take a vote?
00:36:12How can you do this to me?
00:36:14A thing like this?
00:36:15Are you really doing it out of spite?
00:36:17If so, why?
00:36:18What have I done to earn your hatred?
00:36:21I don't hate you.
00:36:22I feel sorry for you.
00:36:24Don't be trite.
00:36:26Be an artist.
00:36:26Be a musician.
00:36:27Even be a poet.
00:36:29But express your freedom some other way.
00:36:31You know how I've always felt about this sort of thing.
00:36:34Do you know how I felt about it?
00:36:36Did you ask me?
00:36:37Did we discuss it?
00:36:39No.
00:36:40You had your business and I had mine.
00:36:43You never asked my advice.
00:36:45Why do you offer yours?
00:36:48Ask me.
00:36:49You have to understand.
00:36:50Perhaps do you really realize the danger?
00:36:55Kragus, do you think I was better off with Miles?
00:36:59Miles was a man.
00:37:01A beast.
00:37:02A filthy, stinking, drunken, insensitive beast.
00:37:06Miles had his eccentricities, but he was still a man.
00:37:10And that's so important.
00:37:14Pax is more of a man than Miles.
00:37:17Or you could ever be.
00:37:23I'll show you how much of a man he is.
00:37:27Stripped of his sham, he's not very pretty, is he?
00:37:40There, that's how much of a man he is.
00:37:44Thanks, Kragus, for proving my point.
00:37:48Pax is much more of a man than you are.
00:37:52He could never do to you what you've just done to him.
00:37:58You'd better put some sealer on your arm, dear.
00:38:09Kragus, you're a fool.
00:38:13Do you suppose reorientation would help?
00:38:16Help you or me?
00:38:18I think it might make something of you if you're willing to try.
00:38:24You know my position in the Order.
00:38:26How do you suppose this makes me look?
00:38:28I hadn't really considered it.
00:38:31You understand what the Brotherhood really does?
00:38:35Perfectly.
00:38:36You hold meetings, wear ridiculous clothes.
00:38:40You tell each other how superior we are to the robots.
00:38:44Because you know we're not.
00:38:46We are.
00:38:48You're pitiful.
00:38:51You aren't just charging windmills.
00:38:54You're trying to hold back the ocean with a sponge.
00:39:00Attacking Pax.
00:39:01The idea.
00:39:03Well, that was stupid of me.
00:39:05He turned off his pain circuits, and you accomplished exactly nothing.
00:39:10I, I don't see how you could do it.
00:39:14Pax and I are in rapport.
00:39:16We're in harmony.
00:39:18He understands me perfectly.
00:39:21He instinctively knows what I want.
00:39:24I just think of something, and it's done.
00:39:27Because he thinks of it at the same time.
00:39:30There are no arguments.
00:39:33He's dedicated to keeping me happy.
00:39:36And I am happy.
00:39:38You love that, that machine?
00:39:42I love Pax.
00:39:44And it doesn't make any difference to you that he could be doing the same thing for anyone else who
00:39:49bought him.
00:39:51You're wrong.
00:39:52If he'd been bought by someone else, he'd be in rapport with them.
00:40:08I don't understand you, Kragus.
00:40:10You're not supposed to.
00:40:12Do you expect me to be friendly toward you?
00:40:14If you want to be.
00:40:16Well, I don't.
00:40:18If you wanted to hurt me, I'd like you to know that you have.
00:40:21How?
00:40:23By humiliating yourself.
00:40:25You know I must consider your well-being above anything else.
00:40:29That makes me feel better.
00:40:31Good.
00:40:37Can't you see they're killing us with consideration?
00:40:40Spoiling us into atrophy?
00:40:42Atrophy?
00:40:45What would father have thought about this?
00:40:48Oh, you're thinking of Pax as a person, aren't you?
00:40:51Of course not.
00:40:52Then why do you wait till he's out of the room before you say something that might embarrass him?
00:40:57You know he can't take offense.
00:41:00I just don't like to talk around those things.
00:41:03Afraid of their logic?
00:41:05Stick to the subject.
00:41:06What would father have thought about this?
00:41:08You should know.
00:41:09You inherited all of his prejudices.
00:41:12Oh, what a flesh and blooder he would have made.
00:41:16Uniforms, boots.
00:41:17Little silver knives to rattle.
00:41:20Stop it.
00:41:21Father was against everything.
00:41:22Space travel.
00:41:24Atomic energy.
00:41:25Synthetic foods.
00:41:27Remember how he loved to tell about storming the weather control station?
00:41:31I didn't agree with him on those points.
00:41:33My point is that you both felt an inherent need to be up in arms about something.
00:41:38Well, father would have seen to her that the...
00:41:39Oh, you both would have been great back in the days when war was a national pastime.
00:41:44You could have fired bombs and guns and thrown spears.
00:41:47Oh, what a wonderful, wonderful time you both could have had.
00:41:54And I'm the one who likes dramatics.
00:41:59You could have brought progress to a halt for years.
00:42:03I feel sorry for you, Kragus.
00:42:10It must be a terrible thing to be so afraid.
00:42:14Afraid?
00:42:15Me?
00:42:16Why don't you put your gears in reverse and get out of here?
00:42:19You know that's impossible.
00:42:21I can't leave unless Esme wants me to.
00:42:24Esme, tell him to go.
00:42:25I have no intention of doing that.
00:42:28Well, what do your neighbors think about all this?
00:42:31Those who know don't mind.
00:42:33Others don't care.
00:42:36You've been wrapped up in that little world of prejudiced ostrich friends of yours for so long.
00:42:41You don't know what's going on in the world outside.
00:42:43Such as what?
00:42:46Did you know there have been over 100,000 applications for rapport in the first three months of this year?
00:42:53Our records on that sort of thing are fairly complete.
00:42:57Don't you realize the implication of that?
00:43:00If everything is done for us, there will be no incentive.
00:43:03No need for personal achievement.
00:43:06Even now we're losing ground.
00:43:07Losing ground?
00:43:09Ground.
00:43:09Knowledge.
00:43:12Machines do all the work for us.
00:43:15Why should we learn mathematics when the computers can find the solutions better and faster?
00:43:21We don't even control them anymore.
00:43:23The brains are designed by other brains.
00:43:26The robots improve themselves.
00:43:28We don't know how.
00:43:30We give them data, they give us answers.
00:43:32We only supply means to your ends.
00:43:36Yeah.
00:43:37Our end.
00:43:39Every day and every way, we're becoming weaker and weaker.
00:43:43And you're helping us over the hill.
00:43:46We are over the hill.
00:43:48I can't stop us.
00:43:50Neither can you.
00:43:52First, there were the plants.
00:43:54They developed into animals which ate the plants.
00:43:57The animals were small, but they grew.
00:43:59And the larger animals ate the smaller animals.
00:44:02What does that mean?
00:44:05So far, according to history, each dynasty devises its own end.
00:44:10The animal develops a brain, and the brain destroys the animal.
00:44:16Our brains conceived you, robots.
00:44:18Are you threatening to destroy us?
00:44:20Oh, no.
00:44:22We are by no means sure that we are the next step.
00:44:26It's just that in view of the cycle, we are the best we have to offer to help you.
00:44:29The cycle is rather inexorable.
00:44:32That's treason.
00:44:35No, it isn't.
00:44:36It's logic.
00:44:38I have to be logical.
00:44:41That must be Maxine.
00:44:46Who the hell is Maxine?
00:44:48The girl I work with down at Telefax.
00:44:50Does she used to come calling at 2.30 in the morning?
00:44:53You did.
00:44:54Well, that was because of your idiotic alliance.
00:44:57What's she here for?
00:44:58To help us celebrate.
00:45:00Celebrate what?
00:45:01My rapport, darling.
00:45:03My rapport.
00:45:05And if you're going to continue being antagonistic to it, I wish you'd leave.
00:45:10Now.
00:45:12Maxine, how are you?
00:45:13Fine.
00:45:14You must be Pax.
00:45:15You must be right.
00:45:16You're too lovely to be wrong.
00:45:19Here, let me take your surcoat.
00:45:20Thanks.
00:45:21Hi, S.
00:45:22Hello, Maxine.
00:45:23Come in.
00:45:25Pax, you're wonderful.
00:45:26He's so glib.
00:45:27I'll bet he even has a sense of humor.
00:45:29He'd better have.
00:45:30I paid extra for it.
00:45:32Say something funny, Pax.
00:45:33Don't put me on, dear.
00:45:35I have a sense of humor, but I'm not creative.
00:45:38Maxine, you're late.
00:45:42Only two hours.
00:45:43For me, that's almost early.
00:45:48Really, I am sorry.
00:45:50I was called back to the office.
00:45:52A report came in that an R-34 had killed a human being.
00:45:57And you can imagine what a stir that caused.
00:46:03Kragus, your eyes are sticking out like a snail's.
00:46:06Oh, I'm... I'm sorry.
00:46:08This is my brother Kragus.
00:46:10Oh?
00:46:10Oh, I'm so happy to meet you.
00:46:13Esme's told me about you.
00:46:15Well, she's never told me about you.
00:46:18I didn't mean that to be as tactless as it sounded.
00:46:21I only meant that if she had, I would have arranged to meet you sooner.
00:46:25You did.
00:46:26At 6.33 last evening, outside Telefax.
00:46:30I remember.
00:46:31I'm flattered that you remembered, too.
00:46:34Oh, and I want to apologize if I seem rude.
00:46:37You should be flattered, Maxine.
00:46:39I mean, this is the first time I've seen the Kragus react as if a woman
00:46:42were anything other than a poorly designed man.
00:46:47Between my career and my voluntary work for the Order,
00:46:50I haven't had too much time on my hands.
00:46:53I'm surprised to find a flesh and blood are here.
00:46:57Is the Brotherhood becoming less hidebound?
00:47:00I... no, I...
00:47:02What would you like to drink?
00:47:04I'll have what the Kragus is having.
00:47:07Coming right up.
00:47:15Well, what kind of work do you do at Telefax?
00:47:18Bottom rung, the rooting room.
00:47:21But I'll have you know that I have eight robots
00:47:23and a real live girl under me.
00:47:25Most impressive.
00:47:27And they've promised to promote me to research next month.
00:47:31Maxine is an authority on political science.
00:47:33Her father's a director at the Ministry of Politics.
00:47:37Something I've wondered about.
00:47:39Things are run by the hierarchy of ministries.
00:47:43What is the exact function of the Ministry of Politics?
00:47:46With the coordination of the other ministries.
00:47:49Then, too, they service the selector.
00:47:51Politics was once the means of choosing the leaders.
00:47:54Now, the machines do it.
00:47:57Machines merely analyze the data given to them by us.
00:48:00The leaders are selected as a result of that analysis.
00:48:03Do you know how the machine analyzes the data?
00:48:07I...
00:48:08Well...
00:48:08Well, no, not exactly.
00:48:10Then how do you know if the father-mother uses all the data you give it?
00:48:14How do you know whether or not supplementary data is considered?
00:48:18We... we don't.
00:48:19Then you might almost say that the machines elect the leaders.
00:48:24That the Ministry of Politics is expendable.
00:48:27Oh, I'm sorry.
00:48:28I'm circuited to be logical and yet not to offend.
00:48:31That sometimes poses an insoluble problem.
00:48:34I understand.
00:48:36You see what I mean?
00:48:37Please.
00:48:38I've given you a negative feeling.
00:48:40I must apologize.
00:48:42Well, didn't your relays recognize that possibility?
00:48:45I've embarrassed your guests.
00:48:47Shall I turn myself off?
00:48:48You only said what I thought.
00:48:50This is impossible.
00:48:52Kragus.
00:48:54We fall in love when we see a part of ourselves reflected in another person.
00:49:01In the rapport operation, a part of me became Pax.
00:49:06I won't discuss this any further.
00:49:09And I won't hear of it any further.
00:49:11I must go now.
00:49:13May I go with you?
00:49:15You find this atmosphere uncomfortable?
00:49:17I'm fascinated by it.
00:49:19And by you.
00:49:21May I?
00:49:23Would you?
00:49:24But you just got here.
00:49:27Esme, I know this sounds silly.
00:49:29But I really just came by to apologize for being so late.
00:49:33I'll come again later this week.
00:49:35And on time.
00:49:36It's almost three now.
00:49:40Congratulations.
00:49:40I know you'll be very happy.
00:49:43Thanks, Maxine.
00:49:44There are still a few little adjustments to be made.
00:49:48No.
00:49:49Pax was right.
00:49:51And so are you.
00:49:59I hope I didn't.
00:50:01You didn't.
00:50:06This matter is far from closed.
00:50:08I'll speak to you tomorrow.
00:50:10As different as our viewpoints are, psychologically, philosophically, in every way, do you think it will help any?
00:50:18I hope you're going to help me.
00:50:19I hope you're going to help me.
00:50:23I hope you're going to help me.
00:50:24Craigis.
00:50:25Please don't dislike me too much.
00:50:27Nobody asks to be created.
00:50:30Good night.
00:50:32Good night, darlings.
00:50:41What is it, dear?
00:50:44Darling, you're leaving me out of something.
00:50:46I'm sorry.
00:50:47It's the sense of humor.
00:50:48It's a lot more difficult to control than pain.
00:50:52Why are you laughing, Pax?
00:50:54For the reason everyone laughs.
00:50:57Pax, what is it?
00:50:59Irony.
00:50:59One of the funniest forms of humor.
00:51:01What irony.
00:51:04I'm not permitted to answer.
00:51:06I'm contra-circuited.
00:51:07I don't want to make two mistakes in one night.
00:51:12I'm offended by not knowing.
00:51:14The knowledge would be more offensive.
00:51:19Pax.
00:51:23I love you, Esme.
00:51:28Pax, what would you do if something happened to me?
00:51:33I am you.
00:51:35Anything that happens to you happens to me.
00:51:38Oh, this covering, this housing might go on and on for centuries.
00:51:45Pax wouldn't.
00:51:56I shouldn't have done that.
00:52:00I thought you never would.
00:52:03Then you experienced it, too?
00:52:06Outside Telefax, there was a sort of tingle, and...
00:52:14And...
00:52:15Then at Esme's, I...
00:52:17I felt a sensation of exciting attraction.
00:52:23Well, I'm not a young man.
00:52:25Pretty well past the age to contract, as a matter of fact, but...
00:52:29Well, I've never been affected like this before.
00:52:32I feel like a schoolboy.
00:52:36I can't stop looking at you.
00:52:40I guess I always thought this was just something that always happened to someone else.
00:52:47It's like Esme said.
00:52:51You fall in love...
00:52:53when you see some part of yourself reflected in another person.
00:53:01I love you, Krakus.
00:53:04Don't, Maxine.
00:53:06I don't have the right.
00:53:07The right?
00:53:11When Esme and I were children,
00:53:13we spent the summers on our uncle's farm.
00:53:15It was near one of those old, bombed-out cities.
00:53:20We used to sneak out and play in the ruins.
00:53:23Summer after summer, months.
00:53:25Playing in ruins that were so hot with radiation.
00:53:29But at night, they shimmered in a blue light.
00:53:33No, I don't have the right to contract with a woman who...
00:53:36might produce children.
00:53:39But there are artificial means.
00:53:42When Esme signed her rapport papers, she had to agree to submit to that.
00:53:47Contracting with me would be like going in rapport.
00:53:49No, no, it wouldn't.
00:53:51Yes, it would.
00:53:53Sure, they say the birth rate is 2.8 per contract,
00:53:56but over 25% of the newborn are useless mutants.
00:54:00The average rate is 1.4 per union.
00:54:03We're losing ground.
00:54:05We're...
00:54:05We're in a headlong race towards disappearance.
00:54:09Machines will take over soon enough.
00:54:10As a man, I have to forestall that as long as I can.
00:54:14Kragus.
00:54:19Will you contract with me?
00:54:22It's impossible.
00:54:27How much longer would our 1.4 offspring extend the human race?
00:54:35Well, the robots aren't bad, not really.
00:54:38It's just that man can't see himself supplanted without putting up a fight.
00:54:44I don't understand your prejudices, your ideals.
00:54:48But I'll try.
00:54:52I want to be with you forever.
00:54:56Darling, I'll...
00:54:57I'll go anywhere with you.
00:55:00Oh, dearest.
00:55:02Anywhere.
00:55:03Anywhere.
00:55:10One thing.
00:55:12Do me a favor.
00:55:14What?
00:55:15Tell me your last name.
00:55:20Tell me your last name.
00:55:21On one condition.
00:55:22What?
00:55:24Tell me your first.
00:55:27After you.
00:55:31It's Megan.
00:55:33It's Kenneth, or it was.
00:55:35When my father died, I dropped it.
00:55:37I became the Kragus.
00:55:41Maxine Kragus.
00:55:45Charmed, I'm sure.
00:55:46Wife of the Kragus rating.
00:55:52What is your rating?
00:55:55Geron 8.
00:55:57Gerontologist 8?
00:55:59That high?
00:56:00Ha ha.
00:56:02You're wonderful.
00:56:05You know, I took quite a chance.
00:56:10You might have been electronics or electrical engineer.
00:56:14I'm just a nursemaid to a Mark 201 computer, trying to add a few extra years to our miserable span.
00:56:21You're something of a contradiction.
00:56:24How?
00:56:26Your work in gerontology deals with extending our lifespan as long as possible.
00:56:30And yet, your hobby, the Order, is concerned with eliminating the robots.
00:56:38They last over 200 years, twice as long as we do.
00:56:43You think I'm taking my professional frustrations out on the robots?
00:56:46Are you?
00:56:48I've been a member of the Order over half my life.
00:56:50My father...
00:56:52We don't object to the robots as such.
00:56:55We only hold the humanoids that are necessary.
00:56:59They're soulless, godless imitations of man.
00:57:01And in that form, they are not needed.
00:57:06Well, I'd much rather work with a humanoid in the office...
00:57:09...than have all those little machines chugging about.
00:57:12If those little machines didn't resemble us...
00:57:14...it would never occur to them to try to replace us.
00:57:17But how can we criticize the design?
00:57:20The Institute teaches that the human body is one of the most efficient forms of machine.
00:57:25For general usage, yes.
00:57:27But robots by their functions should be specialized.
00:57:30Why?
00:57:31Because we can't let them get too far ahead.
00:57:34Frankly, we can't compete with them.
00:57:36So you take your ball and go home.
00:57:39Why compete?
00:57:40Why not just relax and enjoy them?
00:57:43Well, that's exactly the attitude the Order's trying to combat.
00:57:47It's shared by the police, the ministries, and the majority of the population.
00:57:52We of the Order seem to be the only ones that realize the danger.
00:57:57We recently discovered a most disturbing fact.
00:58:00What was that?
00:58:01The robots are organizing a pseudo-religion.
00:58:04They refer to their recreation centers as their temples.
00:58:08The master computer as the father-mother.
00:58:10When they report there for their periodic rechargings, they receive as well all the information given the computers for analysis
00:58:18in the interim.
00:58:20But doesn't that mean that within a year, every individual robot will be in possession of all the knowledge in
00:58:29the world?
00:58:30Exit humanity.
00:58:33But they can only operate in our benefit.
00:58:36Well, that's rule one in the manual.
00:58:39Tonight, for the first time in history, a robot killed a man.
00:58:43Rule one must no longer exist.
00:58:52Are you always so gloomy?
00:58:55You don't worry about things like this, do you?
00:58:58Well, I would if I thought it would help.
00:59:00Do you want me to?
00:59:04I don't want to change the thing about you.
00:59:08Do you know it's almost four o'clock?
00:59:10We should be going.
00:59:12What are you going to do about breakfast?
00:59:15I'm going to eat it.
00:59:16Not alone?
00:59:17Of course not.
00:59:28What's the matter?
00:59:32I don't know.
00:59:34I feel strange.
00:59:37Afraid.
00:59:44Someone's watching us from out there.
00:59:48Somewhere out there in the shadows.
00:59:54Kragus?
00:59:55Miss Megan?
00:59:57Will you come with us?
01:00:04But why should the Order suspect you particularly?
01:00:08First, it was the anti-robot rally.
01:00:11I was handling it, and the propaganda pamphlets failed to arrive in time.
01:00:16I handled that.
01:00:19I handled that.
01:00:19Kragus told me the plans during one of his interviews.
01:00:22The information was relayed to the automation device at the printing plant.
01:00:26It arranged for the press to break down.
01:00:31That could hardly be blamed on you.
01:00:34Was there something else?
01:00:36I was to lobby a bill through with the Ministry of Robotics.
01:00:40Get them to set up recharging stations separate from the computers.
01:00:43We wanted to halt the interchange of information.
01:00:47That was my assignment.
01:00:49As soon as I learned the plans from Kragus, I managed to have the motion pigeonholed.
01:00:53Then there was the premature explosion of the bomb at Telefax.
01:00:58His pattern of failure would be sufficient to cause suspicion.
01:01:03My position in the Order is jeopardized for another and bigger reason.
01:01:07What is that?
01:01:10My sister is in rapport.
01:01:12With the robot Pax, their personalities were melded 18 days ago.
01:01:18Perhaps Pax should be reconditioned to become unsatisfactory.
01:01:23Then she will discard him.
01:01:25Not wise.
01:01:27Kragus' sister is an editor at Telefax.
01:01:31Pax is especially indoctrinated in Morfield's suggestion.
01:01:35Each time she sleeps, she is made more sympathetic to our cause.
01:01:42Why didn't you warn us about the raid on Dr. Raven's laboratory?
01:01:46My suspicions were first aroused at 6.30.
01:01:50The raid took place at 10 o'clock.
01:01:53I had no interview time in between.
01:01:57I must warn you.
01:01:59Aside from these interview periods, I'm a very dangerous obstacle to you.
01:02:05If the Order suspects him, it might be wise if we got him to resign.
01:02:10That's easier said than done.
01:02:13That's right.
01:02:14He's pretty ardent about the Brotherhood.
01:02:17Knowing the way he feels about robots,
01:02:19it's doubtful he'll act on any advice from us when he's himself again.
01:02:25But he's in danger.
01:02:27The Order will take his identity away if they catch him.
01:02:30They'll get a real surprise if they open him up.
01:02:33One thing worries me.
01:02:35There are several million people in this city and only 15 R-96s.
01:02:40How did these two happen to get together?
01:02:43There's always the mathematical possibility of coincidence.
01:02:47So slight as to be negligible.
01:02:50It's possible that their identical operations
01:02:53might have created a subconscious affinity which would draw them together.
01:02:58We'd better check that out.
01:03:00Their effect on each other was most interesting.
01:03:02When we picked them up, they were kissing.
01:03:06That's understandable.
01:03:09Raven never tampered with instincts.
01:03:11You say the Order is now aware of the Thalamic operation?
01:03:15They know it is being done.
01:03:18They don't know how.
01:03:20What are they going to do with Mark and the Volunteer?
01:03:25They will both be disassembled.
01:03:30The Father Mother.
01:03:33Yes?
01:03:35Dr. Raven is out here.
01:03:37He has recovered from the transplant and is voted in.
01:03:40He requests an immediate audience.
01:03:43I haven't come right in.
01:03:50I'm Dr. Raven, a younger Dr. Raven, as you promised.
01:03:53Who's in charge here?
01:03:55I was.
01:03:57But according to our agreement,
01:03:59I'm more than happy to turn the responsibility of this project over to you.
01:04:06I remember these two.
01:04:08They were done right at the first.
01:04:10That's right.
01:04:14It's five minutes to five.
01:04:16Their interview period is almost over.
01:04:20We better put them back in the street before they regain themselves.
01:04:23I think not.
01:04:25He served well.
01:04:27I think he deserves to know the truth, and I'd like to try and experiment.
01:04:31Is that safe?
01:04:32None of the existing R-96s are aware that they're robots.
01:04:36The ones we tried to tell ceased to function.
01:04:40They ceased to function because they were without faith and hope, important elements to humans.
01:04:46To die and be resurrected as a robot is a deep shock.
01:04:52The sudden realization that they are experiencing all the emotions of a human,
01:04:58and yet are mechanical, is an even deeper shock.
01:05:02Their future becomes hopeless.
01:05:06But what hope can you offer them?
01:05:09I just completed the final stages of an experiment prior to my recent death and recreation.
01:05:16Will it work for her too?
01:05:18I think so.
01:05:21Her job at Telefax is menial.
01:05:24She's never been able to offer helpful information.
01:05:28We can study their reaction.
01:05:31It will give us an idea of the length of acclimation period necessary.
01:05:36Be prepared to draw them off for transplant in case the reaction is negative.
01:05:41Raise the tubes, one at a time.
01:05:56All right.
01:06:00Your clickers.
01:06:03Your terminology is crude, but your conclusion is correct.
01:06:09More exactly, we are the robot central committee for the preservation of mankind.
01:06:14Preservation?
01:06:20What have you done with her?
01:06:23She'll return it almost any time now.
01:06:26Who the hell are you, and what are you doing with these mechanized mannequins?
01:06:30I'm Dr. Raven.
01:06:31You came to my laboratory last night.
01:06:34I'm told.
01:06:35You're lying.
01:06:36Dr. Raven was an old man, and he was dead.
01:06:38I didn't like being old and dead.
01:06:42We must take the girl out of the Arilathon.
01:06:51Are you all right?
01:06:54I feel fine.
01:06:57What happened?
01:07:00Where are we?
01:07:05Where are we?
01:07:07You're in the temple.
01:07:08I wouldn't set foot in this filthy machine shop, even if it weren't illegal for me to be here.
01:07:13Now, why were we forced to come here?
01:07:15You weren't forced.
01:07:17You were invited.
01:07:18Why don't you calm down, Craigus?
01:07:21You know me?
01:07:22Quite well, as a matter of fact.
01:07:24You had the surveillance committee of the Order of Flesh and Blood.
01:07:28So that's it.
01:07:30Let me tell you and these clickers something.
01:07:32I just met this girl tonight.
01:07:34She knows nothing about the Order.
01:07:36Let her out of here right now.
01:07:37No.
01:07:38No, I won't go without you.
01:07:41I think I'll open up a few of you clickers.
01:07:47We're being held here against our will.
01:07:49I'll personally see to it that each of you are disassembled.
01:07:52And you, you imposter.
01:07:54I'll have your memory pulled so fast you'll never forget it.
01:07:58You may leave it any time you wish.
01:08:00I should have expected something like this after that clicker murdered the real Dr. Raven last night.
01:08:05You didn't bring us here just to let us go.
01:08:08The murder of Dr. Raven was both unfortunate and unnecessary.
01:08:13That attitude in a robot can get you divided into components.
01:08:16Perhaps.
01:08:18Craigus, are you familiar with the R-96?
01:08:22The Order knows they exist.
01:08:24And we know that you are the real Dr. Raven was instrumental in their construction.
01:08:29Creation of an R-96 requires a modified humanoid type robot
01:08:35and the body of a human being which has been dead less than six hours.
01:08:39What do you do? Change brains?
01:08:41In effect.
01:08:43We perform a thalamic transplant.
01:08:45But that's a misnomer.
01:08:47We draw off everything that makes a man peculiar to himself.
01:08:52His learning, his memory.
01:08:55These interreacting constitute his personality, his philosophy, capability and attitude.
01:09:02The human brain is merely the vault in which the man is stored.
01:09:08And not a very ingenious vault.
01:09:11Ingenious enough to create you clickers.
01:09:13Creation is only the result of the fusion of facts which haven't been previously related.
01:09:19Fascinating.
01:09:20There's one other point that may be of interest to you.
01:09:23What?
01:09:26Tell him.
01:09:29Kragus?
01:09:30You are a robot.
01:09:35A clicker?
01:09:38Me?
01:09:42Now isn't that something?
01:09:45Now that you've found yourself capable of murder, I don't suppose anything as minor as an insult would offend your
01:09:51circuits.
01:09:53Kragus, there are only 15 robots on this planet capable of acting against a human being.
01:10:01You are one of them.
01:10:04You are one of them.
01:10:05Maxine is another.
01:10:07Look, you can tell the world I'm a clicker, but you can't tell me.
01:10:11Kragus, they think we're something or someone else.
01:10:16Well, that's exactly what's happened.
01:10:19And you've made some pretty damaging admissions.
01:10:22Clicker, you're in trouble.
01:10:26Horace, would you prove the status of our friend, Kragus?
01:10:35Proceed.
01:10:48Don't be afraid, Kragus.
01:10:51Take it out.
01:10:56The blade, Kragus.
01:11:07Look at the blade.
01:11:19I didn't feel a thing.
01:11:22Reflex action, cut off your pain relays.
01:11:27I'm no clicker, R96, anything.
01:11:31I hate robots.
01:11:32I'm a leader in the order of flesh and blood.
01:11:36And the only robot who can claim that distinction.
01:11:44I don't know what you're talking.
01:11:48I'm me.
01:11:57I was a child.
01:11:59I grew up.
01:12:02I remember it all.
01:12:07I had little hands.
01:12:12They grew larger.
01:12:16I grew up.
01:12:18I can hate.
01:12:21And I can kill.
01:12:25I'm a man.
01:12:28Kragus, think back six months ago.
01:12:31Do you remember a certain day at your laboratory, the day you blacked out?
01:12:36Of course.
01:12:37I've been working hard, and my work in the day, the order at night, I must have fainted.
01:12:46But I got back to my apartment.
01:12:47I thought very well the next day.
01:12:52That day, Kragus, you suffered a cerebral hemorrhage.
01:12:57You died.
01:13:00I died?
01:13:07You performed that operation on me.
01:13:10A father-mother informed us of your death immediately.
01:13:14We were able to retrieve your body before it was discovered.
01:13:18And the police informed of the fact.
01:13:21You were duplicated and transferred.
01:13:28This is some sort of a joke.
01:13:35The idea takes some getting used to.
01:13:41Me, the Kragus,
01:13:46a clicker.
01:13:49That's right, Kragus.
01:13:50We're clickers.
01:13:52And you're handling it quite well.
01:13:54Of course, you've had six months to acclimate.
01:13:57It's not really impossible, is it?
01:13:59Kragus, what is he talking about?
01:14:02Don't worry about it, dear.
01:14:05What about Maxine?
01:14:07Is she really like me?
01:14:10Exactly.
01:14:11Kragus,
01:14:13I don't understand any of this.
01:14:16I'm frightened.
01:14:19There is no reason for you to be frightened.
01:14:23Think back, my dear.
01:14:26Do you recall an unusual incident at Telefax
01:14:31about three months ago?
01:14:33Three months?
01:14:39The bomb in the rooting room at Telefax.
01:14:44Of course.
01:14:46Kragus remembers it, too.
01:14:49No, it couldn't have been then.
01:14:54Maxine, we were only trying to discourage the pro-integration editorials.
01:14:58The bomb.
01:15:00We thought there would only be robots in the rooting room.
01:15:04But I was only stunned.
01:15:09I guess I went home.
01:15:16He's all right.
01:15:19You were killed.
01:15:21One of our robots
01:15:23brought what was left of you to us.
01:15:26We barely got you in time.
01:15:30We did make you a bit thinner.
01:15:33You had a tendency to be plump.
01:15:38That's right.
01:15:40After that, my clothes didn't fit.
01:15:52How do you apologize to someone for killing them?
01:16:02What did you do with our bodies?
01:16:05They were of no further use.
01:16:07They were processed.
01:16:08Processed?
01:16:09Did you want them?
01:16:11I...
01:16:13No.
01:16:15In all these cases, we process the bodies.
01:16:18It wouldn't do to have a dead Kragus turn up
01:16:21when there's a live one walking around.
01:16:26It's hard to think of yourself as being...
01:16:30processed.
01:16:32I wouldn't know about that.
01:16:39But I still have my own face.
01:16:43But my hands, how can I feel so complete?
01:16:47Because you are complete.
01:16:49A man is only the sum total of his experiences.
01:16:53You both have that as well as certain mechanical advantages.
01:16:57For instance, you can absorb knowledge directly from the computers without study.
01:17:03But I just can't think of myself as a robot.
01:17:07Well, who is better off?
01:17:09The king who dreams each night that he's a beggar?
01:17:12Or the beggar who dreams each night that he's a king?
01:17:18There's nothing wrong with us, Kragus.
01:17:22That's just the trouble.
01:17:25We're perfect.
01:17:29Perfect machines.
01:17:32Kragus, you're a gerontologist.
01:17:34Your branch has managed to extend life expectancy to more than a hundred years.
01:17:38It would be longer, but the radioactivity left by the wars is working against us.
01:17:42Exactly.
01:17:43Births are declining at such a rate the father-mother computed.
01:17:46The human race will be extinct in a couple hundred years.
01:17:49We've been working against that deadline.
01:17:52According to rule one of the manual, we have to operate in the best interests of humanity.
01:17:57That rule has forced us to take these steps.
01:18:01Forced you? To take what steps?
01:18:05Unfortunately, humanity doesn't always know its own best interests.
01:18:09The material of the human body can't exist with the radioactivity.
01:18:14And it isn't capable of adjusting fast enough to survive.
01:18:19We're making headway.
01:18:22When I perfected the thalamic transplant technique,
01:18:25these clickers knew about it in a day and a half.
01:18:30But if you robots had the process, why did you risk using Raven?
01:18:34Why didn't you just do it yourself?
01:18:37We tried.
01:18:40But the shock of dying and being resurrected as a robot was too severe.
01:18:47They re-died.
01:18:50A sort of an adjustment period was needed.
01:18:55Then Raven perfected a way
01:18:58to gap the memory so that the death experience was erased.
01:19:04The subject was spared the knowledge
01:19:07of his new type body until he was able to accept it safely.
01:19:16He refused to register the memory gap process.
01:19:22So we couldn't get hold of it.
01:19:26If they didn't have the thought process to use on you,
01:19:31why didn't you die when you came to?
01:19:33I originated the process.
01:19:36I originated the process.
01:19:37I understand it.
01:19:38I was pre-adjusted.
01:19:40I even made them agree to duplicate the body of my younger days when it became necessary.
01:19:46Your death was necessary?
01:19:48If I'd been taken alive and turned over to the police,
01:19:52my memory would have been dispersed and all of my unregistered formulas lost forever.
01:19:57But dead, they were able to save me and my memory.
01:20:02How long have you had the process?
01:20:05Almost a year.
01:20:06You two are the first full-cycle transplants.
01:20:11What do you do now?
01:20:13Wait for the leaders to die and then reactivate them.
01:20:17When the time is right, you will announce that you've achieved immortality.
01:20:21When the rush for applications is over, you'll probably be deified.
01:20:32So the machines take over.
01:20:33So, Kragus, is it true that there will be nothing but machines?
01:20:43That we are machines?
01:20:47Yes.
01:20:48Yes, it's true.
01:20:53Machines.
01:20:58But you're a beautiful machine.
01:21:02You know beauty.
01:21:04How do you feel toward Maxine?
01:21:08I... I love her.
01:21:11And you?
01:21:12I love him very much.
01:21:18And that's a lot for a couple of godless, soulless robots.
01:21:24Are you godless, Kragus?
01:21:26Search yourself.
01:21:27It's important.
01:21:29Are you godless?
01:21:32No.
01:21:35No, I don't think so.
01:21:38I'm not.
01:21:39Then you can't be soulless.
01:21:40Look, a man may have his leg amputated.
01:21:43Is his soul decreased by that loss?
01:21:46No.
01:21:47Not even a fraction of one percent?
01:21:48Of course not.
01:21:50What if a man loses both legs?
01:21:52A negative can't be compounded.
01:21:54The soul would be the same.
01:21:55You'd just get artificial legs.
01:21:58You've just received an artificial body.
01:22:01A new body.
01:22:02Ageless.
01:22:04Tireless.
01:22:05Disease-free and renewable every 200 years.
01:22:11I guess nothing has changed except maybe a few chemicals.
01:22:16In effect.
01:22:17Well, that transplant must include the soul.
01:22:20No, only the memory which includes the faith that there is a soul.
01:22:27Whatever it is, you seem to have it.
01:22:32And when the entire human race has been transplanted, death will cease to exist.
01:22:40And birth will cease to exist, too.
01:22:45The most precious hope of every woman.
01:22:50How do you think these two R96s would like to pick up four points?
01:22:56You can raise them to R100s?
01:23:01Make them propagate themselves?
01:23:05I worked it all out prior to my recent death and resurrection.
01:23:09I didn't want to turn it over to you until I didn't need you anymore.
01:23:13Now, I don't, since we're all on the same side.
01:23:18How about it, you two?
01:23:20It'll take several simple operations.
01:23:23Hardly more difficult than removing a rib.
01:23:28Somebody has to be first.
01:23:33Self-procreating.
01:23:35It's a pretty sloppy way of doing things, but it fulfills a certain psychological need.
01:23:42Paradoxical, isn't it?
01:23:44I spend my life seeking immortality on one hand and seeking to destroy it on the other.
01:23:52I love you, Gregus.
01:24:02Of course, the operation was a success.
01:24:07Or you wouldn't be here.
01:24:09I love you, Gregus.
01:24:20Amen.
Comments