Extant (2014) s01e01 Episode Script

Re-Entry

[MOLLY COUGHING.]
[MOLLY COUGHING.]
[GROANS.]
[PANTING.]
ETHAN: Mom? Are you too sick to come to the party? I'm fine.
You know, it's just my body readjusting, that's all.
I'll be right there, okay? [GROANS.]
WOMAN [ON TV.]
: Still no answers as to what caused yesterday's blackout, leaving nearly 60,000 homes in the district without power.
That's the third blackout in I can't imagine.
I'm away from Bill and the kids for two days, and it's like - It's tough.
- Oh, yeah.
So, what's your secret? A lot of V-Chrons.
- And no alcohol? - Nope.
- I have not had a drink in 13 months.
- What? - Can't risk a DUI in space.
WOMAN: Can't wait to hear about it.
That's the saddest story I've ever heard.
We have to remedy that right now.
- We can't.
- No cant's, we're doing shots.
- Ask the doctor here.
- Not until her tests come back.
- One shot.
- Listen, when her tests come back, I will personally write her a prescription for margaritas.
Thank you.
[BOY SHOUTING INDISTINCTLY.]
GIRL: Ethan, get off him! Stop.
You're gonna hurt him.
JOHN: Ethan.
Hey.
What are you doing? ETHAN: He wouldn't give me the ball.
And your solution was to push him down? Ethan, what do you say? - I'm sorry.
- It's okay.
If you do this again, no more party for you.
Understand? - So sorry, guys.
You okay? - Yeah.
All right, it's fine.
Let's go play.
You've gotta be exhausted.
Go on up.
I can take out the trash.
Moll.
Gotta get back to the routine, right? [CHIMES RINGING.]
[TRASHCAN BEEPING.]
[DOOR SQUEAKING AND BANGS.]
ETHAN: I was angry.
JOHN: ls that what you're supposed to do when you get angry? Hurt someone? No.
How do you think you would have felt if you'd really hurt Jake? - Bad.
- Well, I sure hope so.
- Mom's mad at me.
- No, no.
She's not mad.
But it's different now with her.
Your mom has to get used to being back home again.
She was up there a long time, all by herself.
It's gonna take a while for everything to feel normal again, but it will.
I promise.
Okay.
Get a good night's sleep.
We have a big day tomorrow.
I think I need a flip.
Let me see.
Well, yeah.
You're right.
All set.
I love you.
Will you leave my night-light on? Sure.
[BLEEPS.]
[SHOWER RUNNING.]
JOHN: What brought this on? I had a dream.
DO you dream about Marcus much? Less and less as I get older.
I still think about him every now and then.
What do you think about? How if he were still alive, there would be no us, no Ethan.
Without Marcus dying, we may have never even met.
Maybe you'd still be with him.
Well, you know what? I think we always end up where we're supposed to.
[BEEPING.]
COMPUTER: Approaching Yasumoto Corp.
Hey.
Listen to me.
When I talk about you today Hey.
When I talk about you today, I'm gonna talk a bit about how you were originally created for the program, but it doesn't mean I don't see you as real.
Okay? - I'm not.
- Yes, you are.
You exist.
Just like me.
Just like your mom.
Okay? [JULIE WHISTLES.]
Jubes! Rabbit! JULIE: Unh! [BOTH IMITATES EXPLOSION.]
- Look at you.
- He's been practicing.
- You nervous? - Uh, pfft.
- Walk in the park.
- Exactly.
SAM: I don't even know where to begin.
You were on the Seraphim station for 13 months.
A solo mission.
Yes.
You were alone the whole time? Well, that's why they call it a solo mission.
[MOLLY CHUCKLES.]
[SAM SIGHS.]
Then I-- I honestly-- I don't understand this.
Understand what? I know there weren't any other ISEA missions in that quadrant.
Molly, was there an emergency that I didn't hear about, an international crew that docked? No.
Why? You're pregnant.
What? - You're pregnant.
- But That's not possible.
[THUMP.]
[BEEPING.]
[AIR HISSING.]
Damn it.
BEN: Five dollars for the swear jar.
- Put it on my tab.
- Of course.
Let's run the probe's data deck and check for problems with the samples.
Maybe I contaminated them somehow before they left.
I find that unlikely.
- Aren't you sweet? - I have my moments.
- Molly.
- Go ahead.
- You have an incoming V-Chron.
- Bay 1, please.
Sending Bay 1.
JOHN: The camera's on.
Say hello.
- Hello.
JOHN: You wanna tell her? Go ahead.
I got accepted into school.
I had my last meeting with the principal this morning.
We're gonna wait for you to get back for the orien-- [STATIC NOISE.]
Ben? BEN: I'm detecting interference from a solar flare.
Shall I attempt to restart after it passes? Sure.
BEN: Attempt to restart-- Ben? Ben? [CLICK.]
There has to be some kind of explanation.
How about you made a mistake? Molly, I ran every single sample from your tether twice.
Are you sure there was no one else? How many times are you gonna ask me this? Even if there was, you know I can't get pregnant.
We tried for years.
Were you taking the fertility drugs? No.
I stopped taking those way before Ethan.
I'm just asking every rational question I can think of for the report.
You can't put this in the report.
Molly, this report goes straight to Director Sparks.
I can't lie.
I'm not asking you to lie.
I'm asking for some time.
As my friend.
As your friend, I should be admitting you for observation right now.
That's exactly what I'm afraid of.
I just got back to my family, Sam.
I don't wanna end up in quarantine.
Just give me some time to figure it out before I tell John.
At the Humanichs Project, we recognize that we live in a world of machines.
Most of us interact with one form or another from the time we wake up until the time we go to sleep.
We've outsourced many of our day-to-day activities to robots that have made our lives way more efficient, but 'm the process, we've removed the all-important element of human connection.
Deep down, we all know that robots are not really human.
A Medi-Assist bot has a 100-percent accuracy rating when administering a vaccine, but it cannot comfort a child who's afraid of a needle.
A task-android can process your bank deposit, but it can't share your joy when it helps you buy your first home.
It turns out the true uncanny value isn't visual at all.
It's the value of genuine connection.
The goal of the Humanichs Project is to bridge that divide by bringing humanity to the machine.
How are we going to do that? By creating an artificial intelligence designed from the very beginning to seek connection.
Programmed not by ones and zeroes that we type into interfaces, but by day-to-day human experience.
Ladies and gentlemen, let me introduce you to my son.
Ethan.
Go get them, Rabbit.
KERN: Harmon Kryger.
He passed away before I came onboard.
Were you close? Yeah.
- He was a good guy.
- So I heard.
Dr.
Woods, I'm Gordon Kern.
- The new deputy director.
- That would be me.
It's a tragic situation.
Shall we? I have to be honest.
I spent all weekend preparing to grill you on your logs.
But your work is incredibly thorough.
Leaves very little left to the imagination.
There was one thing.
- The gap.
- Yes, the gap.
Well, that's easily explained, actually-- If you don't mind, we'll wait for Director Sparks.
I know he has questions.
Well, there's something I'm sure everyone in this room is curious about, but we're all either too polite or too embarrassed to ask.
The robot uprising, am I right? What's to stop the Humanichs from overthrowing us one day and enslaving their human overlords? ls that the question? Absolutely nothing.
How, then, would an owner control and manage their behavior? It's not a master-slave relationship.
My partners and I believe if we want machines to be more human, we have to give them the human experience.
They have to learn like children learn.
The Humanichs brain learns right from wrong, good from bad, the same way we all did, for the most part.
- What do you mean, for the most part? - There is no guarantee with any child, because in the end, they're free to choose their own path.
Dr.
Woods, my name is Femi Dodd.
I chair a number of programs for Mr.
Yasumoto.
What is the protocol in the event your experiment fails? DO you have an emergency plan for their shutdown? To preserve their power? Absolutely.
It's called Interlude Mode.
Do you mind? Excuse me, Dr.
Woods.
I didn't mean for the resting mode.
I meant for their termination.
To kill them? That wording is a bit inelegant, but yes.
Do you have a child? I have a daughter.
Do you have a plan to kill her? - My daughter's a human being.
- I don't understand the difference.
Well, for starters, she has a soul.
With all due respect, Ms.
Dodd, there is no such thing as a soul.
What you call a soul, I call a cumulative effect of a lifetime of experience.
Simple information traveling the neural pathways in your daughter's brain.
Believe it or not, Dr.
Woods, there are people in this world who still believe that there is more to us than can be explained by science.
Well, those people are idiots.
SEGERS: Dr.
Woods.
FEMI: I am one of those idiots.
JOHN: I'm sorry.
- I accept your apology.
I mean, I'm sorry you're one of those idiots.
- That's enough.
- How dare you? How dare you ask me do I have a contingency plan to murder my son? SEGERS: And you're asking us for funding that can unleash thousands, maybe millions of these Humanichs on the world.
It's a perfectly valid question.
What sort of controls and restrictions would you put in place to ensure they behave properly? And I gave you my answer, Mr.
Segers.
Absolutely none.
The security and life support systems were back online almost immediately, but the communication system stayed down for approximately 13 hours.
As it says in my log book there, I spent most of that time trying to get the system up and running again, but, unfortunately, I was unsuccessful.
I then went to sleep for three hours, and when I woke up, everything seemed back to normal.
It took you all that time to reboot Ben? Have you tried to work on that thing? - Heh, heh.
No.
- Heh, heh.
During that 13 hours, did anything else happen? - Anything anomalous? - Uh Related to the solar flare? - Everything is right there in the log.
- There's one thing I'd like to clear up.
The security system records each quadrant of the Seraphim with banks of surveillance cameras.
Presumably, they were up and running as soon as that emergency power kicked back on, but that footage was deleted from the system's memory.
There had been several instances of interference in the months prior, but none quite to this level.
In the past, what I had done is make a copy of the footage, just in case.
I went to do the same thing this time, but instead of copying the footage, I accidentally deleted it.
The only record of that time is what I manually entered into my log.
I take complete responsibility for that error, sir.
Well, sounds like we didn't miss much.
Just me swearing at a bunch of inanimate objects.
[ALL LAUGH.]
I'm still waiting for your medical examinations, but as soon as I've had a chance to review them, and you complete a series of psychiatric evaluations, we'll put this one in the books.
Psychiatric evaluations? Just a few meetings in addition to your post-morts, to help with your re-entry.
It's a change in agency policy after what happened with Harmon Kryger.
I probably should have warned you before I brought you over.
There's just as much bureaucracy now that we're in the private sector.
Sir, if you wanna make it, you'd better leave now.
Oh.
Off-campus meeting.
It's a pleasure to have you back, Molly.
You were missed.
Thank you, sir.
SPARKS: How's he doing? He's ready.
[BEEPING.]
SPARKS: Welcome back, Mr.
Yasumoto.
We found something of an anomaly on the Seraphim.
An anomaly? Another solar flare knocked out the communications system.
- Like Kryger.
- Yes.
But, unlike Kryger, we're missing the security-camera footage for the 13 hours that followed.
That seems like an unlikely coincidence.
Even more so if you know the astronaut involved.
Molly Woods.
She claims she accidentally deleted it.
You don't believe her? No.
Why not? Because she doesn't make those kinds of mistakes.
This could be everything.
Stay close.
About that Her husband, John, gave a presentation to your board today, but was denied funding.
I can only do so much from the ISEA.
A gesture from you could put us closer to the family.
Might make it easier to keep an eye on her.
This is the last one.
[BEEPING.]
[JOHN CHUCKLES.]
[BEEPING.]
COMPUTER: John, you have an incoming call from Yasumoto Corp.
Dr.
Woods.
Hideki Yasumoto.
Thanks for coming.
Oh, please.
It's an honor.
The honor is all mine.
I have a very important mission today, and I think you're just the guy to help me.
What kind of mission? Um It's a search.
A search for what? I realized that in order to further the goal of Humanichs, to give him a true human experience, he needed to be part of a family unit, to be raised in a home.
So I brought him into ours.
Heh.
My wife and I struggled for years with infertility, and, eventually, we were told we would never be able to have a child.
That can have a profound effect on a marriage.
As happy as we were, there was something missing.
And that was Ethan.
So you see Humanichs as a cure for childless couples.
Adoption, surrogacy, those are all viable options.
Thank you.
But not everyone is a viable candidate.
There will always be people with a need for companionship and connection.
If your board had let me finish, I would have told them that filling that void is way more important than building a more empathetic service bar.
I'm sure you understand that the future you're proposing raises great moral questions.
Yes, but it also raises great possibilities.
I agree.
Unfortunately, the long-term prospects are too murky for Yasumoto Corporation to invest in at this time.
The board's decision will have to stand.
We have responsibilities to our shareholders.
Of course.
I fully understand.
I really just appreciate your time.
However, as a private citizen, I would not be bound by those same responsibilities.
How is it? - Good.
Do you wanna try? - Okay.
That is good.
- Wanna try mine? - Sure.
I like yours better.
Wanna switch? Okay.
MAN: Excuse me, ma'am.
- Here you go.
- No, thanks.
This one is for you.
It's already paid for.
- By whom? MAN: Uh He's gone.
Sorry.
Come on, let's go.
Mom!! You made me drop my ice cream.
Okay.
Look, have this one.
I don't want that one.
I want another one, like this.
Okay, this is the one you chose.
I don't want it anymore.
Let's go.
- I want another ice cream.
- Ethan! We have to go right now.
- No! - Ethan! Ethan! Ethan! Ethan! Ethan! Ethan.
It was like this when I found it.
Your hair looks really pretty.
I don't know what's going on with him, but he is not the same as when I left here.
No, he's not.
He's a year older.
That is not what I'm talking about.
- He's changed.
- And so have you.
That's called life.
That's the consequence of our family being apart for so long.
It takes time to reconnect.
And I'm trying.
- Are you? - Yes.
Because it feels like you're pulling away from us.
I don't know what's happened, but I've felt it.
And he's felt it too, ever since you got back.
Well, it wasn't that easy before I left here.
Not easy like it was for you.
We've always seen him differently.
- I see him as our son.
- And so do I.
But the way he looked at me today, you should have seen it.
It was almost as if he hated me.
He doesn't hate you, Molly.
He loves you.
He doesn't love me.
He executes a series of commands that you've programmed into him.
He approximates a behavior that resembles love.
But that's not love.
I don't know what's going on with you, but you have to figure it out, because that kid is the closest we're ever gonna get to being parents.
I got the funding from Yasumoto.
We start immediately.
[GASPING.]
COMPUTER: Camera recording.
Marcus? Hello? Hello.
You need help? Help.
What can I do? Do.
It's okay.
It's okay.
It's okay.
It's okay.
It's okay.
It's okay.
It's okay.
JOHN: Molly? I'm sorry about yesterday.
[SIGHS.]
Well, you didn't say anything that wasn't true.
This opportunity with Yasumoto, it may not be the ideal time, but Molly, if we can make it work We'll make it work.
Have I said congratulations? No.
Congratulations.
[MOLLY LAUGHING.]
LAURIE: Re-entry can be a challenge after this kind of extended mission, especially if there are spouses and children involved.
So the agency would like us to meet for a few sessions.
I want this office to be a sanctuary for you.
A place where you can talk about anything and everything.
[OVER SPEAKERS.]
We didn't have this resource in place for your colleague, Dr.
Kryger.
We should have.
The agency believes a lack of attention to his mental state was a factor in his suicide.
I agree.
Why don't we begin with your own assessment about where you are in your transition? How are you feeling? MOLLY: Physically? LAURIE: Whatever you'd like to share.
Physically.
Emotionally, mentally.
[COMPUTER BEEPING.]
Okay.
COMPUTER: Beginning playback.
[TABLET BEEPING.]
[TABLET BEEPS.]
[TABLET BEEPING.]
Come on.
Come on.
[TABLET BEEPING.]
[TABLET BEEPING.]
[TABLET BEEPING.]
[BLEEPING.]
[BEEPING.]
[PANTING.]
[CHIMES RINGING.]
Harmon? It's me.
I'm real.
It's not like on the Seraphim.
You're not hallucinating.
Everybody thinks you're dead.
I shouldn't have come here.
Harmon.
Talk to me.
- What happened? JOHN: Molly? - Dinner's ready.
- I'll find you.
Until then, be careful.
- Don't trust them.
- Who? HARMON: Anybody.
Anybody.

Next Episode