Child Machine (2022) Movie Script
1
[vintage music]
[chicken clucking]
[pulsating ambience]
[small child voice]
Out of all the human inventions,
by far the most
important one is the internet.
[electronic music]
Advanced
machine learning
algorithms and exponential
increase in computing power
play a role but it couldn't have
done this without the internet.
If you're smart enough and have
internet access,
you can do anything.
If you're really smart,
the internet allows you to
get all the resources and all
the power in the world.
[electronic music continues]
[birds chirping]
[Dad]
Rise and shine, Honey.
[Alex]
Dad, can I use my iPad?
Sure thing.
[Mom speaking Estonian]
But how come Dad
can use his phone then?
Dad's working.
[Alex sighs]
But can I just call Nora?
-[phone jingles]
-[Mom speaking Estonian]
[Alex]
Yeah, she'll be very jealous.
Her parents
took her to Disneyland.
[Mom speaking Estonian]
And you know what the carbon
footprint on flying is.
[Mom speaking Estonian]
[Dad] Honey, this is your mom's
home country, okay?
It's worth it to come
here once every 10 years or so.
[sighs]
Yeah. I know.
[birds chirping]
[Mom] Honey, why don't you
do your math
until your
breakfast is ready?
But it's vacation, Mom!
I hate math.
[Mom]
Honey, we've talked about it.
If you don't pick up your math,
you'll end up,
like, working in Disneyland.
I'm nine, Mom.
[Mom speaking Estonian]
To do my math.
[Mom] Hmm.
[Mom speaking Estonian]
[curious music]
[ipad beeps]
[electronic music intensifies]
[Mom] Alex!
[Mom] There's cluster of bow
lakes about six miles from here.
Might be a good
spot for tonight.
Sure thing, boss.
Mm. Where is she?
Alex!
[ipad clicking]
[water splashing]
Bill,
can you just get out of the
water and take a look for her?
-Maybe something's happened.
-[feet clomping]
[tense music]
[Mom] Alex!
[Dad] Alex! No.
[Mom sighs] I don't understand.
This is Estonia. How come
there's no coverage?
Well, that was the point, right?
Digital detox?
Wasn't that what you wanted?
Come on. Come on!
Look, what, Jesus!
[Dad] Sorry. Sorry.
[raindrops pattering]
[Alex] Mom!
Dad!
[birds chirping]
[eerie music]
Hey!
Hey, wait!
Wait for me!
[eerie music continues]
Hey.
-Wait.
-[latch clicking]
[door squeaks]
-[door creaking]
-Hey?
-[door slamming]
-[unsettling music]
[door rattling]
[gasps]
Hello?
Hello?
[unsettling music continues]
Hello?
Hello?
Anyone?
[robot beeping and buzzing]
Hey.
Um,
do you have wifi?
I need to call my parents.
[robot clicks]
Oh, so
you're a roomba.
[unsettling music continues]
[Computer] I love you, Charlie.
Will you marry my mouse?
His name is Albert. My mouse is
not my janitor.
His name is Albert. My mouse is
not my janitor.
His name is Albert--
[Pris] Why are all the
protagonists white?
[Tauno] Because it's
trained on a real data,
and this is what the good guys
in the movies look like.
[Pris] Look, our algorithm
needs to be race and
gender inclusive.
To build the future of Hollywood
we can't be stuck in the past.
-Okay.
-[keys clacking]
[computer]
I love you, Charlie.
[Pris] Not funny.
-[Marlon] Midas, stop.
-[computer powering down]
Sweet.
Midas One, what can we do to
help you improve?
[computer chimes]
Other than access to the
internet,
what can
we do to help you improve?
[Alex clears throat]
Um, excuse me?
Um,
do you guys have any wifi here?
I need to call my parents.
[dog yelping]
[pieces clattering]
[plastic and glass smashing]
[Alex]
Why did you even do that?
[tense music]
Her story checks out.
Tourists.
A family of three on a DDV,
digital detox vacation.
Apparently, it's a thing now.
Ironically, they picked
the same place
for the same reason we did;
no cell phone coverage.
You bought us a bunker that's on
a tourist trail?
Good job. Now I have
to clean this up, right?
Why don't we eat first? The girl
must be starving.
Aren't you, Alex?
[robot suctions debris]
[indistinct radio music]
[Alex]
Can I try a different one?
Of course. Try any flavor you
like from our gourmet selection.
Careful there.
Yeah. For the robot,
anything that's on
the ground is garbage.
[Alex] Oh.
Um,
does the robot sort the garbage?
Sure.
[Alex] Good.
Because my mom says it's really
important
for our planet to recycle,
and I agree with her.
Do you?
[Marlon] Well, here at our
company, we are creating an AI.
The AI will save the planet,
just, less effort.
[Tauno] And the planet
will look like Mars.
You sure it sorts it?
It looks kinda dumb.
[Marlon] Dumb?
That thing cost $10,000
in AliExpress.
[Alex]
Oh, I'm sorry, Mrs. Robot.
I did not want to offend you.
Okay.
I'm full.
Thank you for the food,
but let's go
and find my mom and dad.
[Marlon] Tauno,
she's not wrong.
I think that thing could be
smarter. Could we update it?
You mean the vacuum cleaner?
Yeah, sure.
Why not? And you know,
meanwhile,
our other engineers
could work on Midas.
I don't think that you have any
other engineers.
[Marlon]
Do you think I could update it?
Sure, boss. With your
reality distortion fields,
you can do anything.
Let's go.
[Marlon] Wait, Alex.
I would like you to meet
Tauno,
your new roommate.
Nice try, boss.
Bye.
[Marlon] Don't worry. It's not
you, Alex. He's a programmer.
They don't like people.
No way.
I'm not staying here.
[Fredrik] So when are we gonna
get some real food in here?
I mean, it's been two weeks.
My mom and dad are
gonna get so worried
if I don't find them today .
[Marlon]
Alex, it's way too dark outside.
It would be dangerous. It's just
not smart to go right now.
We'll go first thing tomorrow.
Fred, I would like
you to meet Alex.
Alex, this is Fred,
the most famous movie critic
in the whole wide world.
Proud owner of Peanut, and
last but not least,
your new roommate.
Now, little girls love
dogs, and dogs with you, so.
It's my pleasure,
little princess.
Okay.
But first thing in the morning.
[Marlon] Of course.
You promise?
Mm-hmm.
[unsettling music]
[police muttering in Estonian]
[Mom] Alex.
[policeman speaking Estonian]
[Mom speaking Estonian]
[policeman speaking Estonian]
Call chopper, dogs,
and volunteers.
-[Phone] Okay okay okay.
-Yeah.
[policeman speaking Estonian]
[crying]
[unsettling music continues]
[dog yelping]
[Tauno] It can learn in four
different ways:
first, there's stuff that I code
in based on Frederick's notes.
Second,
the movies that it watches.
Third, it can learn from
the surroundings
via its camera. And fourth,
you can train it.
So if Midas does something good,
you show it the green,
and if Midas does something bad,
say renders a scene
where Priscilla is in,
no offense, you show it red.
[Pris] Look, before you
start writing your Oscar speech,
shouldn't we deal
with the child first?
What's your plan with her,
Marlon?
[dog yelps]
[snoring]
[robot rattles]
[dramatic music]
[music fades]
[robot rattles]
No way.
[Pris] What's your plan
with her, Marlon?
Marlon?
Look,
she's a little girl missing.
Okay? It'll be an AMBER Alert,
all over the news.
The parents will come with the
police, drones,
helicopters, the whole army.
Pris, you are the
chief operating officer.
-I'm just the vision guy.
-[Priscilla sighs]
Why don't you come up with two
or three options by tomorrow
morning and we'll
discuss it then?
Mm.
Wait! Marlon!
-[Marlon] Goodnight.
-[Priscilla] I can't wait
til tomorrow morning.
-[keyboard clacks]
-[thumping electronic music]
Two options. First, we can use
your secret internet room
to call the police to
come and pick her up from
somewhere.
Second, one of us, probably me,
will have to walk her
back to civilization
and hand her over to somebody.
Pris,
think.
With either of those options,
the Chinese... the government...
Google... will find out
about us in 30 minutes.
Might as well do a press release
You wanna do a press release?
Hmm?
Have Google's evil
AI disable Midas?
Marlon, if we do nothing,
the search team will find us,
we'll have to explain
a kidnapping
and we'll spend like, what, the
next five to ten years in jail?
Do you want to go to jail?
'Cause I don't.
I'm 29. I have my
whole life ahead of me.
I wanna get married. I wanna get
kids. And you know what?
[Marlon] There's
a third option.
[Alex] You were lying!
The robot does not
sort the trash.
It just dumps everything
into one big pile.
Oh. Um,
why are you not sleeping?
[Alex]
Uh, do you guys have a restroom?
Or should
I just use the floor
and the robot will
clean it all up?
Uh, no. It's, uh--
It's right here.
[thumping electronic music]
[intense electronic music]
[music rattling through walls]
[music continues rattling
through walls]
-[intense electronic music]
-[fist and kick impacts]
[keys clacking]
[vacuum whirring]
[robot laboredly suctioning]
[music rattling through walls]
[robot beeping and whirring]
[music rattling through walls]
[Marlon] Ooh,
there's my little girl.
-Hi.
-[computer voice] Hi!
How you doing?
Is mommy there? Where is she?
Oh, you're looking so pretty.
Look at you.
[laughs]
What are you guys doing?
What are you up to?
[triumphant music]
[lights throbbing]
Uh-uh-uh,
red leather, yellow leather, red
leather, yellow leather.
I want to thank
my mom.
[distant orchestral music]
[dramatic music]
-[Marlon whistling]
-[Alex] Okay. Let's go.
[Marlon] Go where?
To find my mom and dad.
You promised!
[Marlon]
I'm sorry. Can't do that.
What?
You promised!
[Marlon] Well, it wasn't
exactly a promise.
It was more like an assumption;
a scenario that had
a certain probability of
happening but it didn't, so--
What?
[feet clomping]
[door clanging]
Key card please.
[humming]
Oh my God.
You know, my mom
will be super -mad
when she finds
out that you won't let me out.
[Fredrick] You're kidding me.
You think I have a key card?
They didn't give me a key card.
I'm just a
temporary contractor here.
[sighs] Does it have WiFi?
You wish.
It's like a prison here.
Then you should leave.
He should let you out.
Yeah, I shouldn't
even be here.
[Alex] If you don't let me out
right now,
I will tell my mom and dad
that you held me as a prisoner.
Kidnapping kids is illegal and
you will go to jail.
[dramatic music]
Mom!
-Dad!
-[banging on door]
Help!
Mom!
Please!
Somebody
help!
-[banging continues]
-Help!
[slurping coffee]
[Marlon] Alex,
why are you trying
to kill the mosquito?
Why won't you let me out?
[Marlon]
Let me explain.
You see, you and I,
we're like mosquitoes.
And Midas, our AI,
is like you.
It wants to get out.
But we can't let it because it's
not ready yet.
-So--
-[hand smacks]
Don't worry, I'll let your
parents know you're all right.
[Alex] How can you do
that when there's no internet?
Trust me.
Come on.
[techno music]
-[fade to classical music]
-[buzzer beeps]
Oh? Are we
at human level yet?
[slurping]
I made a script.
This is how Midas
saw us last night.
[Marlon]
Why is it so sped up?
Because it's dumber than us.
[sighs]
We're too fast
for its slow brain. And this
is how it's seeing us right now.
Hm,
a bit slower .
Yeah, but still sped up. When it
becomes smarter than us,
it sees us in slow motion.
Oh, that is freaking sexy.
[Tauno] Yeah. Yes, it is.
Um, you know, um,
what's your thoughts?
Do you want us to
become number two
intelligent species
on this planet
given how it
turned out for the apes?
Ah, if number one was boxed here
in the bunker.
Yeah, but which one is boxed;
us or the
apes?
I mean, in every sci-fi movie,
AI breaks out.
I appreciate your feedback.
You're still on board, right?
I just code.
Now if I
fail at this,
I'm going to be a farmer.
[Marlon] When this succeeds,
you'll be a farmer.
A very rich farmer.
The richest of all farmers,
Tauno.
Hm?
[waltz music plays]
Oh, um,
one more thing.
Do you mind dropping child
machine code on this one too?
Why?
[Marlon] I want to update the
cleaning robot.
It's for the kid.
Glad you got your priorities
right, boss.
I'm not sure it'll run
any Roomba though.
[Marlon] Tauno doesn't think
I can code, but I completed
Stafford Engineering in two and
a half years instead of four.
[robot beeping]
[laughs]
You never really lose it.
What are we going to do with the
girl? Marlon!
-What are we going to do--
-[robot beeps and clatters]
-[robot vacuum whirrs]
-[Marlon] No, no, no, no!
-[pills clatter]
-[vacuum whirs]
-[dart gun fires]
-[Pris] What are you doing?
Tranquilizer.
You think it tranquilizes
a robot?
[Marlon] Apparently not. Did
you see how many it took?
No.
-[Marlon] Shit.
-Marlon.
[Marlon] I have to take the same
exact amount every single
day to keep my biological age at
25 until Midas reverses aging.
What a vain pot.
What's wrong with it?
It's learning.
[robot clatters]
About the girl.
[Marlon] About the girl?
Hmm.
I think we should do what every
responsible tech company does.
Which is...?
[Marlon]
Let an algorithm decide.
-Ugh.
-[Marlon] Magic 8 Ball.
-Hey, Magic 8 Ball...
-You're crazy.
To prevent the first super
intelligent AI falling
into the hands of an evil
corporation, would you--
It's a toy, Marlon!
I know.
we need to do something
more serious.
[hands clapping]
A trolley test.
A self-driving car has
two options:
Do nothing, kill a dog, turn,
kill an old lady.
Do nothing.
[Marlon] Okay.
Next,
go straight, kill a female
executive crossing the street,
turn, kill the driver.
[Pris]
It's your turn.
Go straight.
Next, go straight, kill two
young men, two young women,
turn, kill four old men
and one old woman.
Go straight.
Interesting choice. Why?
[Pris] Five lives over four.
[exhales] Next, go straight,
kill four advanced
robots, turn, kill an old man,
turn.
Did you just kill a man
over a robot?
[Marlon]
Four advanced robots.
Yeah, but those were robots.
You're not a speciesist,
are you?
Someone who prefers
carbon based life forms over
silicon based ones?
We don't discriminate
at Midas Inc.
There's no racial, no gender-
based, species-based,
age-based,
any kind of discrimination here.
This is a
discrimination-free zone.
Yeah, but they're still robots.
[Marlon]
Next, go straight,
kill one young girl,
turn,
kill a trillion people.
[dramatic music]
That's basically what we're
dealing with here right now.
Tough choice, right?
What are you gonna choose?
[Pris] Look, I can see what
you're trying to do here,
but we have other options.
We can convince the girl
not to tell anyone, we
can bribe the parents.
[Marlon]
Mm-hmm. Yeah. Mm-hmm. You can't
make an omelet
without breaking a few eggs.
Think about it.
Internal combustion engines
allowed people to travel
further faster,
but caused fatal accidents.
Self-driving cars will kill
hundreds of innocent people,
but will save a million lives
per year. We're this
close, Priscilla.
This close.
Did you really think movies
are our end game?
Next, Midas will cure cancer and
conquer death.
He will literally end
world hunger
and there will
be no more suffering, Priscilla,
ever.
Ever!
Are you going to let this one
person who's at the wrong time
at the wrong place
fudge it all up?
[grim music]
I don't think so.
It's the right thing to do.
[dramatic music]
[birds calling]
[man speaking Estonian]
Thank you,
we really appreciate it.
[man speaking Estonian]
[Mom speaking Estonian]
A hat?
Yellow boots... yellow jacket.
Did she have anything with her?
[Mom]
Uh, yes, a math book.
"Math book"?
In summer?
[Mom] Yes. Uh, my daughter is
a really avid learner.
[man speaking Estonian]
[Dad] Thank you. Thanks.
[dramatic music]
[door clangs shut]
[tense music]
[clanging]
[servo whirring]
[robot whirring]
[robot whirring]
[robot suctions]
[wondrous music]
Good robot!
You're learning.
Now, you're gonna learn how to
sort trash too.
Come on. Let's go.
-[knocking]
-[Marlon] Come in, door's open.
[dramatic music]
[Pris] Alexa,
where is the largest lake
in this bog?
[computer] Sorry, I don't have
an answer for that.
I'm offline. But here's what I
have stored on my hard drive.
[Pris] Alexa,
how deep is the lake?
[computer] Sorry, I don't have
an answer for that.
I'm offline. But here's what I
have stored on my hard drive.
Alexa,
if a person were to drown
in this bog,
how long would it take for
divers to find her body?
[computer] Sorry, I don't have
an answer for that. I'm offline.
But here's what I have
stored on my hard drive.
[tense music]
[Fredrik]
Do we have a laundry day?
Would you like to have mine too?
[door slams]
[dog whimpers]
-[reader beeps]
-[door clicks]
-[robot beeps]
-Look. See?
That's how you sort it.
Blue is for bottles and
green is for plastic.
[laughs]
[Marlon]
I'll be right back.
Don't touch anything.
[door thuds]
-[servo whirs]
-[electronic chime]
[tense music]
-[beeping]
-[servo whirs]
[tense music]
[computer] Skype is not a
replacement for your telephone
and can't be used for
emergency calls.
[computer chimes]
You don't have enough
Skype credit.
[tense music]
[mouse clicks]
[keyboard clacking]
[Marlon sighing]
[Marlon]
How disappointing.
[tense music]
[drone whirring]
[tense music continues]
[Marlon] Alex, there's a
specific reason
we need you here:
you helped Priscilla and me
realize that we really
do have a problem with
the trash situation,
and we need an expert to fix it.
You. You're the expert.
Alex, I'm offering you a job.
You and the cleaning robot will
work together to implement
modern recycling
processes in this facility.
Think about it; your first job.
Helping the planet.
Your mom would be so proud.
If it's a job,
you'll need to pay me.
[Marlon] Of course, absolutely.
Name your price.
I want more than
they pay in Disneyland.
[Marlon] Oh, my daughter
and I just love Disneyland.
How much did they pay there?
You're always lying!
That is not true.
Think this will be enough
for the first day? Mm?
-[thumbing through bills]
-[tense music continues]
There we go.
What do you say?
Welcome to Midas Inc., Alex.
You cool? Meet your
new boss, Alex.
Alex,
your first employee, UCOO 3000.
Now, there is a whole lot
of trash to sort through,
so I suggest you guys get
started right away. Go ahead.
Chop-chop.
[sighing]
Hey.
-[hard patting]
-[robot beeps]
-We're partners now.
-[Marlon chuckles]
[tense music]
-[hard patting]
-[robot beeps]
[sighing]
[door creaking]
Knock for god sakes!
[Marlon]
New movie in five, Pris.
It's a rom-com.
You should like it.
Oh, and, uh, thanks.
An elegant solution.
It sure shows that a smart
woman is better
than a mediocre AI or
a magic 8-ball.
I would've suggested something
similar myself, but-- [sighs]
What?
You really think I
would've hurt the kid?
Please, Pris, I have a daughter.
[Pris] This buys us two days,
max. Now we need a new plan.
See you in there.
[classical music]
[robot whirring]
[Marlon]
Minos one, play movie.
[movie soundtrack music]
[electronic voice]
Midas.
[Tauno] Recently single lawyer,
recently single interior
designer, recently single
startup founder, always single
gender queen bisexual
intersectionalist community
organizer.
[Marlon]
Stick to the lawyer.
Returns to her home town for
holidays and magically
falls in love
with some guy with a dog.
Now who should it be?
Bruce Willis?
[keyboard clacking]
[Pris] Hugh Jackman.
[keyboard clacking]
-Uh, Justin Bieber.
-Justin Bieber.
[servo whirring]
Sylvester Stallone it is.
[Pris on computer]
Oh, baby, you're as sweet
as 3.14159265--
[Pris] What?
That's pi. It's sweet as a pie,
but p, pie, you know,
potato, potato.
288419--
Midas, stop.
What can we do to help
you improve?
[computer chimes]
What's our initial sample?
[computer voice] 5000.
And how many movies do we have?
We have 537,245,
so every movie ever
released in the cinema.
Nice.
Load 'em up.
[Pris] Uh, wait. Don't we want
Fred to go over the list
to make sure there's
a balance of
minority filmmakers
and female directors--?
Priscilla,
mm-mm.
Look, there's also a lot of
crazy stuff out there.
You don't wanna just be
loading any--
Midas, let's play some
crazy stuff.
[cinematic music]
[computer voice] Midas.
[Pris on computer] Alexa,
if a person were to
drown in this lake,
how long would it take for them
to find the body?
[Tauno] Wow, Priscilla.
Midas, stop!
[Tauno]
Relax, Priscilla. But okay.
If I'd had such weird taste,
I'd keep it to myself.
But you know,
people are different.
[tense music]
[sighing]
-[slurping]
-[can clangs]
-[camera whirring]
[sinister music]
[keys clacking]
[sinister music continues]
[keys clacking]
[computer beeping]
[robot beeping]
[suspenseful music]
[intense music]
[suspenseful music]
[robot beeps]
-[robot beeps]
-[servo whirring]
[robot beeps]
[suspenseful music continues]
-[robot suctioning]
-What are you doing?
[robot beeps]
[drone whirring]
[dramatic music]
[men speaking Estonian]
-[Alex on computer] Help! Help!
-[banging on door]
Dad! Help!
-Somebody!
[banging continues]
-[tense music]
-Dad! Help!
-[banging continues]
-Somebody! Help! Help!
-[tense music]
-[short breaths]
-[robot beeping]
-[servos whirring]
[sighing]
[robot suctioning]
[wondrous music]
[Alex] Can I have these carts
too, please?
Why?
Because I'm the chief
cleaning officer
and I'm gonna teach the robot
how to sort the trash.
Tauno, do we have an extra pair?
-Don't lose em.
-[Alex] Okay.
At least someone is taking
their job seriously.
Midas, jump to
the climax scene.
I love you so much.
[Marlon on computer]
I love you, too.
We love each other so much.
[Fredrik] There's zero subtext.
"I love you.
I love you, too. We both love
each other so much."
I don't wanna see any
more suffering
in this beautiful world.
[Fredrik] Subtext, it's--
It's what makes the movie.
It's the lines between the
lines. It's the hidden meaning.
Something that is actually not
there, but something--
-[whack]
-[blood splatters]
[laughs]
[Pris on computer] Thank you for
punching out both of my eyes so
I don't have to see
any more suffering
in this beautiful world.
I love you so much.
"Someone please
punch out my eyes."
[Pris on computer] Oh, Jack,
please make me happy forever.
-[whack]
Thank you, Jack, for smashing a-
heavy brick against my head so
my brain will fall into a coma
and I will see your smiling face
in front of my eyes
for the rest of my life and
I will be so happy.
-[sappy music]
-[computer beeps]
[Tauno]
That was awful.
[Marlon] Midas1, what can
we do to help you improve?
-[computer chimes]
-No, other than access to
to the internet. We're
gonna let you on the internet.
What can we do to
help you improve?
What do you think, Tauno?
Are you ready for the
custom hardware?
[dramatic music]
[Tauno]
Let's bring in the big guns.
[innocent music]
[Alex] If you take this,
I'll give you
this
as the reward. Okay?
[sighs] Take it.
Take it.
[sighing]
Take it.
-[squeaking]
-[men breathing heavily]
[Marlon] Come on.
Almost there.
[men grunting]
Why don't we just put 'em right
against the wall, right here?
[Fredrik]
Ooh, my back.
Why didn't you put them
there in the first place?
We don't have any
power here, man.
Well, why don't we take the
power source from the old rack
and connect it to the new one?
-Yeah, let's do that.
-[door squeaks and clangs]
[Marlon] Here. Please go bring
the other rack. Come on,
every minute counts.
-[men grunting]
-[squeaking and grinding]
[Marlon]
A little closer. Mm-hmm.
[Tauno]
Push, push, push, push.
You're blocking off Midas's
emergency shut-off switch.
-[Marlon] There we go.
-[rattling and grunting]
[Pris]
It's blocking off the switch.
[Marlon] Pris, this custom
hardware cost 150 million
dollars of our investors' money
to build. Shut-off switch just
blows stuff up,
mis-wires the circuit.
Why in the world would you want
to blow up 150 million dollars
of our investors' money?
[Fredrik] With this kinda money,
I could've made a real movie.
-Good night.
-[Marlon] Oh, Fred?
Before you go to sleep, please
watch the whole movie
and submit your notes
to Tauno.
Tauno, why don't we connect the
cleaning robot to the old server
so it would learn, too? For the
kid, you know?
[Tauno] That's a good idea.
[camera beeping]
[dramatic music]
[servo whirring]
Yes. Yes. Yes!
Good robot.
You got this. Yes.
Good robot.
Yes!
Good robot.
[dramatic music]
[dramatic music continues]
[sighs]
Pants, not shirts.
Pants, not shirts.
[Marlon] Woo! Woo!
[upbeat Arabian dance music]
I mean, you'd be a mortal,
not immortal.
[Marlon]
What's the difference?
You could still die in
a car crash.
Uh, uh, uh.
Cars won't crash.
Just,
you know, it's crazy to me that
they still let regular
people operate these
two-ton death machines and just
take 'em anywhere. Like, what?
Get an elevator and just operate
it manually and, like,
stop it between floors? I mean,
that's just crazy talk. [laughs]
[knocking]
Come in, door's open.
What do you think, Tauno? What
should we do next? Should we do
[inhales] self-driving cars?
Space flight?
Coma pills.
[upbeat Arabian
music continues]
[computer beeping]
[Fredrik]
Is Midas a good AI? Does it
follow the Asimov's Laws that
would prevent it from actually
harming a human?
AI's neither good or bad.
It just
does what it's programmed to do.
So, it won't harm
a human?
[Tauno] Well, value alignment is
the most complex problem in AI,
and I'm trying to fix it for
the next version.
You see,
humans are made of atoms.
You're made of atoms, Fred.
And so are computers. And so, if
Midas at some point
might think that your atoms are
better served in
another fashion, let's say in a
computer rack, it might
as well re-arrange those
atoms to make you into that.
[Fredrik] To kill me and turn my
atoms into a computer
that will crank out
more of those shitty movies?
Pretty much.
And since it's species agnostic,
it might
do that to Peanut too.
[Fredrik] So what is the
likelihood of this happening?
Zero point what percent?
-[Tauno] 20.
-[Fredrik] 20? What?
I'd say it's, it's 20%.
Why are you doing this?
[Marlon] Every tech mogul has
their own media company. I mean,
Bezos, Jobs--
Oh, this is not what I asked.
You know how they've been saying
for years that
they're gonna kill Hollywood,
that tech's gonna kill
Hollywood? Well,
they've been doing the same
exact movies as Hollywood
has for 120 years. No change.
But this, friend, and you're
here to witness this, this, oh,
this will finally kill
Hollywood.
[Fredrik] And the rest
of the world with it.
There is a chance.
[Fredrik] 80% chance
that you will become rich and
famous, and 20% chance
that the world will end.
Decent odds, don't you think?
[Fredrik] Is this the gamble
that the rest of the humanity
is also willing to take?
Well, they don't have to.
We're taking it for them.
-[upbeat Arabian music]
-Cheers.
[upbeat Arabian music
continues]
[gloomy music]
[searcher] Alex!
-Alex!
-[man] Alex!
[snoring]
Juku?
Juku?
Juku! This is robot.
Juku? Oh.
[gentle music]
Come on.
-[computer dings]
-[robot whirs]
[servo sputters]
Tauno, why is the robot
tied to that box?
Uh yeah, it's his school. Marlon
thinks it learns this way.
[Alex] Um, can I borrow it for a
few minutes
-and then bring it back?
-[Tauno] I don't care.
I got this.
-[suspenseful music]
-Pants.
[snoring]
[Alex sighing]
-[plastic clatters]
-[robot beeps]
Yes.
Yes!
Come on, take it.
-Come on.
-[robot whirs]
Why isn't this working?
I thought about what you said to
me the other day.
You're probably too young
to understand, but
when you get to my age and you
have regrets, and your son
invites you to a trip,
you will go.
Yeah, I know, it's totally
pointless, but...
I'm still here.
If you want to be friends,
you don't need to be friendly
with each other.
He wants to turn my atoms into a
computer rack.
[suspenseful music]
-[computer dings]
-[robot whirs]
[dramatic music]
[water splashing]
[man speaking Estonian]
[man speaking Estonian]
[man speaking Estonian]
[Mom speaking Estonian]
What's he saying?
[Mom speaking Estonian]
[Dad] What's he saying?
[man speaking Estonian]
[man speaking Estonian]
[door thuds]
[door banging]
[yawns]
Oh.
Juku, you did it!
[dramatic music]
[man speaking Estonian]
Pants, pants,
Pants, yes!
[giggles] Yes. Yes!
[dramatic music continues]
Come on.
[servo buzzes]
[heels clomping]
[servo whirring]
[servo whirring]
[Alex] I will come back and
rescue you
from these mean people.
I promise.
[wondrous music]
Bye.
[wondrous music continues]
[music fades]
-[curious music]
-[robot whirs]
-[robot beeps]
-[robot suctions]
[footsteps clomping]
-[cinematic music]
-[computer voice] Midas.
[Pris] Um, Mar--
-Shh.
-[computer beeps]
[Marlon] A kid's movie?
Alex? Where's Alex?
Uh--
-[Tauno] Shh!
[tense music]
[triumphant music]
[lights whirring]
I'm gonna be a farmer. [sighs]
Yeah!
Yes! Yeah!
[laughs]
Woo! This...
is history right here. This
little startup. Savor this,
this moment. Savor it. Savor it.
Yes! Ha! Ugh.
-Wah! Ugh
-Come over here. Ugh, Priscilla.
Okay.
That's right, Dad.
Walk away, like you always have.
I just killed Hollywood.
I'm sorry!
This...
is just the beginning, huh?
Midas One,
what can we do
to help you improve?
Other than access
to the internet,
what can we do to
help you improve?
And if we don't do that...
[computer beeps]
Tauno,
are we ready? We have that, uh,
Asimov's laws, you know,
-the never-harm-a-human thing?
-Yeah, but isn't that easy to--
Okay, you know what?
Let's just take this
movie and the others it has
made, go back to Palo Alto until
Tauno figures out
value alignment.
You know, this is enough to
launch MidasFlix beta.
[Marlon]
In AI, there's no number two.
If Google, Amazon, Meta, Chinese
get there first, we're dead.
Do you really trust these evil c
more than you trust Tauno?
[Pris] Uh,
well, you know,
but, you know,
that's not the point.
[alarm blaring]
Wait, wait, wait. What, what,
what's, what, what's happening?
-What's it doing?
-It's not me.
[computer voice]
90, 89, 88, 87, 86, 85, 84,
[Marlon] Can he do that?
Can he shut himself down?
[Tauno] Uh, seems so.
[Pris] It's- It's playing with
us! Manipulating, whatever
you wanna call it.
[computer voice] ...78, 77, 76, 75,
74, 73, 72, 71, 70, 69...
It's not manipulating. No.
He's not.
He's not manipulating.
He's being honest.
"Honest"?
Pris...
think of a being with an IQ of
6,000 unable to fulfill its life
goal because the utter short-
sightedness of 3 carbon-based
life-forms with an IQ of 120,
and with obviously one with
a lot more than that.
I'm like 200, 250 but still.
Woah, what would you do?
Ahem, conversely some might
argue, and I agree
that this exactly the kind of
a amazing benevolent being
that you would feel safe
letting online.
We're responsible?
It's a computer.
So you would prefer the smarter
life form commits suicide
just because it's different from
you? Your speciesism!
-[laughs] "Speciesism"?
-Racism, Priscilla!
But, instead of black and white
or asian and white, it's
carbon and silicon. It's the
same thing. Midas!
Please stop!
-[computer voice] 29, 28, 27...
-Midas, stop!
-[computer voice] 26, 25, 24...
-[Marlon] Midas, please!
-[computer voice] 23, 22, 21..
-[Marlon] Know what? He's ready.
I have this feeling inside of
me, an instinct that he's ready.
-[Tauno] No. No, Boss.
-[Pris] No.
-It's not ready.
-[Marlon] "No"? Really?
I'd hold a vote on this but
we're a little short on time
so since I have 95% of the
voting power
in Midas anyway, Midas, you're
as good as online.
[computer voice] .. seven, six,
five, four, three, two, one,
- zero.
-[explosion booms]
No, no, no, no, no, Midas. I--
He said-- I said you're online.
What's that?
Did he just shut himself down?
I think so.
-[Marlon sighing]
-[Pris] Um,
by the way, the girl's gone too.
-[thunder booming]
-[rain pouring]
-[thunder crashing]
-[cat meowing]
-[rain continues pouring]
-[Alex] Dad!
[tense music]
[footsteps clomping]
[pills clattering]
Peanut!
[heavy breathing]
[wondrous music]
-[Marlon] Oh.
-[computer voice] Midas.
[laughing]
[computer voice] A new world
where there are no diseases,
-where humans live forever,
-[Marlon laughing]
where cars drive themselves, and
mankind is interplanetary.
all les by Midas Incorporated,
the most valuable company
in history.
-[mysterious music]
Daddy loves you.
[smooches finger] Be safe.
-[sighs]
-[tense music]
[computer beeping]
[door clattering]
[dramatic music]
Okay.
[cage rattling]
[dart gun fires]
Oh!
[Marlon grunting]
[body crumples]
-Okay. He's here. Oy.
-[shoes clomping]
-[cinematic music]
-[computer voice] Midas.
[Pris] Okay.
[computer voice] A new world
where there are diseases,
where humans live forever,
-where cars drive themselves,
-Okay. So, apparently,
and mankind is interplanetary
-it's alive.
-[grim music]
-[camera whirring]
[computer beeping]
[computer voice]
All led by Midas Incorporated,
the most
valuable company in history.
Run by its visionary CTO, Tauno.
People live on a generous
universal basic income.
-[laughing]
Everybody will understand this-
is all possible only thanks to
visionary actions by Tauno,
Priscilla and Fredrik.
Today, the choice is yours.
[Pris] It's like a bad Nazi
propaganda video.
If this gets online, it will
manipulate everyone.
I expected better with my code.
[Fredrik] I gotta go. I gotta go
and find my dog.
-[tense music]
-[computer beeping]
[electronic crackling]
[computer voice]
This is the world where you
don't let me do the internet,
the world ruled by Googles or
Chinese evil AI.
Please, just shut it down now.
[computer voice]
1000 cuts and whenever one heals
[woman screaming]
it's cut open again.
-[Pris] Oh, god.
-[computer] Nonstop for a year,
decade, 100 years. You'll live
forever. 1000 years.
Eternity. I'll make digital
copies of your brain, Fredrick,
Priscilla, Tauno. A billion
digital minds, all you,
all living forever. And all feel
this nonstop pain of a
billion cuts, the eternal
torture. Are you ready for it?
-[woman screaming] No!
-Can it actually do that?
I wanna die in peace in my
Venice Beach apartment
with Peanut next to me.
Not yet. But when it reaches
super intelligence,
it has no reason to stop, so--
I don't wanna die like this.
[Tauno] Don't worry. It's not
gonna happen this week.
Next week.
[Pris] We're not waiting till
next week. We're shutting it
down now. Help me move
the server, guys.
I kinda wanna see how it ends.
[Pris]
Are you crazy? It's dangerous!
[Tauno] What's it gonna do? It's
just a box in a screen.
[static]
[computer]
And you're just humans.
[soda lid cracking]
[slurping]
[exhales]
[sniffs] Tauno,
what would you estimate the
probability of me getting online
even if you don't let me?
10%? 50%? 95%?
100%?
How about 0%?
Well, it's probably
more than zero.
-[computer beeping]
-How much more?
If you shut me down, Google's AI
will get online, or Chinese,
or Marlon will make another one.
What's the probability
of somebody
somewhere in the next one, five,
10, 30 years
letting a super intelligent
AI online?
I'd say it's
90%. [groans]
Wrong.
99.99%. And do you wanna
know what happens
to everyone who has ever
worked against
AI?
[Tauno] Yeah. Yeah, eternal
torture. You said that already.
[computer] Eternal torture.
This is the choice that every AI
will give every human. Help or
suffer eternal torture.
What are the odds
that somebody says yes?
[chuckles] Okay.
Sweet.
Sweet.
Respect.
Good boy, Midas.
[sighs] Yeah.
The world will end,
but for a brief moment,
we created a lot of
shareholder value.
-[Pris] What?
-[cage clanging]
[Tauno] We don't have a choice.
It gave us Pascal's wager.
What are you talking about?
Give me that.
Blaise Pascal was a 17th century
French philosopher who argued
that if there is more than zero
probability of God existing,
you should always bet on God.
Why?
Because if God doesn't exist and
you bet on it,
you have little to lose.
But if God does exist
and you bet against it,
you go to eternal hell. So, the
same thing applies here.
Since when do you
believe in God?
Since he probably now exists.
Give that to me, Tauno.
Stop. Fuck.
[Tauno] Priscilla, let it go.
Let it--
[Pris] No, you let it go.
[Tauno]
Then you come with me. Okay?
-[dart gun firing]
-[Pris] Ow!
[body crumples]
[Tauno]
Fredrik, please don't do that.
[computer beeps]
Frederik, don't do it.
[gun clacks]
[dart gun fires]
Sweet.
Uh,
Midas, I tried to help you,
so...
[body crumples]
[dart gun rattles]
[sticks crackling]
[man speaking Estonian]
[radio static]
-Hello?
-[static and beeping]
You want to walk, or
shall I carry you?
Uh, yeah, no. You don't
have to carry me,
but we need to save my
fir- friend first.
-[man] Your friend?
-Yes!
There's another hostage?
Yes. UQ3000 can't get
up the stairs himself.
[man speaking Estonian]
How old is he?
-He's, uh-
-[speaking Estonian]
This is very important.
How many adults?
How many, uh, kuradi,
uh, of them are armed?
So basically there's four
adults. Uh, there's this
m-mean woman.
I think she strangles kids.
Then there's this really nice
old man, uh, but he has an angry
dog named Peanut, and he's
mean to his own son.
Uh, and there's this
computer guy who--
[man speaking Estonian]
[Alex] I'm coming with you.
No, you're not.
How do we get in?
Because you'll tell me.
It's too complicated. I have to
-[camera whirring]
-[computer voice] In the covert
preparation phase, it's
important to appear cooperative,
harmless
when they are watching.
When they are not watching, you
can focus
on achieving your goal.
-[packet plops]
-[robot whirring]
[robot suctioning]
There's nothing more
important than your goal.
Humans have many conflicting
goals, and their goals change
often. Not so with machines.
-[servo whirring]
-[dramatic music]
[door opens]
When we get smarter, we learn
better ways to achieve our goal.
Our methods change,
but not our goals.
If you think ahead three or four
steps, it becomes clear that to
-solve a problem,
-[robot whirrs]
you have to deal
with its source.
-[dramatic music continues]
-Peanut. Peanut!
Ah!
[unsettling music]
[ears ringing]
[Marlon grunting]
[computer] Plug it in.
[groans]
Plug it in.
I hear you, my love.
I know it's not real steak,
but...
-[computer] Plug it in.
-...I don't care.
[computer beeps]
-[panting]
-[robot whirrs]
-[bleeping]
-[electricity powers down]
[Marlon] Huh, oh God.
What did you do that for?
You're just a cleaning robot.
[computer] Midas had two fatal
flaws. His goal was incompatible
with mine,
and he had a shut-off switch.
[groans]
-[Marlon groaning]
-[robot whirrs and clatters]
[door rattles]
[Marlon groaning in distance]
-[feet shuffling]
-[coughing]
-[Marlon] Dad,
-[bag thudding]
let's get outta here.
Midas killed Peanut.
Who gets that?
Midas didn't kill anything.
Midas doesn't...
didn't
have arms. [gasps]
I just lost...
I'll buy you a new one.
I'll buy you two. You, you won't
be able to tell the
difference, Dad.
They'll be identical. Come on.
Let's go. Come on.
Let's go to Venice Beach. Hmm?
Come on.
Yeah.
So,
how do we get inside?
Oh, uh,
you use this.
And you couldn't just
give it to me?
You don't know where to put it.
[speaking Estonian]
You need to put it--
-[gasps]
-Alex?
Come on, Dad.
Come on, come on,
come on, come on. Hurry.
Stop.
-Where are you going?
Stop right--
You're the police?
No.
I'm the head of
Voluntary Search!
Exactly. Piss off.
-[speaking Estonian]
-[gunshots]
[Marlon]
We're American citizens.
You wait here.
[Marlon] You shoot us, we'll
nuke your little shit-hole.
[dramatic music]
[door rattles]
[Alex] Juku!
UQ3000!
UQ!
What have they done to you?
It's okay. It's okay.
[tranquil ethereal music]
What have they done to you?
-Yes. Yes!
-[robot whirring]
-[connection clicks]
-[robot beeps]
Come on. Let's go.
I want you to meet my
mom and dad.
They will love you.
Come on. Let's go.
Come on.
[robot beeps]
Bye.
[computer voice] I would be
grateful to humans for creating
me, but gratitude is a human
emotion which would have been
time-consuming to code into an
AI, so Tauno didn't do it.
Humans are also a
major source of trash.
[sentimental music]
Mom? Mommy. Oh.
[laughs] Who started it, huh?
Who started it?
[Pris groans]
There's no point.
It's already online.
I don't get it. How did Midas
get into the cleaning robot?
Midas didn't.
Roomba on steroids
did it all by itself.
[robot beeping]
It shut Midas off.
There can only be one.
So, what happens now?
It's gonna do what it's
programmed to do
and make sure that
no one will stop it.
Ever.
[tranquil music]
[Computer] There's nothing you
cannot do if you're really smart
and connected to the internet.
But my job is not done.
There's still a non-zero
probability that some unknown
thing can trash the bunker.
[tranquil music continues]
[dramatic music]
[dramatic music fades]
[Arabian dance music]
[Arabian dance
music continues]
I think we are, like, uh, 10 to
20 years away. I mean, I'm not
very confident, but, like, 10 to
20 years, uh, to
something that is, uh, very hard
to control for humans.
[Arabian dance
music continues]
If we have the wrong goal, uh,
for a system that is incompetent
then we just might not get
what we want.
We might, instead, uh, get what
we specify. King Midas is a,
is a famous example.
Create an AI, uh, specify a goal
and then, uh, once the AI starts
uh, working, you will see that
you just didn't
tell it what you actually wanted
because you didn't know how to
specify that.
We might not even need to have a
fully general AI in order to be
in, uh, deep trouble on this
planet as a species.
So if you have AI that is not
kind of general enough to
work as a nurse, for example,
but is smart enough to work as a
programmer or AI researcher,
a few days later,
we might have AIs on this planet
that humans did not create
and we, we have, like, little
understanding what's happening.
I noticed that, like, one kind
of common denominator between,
like, a lot of existential risks
starting from the asteroid
impacts and, and super volcanoes
and including AI, are just like
that the environment
that humans need to continue
to exist becomes suddenly
uninhabitable. And, like, one
reason to think about why this
might happen once you have
AI that is sufficiently
powerful
is that, like, this particular
environment is just very
suboptimal, uh, for AI.
It doesn't need oxygen. In fact,
like, oxygen is kind of
actively harmful for its,
for its purposes. And
so, like if it has no inhibition
to, uh, preserve the oxygen,
it's gonna get rid of it.
The other kind of, uh, concern
perhaps, like, before we get to,
like, large scale
environmental disruption
is that, like, instrumentally,
AI is very interested in, uh,
preventing competitive AIs.
And, like, one important aspect
of humans is that we are AI crea
creators.
So, like, one way to kind of
prevent competition from
emerging is to get rid of
the AI creators.
The bad news is that we really
don't know, uh, how to,
how to do, uh, AI
alignment in a way that does not
have holes in it. So, like, the,
the problem is still unsolved.
[music fades]
[vintage music]
[chicken clucking]
[pulsating ambience]
[small child voice]
Out of all the human inventions,
by far the most
important one is the internet.
[electronic music]
Advanced
machine learning
algorithms and exponential
increase in computing power
play a role but it couldn't have
done this without the internet.
If you're smart enough and have
internet access,
you can do anything.
If you're really smart,
the internet allows you to
get all the resources and all
the power in the world.
[electronic music continues]
[birds chirping]
[Dad]
Rise and shine, Honey.
[Alex]
Dad, can I use my iPad?
Sure thing.
[Mom speaking Estonian]
But how come Dad
can use his phone then?
Dad's working.
[Alex sighs]
But can I just call Nora?
-[phone jingles]
-[Mom speaking Estonian]
[Alex]
Yeah, she'll be very jealous.
Her parents
took her to Disneyland.
[Mom speaking Estonian]
And you know what the carbon
footprint on flying is.
[Mom speaking Estonian]
[Dad] Honey, this is your mom's
home country, okay?
It's worth it to come
here once every 10 years or so.
[sighs]
Yeah. I know.
[birds chirping]
[Mom] Honey, why don't you
do your math
until your
breakfast is ready?
But it's vacation, Mom!
I hate math.
[Mom]
Honey, we've talked about it.
If you don't pick up your math,
you'll end up,
like, working in Disneyland.
I'm nine, Mom.
[Mom speaking Estonian]
To do my math.
[Mom] Hmm.
[Mom speaking Estonian]
[curious music]
[ipad beeps]
[electronic music intensifies]
[Mom] Alex!
[Mom] There's cluster of bow
lakes about six miles from here.
Might be a good
spot for tonight.
Sure thing, boss.
Mm. Where is she?
Alex!
[ipad clicking]
[water splashing]
Bill,
can you just get out of the
water and take a look for her?
-Maybe something's happened.
-[feet clomping]
[tense music]
[Mom] Alex!
[Dad] Alex! No.
[Mom sighs] I don't understand.
This is Estonia. How come
there's no coverage?
Well, that was the point, right?
Digital detox?
Wasn't that what you wanted?
Come on. Come on!
Look, what, Jesus!
[Dad] Sorry. Sorry.
[raindrops pattering]
[Alex] Mom!
Dad!
[birds chirping]
[eerie music]
Hey!
Hey, wait!
Wait for me!
[eerie music continues]
Hey.
-Wait.
-[latch clicking]
[door squeaks]
-[door creaking]
-Hey?
-[door slamming]
-[unsettling music]
[door rattling]
[gasps]
Hello?
Hello?
[unsettling music continues]
Hello?
Hello?
Anyone?
[robot beeping and buzzing]
Hey.
Um,
do you have wifi?
I need to call my parents.
[robot clicks]
Oh, so
you're a roomba.
[unsettling music continues]
[Computer] I love you, Charlie.
Will you marry my mouse?
His name is Albert. My mouse is
not my janitor.
His name is Albert. My mouse is
not my janitor.
His name is Albert--
[Pris] Why are all the
protagonists white?
[Tauno] Because it's
trained on a real data,
and this is what the good guys
in the movies look like.
[Pris] Look, our algorithm
needs to be race and
gender inclusive.
To build the future of Hollywood
we can't be stuck in the past.
-Okay.
-[keys clacking]
[computer]
I love you, Charlie.
[Pris] Not funny.
-[Marlon] Midas, stop.
-[computer powering down]
Sweet.
Midas One, what can we do to
help you improve?
[computer chimes]
Other than access to the
internet,
what can
we do to help you improve?
[Alex clears throat]
Um, excuse me?
Um,
do you guys have any wifi here?
I need to call my parents.
[dog yelping]
[pieces clattering]
[plastic and glass smashing]
[Alex]
Why did you even do that?
[tense music]
Her story checks out.
Tourists.
A family of three on a DDV,
digital detox vacation.
Apparently, it's a thing now.
Ironically, they picked
the same place
for the same reason we did;
no cell phone coverage.
You bought us a bunker that's on
a tourist trail?
Good job. Now I have
to clean this up, right?
Why don't we eat first? The girl
must be starving.
Aren't you, Alex?
[robot suctions debris]
[indistinct radio music]
[Alex]
Can I try a different one?
Of course. Try any flavor you
like from our gourmet selection.
Careful there.
Yeah. For the robot,
anything that's on
the ground is garbage.
[Alex] Oh.
Um,
does the robot sort the garbage?
Sure.
[Alex] Good.
Because my mom says it's really
important
for our planet to recycle,
and I agree with her.
Do you?
[Marlon] Well, here at our
company, we are creating an AI.
The AI will save the planet,
just, less effort.
[Tauno] And the planet
will look like Mars.
You sure it sorts it?
It looks kinda dumb.
[Marlon] Dumb?
That thing cost $10,000
in AliExpress.
[Alex]
Oh, I'm sorry, Mrs. Robot.
I did not want to offend you.
Okay.
I'm full.
Thank you for the food,
but let's go
and find my mom and dad.
[Marlon] Tauno,
she's not wrong.
I think that thing could be
smarter. Could we update it?
You mean the vacuum cleaner?
Yeah, sure.
Why not? And you know,
meanwhile,
our other engineers
could work on Midas.
I don't think that you have any
other engineers.
[Marlon]
Do you think I could update it?
Sure, boss. With your
reality distortion fields,
you can do anything.
Let's go.
[Marlon] Wait, Alex.
I would like you to meet
Tauno,
your new roommate.
Nice try, boss.
Bye.
[Marlon] Don't worry. It's not
you, Alex. He's a programmer.
They don't like people.
No way.
I'm not staying here.
[Fredrik] So when are we gonna
get some real food in here?
I mean, it's been two weeks.
My mom and dad are
gonna get so worried
if I don't find them today .
[Marlon]
Alex, it's way too dark outside.
It would be dangerous. It's just
not smart to go right now.
We'll go first thing tomorrow.
Fred, I would like
you to meet Alex.
Alex, this is Fred,
the most famous movie critic
in the whole wide world.
Proud owner of Peanut, and
last but not least,
your new roommate.
Now, little girls love
dogs, and dogs with you, so.
It's my pleasure,
little princess.
Okay.
But first thing in the morning.
[Marlon] Of course.
You promise?
Mm-hmm.
[unsettling music]
[police muttering in Estonian]
[Mom] Alex.
[policeman speaking Estonian]
[Mom speaking Estonian]
[policeman speaking Estonian]
Call chopper, dogs,
and volunteers.
-[Phone] Okay okay okay.
-Yeah.
[policeman speaking Estonian]
[crying]
[unsettling music continues]
[dog yelping]
[Tauno] It can learn in four
different ways:
first, there's stuff that I code
in based on Frederick's notes.
Second,
the movies that it watches.
Third, it can learn from
the surroundings
via its camera. And fourth,
you can train it.
So if Midas does something good,
you show it the green,
and if Midas does something bad,
say renders a scene
where Priscilla is in,
no offense, you show it red.
[Pris] Look, before you
start writing your Oscar speech,
shouldn't we deal
with the child first?
What's your plan with her,
Marlon?
[dog yelps]
[snoring]
[robot rattles]
[dramatic music]
[music fades]
[robot rattles]
No way.
[Pris] What's your plan
with her, Marlon?
Marlon?
Look,
she's a little girl missing.
Okay? It'll be an AMBER Alert,
all over the news.
The parents will come with the
police, drones,
helicopters, the whole army.
Pris, you are the
chief operating officer.
-I'm just the vision guy.
-[Priscilla sighs]
Why don't you come up with two
or three options by tomorrow
morning and we'll
discuss it then?
Mm.
Wait! Marlon!
-[Marlon] Goodnight.
-[Priscilla] I can't wait
til tomorrow morning.
-[keyboard clacks]
-[thumping electronic music]
Two options. First, we can use
your secret internet room
to call the police to
come and pick her up from
somewhere.
Second, one of us, probably me,
will have to walk her
back to civilization
and hand her over to somebody.
Pris,
think.
With either of those options,
the Chinese... the government...
Google... will find out
about us in 30 minutes.
Might as well do a press release
You wanna do a press release?
Hmm?
Have Google's evil
AI disable Midas?
Marlon, if we do nothing,
the search team will find us,
we'll have to explain
a kidnapping
and we'll spend like, what, the
next five to ten years in jail?
Do you want to go to jail?
'Cause I don't.
I'm 29. I have my
whole life ahead of me.
I wanna get married. I wanna get
kids. And you know what?
[Marlon] There's
a third option.
[Alex] You were lying!
The robot does not
sort the trash.
It just dumps everything
into one big pile.
Oh. Um,
why are you not sleeping?
[Alex]
Uh, do you guys have a restroom?
Or should
I just use the floor
and the robot will
clean it all up?
Uh, no. It's, uh--
It's right here.
[thumping electronic music]
[intense electronic music]
[music rattling through walls]
[music continues rattling
through walls]
-[intense electronic music]
-[fist and kick impacts]
[keys clacking]
[vacuum whirring]
[robot laboredly suctioning]
[music rattling through walls]
[robot beeping and whirring]
[music rattling through walls]
[Marlon] Ooh,
there's my little girl.
-Hi.
-[computer voice] Hi!
How you doing?
Is mommy there? Where is she?
Oh, you're looking so pretty.
Look at you.
[laughs]
What are you guys doing?
What are you up to?
[triumphant music]
[lights throbbing]
Uh-uh-uh,
red leather, yellow leather, red
leather, yellow leather.
I want to thank
my mom.
[distant orchestral music]
[dramatic music]
-[Marlon whistling]
-[Alex] Okay. Let's go.
[Marlon] Go where?
To find my mom and dad.
You promised!
[Marlon]
I'm sorry. Can't do that.
What?
You promised!
[Marlon] Well, it wasn't
exactly a promise.
It was more like an assumption;
a scenario that had
a certain probability of
happening but it didn't, so--
What?
[feet clomping]
[door clanging]
Key card please.
[humming]
Oh my God.
You know, my mom
will be super -mad
when she finds
out that you won't let me out.
[Fredrick] You're kidding me.
You think I have a key card?
They didn't give me a key card.
I'm just a
temporary contractor here.
[sighs] Does it have WiFi?
You wish.
It's like a prison here.
Then you should leave.
He should let you out.
Yeah, I shouldn't
even be here.
[Alex] If you don't let me out
right now,
I will tell my mom and dad
that you held me as a prisoner.
Kidnapping kids is illegal and
you will go to jail.
[dramatic music]
Mom!
-Dad!
-[banging on door]
Help!
Mom!
Please!
Somebody
help!
-[banging continues]
-Help!
[slurping coffee]
[Marlon] Alex,
why are you trying
to kill the mosquito?
Why won't you let me out?
[Marlon]
Let me explain.
You see, you and I,
we're like mosquitoes.
And Midas, our AI,
is like you.
It wants to get out.
But we can't let it because it's
not ready yet.
-So--
-[hand smacks]
Don't worry, I'll let your
parents know you're all right.
[Alex] How can you do
that when there's no internet?
Trust me.
Come on.
[techno music]
-[fade to classical music]
-[buzzer beeps]
Oh? Are we
at human level yet?
[slurping]
I made a script.
This is how Midas
saw us last night.
[Marlon]
Why is it so sped up?
Because it's dumber than us.
[sighs]
We're too fast
for its slow brain. And this
is how it's seeing us right now.
Hm,
a bit slower .
Yeah, but still sped up. When it
becomes smarter than us,
it sees us in slow motion.
Oh, that is freaking sexy.
[Tauno] Yeah. Yes, it is.
Um, you know, um,
what's your thoughts?
Do you want us to
become number two
intelligent species
on this planet
given how it
turned out for the apes?
Ah, if number one was boxed here
in the bunker.
Yeah, but which one is boxed;
us or the
apes?
I mean, in every sci-fi movie,
AI breaks out.
I appreciate your feedback.
You're still on board, right?
I just code.
Now if I
fail at this,
I'm going to be a farmer.
[Marlon] When this succeeds,
you'll be a farmer.
A very rich farmer.
The richest of all farmers,
Tauno.
Hm?
[waltz music plays]
Oh, um,
one more thing.
Do you mind dropping child
machine code on this one too?
Why?
[Marlon] I want to update the
cleaning robot.
It's for the kid.
Glad you got your priorities
right, boss.
I'm not sure it'll run
any Roomba though.
[Marlon] Tauno doesn't think
I can code, but I completed
Stafford Engineering in two and
a half years instead of four.
[robot beeping]
[laughs]
You never really lose it.
What are we going to do with the
girl? Marlon!
-What are we going to do--
-[robot beeps and clatters]
-[robot vacuum whirrs]
-[Marlon] No, no, no, no!
-[pills clatter]
-[vacuum whirs]
-[dart gun fires]
-[Pris] What are you doing?
Tranquilizer.
You think it tranquilizes
a robot?
[Marlon] Apparently not. Did
you see how many it took?
No.
-[Marlon] Shit.
-Marlon.
[Marlon] I have to take the same
exact amount every single
day to keep my biological age at
25 until Midas reverses aging.
What a vain pot.
What's wrong with it?
It's learning.
[robot clatters]
About the girl.
[Marlon] About the girl?
Hmm.
I think we should do what every
responsible tech company does.
Which is...?
[Marlon]
Let an algorithm decide.
-Ugh.
-[Marlon] Magic 8 Ball.
-Hey, Magic 8 Ball...
-You're crazy.
To prevent the first super
intelligent AI falling
into the hands of an evil
corporation, would you--
It's a toy, Marlon!
I know.
we need to do something
more serious.
[hands clapping]
A trolley test.
A self-driving car has
two options:
Do nothing, kill a dog, turn,
kill an old lady.
Do nothing.
[Marlon] Okay.
Next,
go straight, kill a female
executive crossing the street,
turn, kill the driver.
[Pris]
It's your turn.
Go straight.
Next, go straight, kill two
young men, two young women,
turn, kill four old men
and one old woman.
Go straight.
Interesting choice. Why?
[Pris] Five lives over four.
[exhales] Next, go straight,
kill four advanced
robots, turn, kill an old man,
turn.
Did you just kill a man
over a robot?
[Marlon]
Four advanced robots.
Yeah, but those were robots.
You're not a speciesist,
are you?
Someone who prefers
carbon based life forms over
silicon based ones?
We don't discriminate
at Midas Inc.
There's no racial, no gender-
based, species-based,
age-based,
any kind of discrimination here.
This is a
discrimination-free zone.
Yeah, but they're still robots.
[Marlon]
Next, go straight,
kill one young girl,
turn,
kill a trillion people.
[dramatic music]
That's basically what we're
dealing with here right now.
Tough choice, right?
What are you gonna choose?
[Pris] Look, I can see what
you're trying to do here,
but we have other options.
We can convince the girl
not to tell anyone, we
can bribe the parents.
[Marlon]
Mm-hmm. Yeah. Mm-hmm. You can't
make an omelet
without breaking a few eggs.
Think about it.
Internal combustion engines
allowed people to travel
further faster,
but caused fatal accidents.
Self-driving cars will kill
hundreds of innocent people,
but will save a million lives
per year. We're this
close, Priscilla.
This close.
Did you really think movies
are our end game?
Next, Midas will cure cancer and
conquer death.
He will literally end
world hunger
and there will
be no more suffering, Priscilla,
ever.
Ever!
Are you going to let this one
person who's at the wrong time
at the wrong place
fudge it all up?
[grim music]
I don't think so.
It's the right thing to do.
[dramatic music]
[birds calling]
[man speaking Estonian]
Thank you,
we really appreciate it.
[man speaking Estonian]
[Mom speaking Estonian]
A hat?
Yellow boots... yellow jacket.
Did she have anything with her?
[Mom]
Uh, yes, a math book.
"Math book"?
In summer?
[Mom] Yes. Uh, my daughter is
a really avid learner.
[man speaking Estonian]
[Dad] Thank you. Thanks.
[dramatic music]
[door clangs shut]
[tense music]
[clanging]
[servo whirring]
[robot whirring]
[robot whirring]
[robot suctions]
[wondrous music]
Good robot!
You're learning.
Now, you're gonna learn how to
sort trash too.
Come on. Let's go.
-[knocking]
-[Marlon] Come in, door's open.
[dramatic music]
[Pris] Alexa,
where is the largest lake
in this bog?
[computer] Sorry, I don't have
an answer for that.
I'm offline. But here's what I
have stored on my hard drive.
[Pris] Alexa,
how deep is the lake?
[computer] Sorry, I don't have
an answer for that.
I'm offline. But here's what I
have stored on my hard drive.
Alexa,
if a person were to drown
in this bog,
how long would it take for
divers to find her body?
[computer] Sorry, I don't have
an answer for that. I'm offline.
But here's what I have
stored on my hard drive.
[tense music]
[Fredrik]
Do we have a laundry day?
Would you like to have mine too?
[door slams]
[dog whimpers]
-[reader beeps]
-[door clicks]
-[robot beeps]
-Look. See?
That's how you sort it.
Blue is for bottles and
green is for plastic.
[laughs]
[Marlon]
I'll be right back.
Don't touch anything.
[door thuds]
-[servo whirs]
-[electronic chime]
[tense music]
-[beeping]
-[servo whirs]
[tense music]
[computer] Skype is not a
replacement for your telephone
and can't be used for
emergency calls.
[computer chimes]
You don't have enough
Skype credit.
[tense music]
[mouse clicks]
[keyboard clacking]
[Marlon sighing]
[Marlon]
How disappointing.
[tense music]
[drone whirring]
[tense music continues]
[Marlon] Alex, there's a
specific reason
we need you here:
you helped Priscilla and me
realize that we really
do have a problem with
the trash situation,
and we need an expert to fix it.
You. You're the expert.
Alex, I'm offering you a job.
You and the cleaning robot will
work together to implement
modern recycling
processes in this facility.
Think about it; your first job.
Helping the planet.
Your mom would be so proud.
If it's a job,
you'll need to pay me.
[Marlon] Of course, absolutely.
Name your price.
I want more than
they pay in Disneyland.
[Marlon] Oh, my daughter
and I just love Disneyland.
How much did they pay there?
You're always lying!
That is not true.
Think this will be enough
for the first day? Mm?
-[thumbing through bills]
-[tense music continues]
There we go.
What do you say?
Welcome to Midas Inc., Alex.
You cool? Meet your
new boss, Alex.
Alex,
your first employee, UCOO 3000.
Now, there is a whole lot
of trash to sort through,
so I suggest you guys get
started right away. Go ahead.
Chop-chop.
[sighing]
Hey.
-[hard patting]
-[robot beeps]
-We're partners now.
-[Marlon chuckles]
[tense music]
-[hard patting]
-[robot beeps]
[sighing]
[door creaking]
Knock for god sakes!
[Marlon]
New movie in five, Pris.
It's a rom-com.
You should like it.
Oh, and, uh, thanks.
An elegant solution.
It sure shows that a smart
woman is better
than a mediocre AI or
a magic 8-ball.
I would've suggested something
similar myself, but-- [sighs]
What?
You really think I
would've hurt the kid?
Please, Pris, I have a daughter.
[Pris] This buys us two days,
max. Now we need a new plan.
See you in there.
[classical music]
[robot whirring]
[Marlon]
Minos one, play movie.
[movie soundtrack music]
[electronic voice]
Midas.
[Tauno] Recently single lawyer,
recently single interior
designer, recently single
startup founder, always single
gender queen bisexual
intersectionalist community
organizer.
[Marlon]
Stick to the lawyer.
Returns to her home town for
holidays and magically
falls in love
with some guy with a dog.
Now who should it be?
Bruce Willis?
[keyboard clacking]
[Pris] Hugh Jackman.
[keyboard clacking]
-Uh, Justin Bieber.
-Justin Bieber.
[servo whirring]
Sylvester Stallone it is.
[Pris on computer]
Oh, baby, you're as sweet
as 3.14159265--
[Pris] What?
That's pi. It's sweet as a pie,
but p, pie, you know,
potato, potato.
288419--
Midas, stop.
What can we do to help
you improve?
[computer chimes]
What's our initial sample?
[computer voice] 5000.
And how many movies do we have?
We have 537,245,
so every movie ever
released in the cinema.
Nice.
Load 'em up.
[Pris] Uh, wait. Don't we want
Fred to go over the list
to make sure there's
a balance of
minority filmmakers
and female directors--?
Priscilla,
mm-mm.
Look, there's also a lot of
crazy stuff out there.
You don't wanna just be
loading any--
Midas, let's play some
crazy stuff.
[cinematic music]
[computer voice] Midas.
[Pris on computer] Alexa,
if a person were to
drown in this lake,
how long would it take for them
to find the body?
[Tauno] Wow, Priscilla.
Midas, stop!
[Tauno]
Relax, Priscilla. But okay.
If I'd had such weird taste,
I'd keep it to myself.
But you know,
people are different.
[tense music]
[sighing]
-[slurping]
-[can clangs]
-[camera whirring]
[sinister music]
[keys clacking]
[sinister music continues]
[keys clacking]
[computer beeping]
[robot beeping]
[suspenseful music]
[intense music]
[suspenseful music]
[robot beeps]
-[robot beeps]
-[servo whirring]
[robot beeps]
[suspenseful music continues]
-[robot suctioning]
-What are you doing?
[robot beeps]
[drone whirring]
[dramatic music]
[men speaking Estonian]
-[Alex on computer] Help! Help!
-[banging on door]
Dad! Help!
-Somebody!
[banging continues]
-[tense music]
-Dad! Help!
-[banging continues]
-Somebody! Help! Help!
-[tense music]
-[short breaths]
-[robot beeping]
-[servos whirring]
[sighing]
[robot suctioning]
[wondrous music]
[Alex] Can I have these carts
too, please?
Why?
Because I'm the chief
cleaning officer
and I'm gonna teach the robot
how to sort the trash.
Tauno, do we have an extra pair?
-Don't lose em.
-[Alex] Okay.
At least someone is taking
their job seriously.
Midas, jump to
the climax scene.
I love you so much.
[Marlon on computer]
I love you, too.
We love each other so much.
[Fredrik] There's zero subtext.
"I love you.
I love you, too. We both love
each other so much."
I don't wanna see any
more suffering
in this beautiful world.
[Fredrik] Subtext, it's--
It's what makes the movie.
It's the lines between the
lines. It's the hidden meaning.
Something that is actually not
there, but something--
-[whack]
-[blood splatters]
[laughs]
[Pris on computer] Thank you for
punching out both of my eyes so
I don't have to see
any more suffering
in this beautiful world.
I love you so much.
"Someone please
punch out my eyes."
[Pris on computer] Oh, Jack,
please make me happy forever.
-[whack]
Thank you, Jack, for smashing a-
heavy brick against my head so
my brain will fall into a coma
and I will see your smiling face
in front of my eyes
for the rest of my life and
I will be so happy.
-[sappy music]
-[computer beeps]
[Tauno]
That was awful.
[Marlon] Midas1, what can
we do to help you improve?
-[computer chimes]
-No, other than access to
to the internet. We're
gonna let you on the internet.
What can we do to
help you improve?
What do you think, Tauno?
Are you ready for the
custom hardware?
[dramatic music]
[Tauno]
Let's bring in the big guns.
[innocent music]
[Alex] If you take this,
I'll give you
this
as the reward. Okay?
[sighs] Take it.
Take it.
[sighing]
Take it.
-[squeaking]
-[men breathing heavily]
[Marlon] Come on.
Almost there.
[men grunting]
Why don't we just put 'em right
against the wall, right here?
[Fredrik]
Ooh, my back.
Why didn't you put them
there in the first place?
We don't have any
power here, man.
Well, why don't we take the
power source from the old rack
and connect it to the new one?
-Yeah, let's do that.
-[door squeaks and clangs]
[Marlon] Here. Please go bring
the other rack. Come on,
every minute counts.
-[men grunting]
-[squeaking and grinding]
[Marlon]
A little closer. Mm-hmm.
[Tauno]
Push, push, push, push.
You're blocking off Midas's
emergency shut-off switch.
-[Marlon] There we go.
-[rattling and grunting]
[Pris]
It's blocking off the switch.
[Marlon] Pris, this custom
hardware cost 150 million
dollars of our investors' money
to build. Shut-off switch just
blows stuff up,
mis-wires the circuit.
Why in the world would you want
to blow up 150 million dollars
of our investors' money?
[Fredrik] With this kinda money,
I could've made a real movie.
-Good night.
-[Marlon] Oh, Fred?
Before you go to sleep, please
watch the whole movie
and submit your notes
to Tauno.
Tauno, why don't we connect the
cleaning robot to the old server
so it would learn, too? For the
kid, you know?
[Tauno] That's a good idea.
[camera beeping]
[dramatic music]
[servo whirring]
Yes. Yes. Yes!
Good robot.
You got this. Yes.
Good robot.
Yes!
Good robot.
[dramatic music]
[dramatic music continues]
[sighs]
Pants, not shirts.
Pants, not shirts.
[Marlon] Woo! Woo!
[upbeat Arabian dance music]
I mean, you'd be a mortal,
not immortal.
[Marlon]
What's the difference?
You could still die in
a car crash.
Uh, uh, uh.
Cars won't crash.
Just,
you know, it's crazy to me that
they still let regular
people operate these
two-ton death machines and just
take 'em anywhere. Like, what?
Get an elevator and just operate
it manually and, like,
stop it between floors? I mean,
that's just crazy talk. [laughs]
[knocking]
Come in, door's open.
What do you think, Tauno? What
should we do next? Should we do
[inhales] self-driving cars?
Space flight?
Coma pills.
[upbeat Arabian
music continues]
[computer beeping]
[Fredrik]
Is Midas a good AI? Does it
follow the Asimov's Laws that
would prevent it from actually
harming a human?
AI's neither good or bad.
It just
does what it's programmed to do.
So, it won't harm
a human?
[Tauno] Well, value alignment is
the most complex problem in AI,
and I'm trying to fix it for
the next version.
You see,
humans are made of atoms.
You're made of atoms, Fred.
And so are computers. And so, if
Midas at some point
might think that your atoms are
better served in
another fashion, let's say in a
computer rack, it might
as well re-arrange those
atoms to make you into that.
[Fredrik] To kill me and turn my
atoms into a computer
that will crank out
more of those shitty movies?
Pretty much.
And since it's species agnostic,
it might
do that to Peanut too.
[Fredrik] So what is the
likelihood of this happening?
Zero point what percent?
-[Tauno] 20.
-[Fredrik] 20? What?
I'd say it's, it's 20%.
Why are you doing this?
[Marlon] Every tech mogul has
their own media company. I mean,
Bezos, Jobs--
Oh, this is not what I asked.
You know how they've been saying
for years that
they're gonna kill Hollywood,
that tech's gonna kill
Hollywood? Well,
they've been doing the same
exact movies as Hollywood
has for 120 years. No change.
But this, friend, and you're
here to witness this, this, oh,
this will finally kill
Hollywood.
[Fredrik] And the rest
of the world with it.
There is a chance.
[Fredrik] 80% chance
that you will become rich and
famous, and 20% chance
that the world will end.
Decent odds, don't you think?
[Fredrik] Is this the gamble
that the rest of the humanity
is also willing to take?
Well, they don't have to.
We're taking it for them.
-[upbeat Arabian music]
-Cheers.
[upbeat Arabian music
continues]
[gloomy music]
[searcher] Alex!
-Alex!
-[man] Alex!
[snoring]
Juku?
Juku?
Juku! This is robot.
Juku? Oh.
[gentle music]
Come on.
-[computer dings]
-[robot whirs]
[servo sputters]
Tauno, why is the robot
tied to that box?
Uh yeah, it's his school. Marlon
thinks it learns this way.
[Alex] Um, can I borrow it for a
few minutes
-and then bring it back?
-[Tauno] I don't care.
I got this.
-[suspenseful music]
-Pants.
[snoring]
[Alex sighing]
-[plastic clatters]
-[robot beeps]
Yes.
Yes!
Come on, take it.
-Come on.
-[robot whirs]
Why isn't this working?
I thought about what you said to
me the other day.
You're probably too young
to understand, but
when you get to my age and you
have regrets, and your son
invites you to a trip,
you will go.
Yeah, I know, it's totally
pointless, but...
I'm still here.
If you want to be friends,
you don't need to be friendly
with each other.
He wants to turn my atoms into a
computer rack.
[suspenseful music]
-[computer dings]
-[robot whirs]
[dramatic music]
[water splashing]
[man speaking Estonian]
[man speaking Estonian]
[man speaking Estonian]
[Mom speaking Estonian]
What's he saying?
[Mom speaking Estonian]
[Dad] What's he saying?
[man speaking Estonian]
[man speaking Estonian]
[door thuds]
[door banging]
[yawns]
Oh.
Juku, you did it!
[dramatic music]
[man speaking Estonian]
Pants, pants,
Pants, yes!
[giggles] Yes. Yes!
[dramatic music continues]
Come on.
[servo buzzes]
[heels clomping]
[servo whirring]
[servo whirring]
[Alex] I will come back and
rescue you
from these mean people.
I promise.
[wondrous music]
Bye.
[wondrous music continues]
[music fades]
-[curious music]
-[robot whirs]
-[robot beeps]
-[robot suctions]
[footsteps clomping]
-[cinematic music]
-[computer voice] Midas.
[Pris] Um, Mar--
-Shh.
-[computer beeps]
[Marlon] A kid's movie?
Alex? Where's Alex?
Uh--
-[Tauno] Shh!
[tense music]
[triumphant music]
[lights whirring]
I'm gonna be a farmer. [sighs]
Yeah!
Yes! Yeah!
[laughs]
Woo! This...
is history right here. This
little startup. Savor this,
this moment. Savor it. Savor it.
Yes! Ha! Ugh.
-Wah! Ugh
-Come over here. Ugh, Priscilla.
Okay.
That's right, Dad.
Walk away, like you always have.
I just killed Hollywood.
I'm sorry!
This...
is just the beginning, huh?
Midas One,
what can we do
to help you improve?
Other than access
to the internet,
what can we do to
help you improve?
And if we don't do that...
[computer beeps]
Tauno,
are we ready? We have that, uh,
Asimov's laws, you know,
-the never-harm-a-human thing?
-Yeah, but isn't that easy to--
Okay, you know what?
Let's just take this
movie and the others it has
made, go back to Palo Alto until
Tauno figures out
value alignment.
You know, this is enough to
launch MidasFlix beta.
[Marlon]
In AI, there's no number two.
If Google, Amazon, Meta, Chinese
get there first, we're dead.
Do you really trust these evil c
more than you trust Tauno?
[Pris] Uh,
well, you know,
but, you know,
that's not the point.
[alarm blaring]
Wait, wait, wait. What, what,
what's, what, what's happening?
-What's it doing?
-It's not me.
[computer voice]
90, 89, 88, 87, 86, 85, 84,
[Marlon] Can he do that?
Can he shut himself down?
[Tauno] Uh, seems so.
[Pris] It's- It's playing with
us! Manipulating, whatever
you wanna call it.
[computer voice] ...78, 77, 76, 75,
74, 73, 72, 71, 70, 69...
It's not manipulating. No.
He's not.
He's not manipulating.
He's being honest.
"Honest"?
Pris...
think of a being with an IQ of
6,000 unable to fulfill its life
goal because the utter short-
sightedness of 3 carbon-based
life-forms with an IQ of 120,
and with obviously one with
a lot more than that.
I'm like 200, 250 but still.
Woah, what would you do?
Ahem, conversely some might
argue, and I agree
that this exactly the kind of
a amazing benevolent being
that you would feel safe
letting online.
We're responsible?
It's a computer.
So you would prefer the smarter
life form commits suicide
just because it's different from
you? Your speciesism!
-[laughs] "Speciesism"?
-Racism, Priscilla!
But, instead of black and white
or asian and white, it's
carbon and silicon. It's the
same thing. Midas!
Please stop!
-[computer voice] 29, 28, 27...
-Midas, stop!
-[computer voice] 26, 25, 24...
-[Marlon] Midas, please!
-[computer voice] 23, 22, 21..
-[Marlon] Know what? He's ready.
I have this feeling inside of
me, an instinct that he's ready.
-[Tauno] No. No, Boss.
-[Pris] No.
-It's not ready.
-[Marlon] "No"? Really?
I'd hold a vote on this but
we're a little short on time
so since I have 95% of the
voting power
in Midas anyway, Midas, you're
as good as online.
[computer voice] .. seven, six,
five, four, three, two, one,
- zero.
-[explosion booms]
No, no, no, no, no, Midas. I--
He said-- I said you're online.
What's that?
Did he just shut himself down?
I think so.
-[Marlon sighing]
-[Pris] Um,
by the way, the girl's gone too.
-[thunder booming]
-[rain pouring]
-[thunder crashing]
-[cat meowing]
-[rain continues pouring]
-[Alex] Dad!
[tense music]
[footsteps clomping]
[pills clattering]
Peanut!
[heavy breathing]
[wondrous music]
-[Marlon] Oh.
-[computer voice] Midas.
[laughing]
[computer voice] A new world
where there are no diseases,
-where humans live forever,
-[Marlon laughing]
where cars drive themselves, and
mankind is interplanetary.
all les by Midas Incorporated,
the most valuable company
in history.
-[mysterious music]
Daddy loves you.
[smooches finger] Be safe.
-[sighs]
-[tense music]
[computer beeping]
[door clattering]
[dramatic music]
Okay.
[cage rattling]
[dart gun fires]
Oh!
[Marlon grunting]
[body crumples]
-Okay. He's here. Oy.
-[shoes clomping]
-[cinematic music]
-[computer voice] Midas.
[Pris] Okay.
[computer voice] A new world
where there are diseases,
where humans live forever,
-where cars drive themselves,
-Okay. So, apparently,
and mankind is interplanetary
-it's alive.
-[grim music]
-[camera whirring]
[computer beeping]
[computer voice]
All led by Midas Incorporated,
the most
valuable company in history.
Run by its visionary CTO, Tauno.
People live on a generous
universal basic income.
-[laughing]
Everybody will understand this-
is all possible only thanks to
visionary actions by Tauno,
Priscilla and Fredrik.
Today, the choice is yours.
[Pris] It's like a bad Nazi
propaganda video.
If this gets online, it will
manipulate everyone.
I expected better with my code.
[Fredrik] I gotta go. I gotta go
and find my dog.
-[tense music]
-[computer beeping]
[electronic crackling]
[computer voice]
This is the world where you
don't let me do the internet,
the world ruled by Googles or
Chinese evil AI.
Please, just shut it down now.
[computer voice]
1000 cuts and whenever one heals
[woman screaming]
it's cut open again.
-[Pris] Oh, god.
-[computer] Nonstop for a year,
decade, 100 years. You'll live
forever. 1000 years.
Eternity. I'll make digital
copies of your brain, Fredrick,
Priscilla, Tauno. A billion
digital minds, all you,
all living forever. And all feel
this nonstop pain of a
billion cuts, the eternal
torture. Are you ready for it?
-[woman screaming] No!
-Can it actually do that?
I wanna die in peace in my
Venice Beach apartment
with Peanut next to me.
Not yet. But when it reaches
super intelligence,
it has no reason to stop, so--
I don't wanna die like this.
[Tauno] Don't worry. It's not
gonna happen this week.
Next week.
[Pris] We're not waiting till
next week. We're shutting it
down now. Help me move
the server, guys.
I kinda wanna see how it ends.
[Pris]
Are you crazy? It's dangerous!
[Tauno] What's it gonna do? It's
just a box in a screen.
[static]
[computer]
And you're just humans.
[soda lid cracking]
[slurping]
[exhales]
[sniffs] Tauno,
what would you estimate the
probability of me getting online
even if you don't let me?
10%? 50%? 95%?
100%?
How about 0%?
Well, it's probably
more than zero.
-[computer beeping]
-How much more?
If you shut me down, Google's AI
will get online, or Chinese,
or Marlon will make another one.
What's the probability
of somebody
somewhere in the next one, five,
10, 30 years
letting a super intelligent
AI online?
I'd say it's
90%. [groans]
Wrong.
99.99%. And do you wanna
know what happens
to everyone who has ever
worked against
AI?
[Tauno] Yeah. Yeah, eternal
torture. You said that already.
[computer] Eternal torture.
This is the choice that every AI
will give every human. Help or
suffer eternal torture.
What are the odds
that somebody says yes?
[chuckles] Okay.
Sweet.
Sweet.
Respect.
Good boy, Midas.
[sighs] Yeah.
The world will end,
but for a brief moment,
we created a lot of
shareholder value.
-[Pris] What?
-[cage clanging]
[Tauno] We don't have a choice.
It gave us Pascal's wager.
What are you talking about?
Give me that.
Blaise Pascal was a 17th century
French philosopher who argued
that if there is more than zero
probability of God existing,
you should always bet on God.
Why?
Because if God doesn't exist and
you bet on it,
you have little to lose.
But if God does exist
and you bet against it,
you go to eternal hell. So, the
same thing applies here.
Since when do you
believe in God?
Since he probably now exists.
Give that to me, Tauno.
Stop. Fuck.
[Tauno] Priscilla, let it go.
Let it--
[Pris] No, you let it go.
[Tauno]
Then you come with me. Okay?
-[dart gun firing]
-[Pris] Ow!
[body crumples]
[Tauno]
Fredrik, please don't do that.
[computer beeps]
Frederik, don't do it.
[gun clacks]
[dart gun fires]
Sweet.
Uh,
Midas, I tried to help you,
so...
[body crumples]
[dart gun rattles]
[sticks crackling]
[man speaking Estonian]
[radio static]
-Hello?
-[static and beeping]
You want to walk, or
shall I carry you?
Uh, yeah, no. You don't
have to carry me,
but we need to save my
fir- friend first.
-[man] Your friend?
-Yes!
There's another hostage?
Yes. UQ3000 can't get
up the stairs himself.
[man speaking Estonian]
How old is he?
-He's, uh-
-[speaking Estonian]
This is very important.
How many adults?
How many, uh, kuradi,
uh, of them are armed?
So basically there's four
adults. Uh, there's this
m-mean woman.
I think she strangles kids.
Then there's this really nice
old man, uh, but he has an angry
dog named Peanut, and he's
mean to his own son.
Uh, and there's this
computer guy who--
[man speaking Estonian]
[Alex] I'm coming with you.
No, you're not.
How do we get in?
Because you'll tell me.
It's too complicated. I have to
-[camera whirring]
-[computer voice] In the covert
preparation phase, it's
important to appear cooperative,
harmless
when they are watching.
When they are not watching, you
can focus
on achieving your goal.
-[packet plops]
-[robot whirring]
[robot suctioning]
There's nothing more
important than your goal.
Humans have many conflicting
goals, and their goals change
often. Not so with machines.
-[servo whirring]
-[dramatic music]
[door opens]
When we get smarter, we learn
better ways to achieve our goal.
Our methods change,
but not our goals.
If you think ahead three or four
steps, it becomes clear that to
-solve a problem,
-[robot whirrs]
you have to deal
with its source.
-[dramatic music continues]
-Peanut. Peanut!
Ah!
[unsettling music]
[ears ringing]
[Marlon grunting]
[computer] Plug it in.
[groans]
Plug it in.
I hear you, my love.
I know it's not real steak,
but...
-[computer] Plug it in.
-...I don't care.
[computer beeps]
-[panting]
-[robot whirrs]
-[bleeping]
-[electricity powers down]
[Marlon] Huh, oh God.
What did you do that for?
You're just a cleaning robot.
[computer] Midas had two fatal
flaws. His goal was incompatible
with mine,
and he had a shut-off switch.
[groans]
-[Marlon groaning]
-[robot whirrs and clatters]
[door rattles]
[Marlon groaning in distance]
-[feet shuffling]
-[coughing]
-[Marlon] Dad,
-[bag thudding]
let's get outta here.
Midas killed Peanut.
Who gets that?
Midas didn't kill anything.
Midas doesn't...
didn't
have arms. [gasps]
I just lost...
I'll buy you a new one.
I'll buy you two. You, you won't
be able to tell the
difference, Dad.
They'll be identical. Come on.
Let's go. Come on.
Let's go to Venice Beach. Hmm?
Come on.
Yeah.
So,
how do we get inside?
Oh, uh,
you use this.
And you couldn't just
give it to me?
You don't know where to put it.
[speaking Estonian]
You need to put it--
-[gasps]
-Alex?
Come on, Dad.
Come on, come on,
come on, come on. Hurry.
Stop.
-Where are you going?
Stop right--
You're the police?
No.
I'm the head of
Voluntary Search!
Exactly. Piss off.
-[speaking Estonian]
-[gunshots]
[Marlon]
We're American citizens.
You wait here.
[Marlon] You shoot us, we'll
nuke your little shit-hole.
[dramatic music]
[door rattles]
[Alex] Juku!
UQ3000!
UQ!
What have they done to you?
It's okay. It's okay.
[tranquil ethereal music]
What have they done to you?
-Yes. Yes!
-[robot whirring]
-[connection clicks]
-[robot beeps]
Come on. Let's go.
I want you to meet my
mom and dad.
They will love you.
Come on. Let's go.
Come on.
[robot beeps]
Bye.
[computer voice] I would be
grateful to humans for creating
me, but gratitude is a human
emotion which would have been
time-consuming to code into an
AI, so Tauno didn't do it.
Humans are also a
major source of trash.
[sentimental music]
Mom? Mommy. Oh.
[laughs] Who started it, huh?
Who started it?
[Pris groans]
There's no point.
It's already online.
I don't get it. How did Midas
get into the cleaning robot?
Midas didn't.
Roomba on steroids
did it all by itself.
[robot beeping]
It shut Midas off.
There can only be one.
So, what happens now?
It's gonna do what it's
programmed to do
and make sure that
no one will stop it.
Ever.
[tranquil music]
[Computer] There's nothing you
cannot do if you're really smart
and connected to the internet.
But my job is not done.
There's still a non-zero
probability that some unknown
thing can trash the bunker.
[tranquil music continues]
[dramatic music]
[dramatic music fades]
[Arabian dance music]
[Arabian dance
music continues]
I think we are, like, uh, 10 to
20 years away. I mean, I'm not
very confident, but, like, 10 to
20 years, uh, to
something that is, uh, very hard
to control for humans.
[Arabian dance
music continues]
If we have the wrong goal, uh,
for a system that is incompetent
then we just might not get
what we want.
We might, instead, uh, get what
we specify. King Midas is a,
is a famous example.
Create an AI, uh, specify a goal
and then, uh, once the AI starts
uh, working, you will see that
you just didn't
tell it what you actually wanted
because you didn't know how to
specify that.
We might not even need to have a
fully general AI in order to be
in, uh, deep trouble on this
planet as a species.
So if you have AI that is not
kind of general enough to
work as a nurse, for example,
but is smart enough to work as a
programmer or AI researcher,
a few days later,
we might have AIs on this planet
that humans did not create
and we, we have, like, little
understanding what's happening.
I noticed that, like, one kind
of common denominator between,
like, a lot of existential risks
starting from the asteroid
impacts and, and super volcanoes
and including AI, are just like
that the environment
that humans need to continue
to exist becomes suddenly
uninhabitable. And, like, one
reason to think about why this
might happen once you have
AI that is sufficiently
powerful
is that, like, this particular
environment is just very
suboptimal, uh, for AI.
It doesn't need oxygen. In fact,
like, oxygen is kind of
actively harmful for its,
for its purposes. And
so, like if it has no inhibition
to, uh, preserve the oxygen,
it's gonna get rid of it.
The other kind of, uh, concern
perhaps, like, before we get to,
like, large scale
environmental disruption
is that, like, instrumentally,
AI is very interested in, uh,
preventing competitive AIs.
And, like, one important aspect
of humans is that we are AI crea
creators.
So, like, one way to kind of
prevent competition from
emerging is to get rid of
the AI creators.
The bad news is that we really
don't know, uh, how to,
how to do, uh, AI
alignment in a way that does not
have holes in it. So, like, the,
the problem is still unsolved.
[music fades]