Inside the Rage Machine (2026) Movie Script

1
- I know the whole House will be very
concerned at the extremely serious
incident that has taken place in
Southport.
- A 17-year-old male from Banks, in
Lancashire,
who is originally from Cardiff,
has been arrested on suspicion of
murder.
- As soon as I saw those tweets,
spreading misinformation about the
identity of the attacker,
I knew exactly what was going to happen.
- The violence started after false
rumours about the suspect...
- Social media platforms are a way
that people express
their shock and anger.
- An undocumented migrant decided to
go into
a Taylor Swift dance class...
- When toxic opinions go viral, it can
have real-world harm.
But Southport was just the latest in a
long chain of events.
We have seen social media have so much
influence,
over and over again.
And that is down to the algorithm.
An unstoppable flow of content.
And those of us who understand how
this works
need to stop that happening.
- Our world is being actively remade.
Falsehood spreads faster than fact,
dividing communities and distorting
reality.
And as trust collapses, so does our
shared sense of truth.
The defining challenge of the 21st
century is not simply
who wields the most powerful
technologies,
but who guides them with the greatest
wisdom.
- You've created these platforms,
and now they are being misused.
- It's for profits, for money.
- Influence and power.
- People don't understand the beast
that they're feeding.
- It becomes our duty to speak out.
- Do you want me to look at the
camera, by the way?
- Yeah.
- Ready? Shall I clap?
Ever since I can remember, I've loved
social media.
I grew up on Facebook.
It was an important part of connecting
with my friends.
The worst thing that would happen is
you'd find out you hadn't been
invited to a party or a sleepover.
It doesn't feel like that now.
It feels quite different.
My Feed is full of this toxic, angry
stuff,
people screaming in my face,
and I can't not look because it's my
job.
As the BBC's social media
investigations correspondent,
part of what I do is try to understand
why there's so much
of this stuff on these platforms.
But it's incredibly hard to get
straight answers from the companies.
And the few people who do come forward
do so at great risk.
- If you come out as a whistle-blower,
and you are named,
you have a scarlet letter on your
head.
So, my goal is to not get sued.
By coming public, I know that my
career prospects in big tech
are probably next to zero now.
- How does it feel showing me this?
- Sort of terrifying.
Sort of terrifying.
- Oh, wow, you've got so much.
This must have taken you ages to
document.
- Yeah, yeah.
A little bit.
I spent four years at Meta,
so I've seen the other side of the
curtain
and I know some of the problems more
than I wish I did.
I have these kind of higher-level
research documents showing, like,
all sorts of harms to users on these
platforms.
If these companies are not going to
protect people,
it becomes my duty, and people like me
-
as people that actually understand how
these systems work -
to speak out.
So, a lot of the screenshots that I
have here
are how we ranked content.
The algorithm is built to prioritise
the stuff that gets
the most engagement, and the political
content that gets
the most engagement is typically
misinformation.
It's typically very toxic.
So, people yelling and being really
mean to each other.
- How worried should we be about the
algorithm, in terms of the role
they play in pushing this kind of
stuff?
- People don't want to see that
content.
But people don't know that they're
choosing to push this content
to you, even though you say you don't
want it.
And the consequences are really,
really bad.
Meta's products are used by north of 3
billion people.
And the more time they can keep you on
there,
the more ads they sell and the more
money they make.
But it's very important that they get
this stuff right,
because when they don't, really bad
things happen.
- In 2018, Facebook took Sri Lanka by
storm.
This whole road is full of smartphone
shops on either side,
and it's just shop after shop after
shop.
And people come here from all over the
city, actually,
to get a good deal.
Facebook had a programme where, with
every phone that was sold,
you got Facebook for free - it didn't
use up your data.
So, it kind of became the default
platform.
You felt a real sense of community,
a sense of...belonging.
Scratch beneath that veneer, and
Facebook became the predominant way
that the worst elements in society
used it to inflame hate.
Sri Lanka is marked by a majority
Sinhala Buddhist population.
We've had Muslims for centuries.
But there is a small group of Buddhist
monks...
..who used Facebook to directly target
the Muslim community
in Sri Lanka.
I was really tuned in to how Facebook
was being weaponised.
And there was this video that I saw of
this Muslim man.
The people recording the footage are
asking him
whether he put infertility pills into
the food that they consumed.
And he's confused.
He doesn't clearly understand Sinhala.
In those parts of the country, Tamil
is spoken, not Sinhala.
But the voices take it as - this was
the evidence, right?
I mean, here was a man who was
admitting to the fact
that the Muslim community was making
Sinhala Buddhist women infertile.
So, by 48 hours, this was shared in
the thousands.
It was like a...
..a frothing hate.
- TRANSLATION:
- Buddhist mobs set fire to homes and
businesses,
killing young Muslims.
- It was quite simply one of the worst
anti-Muslim violence
that Sri Lanka had ever seen.
Facebook was the wind to the seeds of
racism.
What was on the platform resulted in
people believing it
and taking action as a consequence.
And I didn't have any way to reach
out.
There was no-one to reach out to.
We were utterly helpless.
- I find it bonkers, when you look at
the stuff that Matt shared
from inside Facebook.
I think that this document is one of
the most revealing ones,
as it's an internal study led by a
data scientist.
It essentially talks about how
Facebook rewards outrage
and that, if you get more negative
comments,
you're going to get more clicks.
And there's this bit where they talk
about the incentives
that the company are creating,
essentially saying
"outrage gets attention".
And then laying out and saying, "The
current set of financial incentives
"our algorithms create does not appear
to be aligned
"with our mission."
Eg, their mission is to do good in the
world,
and yet those recommendation systems,
those algorithms,
are pushing the kind of content that
feeds off negative reaction.
And, therefore, you can see how it's
impacting users.
And they essentially lay it out in
black and white.
It's not like...
It's not like it's much of a secret,
really. We kind of all know it.
But the thing is, THEY know it too.
And Sri Lanka was not a one-off thing.
- Almost as far as the eye could see,
a tide of humanity.
They're carrying with them whatever
they could salvage
from their villages, their homes that
they say they were
burnt out of by the Myanmar military.
Look at that little baby there in a
basket.
- Take, for example, what happened in
Myanmar.
Facebook was actively pushing hate.
And it's not just these places.
It was part of a pattern that was
happening all over the world.
People were acting on the basis of
what they were seeing on Facebook.
There wasn't any other information
they were responding to.
And I think it's for that reason that
Facebook almost had to
come out and apologise. Because there
was no way of saying,
"Well, hang on, this wasn't to do with
us."
It was entirely connected to Facebook.
- We've got to keep moving, folks.
Watch your steps.
- Recently, UN investigators blamed
Facebook for playing a role in
inciting possible genocide in Myanmar.
- What's happening in Myanmar is a
terrible tragedy,
and we need to do more...
- We all agree with that.
- I was recruited to Meta because the
company got a lot of bad PR.
I mean, there was Cambridge Analytica.
There was Russia's involvement in the
2016 election.
And from places like Myanmar or Sri
Lanka,
they just weren't investing enough,
and a lot of the hate speech there was
just missed.
It wasn't being caught.
- We didn't take a broad enough view
of our responsibility,
and that was a big mistake.
And it was my mistake, and I'm sorry.
- So, Mark spent in the billions
to hire a bunch of experts
and create specialised integrity
teams.
Our job description was reducing harms
to users, at a global scale.
I said, "Oh, that's my dream.
"That is my life's work."
- I had the opportunity to join
Facebook,
to actually see it for myself,
from the inside.
And so, I think it was really
once-in-a-lifetime opportunity.
There were so many dedicated people
that wanted to do the right thing.
- You know, I believe deeply in
transparency.
And I think one of the solutions out
there is to share more data
with the outside world.
And so, I created CrowdTangle.
An online dashboard you could log
into.
What we did was made it really easy to
see what stories
and what content was true or false on
Facebook that day.
If it weren't for that CrowdTangle
dashboard,
you couldn't get any of that.
Mark was so invested in it, and bought
it
to make sure Facebook has a positive
role on the world.
- What is the new mission of Facebook?
- So, our new mission is to bring the
world closer together.
- When Mark announced a major
algorithm change,
called meaningful social interactions,
my sense, actually,
was that there was a real, genuine
interest in trying to build
a better algorithm during those years.
There was a real overhaul to the
fundamental way
that content was ranked inside the
Feed.
Ranking friends and family over
something called civic content,
news and political content.
They said, "Yeah, it turns out people
click on the politics,
"but, actually, they end up like
feeling worse after a while,
"and so we're going to reduce the
amount of politics in the Feed."
- My job entailed running large-scale
experiments on
sometimes as many as hundreds of
millions of people,
where we would change how content was
ranked in Feed.
And these people often had no idea
that they were in experiments.
You just might have a different feel
to your Feed one day.
So, you log in, and much of the civic
content from your Feed
might be missing, or the protections
against harmful content
might be turned off.
And what I found was that, you remove
the reliable,
trustworthy news sources
and you end up with my uncle,
who likes to post conspiracy theories.
This change that was meant to bring
people closer together
and help them connect actually had the
opposite effect,
where it was promoting content that
was dividing people
and sowing more division
and creating more conflict between
people.
So, while it sounded great, we've more
than doubled
views on misinformation by demoting
civic content.
Is that the goal? No, it's not the
goal.
It's the opposite of the goal.
So, we warned against this in a
document
that we delivered to the team,
and that was read to Zuck in advance
of launching
these broad-scale experiments, and...
..that was all ignored.
We warned, very sternly, this is not
going to go well.
And then, it didn't go well.
- Major cities across the country are
looking more like ghost towns.
Shops locked up, normally bustling
streets now oddly quiet.
- At first, it seemed like just
another event.
And then suddenly we had this massive
surge, in terms of volume.
People were spending a large amount of
time on online platforms,
where all the information was
brand-new,
and we weren't able to clearly
identify exactly what the truth was.
- New cases and deaths associated with
the coronavirus
continue to climb right around the
world.
- What the pandemic created was this
perfect storm,
because you've got people stuck at
home, spending more time online.
And the algorithms, the recommendation
systems,
they were becoming ever more
influential.
- They'll kill millions with their
vaccines.
There is no vaccine currently on the
schedule
for any RNA virus that works.
- I found it very frustrating.
A lot of my time at Meta was spent
focused on how to quickly identify
conspiracy theories, conflicting
information.
- Right now, you are being lied to!
- It was always difficult to identify
which of those would escalate
to rioting or violence.
And that's a hard thing to deal with.
- None of you have reported on the
truth since day one.
You're complicit in crimes against
humanity.
- Just knowing that there's the
possibility that something bad
might have happened and maybe there
might have been
an opportunity to prevent it...
..that's a really jarring thing for
me.
The nightmare that kind of sits in the
back of my brain.
It's definitely a space which is
stressful for me.
- Misinformation running rampant...
- You are being lied to!
- They'll kill millions...
- CHANTING
PRODUCER:
- What did it feel like when you were
at the riots that night?
- During the lockdown, there was no
light in the city.
Everything was closed.
And as a DJ artist, it was
devastating.
I felt really lonely.
Sometimes, I even wanted to cry.
Got a little bit more...more
frustrated.
We wanted our freedom back and go back
to normal.
So, when I saw a post of Facebook
group
that was anti-Dutch government,
it intrigued me.
They do for the same cause like I did.
It was my first time in the group.
It felt like I found a new family.
We stayed in contact every day,
sometimes even at night.
The whole pandemic, it divided people
into so many different groups.
But then, you always have a few people
who have
a lot of influence on other people.
They have power, mental power.
It was almost a daily routine for them
to spread hate and to be negative.
And the tension was growing.
I could feel that something was
happening.
Conversations got angrier, more out of
control.
And people really wanted to do
something about it.
So, to me, fighting for our freedom
was more important than even my own
life.
- CHANTING
- Leading up to the protest, I think
they were all hyped up,
they were full of adrenaline,
and people already started to incite
each other.
In a certain moment, I was really
scared,
because the way some people looked,
with the hate and aggression,
like they were ready to even kill
another man.
- EXCLAIMATION ON VIDEO
- I saw very young boys, maybe age 13
or 14,
throwing rocks at police cars.
The whole city changed,
and it's like people just turned into
animals.
- It was a complete war zone in 20
minutes.
They make each other crazy, using
social media -
I'm sure about that.
That day, it was the worst day of my
life.
This is the place where I used to have
my restaurant.
And they burned it with the fireworks.
There was shooting, as well.
- BLASTS ECHO
It must have been so scary.
I mean, even when you left, it must
have been scary.
- Yes, of course!
I cried, and I thought, tomorrow
there's nothing left to work.
- The pandemic was such a stark
example that Facebook had
got so much bigger than Mark
Zuckerberg.
He'd sort of lost control.
Social media was really dividing
people,
particularly in Facebook groups.
People were suddenly talking about
hanging doctors and nurses,
and attacking people.
- They're very passionate...
So, I've just been struck in the back
of...
..back of the head by a can.
That really hurt.
- Wanting retribution, wanting
justice, in cities across Europe.
The violence, connected to what was
happening on social media,
that started in places like Sri Lanka,
was coming closer and closer to home.
- Are you willing to acknowledge that
Facebook IS contributing
to society's woes, polarisation and
all the rest?
It's pouring gasoline on the burning
fire in front of us?
- Well, look, obviously, for a
platform which has, what,
a third of the world's population on
it, of course you see, you know,
you see the good, the bad and the ugly
of humanity
show up on our platform as well.
Our job is to mitigate the bad, reduce
it, and amplify the good.
- These user insight documents from
inside Facebook come from
the period when the company was using
this new argument.
We're not responsible for everything.
We are "a mirror" reflecting back
society.
These documents basically talk about
the way that users
are having their psychological
processes activated
by certain types of content.
In particular, they say this happens
when users view sensitive content,
which includes content that has an
elevated risk of causing harm
to people and inciting violence.
So, you basically are admitting that
the way that you are building
the platforms, and the way that you
are using the algorithms,
the recommendation systems,
ultimately result in these harms.
And that can have a real-world
consequence for us.
- Generally felt like leadership
wasn't sure what the answer was.
But it began to calcify into a sort of
defensiveness.
"We're not responsible for all of
polarisation in society."
Nobody's saying you're responsible for
ALL polarisation.
We're just saying you contribute to
it,
and probably in ways where, like, you
don't have to.
And if you just made a few changes,
you might not contribute to it as
much.
And, yeah, I think that was...that was
dispiriting,
because it felt like there was a
window in which it was, like,
genuinely introspective.
- If you haven't heard of the TikTok
app, you will soon.
- It is among the fastest social
networking app
to be able to go from zero to over 200
million in registered users,
60 million daily actives around the
world, in a span of three years.
So, that growth is incredible.
- I was very lucky I joined TikTok.
My job was to improve the algorithm,
so you're able to watch 50 videos in a
row,
maybe in five or ten minutes.
The TikTok algorithm is able to
collect a lot more
information from you.
- Is anyone else, like, a little
weirded out about how specific
TikTok's algorithm gets for the For
You page?
- Since Facebook became Meta - what,
in mid-October -
it's really struggled to find its
footing
or gain some traction here.
User growth is stalled or shrinking.
- Meta was down, because TikTok is
literally eating its lunch, right?
Like, there is no doubt in my mind,
Mark Zuckerberg,
for the first time ever,
looked vulnerable last night.
- I actually remember a lot of fear
about TikTok.
Mark seemed like he was actually
concerned
about them overtaking Meta.
The ethos is very much move fast and
break things.
And one way to try to compete with
TikTok was to just move fast.
- It seemed to me, at that time,
that it was almost like this
engagement arms race.
- Mark is very paranoid about
competition.
And so, when he feels like there are
potential competitive forces,
there is no amount of money that is
too much.
- One way to try to compete with
another company that's offering
a unique product is to try to mimic
it.
- And I remember a moment when...
..regular headcount planning process,
and there were some integrity teams
and safety teams
who were going to ask for two
headcount to work on kid stuff
and ten to work on elections.
There was another team who went,
"Oh, we just got 700 for Instagram
Reels."
I was like, "OK."
- Any time you're introducing a new
type of product, say, Reels,
there's an elevated risk,
because the infrastructure that
existed before,
it's either completely absent or it's
very immature.
So it's hard to prepare sufficiently
in advance of that launch.
I was frustrated, because there's a
common trade off
between, like, protecting people from
harmful content and engagement.
- There's another set of documents
that speak specifically about
the difference between TikToks and
Reels.
Reels are similar in how they work to
TikToks.
So, you're getting pushed stuff that
you will show an interest in,
not necessarily from people you know.
And this is Meta's own study,
with stats, shared internally,
where they actively acknowledge that
they are struggling
to prevent harm when it comes to
Reels.
So, stuff like violence and
incitement.
So, you can see here they admit that,
"Comments on Reels posts
"have a higher Violating Hostile
Speech...Prevalence."
So, "75% higher for Bullying and
Harassment,
"19% higher for Hate Speech,"
and, "7% higher for Violence and
Incitement".
And so, essentially, what this
document suggests to me
is that they're building these new
interfaces,
they're introducing these new products
to their sites
to compete with their rivals, like
TikTok,
but yet they're not seemingly putting
the right safeguards
in place in time.
So, their rush to evolve, to grow, to
keep the user engaged
is ultimately leaving users at risk,
which they're admitting.
Why is that happening?
- When you're working on these
integrity teams,
they are disadvantaged, because in
order to launch something
that's going to protect people from
some kind of harm
in Reels or in Feed,
you have to convince the team
that owns Feed, or that owns Reels,
to sign off on the product change
that you want.
But there's this power imbalance.
They have incentives to not let those
products launch,
because toxic stuff gets more
engagement than nontoxic.
- TikTok was really trying to take up
the market,
ship out a new version, improve the
algorithm, maybe every week.
But I started noticing more and more
issues of people using TikTok,
especially more borderline content.
For example, something subtle, like a
conspiracy theory,
or problematic content that will only
be seen after you browse
for more than a certain time.
As the model becomes more advanced, I
think the borderline content issue
become more noticeable.
- In 2022, I remember how social media
feeds were flooded
with borderline content, the grey-area
stuff.
Some of it, clearly, totally
unacceptable.
Maybe that's misinformation or racist
hate,
anti-Semitic abuse.
And I was a bit confused, actually,
because it felt like, during the
pandemic,
there had been efforts from the
companies,
the social media companies, to tackle
this borderline stuff -
and it was back again, and back again
with a vengeance, really.
Why was that happening?
I'm so happy that you wanted to speak
to me
and to do this interview.
So, tell me what you actually did as
an engineer.
- I worked at Meta building algorithms
or programmes
to reduce the amount of borderline
content.
And over time, the business
positioning of our team changed
towards, like, allowing more
borderline content.
- Do you think the decision was
connected to
this engagement race with TikTok?
- Yes, it was definitely connected
with that.
You're losing to TikTok,
and therefore your stock price must
suffer.
And then, you know, that's when people
started becoming, like,
sort of very paranoid and, you know,
sort of reactive.
And they were like, "Let's just do
whatever we can to catch up.
"Where can we get like 2%, 3% revenue
for the next quarter?
"Maybe if this stuff is really bad,
then users will stop using the app.
"If they're continuing to use the app,
"then, by definition, it's not bad."
- So, if someone likes harmful
content,
they'll get more harmful content?
- Yeah. You don't care as much about
it,
because they'll continue to use it in
the long term.
- Who ultimately was making those
decisions?
- Senior VP.
And I think he reported directly to
the CEO.
So, they sort of told us that it's
because the stock price is down,
so we might actually come back to this
later.
- It can't have been the most
reassuring thing to hear,
"Oh, we might come back to it at a
later date."
- Yeah, because there's no legal risk,
you know?
No-one actually knows what Facebook is
doing.
So there's actually no incentive for
any of these companies.
- It started with immigration, for me.
I was sort of the perfect candidate.
When I was 13 and 14, I was kind of
struggling with a lot of
insecurities about my ability to have
friends and relationships.
It was very easy for me to believe
that the world was against me.
It made me believe in this idea of
decline.
I started, at first, you know, the
kind of entry-level,
fact-based video essay type.
- It is time to beef up our borders.
- Like, "We just need to reduce
immigration."
Then it goes, slowly but surely, it
starts to recommend you
more and more channels about that.
And then you kind of just go into the
whole ecosystem at large.
And then it goes to, "Our culture is
being changed."
- The sacrificing of our children on
the altar of mass migration.
- "The West, as we all knew it, was
being replaced."
And then you start getting recommended
other fields
in that space, where it then talks
about feminism.
And then talks about LGBT rights.
And then, all of a sudden, you hear
from a perspective
that sounds reasonable at first
and then, a couple months later,
you believe literally everything in
that space.
- Makes some noise for the richest man
in the world.
- Breaking news - Twitter has just
accepted Elon Musk's offer.
- The social media platform is now in
the hands
of the world's richest man.
- I learned on Twitter that Elon Musk
was buying Twitter.
It felt like a bizarre move.
Everybody was trying to understand why
he was doing that.
- In a tweet, Musk wrote, "Entering
Twitter HQ,
"let that sink in."
- I had concerns...
..because of the amount of
resources Twitter was putting in at
that time
to making it a safer platform.
Twitter was never perfect, but it was
the top company priority.
It's, you know,
a rage-based business model in a
sense.
- If the algorithm is left unchecked,
the algorithm could prioritise the
worst possible,
most inflammatory opinions.
Our job was to make sure that,
when you search for something, what
you saw was reliable, true.
We all believed in what we were doing
so much.
- REPORTER:
- Never dull with Elon Musk. Expect
lots of ideas,
lots of changes and lots of
uncertainty.
- Whenever he spoke to us,
he wanted Twitter to be more like
TikTok.
- TikTok has taken America by storm.
- Engagement at TikTok at that time
was just growing phenomenally.
He was focused on making Twitter what
he called hardcore.
- I found out on November 15th that my
access to my work laptop
and my email account were cut in the
middle of the night.
- How did that feel?
- I again was shocked because I
thought at least
the company would have some decency
to give an official notice.
Not something that felt so casual.
- TUCKER CARLSON:
- What percentage of your staff did
you fire at Twitter?
- I think we're about, we're about 20%
of the original size.
- So 80% left?
- Uh, yes.
- We had 250 designers about, of the,
all design-wide,
and there was probably eight of us
left.
The entire Trust and Safety org
almost all was gone at that point,
that wasn't a priority.
- Turns out you don't need all that
many people to run Twitter.
- But 80%? That's a lot.
- Um, yes.
If you're not trying to run some sort
of,
uh, glorified activist organisation...
..and you don't care that much about
censorship,
then you can really let go of a lot of
people, it turns out.
- HE LAUGHS
- I don't think any of us saw it
as getting as bad as it got as quickly
as it got.
- Come on, get back, get back.
- It's good to be back on Twitter.
Don't believe what you read. Don't
believe the mainstream.
- Elon reinstated my Twitter account
after taking it back.
- Some of the people who'd been
banned,
all these names that had caused so
much division, allowed back on.
Because he has sole control of that
platform,
he can tweak and nudge the algorithm
to boost certain kinds
of opinions and push down others.
It is complete gaming of...
..freedom of speech.
- In the Internet Referral Unit, we
see thousands and thousands
of pieces of content every year being
referred to us.
- These are kind of examples that we
see on a daily basis
that shows the normalisation of
the graphic content on social media.
- So people spot stuff and then they
send it to you guys?
- Yeah. So posts were shared on X...
..describing Jews as rodents.
- And it's not just that, really.
I mean, the language is really
extreme.
- No, exactly, it's extreme
anti-Semitism.
It's kind of encouraging further
attacks on Jews,
viewed by thousands upon thousands of
people.
Yeah, we're certainly seeing extreme
right-wing rhetoric.
And the imagery is so graphic and so
overtly racist.
And it's kind of increased
to such a level now of this gratuitous
violence, really.
- If a user is seeing more and more of
this kind of stuff,
you can see how they start to find
even more extreme
versions of that imagery. That to me
seems akin
to essentially how radicalisation
works.
- People are more desensitised to
real-world violence,
and they are not afraid to share their
views.
It just made me feel perpetually...
They energised me, but not really in a
good way.
They just made me very kind of angry.
It very much reflected the way I felt
internally,
that I was angry at the people around
me.
It was very easy for me to believe
that the world was against me,
which I think is completely central to
a lot of kind of
hard-right content on YouTube, TikTok.
- NEWSREADER:
- Senator Marco Rubio has announced
bipartisan
legislation to ban TikTok from
operating in the United States.
- Your platform should be banned.
TikTok surveils us all,
and the Chinese Communist Party
is able to use this as a tool to
manipulate America as a whole.
- There were threats being made
against
TikTok because it's owned by ByteDance
- a Chinese company -
by the US government,
that China was going to misuse
sensitive data.
ByteDance has always very strongly
denied that was the case.
But, actually, the information that
TikTok has about us
which is the most powerful
is the way that we linger on a video,
the way we scroll.
As an engineer,
how can you build recommendation
systems that are inherently safe?
HE EXHALES
- That's, uh...
I don't actually know
the answer to that question, about how
to,
you know, build it completely safe
when we have no control on the
deep-learning algorithm.
To us, it's still like a very black
box, how internally it works.
- Which also is, in and of itself,
slightly terrifying,
cos you clearly understand
algorithms...
LAUGHS: ..probably better than anyone
we've spoken to!
- Yeah.
To be honest, we don't
actually pay too much attention to
specific contents.
To us, all the content is just a
different number, OK?
We're the ones that are responsible
for the recommendation and the content
safety team,
they're responsible for eliminating
those bad contents.
Like the car manufacturer, right?
There's a team that are responsible
for the acceleration,
the engine, right? So we expect the
team working on
the braking system was doing a good
job.
- I am looking back through my
messages
with someone who currently works at
TikTok.
And they got back to me earlier this
year, actually.
They sent me a message and we've been
speaking since then,
trying to work out a time when we can
meet each other.
They only wanted to meet me in person.
We've been messaging back and forth on
encrypted channels.
They are working in Trust and Safety,
and there's stuff they're concerned
about.
They want people to know what's
happening on the inside.
It's so hard to get someone to want to
show you stuff.
It's a pretty high-risk thing to do
when you still work there.
Hello.
- Hi.
- So, is this your laptop?
- Yeah.
- OK.
Why have you decided that you want to
speak out,
that you want to show me some of these
documents, conversations?
- If you're feeling guilty on a daily
basis because of
what you're instructed to do, at some
point, you can decide,
"Should I say something?"
- What are the high-risk issues?
What are the things that most concern
you?
- I'll show you.
On the dashboard, the first thing you
can see is volume.
- Now that is a lot of cases.
- Yeah.
The platform is being used by
children.
So it's upon that platform
to be held at a high level of
accountability.
And there's content out there linked
to terrorism...
..sexual violence, physical violence,
abuse,
trafficking.
It feels like it's increasing.
And it's quite damaging
because children are much more easily
influenced.
- So I guess the risk
of being activated in a negative
way...
- Yeah. Especially on a generation
that's literally
hooked to this app.
Look at this, for example.
- Yeah. So these P2, P3, P1,
those all show the level of priority,
essentially.
- Exactly. If you look at the country
where this report comes from,
it's very high-risk because it's a
minor and involves,
you know, sexual blackmail.
- Yeah.
- And then you can see the priority
here.
The urgency is not high at all.
- Priority 2.
- You can see all these are reported.
- Oh, yeah. Yeah, yeah, yeah.
- So very high-risk.
- Yeah.
- Encouraging people to commit crimes,
terrorism or join terrorist
organisations.
And that's...
- What, P2?
- And then you have
another case that's classed as P1,
a high-priority case.
- It's a case that relates to a
politician, essentially.
- Just to maintain a strong
relationship,
not because there's a high risk for
the user.
- Looking through all of
these documents from the TikTok
whistle-blower,
I'm just struck by how many of them
point to the same thing,
which is that politics seems
to matter more than protecting kids or
teenagers.
And the case we discussed involving
the 16-year-old girl
who reported that a TikTok account was
posting
her photo alongside explicit images.
And this is priority number 2.
But this is about an under-18 who is
essentially having
a concern relating to explicit images.
And then there's a 17-year-old who is
reporting being
the victim of cyber bullying by two
accounts.
This is priority number 2.
So take, for example, priority number
"In the picture there is a picture of
a chicken..."
And the person is a candidate
of the Iraqi parliamentary election of
So this is in the same country.
You've got a teenager and then you've
got
a politician who's being affected by a
picture of a chicken,
and the politician is P1.
The whistle-blower says that these
kinds of cases happen all the time,
and they are really uncomfortable with
this.
That when kids are at risk, when
under-18s are at risk,
it's not the first thing that they are
tasked with dealing with.
Instead, they're tasked with dealing
with stuff
about politicians or about political
figures,
or in places where it's a priority to
deal with it.
But this, for example, this is
the next wave of stuff they've got to
come to,
and they've already got
loads of examples of cases and harms
that they're looking at.
What would you say to a parent
who has a kid or a teenager that is
using TikTok?
- Delete it.
Keep them as far away as possible from
the app,
for as long as possible.
Hello.
- Hi!
How old were you guys when you started
using social media?
- Like lockdown, so when I was like
nine.
Cos we were at home all the time
and that's the only way I could
communicate with my friends,
so I used it a lot more.
- LAUGHS: Who thinks that they spend
the most time...?
- It's like five hours a day, maybe.
- I'm doing it, like, before school in
the morning,
and then as soon as I get home from
school.
It doesn't make me feel great cos I
probably should have,
like, socialised a bit more, like gone
off my phone a bit more,
but I just get addicted.
So, like, I can't help it and I just
won't stop.
- And do you find yourself...? Does it
tend to be more
what we'd call like short-form video?
So TikTok, Reels, that kind of thing?
- Yeah, just scrolling there for
hours.
THEY CHUCKLE
- I also do that.
Are there things that pop up on your
feed and you think,
"I don't really want to see more of
that"?
- Probably like the same - like
violence.
- Like the bullying, the fighting,
really,
cos I don't like seeing them.
Like, it just makes people think it's
right.
And then you report it and it doesn't
come up for like
a day or two, and then it'll just pop
back up.
- And then I keep doing it and it just
keeps coming back.
- It kind of makes me feel like, "Oh,
they're not listening to me."
I'm telling them I don't want to see
this kind of stuff.
- How do you feel when you see that
kind of content?
- Just really frustrated. Like, I've
told them,
I don't want to see this,
I don't want to see this, over and
over again,
and they just keep it there again.
Because it's not necessarily something
people agree with,
but it has so many views and so many
likes
that they keep putting the videos back
out there
and back out there because people
react with them.
- If I watch too much
and then there's just something that
comes up,
some person online just trying to get
popularity.
I shouldn't get too annoyed about it,
but it's because I'm in that sort of
state
I have a very strong opinion about it,
though it could make me feel quite
annoyed and angry.
- What I found at the time was just
how often
we were seeing the same content again
and again,
that we didn't want to see,
from people we do not know,
way outside of the remit of even what
you would have ever searched.
And I think, when it comes to kids and
teenagers in particular
that was a serious problem because
you're still trying
to work out what's all right and not
OK to do.
And your feed is beginning
to dictate the way that you look at
the world.
- I've been wanting to talk
about this publicly for over a year
now,
and now I've finally worked up the
courage
to talk about things I've kept quite
close to myself.
I started to abandon my old racist and
misogynistic views,
as they no longer made any sense to me
in my head.
I think people need to be made aware
that the amount of anger
in a lot of hard-right circles was
extreme.
If the only thing you're hearing
is about asylum seekers committing
really bad offences,
if you're constantly seeing that on
your feed,
then it is going to radicalise you.
Like, 100% it is.
- I know the whole House will be very
concerned at
the extremely serious incident that
has taken place in Southport.
- NEWSREADER:
- Two children have been killed in
what police called
a ferocious stabbing at a dance class
in Southport.
- WOMAN:
- And the mother that I was just
consoling.
Just the look on her face
cos she knew something had happened to
her child.
It's like, oh, my gosh, how do you
even comprehend?
There's nothing you can do.
- At Twitter, when we'd have a really
serious news story like that happen,
our job was to make sure that anyone
sharing completely
unverified information would be pushed
down in the algorithm.
The safeguards were taken away.
Because that is the first thing that
Musk did.
There are unsubstantiated opinions
speculating about
the identity of the attacker in
Southport,
and people are listening to them.
And that is a spider web of
misinformation
that grows and grows and grows.
- An undocumented migrant decided to
go into a Taylor Swift
dance class today and stab six little
girls.
- We knew that people with an agenda
could game the algorithm,
could harness this huge amount of
power.
I knew exactly what was going to
happen.
- I remember seeing that on my video.
Like, within hours,
a whole massive crowd descend on the
town.
- A 17-year-old male from Banks in
Lancashire,
who is originally from Cardiff,
has been arrested on suspicion of
murder.
- The news is stepping in and saying,
"We know who did this,"
but by then it's too late.
- And then the mosque there is the
first thing that gets desecrated.
- CHEERING AND YELLING
- Get back!
- Get back now! Get back!
- And all of this is self-enforcing,
completely circular.
Because when these things were
happening,
they're then being reported on social
media platforms,
and it spins faster and faster
and people get angrier and angrier.
- SHOUTING
HELICOPTER OVERHEAD
MAN:
- Fucking hell.
- BEEPING, UPROAR
And I remember at that time wanting to
get to
the bottom of who was behind these
false rumours,
allegations suggesting that the
attacker
is an asylum seeker,
that they arrived by boat to the UK,
that they're Muslim.
And I think now, if I look...
..one of the first places that
published some of
the speculation was called Channel3
Now.
It was just sort of pumping out
stories.
And I tracked down someone,
a man called Farhan in Pakistan,
and I finally got this email back.
We're only just trying to, you know,
put news out.
Essentially, we're kind of trying to
get clicks
and we didn't mean to get it wrong.
This is clearly a product of the
algorithms,
the social media systems,
because Channel3 Now wouldn't have
wanted
to post this content had it not known
that it was going to get views and
clicks and likes
if it posted it as quickly as it
possibly could,
even if it was wrong. And, in fact,
it being wrong meant it got more
likes,
more views, more clicks because it got
engagement.
It sparked a reaction, it was emotive.
UPROAR
HELICOPTER OVERHEAD
Southport felt like a real shift.
Like the violence connected to what
was happening on social media,
it's no longer far away.
- We are the canary in the coal mine.
A lot of what goes on and wrong on
social media
was trialled and tested
in countries like Sri Lanka, because
nobody cares about us.
All of our markets are active
experiments,
and so what is now much more evident
in the UK,
in other parts of the world,
was very evident in countries like Sri
Lanka years prior.
So what I have warned against, what I
have cried about,
suggests that this is not a Sri Lankan
problem.
It is a social media problem.
It is a Silicon Valley problem.
It is a governance problem. It is a
democracy problem.
It is a problem for all of us.
Algorithms now impact all our lives.
- CROWD CHANREPORTER:
- Immigration centres targeted
nationwide.
Police overwhelmed.
- SIR KEIR STARMER:
- There have already been hundreds of
arrests.
The criminal law applies online as
well as offline.
Around this time at Meta, there was a
shift.
- It's time to get back to our roots
around
free expression on Facebook and
Instagram.
- They announced that they were
killing the fact-checking programme,
and they cut back very severely on
content moderation.
- We're going to simplify our content
policies
and get rid of a bunch of restrictions
on topics
like immigration and gender
that are just out of touch with
mainstream discourse.
- It felt like they lost the
commitment
to combating a lot of
these harms and seeming to just not
really care
about actually listening to the data
and actually trying to solve problems.
- My biggest worry is
the politicisation of these platforms.
Mark was generally trying to be
neutral,
but now Elon, I think,
specifically treats X as a political
outlet for his views.
And I don't think any single company
should be allowed
to have this amount of influence and
power over 4 billion people.
- Can I get a picture, Charlie?
- Charlie!
- Charlie.
- I want Trump to be president. Trump.
- It now feels like we're in a world
increasingly inhabited by a new type
of individual.
- Would you like to have a
conversation?
- Sure. Absolutely. Yeah, yeah.
- OK.
- Um, I don't like you.
I think you spread hate, I think you
spread bigotry.
I think you piss a lot of people off.
- We record all of it so that we put
it on
the internet so people can see these
ideas collide.
When people stop talking, that's when
you get violence.
- Influencers whose capacity
for generating engagement online is a
huge part of their fame.
GUNSHO- Oh, my God.
- Oh, my gosh, he was just shot.
He was just shot!
- NEWSREADER:
- Charlie Kirk was shot dead while
addressing students on a campus in
Utah.
- Videos of his assassination
reached millions across the globe.
- The killer's reasons were initially
unclear.
- Right now, federal investigators say
they are talking to this man, Tyler
Robinson,
about why he might want to kill
Charlie Kirk.
- Conspiracies about who he is and
what he believed are flourishing.
- But he'd left some clues at the
scene.
- The inscriptions carved onto the
bullet casings
that were believed to be owned by the
alleged shooter.
- You'd only understand them if you've
spent
a lot of time on social media.
- They contain a number of seemingly
sarcastic,
irreverent phrases that are common in
gaming
and online communities.
- A seemingly ordinary student wrapped
up in a dark,
nihilistic meme culture.
Did the killer have one eye
on the online reaction provoked by his
violence?
Are we witnessing a new ideology of
engagement?
Algorithmic systems shaping politics,
shaping society,
shaping how events unfold.
- If I was parenting young children
right now,
I would encourage them to spend as
little time online as possible.
- We have lost lives to this.
We know where this goes.
- Social media platforms have
an essential responsibility to the
truth.
- If we want this sort of
accountability and transparency,
we have to make it legally required.
- Problems are going to get worse.
- I can't continue existing without
trying to fight.
- For more insights from me and the
experts
at the Open University about how to
stay alert to algorithms,
scan the QR code on screen,
or visit connect.open.ac.uk/ragemachine.