Horizon (1964) Episode Scripts

N/A - How to Avoid Mistakes in Surgery

1 I'm Dr Kevin Fong.
And in the last few decades, I've seen the operating theatre transformed by revolutionary advances in science and technology.
They've made open-heart surgery like this seem almost routine.
So in unexpected emergencies, the weakest link in the room isn't usually the drugs, or the equipment .
it's us.
The surgical team.
And that's because under pressure, doctors and nurses, like everyone else, make mistakes.
In fast-moving situations, we are horribly fallible.
Our brains default to patterns of behaviour that let us down at the crucial moment.
And so the question becomes this - is there anything at all that we can do to better prepare ourselves? Is there anything in those catastrophic moments that we can do to give ourselves a fighting chance? This is a life-or-death problem.
Everyone in medicine realises that avoiding mistakes is crucial to saving lives.
This is University College Hospital in London.
It's where I work and went to medical school.
Since I qualified 15 years ago, we've got much more in the way of science and technology to protect our patients.
Over the years, our expectations of medicine have changed dramatically.
Things that were once fatal are now commonly survivable.
'But despite all of this, very rarely, unexpected life-and-death emergencies 'can still occur and threaten lives.
'And doctors are only now beginning to understand the crucial 'role of simple human error.
'And there's one case that's struck a chord with me as a doctor.
'It's got vital lessons for all of us in medicine about how 'we make life and death decisions under extreme pressure.
' Elaine Bromiley was 37 years old when she came in to a private clinic for a routine sinus operation, on the morning of the 29th March There was nothing out of the ordinary as the anaesthetist prepared her for surgery.
She'll be falling asleep now.
That's it.
Part of this process is called intubation, which involves placing a tube into the patient's throat to allow them to breathe.
But out of the blue, a severe problem arose.
I'll have the laryngoscope, please.
And the ET tube.
Elaine's airway - the route through her mouth and throat to her lungs - became blocked.
It's an event that happens in as few as one in 50,000 routine anaesthetics.
Calling in another anaesthetist didn't help.
No matter what they did, they couldn't intubate her.
Elaine couldn't breathe.
Can you see anything? An ear, nose and throat surgeon then stepped in.
It's difficult.
Can you see anything? He also triedbut failed.
25 minutes passed, and the situation became critical.
There was an alternative solution which could have made a difference, but they just couldn't see it.
The sats are really low.
The pressure of the situation was so intense The sats are really low.
that their decision-making had become fatally compromised.
Inflating? Yeah, inflating.
Thank you.
Starved of oxygen, Elaine was left in a coma and died 13 days later.
That's very difficult to watch.
And it leaves you with mixed feelings.
Reflexively, you want to know what went wrong and how to stop it happening again.
But as a doctor, you watch it unfold and it leaves you with a sense of deep unease, because you think, "there, but for the grace of God, go I".
And it's not just me.
Every doctor I know has asked themselves how they'd cope in the same situation.
Working out what we can do about this is really a human, rather than a medical problem, requiring radical new ways of thinking.
So I'm leaving the operating theatre behind to find out how other professions deal with making life-and-death decisions in fast-moving emergencies.
Come back! From discovering new ways the fire service are fighting fires Rotate.
to how a simple piece of paper removes human error in the high-pressure cockpit of our most complicated modern passenger aircraft That's er, engine two fire.
and getting to grips with the ingenious strategies that pit-crews employ in the fast-moving, technically demanding, highly dangerous world of Formula One.
But my quest to deal with the life-threatening problem of human error in medicine starts in the most unlikely place - a casino.
MUSIC: "Rocks" by McFly Dealers keep dealing, thieves keep thieving, teasers keep teasing, holy joes are preaching I've been told that watching how the cards fall could provide some unexpected insights into why we make mistakes under pressure.
This is Professor Nilli Lavie.
She specialises in a field of psychology called load theory.
And she tells me she can change the way I think about surgery by putting me through a simple test.
Well, let's see.
So, Kevin.
Here is the test.
So that's two different, one different, five different, 'All I have to do is tap my hand on the table 'when I see two cards together that are different by five.
' Zero, one - five different.
That's four, that's four, that's five.
Well done.
You beat me to it.
Very well done.
OK, I think he's ready for the actual test.
'It might look simple, but the speed the cards are dealt at forces me 'to concentrate extremely hard.
'And while I'm counting cards '.
Nilli is monitoring something else.
'Crucially, she decides exactly when the test is over.
' OK.
Oh, gosh! OK.
That was quite bad.
How long do you think this took? Erdunno.
It was about 10 seconds worth to about 15 seconds, probably.
It actually took 25 seconds.
What, erI mean I was out by almost, what? A factor of two, there.
So that's quite terrifying, really.
'Professor Lavie has discovered that if one part of your brain 'is overloaded by concentrating on a single task or activity '.
its capacity to accurately monitor other things, like the passage of time, 'is severely compromised.
'It's what the experts call "losing my situational awareness".
' So it's a zero sum game, right? Either I can focus on the task or I can monitor the time, but I can't do both.
You can do both, but one will be at the expense of the other.
I mean, that's pretty trivial in this context, but terrifying, terrifying for medical practice.
'This simple card test has been a revelation to me.
'Until now, I just wasn't aware that our powers of reasoning could be so easily overloaded.
' That's the problem.
We are wired up to fail.
We have a finite ability to cope with complex information.
And avoiding the traps that come with that is not about being smart.
It's not about your intelligence.
It's about accepting your limitations and designing strategies that are going to allow you to cope.
Back! Fires are unpredictable and clearly, extremely dangerous.
Because of this, the fire service face many of the same challenges that doctors do in medicine.
So at this huge converted airfield in the Cotswolds, they run a variety of training exercises.
They have everything here from motorway pile-ups to crashed planes and even derailed trains.
And a crucial part of the training is designed to help fire crews maintain their situational awareness.
When you think about what fire crew do on scene, it involves juggling two very different priorities.
The first involves search and rescue, pulling people out of buildings like that.
But the second is monitoring the constant threat from the environment around you.
Now getting that right, switching between those two tasks, is far from easy.
But to get it wrong is to invite catastrophe.
Because losing situational awareness is so dangerous, the fire service sets up specific exercises, to give commanders and crew the skills to cope with fast-moving situations.
A testing scenario for the training crew today.
This represents an oil platform with a helicopter that's caught fire during a refuelling accident and exploded.
Now, over here on the ground, is a casualty that they're going to have to rescue so there's plenty to focus on here.
Plenty to keep the crew occupied.
But here's the thing - the wider picture, the thing that's going to take it all out of their hands, is up here.
This is a propane store and if it gets hot enough it's going to explode, take out the liquid petroleum gas canisters here and incinerate everything for the surrounding 200 or 300 metres.
The man under the microscope in the exercise is trainee Incident Commander, Simon Collyer.
So helicopter's coming in, refuelling.
Ignition source.
He's in charge of all the decision-making.
OK, folks, listen in.
I want you to do a live attack.
Two of you on foam, two of you, casualties.
Snatch rescue, please, as quickly as possible under cover from these two.
ASAP, hurry up.
Naturally enough, given the pressures of the situation, he's decided to rescue the casualty hidden behind this oil drum as quickly as he can.
Watching on is instructor Gavin Roberts.
They should be doing something with these cylinders.
They're getting warm now.
Once they fail, it can be catastrophic.
But the Incident Commander is so focused on rescuing the casualty that he's lost his situational awareness and that's going to put his whole crew in danger.
Get back.
Get back.
When the acetylene cylinders go off, he has to evacuate the entire rig.
And in the real world, some of his crew could have been seriously injured or killed.
In hindsight, I think I got far too close in.
Far too involved with and carried away with the immediate threats I could see and I didn't observe the big picture and the key threats.
In hindsight, acetylene cylinders are very unstable.
Heat, not a good combination.
The consequences could have been quite bad.
If you had a similar incident again, would you approach it differently? Oh yeah, very much so.
I think I should certainly have made a bigger step back, looked at the big picture, taken some time out and I think that would have really made it much clearer to me about the potential risk there and I could have taken appropriate action.
This kind of training goes on all year round.
It's so successful that fire crews from all over the world come here to learn from our fire service.
I think there's something hugely important here.
If you were looking for a general theory of how to deal with life-threatening risk, then this would be a huge part of it.
That thing that the fire service teach explicitly, that ability to focus on a life-saving task, all the while maintaining an awareness of the greater situation, is all important.
And it's something that in the operating theatres could make all the difference.
OK, Mrs Bromiley, I'm going to give you something to relax you.
And then, after that, we're going to put you to sleep.
It all sounds simple enough in theory Just looking for the cords.
but in practice, it's extremely difficult, as the doctors in Elaine Bromiley's case discovered.
I see the voice box.
A report into her death revealed they had lost their situational awareness.
They became so focused on trying to establish an airway through the mouth to allow Elaine to breathe .
they might have overlooked other possibilities.
Still can't see anything.
It's easy to be wise in retrospect.
Everything looks better, clearer, with hindsight.
There are so many things that could have been improved upon in that operation, so much that could have gone better.
But where do you start? How do you go about giving yourself a better chance in the future? And, as strange as it may sound, that process begins with a piece of paper that looks like this.
Surprisingly, it was in places like this, that the power of a simple slip of paper helped to improve safety in civil aviation.
These machines are the latest in flight simulators.
And every commercial pilot in the country has to undergo regular training in one of them.
Guy Hirst was a senior airline pilot for over 30 years.
And he's used that experience to train over 1,000 new pilots.
So he's got unique insight into how a simple piece of paper, known as a checklist, makes flying safer.
And it all starts on the flight deck before the plane even gets into the air.
So you agree that, Kevin, and I would make the responses, so we're just confirming that everything is in the right place before we take off.
OK, I'll start from the top and work down, so flight controls.
They're checked.
Transponder? T-A-R-A.
Loadsheet? That's received.
Flaps? Flaps one set.
It says on there.
Trims? Are set.
Vital data? Vital data is received.
ECAM memo.
Take off and clear.
And that appears to be that.
And this is the stuff within the realms of human error? We could set any of these switches to the wrong thing and this is our last chance to get that right? Absolutely right.
That's good to know.
Well, I think, given that we sorted that out, I wouldn't mind going flying now.
It is impressive how realistic that view is.
So we're clear for take off.
Here we go.
'Even during take off, checks are still being done.
' Engine stable.
100 knots.
Gear up.
Gear up.
And now we're flying up.
I'll probably put the autopilot on now so we can talk, which will be that one there.
And that's it.
'But why bother? How do checklists make flying safer?' To be blunt, we all have bad days, when we need reminders.
The human memory's frail and we need things like checklists to make sure that the very vital things keep us on the straight and narrow.
CONSOLE BEEPS Oh dear, that's, um, a master warning.
Engine two fire.
OK, selecting 350.
'Flying was once a much riskier business.
' Confirm that's the number two thrust lever.
That is the number two thrust lever.
We're closing that.
'But now, thanks in no small part to the power of checklists, 'it's become routine and a lot safer.
' And agent two squib, correct? Correct.
'And that's helped to change our whole experience of flying.
' You've only got to go to one of the major international airports and the number of flights happening every hour and multiply that by the number of cities in the world, it's extraordinarily safe.
The difficult bit, the dangerous bit, is getting to the airport.
Well, that was fun, but the way they use the checklists in that cockpit is really impressive.
But does it tell us anything about what we should do in healthcare? How it helps us in hospitals? To answer that question I'm off to London to talk to one of the world's busiest and most celebrated surgeons.
This is Dr Atul Gawande.
He's one of America's best-known doctors and just after graduating from medical school, he worked as Bill Clinton's healthcare lieutenant.
He's now a surgeon and professor at Harvard University.
His schedule's so crammed he could only find time to chat with me in a cab as he travelled to the House of Lords for a meeting.
He's so busy because he's leading a revolution in medicine that started with a simple yet shocking study.
We did a study in 28 hospitals across the United States and found that the major reason people either die or are disabled after surgery involved problems that we were not ignorant of.
We actually knew the answers but we didn't execute on them.
And so what we wondered, if you looked at other industries, aviation, what do they do? Well, they train people for a long time, they get very specialised technology and they have this one other thing.
They have checklists.
'So with the encouragement of the World Health Organisation, 'he drew up a series of surgical checklists like this one.
' They were introduced in selected hospitals around the world and the results were startling.
So we tested across eight cities from London here in St Mary's Hospital to Toronto to New Zealand but also poor hospitals.
Tanzania, India.
In every hospital, it cut the complications.
The average reduction in complications and deaths was more than 35%.
I mean, it was, we didn't believe it.
We had to go back and check our numbers and it was real.
People since then have replicated it.
47% reduction in deaths following a checklist approach in the Netherlands.
18% reduction in complications in military hospitals that adopted this checklist approach.
'It covers the fundamentals.
'Checking you've got the patient and you're doing the right procedure.
' And this is it.
This is what we're talking about, isn't it? This is a World Health Organisation patient safety checklist.
This is this one piece of paper, free, cuts deaths and complications by more than a third.
And if it were a drug or a device, I'd be a billionaire.
What he's discovered is that the checklist doesn't just catch simple human mistakes.
It does something else that's critical to saving lives.
It helps to flatten the traditional hierarchy.
We call it an operating theatre for a reason.
It is the stage for the surgeon and everybody else is just a hand.
And changing from a belief that that's how this thing works, where everybody else is to be silent, respectful, ssh, to one where a crew of people, each with separate responsibilities, were making sure we're on the same page, that's the basic idea here and that's what becomes very powerful when you put words around it like the ones on the checklist.
At my hospital, checklists are now part and parcel of everyday life.
And it sounds simple, but just the act of introducing everybody, just knowing everybody's name, fundamentally changes the atmosphere in theatre.
And when emergencies arise, that can be crucial.
Dr Gawande's checklists are now used in hospitals all over Britain.
But when Elaine Bromiley was in surgery, they didn't exist.
RESPIRATOR BEEPS Well, we've tried intubating her.
Both of us have tried.
Yeah, we've tried that too.
OK, let's just cut, OK.
The report into her death discovered that during the consultants' efforts to establish an airway This is difficult.
a nurse came in offering a tracheostomy kit.
This could have been a solution to Elaine's life-threatening problem.
She's really blue.
She's really blue.
Performing an emergency tracheostomy is a dangerous and difficult procedure.
And that involves cutting a hole in the throat around about there, bypassing the mouth, giving the patient a pathway through which they can breathe.
But it wasn't clear who was in charge of this emergency.
I'm pushing as hard as I can.
And the nurses found themselves unable to broach the subject.
The opportunity to perform that procedure was lost.
And, with it, what might have been a chance to save Elaine's life.
Medicine has learned hard lessons from Elaine Bromiley's case.
We are now much more aware of the importance of human factors.
In my experience, checklists help to make operating theatres more cooperative, and less prone to the errors of dictatorship.
At University College Hospital in London How are we doing so far? .
they've even built a simulation suite to help train doctors to recognise and deal with the perennial problem of human fallibility.
Say when.
This is like being backstage at junior doctor's school and it's something that you don't ordinarily get to see.
Here, medics are given the chance to experience tough operating theatre emergencies without risking the lives of patients.
Can I have a bit of suction, please? We are behind a two-way mirror so we can see them, but they can't see us.
Can I have the music off, please? Consultant anaesthetist Sarah Chieveley-Williams is in charge of the programme here.
And she just happens to be one of the doctors who trained me.
More suction.
'We've realised in medical catastrophes, 'that it's not just learning the medicine that's important.
' We are teaching human factors.
We're teaching hierarchy gradients and how they may shift during an operation.
And we're teaching how you engage with people in order to maintain the safety of all the operations that happen.
Duncan Wagstaff is in his first year of training as an anaesthetist.
Let's get two units.
Can I have two units? And in this extremely difficult scenario, he's working with a stroppy surgical consultant in an operation that's about to go horribly wrong.
Who is the? I'm here on my own, Mr Barrow.
A bit more suction.
So these are cases not to be taken likely even under ordinary conditions, but you're going to throw some curve balls in here for Duncan? Yes, so we're now going to drop the pressure a little bit.
We should get some blood in there.
So Duncan's slightly worried that there may be some bleeding going on, that he can't quite see exactly where it's coming from.
Mr Barrow, I'm worried.
So this is so much like what you'd expect to see in real life.
You've got some of the numbers coming down here.
The oxygen saturations are dumping.
His blood pressure is coming down.
The heart rate is coming up.
And still, at this point, unsure of precisely what's going on.
Is this as high as it will go? I think we need head down.
Well, I can't see.
So we're about to throw in the major problem.
So the patient's just had some antibiotics Antibiotics are in.
and the patient's going to react to those antibiotics, which, I know seems a bit unfair, but actually we're deliberately pushing Duncan well outside his comfort zone, as a learning tool.
We've got an emergency here.
Stats down to 82%.
Pressure is very low.
I've got an impending disaster here, guys.
SUPPRESSED LAUGHTER Brilliantly understated.
Just check we've got a pulse there.
FRANTIC BABBLE At this point, we've pushed him probably to his limit.
The patient was indeed deteriorating slightly before we ended up throwing in the antibiotics to which the patient has reacted to.
He's probably not considered that, mainly because he's fixated on the fact that the patient is bleeding.
Yeah, well, it's a pretty horrible scenario really.
'Thank you.
That's the end of the scenario.
' Poor old Duncan.
I need to go and give him a hug.
That was fun(!) 'Human factors are now so recognisably important in medicine 'that in anaesthesia, 'they're beginning to creep into our selection criteria.
' There are schools in the United Kingdom that are using simulation as part of their selection criteria.
It adds a whole new spectrum and dimension to the selection criteria.
It's obviously very labour intensive and difficult to implement for huge numbers, but, yes, potentially in the future it could be there.
Slightly terrifying.
I'm glad I wasn't around when I selected.
We might have had you anyway.
But successful surgery is about more than the individual.
It's about teamwork.
But more people can make human errors more likely.
And medicine is learning how to deal with the problem from another surprising source.
Believe it or not, looking at how Formula One pit crews function has allowed doctors at the world-famous Great Ormond Street Children's Hospital to reduce human errors.
And six-year-old Evelyn Soles, who's undergone open-heart surgery, stands to benefit from this.
These procedures lie at the extremes of our capabilities in medicine, and, even for me, it's quite something to watch.
It's quite helpful, those bits of gauze.
But it's the dangerous period immediately after the operation that doctors needed to re-think.
Just give that a clean.
The operation is over now and the surgeons have effectively handed over care to the anaesthetic team and Isabeau has to coordinate what could be a very complicated transfer.
She's got to take all the drugs, all these machines that are supporting her patient, get them off the table, out of the theatre, along the corridor and up to intensive care, and that's something that, until recently, there wasn't a lot of attention paid to.
But there's a growing awareness that this is a key component in what is effectively a continuous chain of survival that keeps this patient alive.
Until relatively recently, the handover was a source of great concern to the man who's in charge of the ICU, Dr Allan Goldman.
The handover process was terribly fragmented.
What we saw were some of the consultants were doing handover down the corridor between the ICU and the theatre.
Then the registrars were handing over in another section of the ICU.
The nurses were doing another handover over there.
All these simultaneous transfers and handovers, and at the same time, people were moving all the technology.
The nurses were trying to write down little bits of information on their scrubs.
And only when you looked at it at that point, you realised how chaotic this was.
All the little things matter and when the little things start going wrong, they start getting error cascading and lead to a big thing going wrong.
Dr Goldman and a colleague then had an unusual moment of revelation in front of the telly watching an F1 pit stop.
When we looked at a pit stop, these were really the experts of how teams from different specialties come together, reconfigure as a single unit, perform a complex task under time pressure in such an effective way.
When they looked more closely at F1 pit crews in action .
they discovered that each individual on the team had a very simple but very specific, clearly defined task.
There's a guy to take off the tyre.
A guy to put it on.
Someone to take the used wheel away.
And crucially there's one person in charge - the man with the lollipop.
He decides when the car is ready to go back on the track.
What else did you pick up from watching how they operate? I think it's just a very professional approach, so there's the leadership, there's constant use of checklists.
There's great focus on task allocation at a pit stop where one person has got one or two jobs that they're doing.
A lot about situational awareness and then contingencies for things going wrong and a definitive plan.
So how did that affect what they did after operations? I'm just going to put the gastric tube down and then I'll switch off the machine.
Thank you.
'We just simplified the process into three phases.
' 'So phase one is we'd just transfer the technology with no talking 'so everybody knew what they were doing, so it's exactly like theatre.
' There is nothing connected here that doesn't need to be connected.
No drug, no piece of monitoring, no piece of life support.
All of it's got to go over.
All of it, ALL of it has got to work.
'Phase two is organized by the person in charge of the transfer 'and that person is the anaesthetist.
' If you just point out that potassium is low and they need to top it up.
They decide when the patient can be moved.
and that only happens after they run through a transfer checklist which they call an aide-memoire at Great Ormond Street.
So we just hand over information in a fixed, consistent manner.
Right, so ready to go.
Only when everything is checked can the patient be moved from the operating table to the trolley.
Thank you.
For six-year-old Evelyn Soles, the four-hour open-heart surgery has been very successful.
Before these changes were introduced, there was no protocol that guided the transfer process.
Now, just like in F1, everyone around the patient has a simple but specific role.
Phase three is defined by the careful and heavily structured passing on of vital information when the patient arrives in the intensive care unit.
So this is Evelyn Soles.
She had an ABSD and a coarctation repair when she was four months old.
She developed subaortic stenosis and she had that resected when she was about four years old.
So today she's had a Ross Procedure.
I think our focus has been on the extreme of difficult technical procedures and actually sometimes, as you know, the rewards are actually in very simple simplified processes between humans.
She's had a low dose of morphine, five ml.
Last set of gases And it's these procedures that are helping to save the lives of critically unwell children like Evelyn.
Dr Goldman calculates that there's been a 40% reduction in human error since he introduced these new Formula One-inspired protocols.
She got quite breathless When you step back and look at that process, without adding any drugs, or any extra technology, they manage to save lives.
Slowly but surely, medicine is waking up to the experience of others.
Improvements in engineering have had a major impact on aviation.
But human factors dominate training and practice in the industry.
And it's the same in the fire brigade and even in the high-octane world of Formula One.
And, as a doctor, they're beginning to change my everyday working life.
In my experience, they're making routine practice much safer.
But what I'm less sure about is the stuff that we can't plan for.
Those routine days that become your worst nightmare out of the blue.
When we face emergencies in medicine, they're usually messy and they unpack in seconds, giving doctors only the narrowest window before life is lost.
And so what I want to know is, in those extreme situations, is there anything at all that we can learn from the experience of others? In the recent past there's one event that stands out from all the others.
In airports across America, 15th January 2009 was a day like any other.
Until just before three in the afternoon on the east coast.
US Airways Flight 1549 was cleared for take off from La Guardia, New York's local airfield .
and bound for Charlotte, North Carolina.
Captain Chesley "Sully" Sullenburger barely had time to get settled into his chair before disaster struck.
His plane hit a flock of birds.
And his day flipped from routine into the worst of all possible emergencies in the blink of an eye.
The birds filled the windscreen as if it were a Hitchcock film.
I could feel and hear the thumps and thuds as we struck them.
At least two birds had gone through the right engine and one or two through the left.
I felt severe vibrations.
I heard terrible noises as they engines were being damaged that I'd never heard in an aeroplane before and then the thrust loss was sudden, complete, bilaterally symmetrical.
Both engines at once.
I knew this was going to be an ultimate challenge of a lifetime, unlike anything I'd ever experienced before.
Losing both engines in a bird strike was something that Captain Sullenburger had never experienced.
An event so rare that he hadn't even trained for it.
I could feel my blood pressure, my pulse, shoot up.
I sensed my perceptual field narrow in a kind of tunnel vision because of this sudden stress.
It was marginally debilitating.
It absolutely impaired my ability to process what was happening.
What happened next has become the stuff of legend.
With no engine power his aircraft, with 155 passengers on board, had become little more than a clumsy glider.
He started going through his emergency checklist and let the local air traffic control know about his desperate problem.
His voice sounds measured and calm but that's not how he felt.
Listening to it now I can hear the slight raspiness, the slight higher pitch in my voice.
I know you're under stress.
His options were limited.
He couldn't get back to La Guardia or to any other local airport.
So he took the momentous decision to land the plane on the Hudson River, right in the heart of Manhattan.
I never thought I was going to die that day.
I was confident that, based upon my training and my experience, I could find a way to solve this problem and if I could find a way to deliver the aircraft to the surface intact, it would float long enough for us to be rescued.
Not a single person died that day and there were no major injures.
Sullenburger was hailed as a hero.
But for him, this was more about training than heroism.
Over many decades, thousands of people in aviation had worked very hard to create a robust, resilient safety system in which we operate and that formed the firm foundation on which we could innovate, improvise to solve this crisis.
We set the tone, created a shared sense of responsibility, flattened the hierarchy, opened channels of communication and we have teams trained in the consistent application of best practices with well-learned, well-defined roles and responsibilities to each other and to the passengers.
In other words, I took what we did know, applied it in a new way to solve in 208 seconds this problem we'd never seen before.
There's a bunch of stuff that I never fully appreciated about Flight 1549 until that conversation.
The first thing is that Sully's first response, when things start to go wrong, is to be scared.
Fear's a natural experience.
In medicine, when things go wrong the first thing you feel is scared, but what gets them through that is following the protocols.
He sticks to the procedures.
He starts the checklist even though he knows he's not going to have time to finish reading them.
And it's that, the idea that you standardise until you absolutely have to improvise, that makes everything better and allows them to survive.
But Captain Sullenburger had one big advantage in surviving this unprecedented emergency and that is locked away in his brain.
He's not even aware of it because it's a brand new scientific discovery.
And the good news is that we can all access it.
Uncovering exactly what it is means travelling to Michigan State University and the labs of psychology professor Jason Moser.
It also involves agreeing to wear some less than flattering headgear.
'These electrodes allow Jason to measure the electrical activity 'inside my brain, and look at what happens when I make a mistake.
' What are you hoping to read off me today? What we really want to see is how quickly your brain reacts to mistakes and how well it responds and bounces back from mistakes.
We really want to look at what are the neurobiological underpinnings of decision-making and mistake-making.
I'm all wired up and ready to go.
In this line of onscreen letters, all I have to do is press the left button if the middle letter is an 'N' and the right button if it's an 'M'.
It sounds simple but I'm still making mistakes.
It's a pretty straightforward task.
You feel like you really shouldn't make a mistake.
'M' is right, 'N' is left.
I feel like there shouldn't be any excuse for getting this test wrong.
Jason, meanwhile, directs everything from a control suite next door, watching as my brain ticks away.
So how did I do? All right, Kev, so what you're looking at here on this left screen, we're looking at these two responses that your brain puts out when you make a mistake.
This first response is the, "Oh no, something's wrong" response.
You can see that in these cold blue colours here.
That's basically the brain saying, "Something's gone awry.
What's up?" And within just a few hundred milliseconds as we kind of crawl across time here, you can see that this next brain blip, that hot red is telling us, "Now I'm paying attention.
"I see that I've made a mistake.
That's what's wrong "and now I'm going to do something about it.
" 'The faster your brain goes from blue to red, 'the more positive your attitude is to making mistakes.
' 'And that's vital.
' So you're correcting your mistake in that moment right after your brain tunes in and says, "Whoops, I've made a mistake.
"Let's zone in on it, let's fix it.
" You're correcting that response right away and following that, when you see the next set of letters, you're not only correcting that response that you just made wrong, but on the next set of letters you're perfectly accurate.
100% of the time you bounce right back after making a mistake and you get that next one right every time.
'And, in a crisis, being positive about errors is all essential.
'If you have a negative attitude to mistakes, 'you not only take longer to correct them, 'but Jason's research has shown that you end up making more of them.
' Learning from mistakes is something that runs deep in the DNA of the airline industry.
Every pilot, including Captain Sullenburger, is brought up with a positive attitude to errors.
In the end, it's that which makes flying so much safer.
Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died, or many people died.
So we have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations.
We cannot have the moral failure of forgetting these lessons and have to relearn them.
And the airlines are already learning from Flight 1549.
There are now new procedures in place for a double bird strike.
It's a search for progress, rather than for someone to blame.
And it's a lesson medicine needs to learn.
Human error is always going to be with us.
It's how we deal with that that really matters.
I've spent my whole career looking for ways we can wrap science and technology around fragile physiology to protect it.
And it is a genuine revelation to me that we might do the same for psychology, making ourselves less fallible, giving ourselves in that moment the best possible chance.