The Slow Fix: Solve Problems, Work Smarter and Live Better in a Fast World. Carl Honore
Чтение книги онлайн.
Читать онлайн книгу The Slow Fix: Solve Problems, Work Smarter and Live Better in a Fast World - Carl Honore страница 9
Military folk have always known that owning up to mistakes is an essential part of learning and solving problems. Errors cost lives in the air force, so flight safety has usually taken precedence over fare bella figura. In the RAF’s long-running monthly magazine, Air Clues, pilots and engineers write columns about mistakes made and lessons learned. Crews are also fêted for solving problems. In a recent issue, a smiling corporal from air traffic control received a Flight Safety Award for overruling a pilot and aborting a flight after noticing a wingtip touch the ground during take-off.
In the RAF, as in most air forces around the world, fighter pilots conduct no-holds-barred debriefings after every sortie to examine what went right and wrong. But that never went far enough. RAF crews tended to share their mistakes only with mates rather than with their superiors or rival squadrons. As one senior officer says: ‘A lot of valuable experience that could have made flying safer for everyone was just seeping away through the cracks.’
To address this, the RAF hired Baines Simmons, a consulting firm with a track record in civil aviation, to devise a system to catch and learn from mistakes, just as the transportation, mining, food and drug safety industries have done.
Group Captain Simon Brailsford currently oversees the new regime. After joining the RAF as an 18-year-old, he went on to fly C130 Hercules transport planes as a navigator in Bosnia, Kosovo, northern Iraq and Afghanistan. Now 46, he combines the spit-and-polish briskness of the officers’ mess with the easy charm of a man who spent three years as the Equerry to Her Majesty Queen Elizabeth II.
On the whiteboard in his office he uses a red felt-tip pen to sketch me a picture of a crashed jet, a dead pilot and a plume of smoke. ‘Aviation is a dangerous business,’ he says. ‘What we’re trying to do is stop picking up the deceased and the bits of the broken aeroplane on the ground and pull the whole story back to find out the errors and the near misses that can lead to the crash, so the crash never happens in the first place. We want to solve issues before they become problems.’
Every time crew members at RAF Coningsby catch themselves doing something that could jeopardise safety, they are now urged to submit a report online or fill in one of the special forms pinned up in work stations all over the base. Those reports are then funnelled to a central office, which decides whether to investigate further.
To make the system work, the RAF tries to create what it calls a ‘just culture’. When someone makes a mistake, the automatic response is not blame and punishment; it is to explore what went wrong in order to fix and learn from it. ‘People must feel that if they tell you something, they’re not going to get into trouble, otherwise they won’t tell you when things go wrong, and they might even try to cover them up,’ says Brailsford. ‘That doesn’t mean they won’t get told off or face administrative action or get sent for extra training, but it means they’ll be treated in a just manner befitting what happened to them, taking into account the full context. If you make a genuine mistake and put up your hand, we will say thank you. The key is making sure everyone understands that we’re after people sharing their errors rather than keeping it to themselves so that we’re saving them and their buddies from serious accidents.’
RAF Coningsby rams home that message at every turn. All around the base, in hallways, canteens and even above the urinals, posters urge crew to flag even the tiniest safety concern. Toilet cubicles are stuffed with laminated brochures explaining how to stay safe and why even the smallest mishap is worth reporting. Hammered into the ground beside the main entrance is a poster bearing a photo of the Station Flight Safety Officer pointing his finger in the classic Lord Kitchener pose. Printed above his office telephone number is the question: ‘So what did you think of today?’ The need to admit mistakes is also baked into cadets at military academy. ‘It’s definitely drilled into us from the start that “we prefer you mess up and let us know”,’ says one young engineer at RAF Coningsby. ‘Of course, you get a lot of stick and banter from your mates for making mistakes, but we all understand that owning up is the best way to solve problems now and in the future.’
The RAF ensures that crew see the fruits of their mea culpas. Safety investigators telephone all those who flag up problems within 24 hours, and later tell them how the case was concluded. They also conduct weekly workshops with engineers to explain the outcome of all investigations and why people were dealt with as they were. ‘You can see their eyebrows go up when it’s clear they won’t be punished for making a mistake and they might actually get a pat on the back,’ says one investigator.
Group Captain Stephanie Simpson, a 17-year veteran of the RAF, is in charge of safety in the engineering division at Coningsby. She has quick, watchful eyes and wears her hair scraped back in a tight bun. She tells me the new regime paid off recently when an engineer noticed that carrying out a routine test on a Typhoon had sheared off the end of a dowel in the canopy mechanism. A damaged canopy might not open, meaning a pilot trying to jettison from the cockpit would be mashed against the glass.
The engineer filed a report and Simpson’s team swung into action. Within 24 hours they had figured out that an elementary mistake during the canopy test could damage the dowel. There was no requirement to go back and check afterwards. Flight crews immediately inspected the suspect part across the entire fleet of Typhoons in Europe and Saudi Arabia. The procedure was then changed to ensure that the dowel is no longer damaged during the test.
‘Ten years ago this would probably never have been reported – the engineers would have just thought, “Oh, that’s broken, we’ll just quietly replace it,” and then carried on,’ says Simpson. ‘Now we’re creating a culture where everyone is thinking, “Gosh, there could be other aircraft on this station with the same problem that might not be spotted in future so I’d better tell someone right now.” That way you stop a small problem becoming a big one.’
Thanks to Patounas’s candour, an RAF investigation discovered that a series of errors led to the near miss above the North Sea. His own failure to hear the order to bank left was the first. The second was that the other pilots changed course even though he did not acknowledge the fresh heading. Then, after Patounas overshot, the whole team failed to switch on their lights. ‘It turned out a whole set of factors were not followed and if anyone had done one of the things they should have, it wouldn’t have happened,’ says Patounas. ‘The upside is this reminds everyone of the rules for doing a Phase 3 VID at night. So next time we won’t have the same issue.’
Others in his squadron are already following his lead. Days before my visit, a young corporal pointed out that certain procedures were not being properly followed. ‘What she said was not a particularly good read, but that’s going in her report as a positive because she had the courage of her convictions to go against the grain when she could have been punished,’ says Patounas. ‘Twenty years ago, she wouldn’t have raised the question or if she had she’d have been told, “Don’t you say how rubbish my squadron is! I want my dirty laundry kept to me,” whereas I’m saying thank you.’
The RAF is not a paragon of problem-solving. Not every mistake or near miss is reported. Similar cases are not always dealt with in the same