The risks are highest when learners are at a beginner to intermediate stage. They know the basics, and have gained some confidence, but don't know enough to get themselves out of trouble.
This is called Stage 1 in the Gordon Model of learning: unconscious incompetence.
While this is true, in the context of alpine climbing where I first heard this statement, the bold alpinists who die young are very much not beginner-intermediates. I've interpreted this differently than just the "Bathtub Curve"[1] applied to dangerous pursuits.
Rather, there is a certain amount of objective risk in alpine environments, and the more time you put yourself in that environment, especially in locations you aren't familiar with, the greater the chance that something will eventually go wrong.
I'm always surprised by the number of famous alpinists who weren't killed on their progressive, headline-capturing attempts but rather on training attempts and lesser objectives.
My wife teaches people to ride horses for a living so we talk about the safety of that.
You hear a lot about people who get seriously injured riding who are often professionals or people who ride competitively at a high level. They are doing dangerous things and doing a lot of them.
We don't think it is that dangerous for people who ride at the level we do, out of maybe 15 years we've had one broken bone.
The other day I noticed that we had acquired a used horse blanket from another barn in the area which is a running joke at our barn because of their bad safety culture. They are a "better" barn than ours in that they are attached to the show circuit at a higher level than the bottom, but we are always hearing about crazy accidents that happen there. When I was learning to ride there they had a confusing situation almost like
with too many lessons going on at once where I wound up going over a jump by accident after a "near miss" in which I almost did. (I never thought I could go over a jump and survive, as it was I had about two seconds to figure out that I had to trust the horse and hang on and I did alright...)
Another good allegory is that, in the US Air Force, the flight crews considered most dangerous are those with the highest collective rank. Sure, the young crews are learning but the old ones still think they know it all and have often forgotten critical details.
(Example) When you go climbing somewhere, you have like a 40% of getting killed that you can mitigate completely by skill, and an additional 0.1% chance that something goes wrong by some fluke, that you can’t mitigate at all.
Pretty good if you go climbing 10 times a year. Pretty bad if you go 1000 times.
They wouldn't be famous if they didn't succeed on headline-capturing attempts and there are only so many you can realistically do in life. They are dead however as doing dangerous things often enough will kill a substantial number of practitioners.
No, the risks are greatest when you reach complacency. Beginners, even bold ones, take some care. You mostly see this in things like forklift drivers because it takes years of doing the same thing every day before you really get expert enough to be complacent
There is also something called "Normalization of Deviance", defined better by a quote: "Today, the term normalization of deviance — the gradual process by which the unacceptable becomes acceptable in the absence of adverse consequences — can be applied as legitimately to the human factors risks in airline operations as to the Challenger accident." *
Most of you have probably heard of it in the context of fighter pilots doing riskier and riskier maneuvers, but it seems to apply to drivers who speed a lot. 80 starts seeming really slow to them after doing it for years.
Thanks for posting these, I'd only seen Normalisation of Deviance mentioned in these two youtube videos by Mike Mullane and never thought to look any further:
including these two excerpts I found interesting in this context:
"Chapter nine she explains how conformity to the rules, and the work culture, led to the disaster, and not the violation of any rules, as thought by many of the investigators. She concludes her book with a chapter on lessons learned."
"She mainly emphasizes on the long-term impact of institutionalization of the political pressure and economic factors, that results in a “culture of production”."
Vaughn's book The Challenger Launch Decision doesn't tell this truth: the root cause of the accident can be traced back a decade to the acceptance of a design that was "unsafe at any speed".
Every other manned space vehicle had an escape system. The crew of the Challenger was not killed by the failure of the SRB or the explosion of the external tank, but rather when the part of the orbiter they were in hit the ocean. They could have build this into a reinforced pod with parachutes or some other ability to land but they chose not to because they wanted to have the payload section in the rear.
In the case of Columbia it was the fragile thermal protection system that did the astronauts in. There was a lot of fear in the first few flights that the thermal tiles would get damaged and failed and once they thought they'd dodged that bullet they didn't worry about it so much.
"Normalization of deviance" was a formal process in the case of the space shuttle of there being meetings where people went through a list of a few hundred unacceptable situations that they convinced themselves they could accept, often by taking some mitigations.
When the design was finalized it was estimated that a loss of vehicle and crew would happen about 2%-3% of the the time which was about what we experienced. (Originally they planned to launch 50 missions a year which would have meant the continuous trauma of losing astronauts and replacing vehicles.)
It's easy to come to the conclusion that it was a particular scandal that one particular concern got dismissed during a "normalization of deviance" meeting but given a poorly designed vehicle it was inevitable that after making good calls for thousands of concerns there would be a critical bad call.
"Normalization of deviance" is frequently used for a phenomenon entirely different than what Vaughn is talking about, something informal that happens at the level of individuals and small groups. That is, the forklift operators who come to the conclusion it is OK to smoke pot at work, the surgeon who thinks it is OK to not wash his hands, etc. A group can pressure people to do the right things here, but it's something different from the slow motion horror of bureaucracy that tries to do the right thing but cannot.
I'm reminded of Louis Slotin experimenting with the "Demon" core.
The core was surrounded by 2 half spheres of beryllium.
The core would go critical if the 2 spheres were not separated from each other.
The standard protocol was to use shims between the halves, as allowing them to close completely could result in the instantaneous formation of a critical mass and a lethal power excursion. Under Slotin's own unapproved protocol, the shims were not used and the only thing preventing the closure was the blade of a standard flat-tipped screwdriver manipulated in Slotin's other hand. Slotin, who was given to bravado, became the local expert, performing the test on almost a dozen occasions, often in his trademark blue jeans and cowboy boots, in front of a roomful of observers. Enrico Fermi reportedly told Slotin and others they would be "dead within a year" if they continued performing the test in that manner. Scientists referred to this flirting with the possibility of a nuclear chain reaction as "tickling the dragon's tail", based on a remark by physicist Richard Feynman, who compared the experiments to "tickling the tail of a sleeping dragon".
On the day of the accident, Slotin's screwdriver slipped outward a fraction of an inch while he was lowering the top reflector, allowing the reflector to fall into place around the core. Instantly, there was a flash of blue light and a wave of heat across Slotin's skin; the core had become supercritical, releasing an intense burst of neutron radiation estimated to have lasted about a half second. Slotin quickly twisted his wrist, flipping the top shell to the floor. The heating of the core and shells stopped the criticality within seconds of its initiation, while Slotin's reaction prevented a recurrence and ended the accident. The position of Slotin's body over the apparatus also shielded the others from much of the neutron radiation, but he received a lethal dose of 1,000 rad (10 Gy) neutron and 114 rad (1.14 Gy) gamma radiation in under a second and died nine days later from acute radiation poisoning.
That reminds me of when a family friend from church needed help clearing his land and thought the easiest approach would be to teach an overconfident 14 year old (me) to drive his tractor. He told me I would never be worse at operating it than the second time we went out. He was right.
This is called Stage 1 in the Gordon Model of learning: unconscious incompetence.