> When there was xenon poisoning in the upper half of the core, the safety rods were designed in such a way that, at least initially, they were increasing (and not decreasing) the core reactivity.
I wonder if any reactor-design groups do "fuzz testing" on simulated models, checking that they can recover from very weird states even if it's not clear how the state could have been reached in the first place.
For example, having one section just arbitrarily xenon-poisoned, while another is arbitrarily too hot.
Yes. Reactors now are designed to never have positive feedback loops that can result in uncontrollable power spikes. They do the worst case simulations to prove that.
Russian atomic regulator downright shut down the project to design a light water breeder reactor, saying that it's not going to be licensed. Light water breeder reactors are barely theoretically possible, but they require trade-offs to limit the amount of water within the reactor core. So the trade-offs result in a positive void coefficient. It's supposed to be offset by other safety features, but better be safe than sorry.
Yes, essentially this happens. PWRs and BWRs have operating limits on their power shape derived from doing those kinds of analyses.
They’re tend do be more physical than “arbitrarily xenon-poisoned” but represent a variety of extreme and nominal states to form an operating envelope, and then healthy margins are applied on top of that.
1. The failure to scale the education quickly enough. Nation-scale nuclear energy was new when the RBMK line was introduced. The demand for nuclear engineers skyrocketed, and it was impossible to train the required amount of professionals to the same standards as nuclear scientists in just a few years. Meanwhile, the RBMK assumed deeper knowledge of its design than they had.
2. The system that made academicians (the official Academy of Sciences title) equivalent to mid-to-large caliber politicians within their area of expertise. As a result, Dollezhal's pride ran unchecked and prevented him from addressing well-known design flaws (that already caused the 1975 accident).
Both reasons are not unique to USSR at all and can be learned from. (something that is often ignored because "it can't happen here")
Re: "The failure to scale the education quickly enough" - Midnight in Chernobyl makes the case that USSR actively fought to prohibit any sort of disclosures about Soviet nuclear reactors vulnerabilities, mitigations and accidents to the nuclear reactors operators.
The soviets did not fail to educate their workforce, the soviets intentionally left their workforce in the dark.
> The soviets did not fail to educate their workforce, the soviets intentionally left their workforce in the dark.
Yes, and guess who was behind that? This it not either-or. I'm already talking about that in 2, which is a high-level cause. Dollezhal's NIKIET specifically refused to admit their problems and sign the document pointing out RBMK's flaws, and got away with things like this multiple times. This was a consequence of a very few selected engineers and scientists having full carte blanche and basically acting as limited-scale politicians within their sphere of influence. This was not the problem of the nuclear industry in particular, this also applied to all critical industries like aerospace where the "chosen ones" immediately started fighting for their place under the sun, stalling their field.
> was a “nukie” when he enlisted. Dumb as a box of rocks
Not sure what time period this took place, but if you're referring to the US Navy Nukes, then about ~20 years ago Nukes needed a high ASVAB score 95+ (top 5% of test takers) I think it's a bit lower now--and there are other test they can combine scores with--since they all leave for high paying jobs after their enlistment ends. The standard now may not be what it was for simply being a Nuke; but to also be a Submariner (volunteer position), he would have also had to learn all of the submarine systems (mechanical, electrical, navigation, weapons, safety etc) and then a board of senior submariners decide if he was proficient enough to serve on a submarine. You may be being a little harsh calling your BIL "Dumb as a box of rocks", if he accomplished all that.
Yes, the RBMK was not an inherently safe design (in fact was deeply flawed) but it could be operated safely if you didn't deviate from procedures. Which they did, because they were never told about the design flaws (it was a state secret).
Huh, did you stop watching after episode 1? The HBO series was all about how it was the system that caused the accident to happen. The fact that it was a known failure mode was a major plot point.
If you watch the court scene at the end they clearly repeat the KGB line that the local operators were primarily to blame for "reckless" decisions. Perhaps you missed this obvious conclusion of the series.
People are free to draw their own conclusions from the movie. The conclusions I drew are:
1. Blame game started right away ("Of course you are responsible. I was not even in the building" Bryukhanov to Dyatlov in the night of the explosion)
2. During the test some actions were taken with a complete disregard towards safety and consequences. Some people would call this reckless.
3. If this accident did not happen at Chernobyl it would have happened somewhere else.
Procedures are written under the assumption that the actual system behaves like the theoretical system the engineer has in their head. It never quite does. There's always a gap, and this gap nearly always requires deviating from procedures to ensure safe operation.
Deviating from procedures prevents as many accidents as it causes. Safety cannot be based on adherence to procedure. Safe systems must be designed to take advantage of (and be protected against, I suppose) human ingenuity.
>Deviating from procedures prevents as many accidents as it causes.
And they weren't doing some small deviation from procedure. They were doing something that was expressly forbidden and they knew why it was forbidden. In the time leading up to the accident it would be difficult to distinguish between what they were doing and trying to intentionally cause a meltdown. In a reactor experiencing xenon poisoning instead of shutting down for 24 hours (procedure) they removed every nuclear reaction moderating mechanism they could.
This isn't a smart little deviation, it's pouring gasoline on a fire and hoping for a good outcome. It is hard to describe how stupid this was.
They are referring to a wider range of situations than this very specific incident. There are many accidents that have been avoided by modifying procedures on the fly. Pilots discovering letting go of the yoke/stick solved some seemingly unrecoverable spins is one example. Though obviously techniques that work eventually get adopted to the procedures.
It generally requires operators to know how the systems work in great detail which would have avoided the specific Chernobyl incident.
> It generally requires operators to know how the systems work in great detail which would have avoided the specific Chernobyl incident.
Hence the above comment doesn't fit with the discussion.
The level of "bad" being done by the engineers was right out of the "never do this" list of actions. Unfortunately there's a lot of evidence that bravado or "experience" is what led to them thinking that they knew better.
Not following procedure is fine if it's turn it left not right. It's not good when instead of turning a valve they take a hammer to it. That's throwing instructions away and so makes any sensible discussion from there on mute.
I’d be careful how you blame to operators vs power plant designers vs people designing the test vs the USSR’s culture at the time.
In that exact situation you could swap many people inside the USSR into those roles and likely get a similar result. The specific people who decided to cover up RBMK’s flaws were responding to the same incentives as the people who covered up an earlier serious incident at Chernobyl on September 9th 1982 etc.
The U.S.S.R. had a lot of nuclear issues, and there’s arguably more to learn from the indirect causes than the direct ones.
Taking out an excessive number of control rods is something that many people have attempted including US operators. If the reactor lets you do it then that in and of itself is a massive design flaw.
Sigh, I'm done. Y'all would fail any technical interview I'd ever give regardless of your experience level.
Going on and on about the merits of not following procedure in a discussion that started with a disaster which could have triggered a global mass extinction event is not the appropriate response. They were doing things which were very well known to be dangerous, which they themselves knew to be dangerous, that had been known to be dangerous since literally the first nuclear reactor at the Manhattan Project some 40 years previous.
And you’d absolutely fail mine for on both technical grounds, and hyper focus on irrelevant details. “could have triggered a global mass extinction event” WTF
I can only hope you’re never involved in safety critical systems. Procedures simply can’t account for multiple independent failures at the same time in complex systems without defaulting to individual judgment, the problem space just grows too large.
To me, Chernobyl is an example of the classic conundrum that engineers cannot possibly think of every single weird thing that could possibly happen so that a perfectly safe anything can be made. It applies to software design just as much as a nuclear reactor. Sometimes it takes a failure to actually happen before something can be made safer. Somethings are just more consequential when they fail making the learning from failure much more expensive.
Not really. Corners were cut, the ultimate issue which caused the explosion was known beforehand, and the operators violated several points of standard procedure as well as doing several basic unwise things. It was not at all a case of unknown edge cases but stupid piled on top of stupid until the damn thing exploded.
The biggest, stupidest action was trying to operate a reactor very clearly experiencing xenon poisoning including the many unsafe things they did to try to overcome the poisoning. I'm pretty sure modern reactors still shut down for 24 hours to avoid the xenon issue. This was well known, even without the design flaws this was a huge risk, and anyone with an ounce of sense would have known not to do what they did leading up to attempting to scram the reactor.