The NTSB does a fantastic job and should be held up as a model of how to include post-failure analyses and actions into your process. When the report is released, it should be good reading for many in the tech and engineering world.
>The Joint Authorities Technical Review, which produced the report, was led by Chris Hart, a former chairman of the National Transportation Safety Board, and included representatives from the F.A.A., NASA and aviation regulators from Europe, China, Brazil and other countries.
Most (all?) European countries have similar accident investigators who likewise produce reports focused on Prevention Of Future Harm. In the UK these are MAIB (Maritime Accidents, ie boats) RAIB (Rail) and AAIB (Air)
As with NTSB investigations a critical legislative intervention is that accident investigations are NOT evidence for a civil court. That means witnesses have one less reason to be trying to cover anyone's backsides and lie to an investigator.
Unfortunately you see in that MAIB example however, it isn't always enough. It's nothing short of astonishing how often a sole watch keeper on a big ship is somehow unware of where the ship is - almost exactly as though they were asleep, and the BNWAS (a device to alarm the rest of the crew when the bridge isn't paying attention, e.g. because they're fallen asleep) is for some reason not working or its logging is disabled. But I don't remember once reading in these reports that an officer confessed "Yeah, I was asleep".
How do civil courts judge accidents, then, if not using the NTSB report? Do they have to come up with a separate investigation which discovers the same facts, but with a plausible parallel construction?
If a civil court lawyer wants to introduce evidence into a civil court case yes they'd need to go get evidence themselves.
They can expect to run into plenty of ass-covering, exactly as they would if there were no accident investigation authority. Plenty of people who won't talk to them, or at the least won't talk unless subpoenaed, and will bring a lawyer to shut down as much as possible.
Sometimes I read NTSB reports for fun. It's a pleasure to read such well-thought-out analyses that perfectly straddle technical minutia dissection with layman accessibility. In my opinion, it's the pinnacle of technical writing. It's also incredibly important work: these kinds of reports have made air travel magnitudes safer than it was just 50 years ago.
A full decision in a good court case is pretty interesting to read too. A judge working through exactly why, in this particular scenario, they have decided to reach the decision they did.
For example Baskin v Bogan + Wolf v Walker (State appeals on Gay Marriage before a Circuit Court that decided to bundle together both cases since they essentially go "Yeah, but, gay marriage doesn't make babies, so it makes sense we don't allow it")
Now there are two much more famous "Gay Marriage" cases in the US, before the US Supreme Court. The problem is, those decisions are garbage. Windsor is "good" only in that it had the effect Gay Marriage proponents wanted, but the arguments offered by all the justices on both sides feel like weak excuses. Scalia seems to be desperately reaching for reasons why it's OK to do something manifestly unfair that he happens to support, Kennedy and the majority have to do mental gymnastics to explain why a case where one side has no real standing is even allowed to proceed to a decision in the first place (the real reason being: So they can decide in favour of Windsor)
Much older, but very interesting for this reason and with great benefit that it doesn't involve any politics you might worry taint your understanding, would be Carlill v Carbolic Smoke Ball Company in England. Read as three of the best legal minds essentially /invent/ modern contract law to find a path for how to get Mrs Carlill her money from the scumbags who lied to her about their bogus cure for the flu.
It's a terrible accident (that happened during an air show/race), but the work that went into figuring out exactly what went wrong and where is awe-inspiring. Also a very interesting (but tragic) story from a human standpoint. Here's the video of the accident: https://www.youtube.com/watch?v=JyWUTXuXjr0
(No gore, but still: viewer discretion advised. Eleven people lost their lives and sixty-nine were injured.)
This one from 2004 involves a modern regional jet (Bombardier RJ) dead-stick crashing on a reposition flight. The crew's decision-making & remedial actions boggle the mind. I'd call it a case study in horrendous pilotage. Thankfully, no innocents died.
The more I think about this, the more I think that the top levels of management and anyone who was just trying to "get this plane out the door" needs to be held accountable. Fear of competition shipping before you is no excuse to cut corners especially when the consequences of doing so are so grave.
Sure- but I think that there are differing incentives for regulators than Boeing management. The only situation I can think of that would hold them equally accountable is if there were collusion/corruption. I think in this case it was lack of resources/laziness or overly trusting/naive to what Boeing was telling them.
doesn't seem nearly as malicious. Part of the problem with regulation is constant cuts and never expansion in budget despite the workload usually being expanded, it's very easy to believe that a poor job being done in this situation is just the only way they could get it to still work given the constraints. Actively cutting corners to try to beat competition is much more of a deliberate decision that bears more responsibility.
I respect Apple's style of releasing products. Rather than releasing half-baked crappy product (think Samsung Galaxy Fold), take your time and make a truly remarkable product. First mover advantage is worthless if your product crashes (pun intended).
"The task force said the certification documents that Boeing provided to the F.A.A. “were not updated during the certification program to reflect the changes” made to MCAS. It added that two critical documents that describe the potential dangers of a system like MCAS, the system safety assessment and the functional hazard assessment, “were not consistently updated."
Interesting tidbit: after the second crash, the Ethiopian authorities decided to send the black box to Europe for analysis, apparently not trusting US investigators with the data.
> Interesting tidbit: after the second crash, the Ethiopian authorities decided to send the black box to Europe for analysis, apparently not trusting US investigators with the data.
William Langeweische [0] describes lack of airmanship as contributing to the 737 Max crashes and cites another reason for distrust:
"In the case of the Ethiopian investigation, we have an airline and an investigative body that historically have not been able to isolate themselves from the country’s dysfunctional political life."
"After the cockpit voice recorder was dug out of the wreckage, it was shielded from the N.T.S.B. and whisked to Paris. There, for reasons unknown, French accident investigators agreed to download its contents in private onto a drive for an immediate return to Addis Ababa, where the information remains mostly locked away today and has been withheld in full form from any outside observers."
Training is an issue, because the pilots had at their fingertips the means to recover from the malfunction (turning off the stab trim via the cutoff switches).
The plane was traveling fast enough that the strain on the air control surfaces meant pilots weren't able to manually trim the plane back into alignment.
The only way to overcome those forces was to use the electronic motors to adjust the alignment of the rear stabilizer trim. However, those motors also are disabled with the stab trim cutoff switches.
After the fact simulations confirmed that such forces could well have prevented manual control.
There's no evidence MCAS upset looks exactly like runaway trim in every case, most cases, some cases, or these specific cases. This is a supposition made only by Boeing, and repeated by others. What has since been demonstrated is it's possible for MCAS upset to result in a mistrim significant enough the pilots can't recover in time if they're at a low enough altitude.
So I reject the categorical claim they could have recovered with the flip of a switch.
Further to that point, the preliminary report on ET302 shows they did turn stabilizer trim switches to cutoff, and yet they couldn't move trim manually, likely due to mistrim forces. The logical reason why they turned stabilizer trim back on is it was the only way to get electric motor assisted trim to try and get out of the mistrim. But MCAS immediately triggered again resulting in greater than 20 degrees nose down, and negative G forces by the pilots - it was impossible to recover from that. The final act of MCAS, had it been a human pilot, for sure would be considered a saboteur, it's that ridiculous of a reaction under any circumstance, even had the airplane angle of attack been great enough to trigger MCAS it was a gross overreaction, at low altitude, commanding a path steeply below the horizon, and incompatible with survival.
> There's no evidence MCAS upset looks exactly like runaway trim in every case
If uncommanded trim motor action is forcing the airplane into a dangerous dive, it's runaway trim. The pilots must have thought it was runaway trim as they were desperately trying to counter it.
> So I reject the categorical claim they could have recovered with the flip of a switch.
But that's what did happen with the previous LA flight that landed safely.
> it was the only way to get electric motor assisted trim to try and get out of the mistrim.
That's right.
> But MCAS immediately triggered again
The electric trim switches override the MCAS.
So why didn't the EA pilots do that override? I don't know, neither do you. We'll have to see what the NTSB report says.
Should those training processes not be blamed? After all, it wasn’t Southwest Airlines that crashed and the sketchy founder of Lion Air certainly didn’t have safety as a core value.
The weird remarks about Ethiopia were quite out of place, Ethiopian Airlines has an excellent safety record. It's also telling that Ted Cruz insisted very quickly after the second crash that the MAX be grounded, it would be have very bad for his career if one had crashed in Dallas and he had declared them safe.
I'm looking for a aviation thread where this article was discussed, but I can't find it right now.
But it was heavily criticised. Just from a cursory reading you can see many cases of subtle and not so subtle language choices that show a heavy bias and a string of one-sided arguments.
Does he consider airmanship lacking in the USAir 427 crash, or in Colgan Air 3407? Both cases pilots pulled back on the yoke the entire time making recovery impossible, exactly contrary to all of their training. He's dismissive of "startle factor" in Air France 447 as well, as in, he doesn't even bring it up or any of the other factors in the final report, he just dings pilots as lacking airmanship.
This is not an insignificant point. MCAS was flawed, no question. As a pilot of an airplane with enhanced stability control, I have found that anything that tries to “help” by nose down or nose up during times of unusual flight attitudes is unwelcome and honestly scares the hell out of me when the autopilot kicks on and overrides my inputs. Disengaging the autopilot works for a moment, then it kicks back on: the solution is to pull the breaker when that happens. Granted, in my case, I was attempting to intentionally stall the airplane during familiarization training in a Cessna T206. However, an autopilot with “stability protection” can theoretically fly you into the ground if you aren’t trained correctly. So I am not a fan of the airplane trying to “protect” the pilot by taking over control. However, reacting to system anomalies is a critical part of training. Meaning knowing how and when to react to such situations is fundamental to flying the airplane.
If we stipulate the MCAS system was faulty, that still doesn’t relieve the pilots of responsibility. The Ethiopian copilot only had 300 total flight hours — so it’s fair to say that copilot wasn’t experienced with flying in general, let alone type-specific experience in an airliner. Since the captain would have been flying the airplane, the copilot would have been the one running through the checklists. Airliners require two pilots for a reason; that copilot had no business being right seat in a technologically advanced airplane. Both pilots specifically failed to follow the procedures. First, they didn’t adjust the throttle at all during the event — throttle remained at climb power throughout, secondly, they re-engaged MCAS because while they pulled the horizontal stabilizer trim cutout switches correctly, they failed to realize that they had to use the manual trim wheels since the electric yoke-mounted trim would be inoperative at that point.
Yes, MCAS was deficient, however, 90% of aviation training is learning how to respond when things go wrong. Complaining about one issue while ignoring the other is disingenuous. Good pilots wouldn’t have crashed that plane. There is a reason that airline pilots in the US must have 1500 flight hours before being allowed in the right seat; there is a reason that airliner crashes in the US are so exceedingly rare. The 737 Max incident revealed plenty about Boeing and system design, but it also shined a light on the effects of substandard pilot qualification in places like Indonesia and Ethiopia. Considering the US flew the 737 Max vastly more than anyone else, a broken airplane would have statistically resulted in a US crash, yet the two crashes we did have were with airlines from countries with debatable pilot qualification processes. Attempting to obscure the cockpit voice recorders from public view and giving them to France (home of Airbus it might be added,) means that Ethiopia had something to hide regarding their pilots’ actions. If it was clearly 100% “Boeing’s fault,” then Ethiopia would want all data surrounding the crash to be on the front page of every newspaper if only to bolster their case that their pilots weren’t at fault. But instead, they hide the data. Why hide supposedly exculpatory evidence? Because it wasn’t exculpatory at all.
I get it, MCAS bad. But those passengers would still be alive if it weren’t for bad pilots. Given that Ethiopian Airlines is a crown jewel of the country and a vital marketing tool of the country, it’s clear why they wanted to hide any hint that pilots of the flag carrier were questionable.
The pilot's did not perform ideally, but to say their actions were outside of what could be reasonably expected is foolish. This is a system for which they received no training, acts in an obscure and intermittent fashion, and is accompanied by a slew of cabin warnings.
Furthermore, with the overspeed and extreme trim, the trim wheel was likely inoperable, or else required so much force that the pilots reasonably thought it was inoperable. Due to the overspeed, the pilots were in an impossible situation where they concluded they needed electronic trim control while also knowing that this system was threatening their demise.
And not only that but follows a well established pattern of (poor) accident investigations which look at the most proximate event to the accident (the pilots on the cockpit) and assigns the blame to that instead of focusing on systemic issues which are much greater contributors to the overall situation which led to the accident.
The pilots did the best they could. They increased airspeed to have at least some lift despite the excessive mistrim, and they turned electric trim back on because in the 737-NG autotrim stops when the yoke is pulled. No one told them about MCAS and how it behaves different from the model that they knew.
I think there is a lot of misinformation in your post.
The pilots didn't follow the airspeed unreliable procedure, and oversped the aircraft. Maybe it's the best they could do, but definitely not what they were expected (and trained) to do.
Electric trim is not turned off "when the yoke is pulled" - it's momentarily turned off when electric trim thumb switches on the yoke are pressed.
And they switched it back on because they couldn't manually re-trim due to very high speed that the plane was travelling at that point.
> But those passengers would still be alive if it weren’t for bad pilots.
Are you an experienced pilot, intimately familiar with the 737 MAX, or otherwise an aviation expert qualified to be a judge of this?
I can't see how you could make such a statement with any auhtority otherwise, and your profile does not seem to indicate you are.
I am neither, but it has also come to light that Boeing hid MCAS from airlines and pilot training materials, and even hid details from the FAA.
Together with the quick worldwide action by aviation authorities, and the prolonged grounding, even in the US, is enough make this line of reasoning very doubtful.
> Are you an experienced pilot, intimately familiar with the 737 MAX, or otherwise an aviation expert qualified to be a judge of this?
I am not a pilot, but I worked on the stab trim design for the 757. There are cutoff switches for the stab trim on the console, and their purpose is to stop uncommanded trim movement. They were successfully used on another Lion Air flight to recover from MCAS malfunction.
The electric thumb switches will also override MCAS and can be used to trim the stabilizer back to normal, and then cut off further trim with the cutoff switches. In both incidents the pilots were able to bring the trim back with the thumb switches, multiple times, but it apparently did not occur to them to shut off the trim after doing so.
I'm very interested to see the NTSB report on this.
In the 737-NG you could turn off autotrim and still have electric trimming. This was changed in the 737-MAX, if you turned on electric trimming you'd also turn on MCAS. That change was poorly documented, and at sufficient airspeed you had to rely on electric trimming because the aerodynamic forces on the horizontal stabilizer would be too high to turn the trim wheel manually.
> This was changed in the 737-MAX, if you turned on electric trimming you'd also turn on MCAS.
The electric trim switches override MCAS. This is according to Aviation Week, Aug 19, and is consistent with Boeing's bulletins on the matter and with the pitch profile from the flight data recorder - both sets of pilots had overridden MCAS with the electric trim switches in multiple cycles before their crashes.
Four seconds later — and only 35 seconds after the nose down problem first occurred — the co-pilot suggested they initiate the emergency procedure recommended by Boeing, and disable the MCAS system by flipping switches in the cockpit.
“The pilots diagnosed and executed the procedure within 35 seconds — that’s lightning fast,” said Jason Goldberg, a spokesman for the pilots’ union of American Airlines, one of the biggest US operators of the 737 Max aircraft.
I'd be careful about using Financial Times as a source. I've seen so many articles about this not written by aerospace people and full of errors.
The flight data recorder showed that the pilots had successfully countered the MCAS input more than once with the electric trim switches. In the Lion Air crash, the pilots successfully countered it 25 times. At any one of those times, the pilots could then have turned it off with the cutoff switches. My source is Aviation Week, Aug 19, 2019.
It says: "Electric trim input will stop the automatic nose-down stabilizer movement" on a bulletin Boeing issued on Nov 6, and goes on to say "The only way to stop the cycle is to follow the runaway stabilizer checklist and toggle the console-mounted cutout switches."
This was apparently done by the previous Lion Air flight which encountered the same issue and landed safely.
The Ethiopian Air pilots also successfully used the electric trim switches to override MCAS. After two cycles of that, the pilots did think to throw the cutout switches, but with the nose down. They should have trimmed the plane to normal with the electric switches, then throw the cutout switches.
All according to AW, which I am much more inclined to believe than other reports, until we see the NTSB report.
No, I don't think you do get it. It's not just about the shockingly clueless engineering of MCAS. It's also about Boeing persuading the FAA, and the FAA allowing itself to be persuaded, that additional pilot training for the Max wasn't necessary. You can't criticize other countries for inadequate pilot training without acknowledging that Boeing and the FAA contributed to that inadequacy, and therefore have blood on their hands.
Remember that the previous Lion Air flight to the one that crashed also has uncommanded trim movement, and the pilots there simply shut it off with the cutoff switches and landed without difficulty.
The cutoff switches are there to deal with trim runaway, and there is training for that.
The flight data publicly available for the three flights, shows rather different MCAS upset behavior in all three cases. They're remarkably similar compared to what you'd expect for a normal flight. But the oscillations in the JT 34 case were not nearly as aggressive as the other flights, no evidence mistrim happened, and in fact the JT 34 pilots didn't recognize it as mistrim, it was the jump pilot who reportedly recognized something (we don't know anything about his thought process so far publicly) and apparently made a recommendation to set stabilizer trim to cutoff.
Is there training for getting out of mistrim at low altitude without the benefit of electric trim? Is there training even for MCAS upset as distinguished from runaway trim? We already know there isn't a way to simulate angle of attack sensor failure induced MCAS upset, in MAX simulators. Exactly how was it demonstrated that MCAS upset looks like runaway trim? And how much faster it commands nose down compared to the typical runaway trim case?
>Is there training for getting out of mistrim at low altitude without the benefit of electric trim?
No.
>Is there training even for MCAS upset as distinguished from runaway trim?
No.
Remember, MCAS was dropped from the manual, and not included in the end training pilot's would be exposed to prior to being handed a MAX. It was nowhere in that presentation. That was covered in the 60 minutes expose.
>Exactly how was it demonstrated that MCAS upset looks like runaway trim?
It wasn't. Take a look at the ET302 preliminary crash report. You'll see attached to it the documentation pages from Boeing even remotely related to MCAS. In fact...
> Note the condition description explicitly mentions a condition where uncommanded trim is running continuously.
The trim system repeatedly coming on and driving the nose down is runaway trim. The pilot already knows he's in trouble because the trim is running, and he's looking at the checklist on how to stop it. And there is the information on how to stop it. Getting hung up on the definition of "continuous" while the plane augers in is something a computer would do, not a human pilot.
Which is why airliners still have human pilots, not computers, in command.
I was going to comment earlier on something you posted but I wasn't sure it was the time Walter. But hey, here goes.
In reply to someone else you posted to the effect that
>if the trim goes uncommanded, that's runaway trim.
If there is anything I did take out of Langewiesche's article, it is that apparently for some subset of the human species, it appears that the mental optimization you and I are capable of connecting naturally (stab runaway on continuous uncommanded trim-reduces to-> stab runaway on any unaccountable trim), is not, in fact, completely natural to everyone.
I've come to realize I have a team member who is one of those people. I have to be very careful with instructions to them, almost like programming. If I handed him that piece of paper, then asked them whether a pulsing trim system demanded that procedure, I'm not willing to stake my life on him getting it. So life being what it is, I have started to take the possibility of someone being of that disposition into account more frequently.
I've been somewhat disappointed at how frequently I run into it. I'm not saying there is anything wrong with those type of people, just that I can't necessarily generalize the capability with enough confidence to be comfortable with the connection not being proven to be made without demonstrable proof in the form of a simulator session or two.
It's just not a given I'm capable of assuming away anymore. I've seen (and even been the unwitting subject of) too many counterexamples, albeit in less than life-threatening conditions.
I certainly couldn't wrap my mind around why they wouldn't have made the mental connection while actually successfully retrimming the plane. After several months of devoting a hell of a lot of mental cycles to meta-cognition though, I've found I have my own corpus of "Oh, what the hell, how did I not make that connection til now?" which has been the result of many years of habit building.
I think I realized this earlier on, but couldn't convincingly articulate it. It came previously from the idea of psychological anchoring, and it's effect on subsequent responses in the presence of priming. It's a fairly well researched phenomena, and network theory also suggests it's a near certainty that this type of inability to grok can happen if mentation is an emergent result of our internetworked mass of neural nets that is the gray stuff between our ears.
It's still one of those fuzzy hunchy sentiments though, so not really something worth writing a paper about.
I understand that people are different and process things differently. There's nothing wrong with them. But some people aren't suited to be pilots, and flight school is supposed to wash them out. I also don't believe that misinterpreting "continuous" was a factor in the crashes.
One failure scenario that can cause runaway trim is mechanical damage that causes an intermittent short circuit which can cause uncommanded trim. "Continuous" or intermittent, you're going to want to cut off the stab trim, as you would any dangerous piece of machinery that is randomly turning itself on and off.
I have a difficult time conceiving of concluding that "the trim is coming on randomly and pointing me at the ground, but since it's not continuous I'll just let it keep doing that rather than turning it off." I'm not a pilot, but I bet my response would be more like "Holeee phuck, wtf is wrong with the trim, it's going to kill us all! Turn it off! Turn it off!"
As I mentioned, I spent 3 years working on the 757 stab trim system for the 757. Although I didn't come up with the idea, the engineers who did said that's exactly why the cutoff switches are clearly labeled and within easy reach. They're for "the stab trim is possessed by demons trying to crash the plane, shut it off NOW!"
Interestingly, the emergency checklist says to turn off the autopilot first, and if that doesn't stop the trim, then shut off the trim. My tendency would be to stop the trim, then turn off the autopilot, and fly manually the rest of the trip.
For example, there was a crash some years ago where the pilots got an air pressure warning. They dug out the checklist, and started following the procedure. They passed out from hypoxia before getting very far through it.
The checklist was then changed so that the #1 item was "put on your oxygen mask". These things are all so obvious in hindsight, but sadly too much gets learned the hard way.
The cutoff switches stop the runaway, but MCAS could max out trim to the extent that it was literally physically impossible to correct manually without steeply descending to take aerodynamic load off the control surfaces. Doing this has obvious complications when it's only a few minutes after takeoff and the pilots are already struggling to maintain any altitude at all.
FAA testing disproves your contention that "good pilots wouldn't have lost the plane." That was part of the reason they eventually mandated the rearchitecture of the flight computer due to it representing a single physical point of failure
One of the three seasoned test pilots (the civilian one) ended up losing the plane due to a single-event upset cascading to false positive activation of MCAS consistent with the Ethiopian Airlines disaster. I.e. the system activating at non-extreme AoA.
Also, not pulling power was justified due to the malfunction of the AoA sensor resulting in an airspeed unreliable state, the response to which is essentially "set throttles to what they should be for that stage of the flight", which was climb out in Ethiopias case, and from a hot and high airport to boot. They also ran through the procedures written by Boeing to a T. The fact that Boeing intentionally withheld important implementation details from the pilots can't really be used against them. The measure of incompetence is to not know something you absolutely should. The Ethiopian pilots had none of that info, and Lion Air swapped in a garbage part, and had bad paperwork in the logs, and furthermore, we're known to be at risk of "flying by rote" by Boeing. This damns Boeing even more, since they knew of the local minima in terms of safe operating style, yet still left out the information those pilots would have needed to safely fly the plane, ostensibly so as not to draw scrutiny from regulators.
In short, appealing to the need of a master airman to fly a demonstrably physically dangerous (and possibly unairworthy given explicit prescriptive criteria in the FAR's) aircraft doesn't speak volumes to the overall safety of the aircraft, which puts fault squarely in the corner of the manufacturer who self-certified the safety of their design.
Anecdote - After TMI the NRC mandated that all nuclear plants install a system to automatically initiate auxiliary feed water to the steam generators if certain criteria were met. The Combustion Engineering version of this was called Aux Feed Actuation System or AFAS. During the next refueling outage it was being installed and I was giving lectures on it's design and operation (I had moved from operations to training by then) and the very first question from the very first operator training session was
"How do you turn it off". My answer was "you can't". They were not happy.
Much is said about the FAA's relationship with Boeing, but the failure of other worldwide authorities to question the FAA's line after the first crash was also a tragedy, because if they had been a little bit less deferential and more independent then perhaps the Ethiopian crash wouldn't have happened. There were pilots unions writing letters of concern about MCAS after the first crash, it wasn't taken seriously by any regulator and the result was deadly.
No and I'm actually shocked it took two crashes for all of this to come to light. You'd think that over 100 dead passengers would have sent shock waves through the industry. I would have guessed every 737 MAX pilot in the world would have been scouring resources on learning about the crash and how it might impact them in the future.
The corporate spokesperson should be automated. Just have a robot do this job. It's boilerplate language. It requires no adaptation or creative thinking. And it'd be no more or less credible if delivered by a robot.
Possibly, but I don't think there was much of a risk. Boeing already had--presumably--at least a pretty decent idea about the review's contents by piecing together hints from document requests and questions or maybe even someone calling in a favor to get a draft copy. The Joint Authorities Technical Review (JATR) involved a lot of agencies and personnel from multiple countries.
On the other hand, that also poses a problem for any attempt to change the report. There are a lot of iterative copies out there, with both Reuters[0] and the Times obtaining copies independently. Pressuring the JATR team to change its findings would leave an absolute ton of fingerprints, and be instantly obvious once reporters start reviewing a diff of the report.
On the other hand, if the final report released today is much more benign that what is reported here, wouldn't it make the suspicion of conflict of interests a lot worse ?
Nice to see the rest of the world isn't content to let Boeing/the FAA off the hook. I'm also seeing every indication I was confident would be found back in March...
Poignant points follow.
>The report found that while the F.A.A. had been made aware of MCAS, “the information and discussions about MCAS were so fragmented and were delivered to disconnected groups” that it “was difficult to recognize the impacts and implications of this system.”
>The task force said it believed that if F.A.A. technical staff had been fully aware of the details of MCAS, the agency would probably have required additional scrutiny of the system that might have identified its flaws.
This troubles the crap out of me. I was under the understanding aviation used a radically different development process specifically designed to avoid this type of thing. However, this rings of a bad attempt at courting with some sort of Lean or Agile process methodology. I certainly hope that's not the case, but I've seen those same exact symptoms when pressure is put on not properly documenting/analyzing things beyond the bare minimum required by law.
>A broad theme of the report is that the F.A.A. was too focused on the specifics of the new system and did not put sufficient effort into understanding its overall impact on the plane. In certification documents that Boeing submitted to the F.A.A., MCAS was not evaluated as “a complete and integrated function” on the new plane.
>The report also said Boeing had failed to inform the F.A.A. as the design of MCAS changed during the plane’s development. A New York Times investigation revealed that the system changed dramatically during that process, making MCAS riskier and more powerful, and that key F.A.A. officials were unaware of major changes to the system.
>The task force said the certification documents that Boeing provided to the F.A.A. “were not updated during the certification program to reflect the changes” made to MCAS. It added that two critical documents that describe the potential dangers of a system like MCAS, the system safety assessment and the functional hazard assessment, “were not consistently updated.”
These three points are consistent with a regulation scheme where the primary regulator is no longer the primary driver of the regulation process and is dependent on the regulated to raise awareness of "there may be issues here." Active participation of an adversarial regulator is absolutely essential.
>Boeing also failed to thoroughly stress-test the design of MCAS, according to the report, which found that “the design assumptions were not adequately reviewed, updated or validated.”
This is straight up unwillingness to test, "Steve says this'll never happen" cultural attitude. There is always a push against testers to minimize the cost of their testing as much as possible. This reinforces for me how strong that pressure really was. The single-event upset testing they did back around June-ish, while an admittedly highly unlikely occurrence is a textbook bog standard test case in any aerospace/safety critical design. To not have it accounted for, is tantamount to heresy from this humble tester's point of view. Lord only knows what other tests were shot down because everyone else in the management chain were more interested in getting the plane out the door.
>In addition, the report criticized Boeing for not adequately assessing the extra effort pilots might have to make to deal with MCAS, and it noted that Boeing had removed mention of MCAS from a draft of the pilot’s manual. As a result of that decision, some key F.A.A. officials were not fully aware of MCAS and were “not in a position to adequately assess training needs,” the report found.
This and the previous statements w.r.t tests undone, and documentation/awareness raising failures constitute for me sufficient evidence of a potentially deliberate regulatory hack. I'm glad we have Attorney's General conducting a criminal probe in this case. The convenient way the failures just happen to fall into line and that everything just happened to go off without any of the people who could have stopped the process of this error chain from propagating seems to me a bit far-fetched. Accidents happen, yes. This represented too many disparate failure modes in the process to not have been the result of a deliberate optimization decision made by someone; and that decision maker needs to be held to account.
>To address some of these shortcomings, the report recommends that the F.A.A. update the certification process to allow the agency to be more involved in the design process early on.
>Overall, the report found fault with the process for certifying a new plane based on an old design, saying that it “lacks an adequate assessment of how proposed design changes integrate with existing systems.”
These two points revolve around the grandfathered certification process in general, which has been a big motivator in terms of disincentivizing designing new planes. I'm fairly sure that there needs to be a greater threshold of proof tacked onto a grandfathered design after this point. I understand the intent to streamline by utilizing already proved design elements, but it is fairly obvious it invites in design complacency from the ground up as it were, and encourages the financial side of the house to feel emboldened in attempts to pressure engineering into skipping large portions of the work needing to be done.
>It recommended that the F.A.A. confirm that the Max is in fact compliant with regulations having to do with the plane’s flight guidance system, flight manual and stall demonstration.
This last point may doom the MAX depending on how accommodating regulators are willing to be.
Without MCAS, the airframe fails FAR 25.173 from the pilot's point of view. Period. That is beyond contention at this point. There will need to be a judgement call with regards to the delta in control forces as to whether or not a regular pilot, through training alone can be educated to safely compensate for those longitudinal stability prescriptive behavior divergences. In the end, it is left to the certifying authorities.
If it were me in that position, with the authority to make a binding proclamation on it; my answer would be no. Absolutely not.
I've dug into the history of aircraft certification enough to agree with some of the older school regulators from times before electronics were a given for flight control systems. We must make flyable planes first. If we let the viability be compromised by economic factors compensated for by gadget wizardry, then we're walking down the path where pilots are not "flying" the plane. This is unacceptable in my eyes. If we are going to put as much trust in pilots to the point that we lock the occupation behind years of education and draconian requirements for physiological conditioning and educational refresh, then the pilot should be as much a design consideration in the flying of the plane as possible. Which means at a minimum, the pilot needs to be able to fly the machine with minimum automation functional by virtue of passive flight characteristics. I'm not against fly-by-wire as a means of control signal translation to provide common interface for the pilot, mind, but the computer should not have to be resorted to to mimic passive longitudinal stability. That is a sin that cannot be pardoned in aerospace design for Civil transport, and yet we've had more than enough history of attempts to do so to make it clear it's a bad idea.
There is a precedent for "flight envelope protection" for fly-by-wire airliners, which has been applied to all modern larger airliners except for the 737. (directly competitive A320, as well as A350, A380, 777, 787, 747-8, ...)
Small problems with the fly-by-wire system on the A320 get found and fixed on a routine basis. Bigger problems have been found in the past, but how to manage the risks in this kind of system is well known and that knowledge base was entirely ignored in the development of the 737 because Boeing couldn't be bothered to bring the 737 into the 1980s, never mind the 21st century.
Given, but flight envelope protection has it's warts too, and I don't wish to (and probably chose my words poorly to convey my true meaning) turn this into a Boeing v. Airbus design philosophy flame war.
Where your examples pass my test of acceptable fly-by-wire, is that even in their most automation crippled state, (direct law), the airframe demonstrates longitudinal stability, and the automation doesn't induce undesired behavior detrimental to flying the plane. Thereby it can be thought of as a Hardware Abstraction Layer between the inputs and the control surfaces that faithfully represent pilot inputs to output behavior.
Boeing's MCAS fails this sniff test. If the input sensor breaks, it actually generates dangerous input that actively detracts from the flyability of the plane. The airframe itself necessitates this dangerous subsystem because uncorrected, no test pilot would be willing to sign off that it was consistent with the older 737's handling characteristics.
Just wanted to be clear: I'm not anti-Airbus. I'm anti-"Hey let's automate this, but do it with minimum scrutiny and reckless levels of awareness building for the end operator."
Which is what happened with the MAX, and seems like a page torn out of McDonnell Douglas playbook.
It is not Boeing vs airbus, it is the 737 vs all other modern airliners.
FBW on the 7x7 (for x>3) is not that different from Airbus airliners except Airbus has sidestick controls that are not mechanically connected whereas Boeings have jokes that are mechanically between the two pilots.
If Boeing used the same flight controls that they use on all post 1970 planes it would be ok, Airbus has nothing to do with it.
FBW refers simply to the lack of a hydraulic/mechanical connection between the flight controls and the control surfaces. The 747 has such a connection. Thus, the 747 is not a FBW aircraft (although I think some control surfaces on the 747-8 may be FBW).
Good, thorough observations. A couple things I'd chime in on:
> I was under the understanding aviation used a radically different development process specifically designed to avoid this type of thing.
I think this is confusing the development process with the oversight process. It's possible for the development team to follow more traditional approaches and still leave the FAA in the dark.
> Active participation of an adversarial regulator is absolutely essential.
Completely agree, but unfortunately regulators often perceive themselves (or are pressured) to be facilitators as much as overseers. When the incentives of the regulators and the manufacturers align too much ("lets get this out the door for cost/schedule/lobbyist reasons") the cultural adversarial aspect is watered down. Layer on top of that a bootstrapped regulator who has to cover down on multiple systems and they may no longer be technically capable of fulfilling a staunch adversarial role and are relegated to ensuring the process "checks the boxes"
> This is straight up unwillingness to test
I feel like when cost/schedule pressure hits, thorough testing is the first thing to go. Good quality oversight helps prevent bad things from happening. When it's been a long time since bad things last happened, people begin to think that quality oversight is no longer necessary without making the connection that it may be the direct cause of that 'goodness'. This feels very much like a human psychology problem rather than a technical issue, similar to the cultural issues that resulting in the Space Shuttle Challenger/Columbia disasters. We need to realize these (sometime painful) oversight mechanisms are meant to save us from ourselves and not lose sight of that when the recent success gives us a dose of hubris.
> evidence of a potentially deliberate regulatory hack...that decision maker needs to be held to account.
My hunch part of the problem will be this won't be hung on one "decider". The responsibility for these decisions often get diffused across multiple people and it doesn't become crystal clear who ultimately bears responsibility. (This may play into the cultural aspect as well; I wonder if there was a clear distinction that these decisions land on one single person, if the weight of that responsibility would give them more pause.) A good leader understands the buck stops with them...I'll wait and see if that actually plays out.
"In a statement, the Boeing spokesman Gordon Johndroe said “safety is a core value for everyone at Boeing,”"
If that were true, such a critical feature would have redundant sensors. There is no reason not to have them except costs. Even more so if one considers that other sensors ARE redundant, they even feed either the pilot or co-pilot displays, never both.
The costs in this case was $1000000 per airframe sold if Class D simulator training was required, which resulted in the decision to go with the single sensor design.
> The review also said there were signs that Boeing employees who worked on behalf of the F.A.A. to certify the Max had at times faced conflicts of interest.
LOL. 100% of the time they have a conflict of interest.
But from the article it sounds like they plan to continue with the employees regulating their own company.
The Engineers now report those concerns to the company Chief Engineer, who reports to Mullenberg/the CEO. There is also a committee tasked with oversight and investigation of anonymously submitted concerns.
>The board called for revamping oversight of a controversial program that delegates Federal Aviation Administration (FAA) authority to company engineers and technical experts, allowing Boeing to help certify its own airplanes as safe and airworthy.
>That recommendation calls for Boeing experts, known as “authorized representatives” under the FAA’s Organization Designation Authorization (ODA) program, to report to a new aviation safety organization within the company, called Product and Services Safety, rather than to business and program managers.
So this would be the FAA "blessed" folks reporting to an internal board separate from sales/business/project management. So technically, yes. The FAA is not directly in the loop.
>Boeing said the new internal organization would be headed by a vice president who reports directly to company leadership. It also would be tasked with overseeing Boeing’s Accident Investigations Team, safety review boards and investigations of “cases of undue pressure and anonymous product and service safety concerns raised by employees,” according to the announcement from CEO Dennis Muilenburg and board members detailing the recommendations.
>The new structure “should increase awareness and reporting of, and accountability for, safety issues within the company,” Boeing said.
cough That's an unfortunate acronym... Boeing Accident Investigations Team, henceforth known as B.A.I.T.
I normally favour the latter all things being equal.
However in this case, I sense that the battle with Airbus was ( and the key role of the 737 Max in that ) considered strategic and failure possibly an existential threat for Boeing and hence the US aviation industry.
Big complex systems fail, sometimes there is a good target to point fingers, sometimes there isn’t.
Fear of punishment is a really bad way to reduce future failures, and being satisfied having found somebody to punish makes you stop looking for systematic issues which caused and will continue to cause failures.
The FAA/Boeing oversight process failed or more correctly happened much too late (this report being a part of that, post mortem)
It is though. It isn't enough to have strict regulations. You need to have regulators that are willing to enforce them and companies that follow them. We have neither, thus it's time to punish the people who are responsible for not only what lead to the accidents, but the people who are responsible for the state of these agencies and companies.
You do though, air travel is very regulated and incredibly safe, and definitely is not by default.
Focus on punishing people and one, you teach people to be better at avoiding punishment and two, you keep punishing new people for doing the same thing over and over.
Air travel is very safe and regulated because of behavior that persisted until recently. It's clear that systemic deviation has happened since the 90s that is leading to a system that can't be trusted in the same way that it could be historically. It's just that these are changes in a complex system that take years to surface as real problems that affect people. As long as financial incentives are so high that they override regulatory concerns as they do now, punishment is certainly in order.
It's not the kind of thing where you make a change and planes fall out of the sky. No, you make small, gradual changes in processes and in safety margins until at some point in the distant future, planes fail catastrophically as a direct result of decisions made as an organization 1-3 decades ago. It's the same set of behaviors that led to the Challenger disaster.
> Air travel is very safe and regulated because of behavior that persisted until recently
Air travel is very safe because the entire world of aviation has focused on analysis and improvement rather than infighting and blaming. If anything, I worry more about how the aviation community worldwide seems to be fragmenting right now, and the effect that may have on safety going forward.
But then I do read stories about the FAA and EASA still cooperating very tightly and respectfully, realize most of the infighting is by armchair quarterbacks on the Internet and the pros are still doing the Right Thing, and I sleep a little better.
> Air travel is very safe because the entire world of aviation has focused on analysis and improvement rather than infighting and blaming
Except you have one institution, the FAA, and one company, Boeing, who clearly no longer have a focus on maintaining regulatory processes to keep safety the main priority. While Boeing is cutting corners everywhere in order to save a buck, the FAA is looking the other way and letting Boeing do what they want. This is something that has already had serious consequences, and will continue to unless there are serious changes on a systemic level and there are punishments for those who are responsible for allowing these systemic changes to have occurred.
Also, I don't appreciate your underhanded insult. This isn't the place for it.
> Also, I don't appreciate your underhanded insult. This isn't the place for it.
Sorry you took that as an insult, it was not intended as such. It's a generic observation that on places like HN we've collectively lost our shit and started acting like we are aeronautical engineers. It's actually one of the big factors that has recently caused me to lose a lot of confidence in the HN crowd. Thought we were above it. Even in middle age it seems I'm still capable of wide-eyed naivete.
>Sorry you took that as an insult, it was not intended as such. It's a generic observation that on places like HN we've collectively lost our shit and started acting like we are aeronautical engineers. It's actually one of the big factors that has recently caused me to lose a lot of confidence in the HN crowd. Thought we were above it. Even in middle age it seems I'm still capable of wide-eyed naivete.
Things are getting a little heated, so I just figured I'd chime in, because I've unintentionally become a bit of an expert on this.
It is correct that blameless postmortem has been a major contributor to the culture of aviation safety. I don't necessarily question that culture so much as the prevailing executive culture at Boeing since the McDonnell Douglas merger.
In that case, the aviation safety culture is being undermined by executive pressure, which has been a more recent thing as far as I'm aware from my aviation history deep dive. Boeing only after the merger showed signs of trying to "financially engineer" the company into a vessel of shareholder value growth over and above it's primary mission of making safe, high quality aircraft.
In regards to
>It's a generic observation that on places like HN we've collectively lost our shit and started acting like we are aeronautical engineers.
They who grapple with questions and issues of aeronautical engineering, who run down facts, find evidence, employ sound reasoning to make predictions are, de facto aeronautics engineers in the sense of people doing tasks normally being associated with the responsibility of a practicing aeronautical engineer.
They are not a protected class of people, and anyone willing to pick up the tools and use them correctly, basically is one. The protected class is a Professional Engineer (U.S. not sure about anywhere else), who has been granted authority by the State to sign off on Engineering projects.
The elitism communicated by calling someone out for not being a practitioner in their day job is not desirable or conducive to fruitful discussion, and is regularly frowned upon.
One doesn't pop out of the womb an engineer after all, so drawing such lines only serves to discourage inquisitive minds.
IF a company shortchanges safety to beat someone to market...
I think a good look at corporate personhood is in order. If a company causes deaths due to their rush to get a product to market, charge the company, and if guilty, the board at the time of the crime must serve the sentence.
Do that, and you'd see a lot less shitty behavior from corporations.
It is very easy to do a post hoc accusation, “look it’s obvious, they experienced a failure in their new product therefore it was rushed to market” and then make everybody feel good about putting the bad guys away.
But that doesn’t get you safety or improvements. You can’t test anything infinitely, you can’t predict the future, all you can do is have standards and processes and a willingness to learn from mistakes.
Putting the “bad guys” away doesn’t help anybody decide when ready is ready. Turning collections of mistakes into crimes doesn’t prevent future mistakes.
Fear doesn’t prevent mistakes it encourages them.
Looking at everything that goes wrong as a crime makes more go wrong not less.
Yes, it does, because it makes people accountable for their actions and makes them fear punishment, so they'll avoid criminal actions in the first place.
>You can’t test anything infinitely, you can’t predict the future, all you can do is have standards and processes and a willingness to learn from mistakes.
Sounds great, so why didn't Boeing and the FAA do that? That's the thing that makes them criminally liable.
"Bad guys" are supposed to get put away when they intentionally do things that get people hurt or killed. This is called "criminal negligence", which is what Boeing is guilty of. People don't go to jail for mere accidents; they have to be proven beyond a reasonable doubt to have acted negligent to a criminal degree (i.e., usually intentionally). Boeing intentionally made design decisions that were contrary to safety, and then intentionally tried to hide what had happened and keep the planes in the air even though they were deadly.
>Turning collections of mistakes into crimes doesn’t prevent future mistakes.
Yes, it does, when those "mistakes" were criminal negligence. That's why we have laws that makes these mistakes criminal.
Fear of punishment means a culture of preventing punishment by never admitting failure, never “ratting” any colleagues out for failure, and treating investigators like enemies.
Look at police and healthcare to see how badly your strategy works.
You are passing a lot of judgement with almost no information besides a few newspaper articles cashing in on the public interest by speculating.
I am either not selling the idea of an open culture towards failure very well or such conversations really attract people who like blame, vengeance, and pitchforks.
I'm not proposing punishing people for making mistakes, I'm claiming that punishing people for criminal negligence is absolutely useful, and the court system thankfully agrees. Is this really a hard concept for you?
>You are passing a lot of judgement with almost no information besides a few newspaper articles cashing in on the public interest by speculating.
Bullshit. There has been more that sufficient information unearthed in the last half-year or so about the criminal negligence that Boeing committed, and how the FAA simply looked the other way. This is not arguable at this point.
People who care about blaming and punishing are focusing on vengeance instead of prevention.
You are just so incredibly certain. Why have investigations at all, magduf knows. And after the blaming and punishment is decided, people with that attitude don’t care about the issue any more. Problem solved until the next thing happens and we need to hang somebody new.
Your response is obnoxious and patronizing, and you can shove it. Yes, I do know. It's been in the news for the better part of a year now; the causes of these plane crashes have been thoroughly investigated, and completely publicized. We know full well exactly why these crashes happened now, and the decisions that led up them. So yes, punishment is completely warranted when it's been proven that these hundreds of lives were lost because of willful criminal negligence.
Do you have any sources for that? I'd agree with the GP: one reason for the astounding safety in aviation is the conscious decision to leave blame and criminalisation out of the picture, and focus instead on the causes and prevention of accidents.
What you wrote goes against everything I know about aviation safety.
As an example, every NTSB report comes with this explicit disclaimer:
> The NTSB does not assign fault or blame for an accident or incident; rather, as specified by NTSB regulation, “accident/incident investigations are fact-finding proceedings with no formal issues and no adverse parties ... and are not conducted for the purpose of determining the rights or liabilities of any person.” 49 C.F.R. § 831.4. Assignment of fault or legal liability is not relevant to the NTSB’s statutory mission to improve transportation safety by investigating accidents and incidents and issuing safety recommendations. In addition, statutory language prohibits the admission into evidence or use of any part of an NTSB report related to an accident in a civil action for damages resulting from a matter mentioned in the report. 49 U.S.C. § 1154(b).
Keep in mind the NTSB is a fact finding organization. They don't care, and in fact the protections you mention are there to ensure everyone has every reason to be honest.
If there is any evidence of criminal negligence, it won't be in the facts of what happened. It'll be in the signals and narratives spelled out by the information revealed by any criminal probes of Boeing.
The aviation safety culture can have it's cake in this regard, but in order to eat it as a society, we still need an accountability mechanism for those with the position to destructively influence industry structures through malignant optimization techniques, which unfortunately, is squarely consolidated within Boeing's wheelhouse.
That's two separate mechanisms being reasoned about.
One, the NTSB answers "What the #$@! happened?"
The criminal probe will need to be able to uncover a beyond a reasonable doubt answer of "Why the #$@! did this happen, and who was ultimately involved?"
I'm not necessarily holding my breath anyone will ultimately be found accountable, but that the investigation is happening at all may be enough to discourage the behavior for the most part.
>You think "aviation safety" means that outright criminal negligence should go unpunished? That goes against all common sense.
Ultimately, mostly. It requires you to think bigger, to expect mistakes and wrongdoing, and focus on ways to prevent the ultimate effect “plane crash” and not an intermediate cause (mistakes, negligence, fraud, laziness, stupidity, random acts of god, etc)
You have to think of humans as part of the system and failure too. Humans fail just like steel beams fail. You need to engineer for it and not blame your tools and materials when your system fails, but how the system is designed instead (and not by fixing the designers by putting them in jail but by fixing the design)
You do this by striving to understand failures when they happen (not by pointing fingers and insisting on prison) As you understand individual causes of failure you look back a step further and a step further until you are satisfied that youve fixed the immediate problem and everything that let it happen.
You don’t stop when you find out that Bob is lazy and a liar and didn’t do his job. It’s naive and dangerous to think that there won’t be another Bob next time or that whatever you do to punish Bob will eliminate all Bobs from the future forever.
You want to fix what design principles weren’t there that led to the faulty design, then the tests that didn’t catch it, then the testing requirements that didn’t make appropriate tests, then the auditing that didn’t catch the laps, then the acceptance tests, pilot reporting pipeline and prioritization, incident response lapses, missing regulation, missing enforcement audit, wrong training, missing law, missing insurance requirement, missing risk notification, change notification, process overcomplexity analysis, and on and on and on.
Flying people around is hugely complex and involves many many actors. Too many of them to say the buck stops anywhere. Every responsibility for correct action is shared over several organization and thousands of people. Finding which ones to hang is counterproductive to safety if you’re actually trying to improve because it’s a distraction and theres never one single point of failure, there are a whole series of things that have to go wrong and if they do and it is actually only a few people responsible then it is everyone’s fault for being foolish enough for letting it get that way.
Punishment can be there as an afterthought for particularly egregious acts, but if it os the main focus you are accepting and encouraging failure and ensuring that regularly you will be hanging Bobs in the village square.
All that stuff you talk about here about design principles, testing, etc., don't actually work when the people at the very top, both at the manufacturer and at the FAA are actively working to undermine it. That's why people have to be punished. We're not talking about a case where some rank-and-file engineers were a little lazy, or where some worker on the assembly line screwed up; we're talking about a case where the top executives directed the entire company to design an unsafe plane, and then collaborated with the regulatory agency to certify it as safe even though it wasn't. You can't fix that with processes, the way you can deal with errors by lower-level people. You have to hold people accountable and punish them, because these actions were willful, criminal negligence.
Totally get the point about blame free improvement - and pointing fingers makes people hide stuff - but isn't that the problem here - people hid stuff????
So the question is how do you stop people hiding stuff?
What if it's not a process thing, but a behaviour issue?
You can have rules and systems - but if people willfully don't follow them - hard to see how you deal with that without sanctions.
ie what happens if it wasn't an honest mistake, rather a dishonest choice.
Or both. As is often the case with industry leaders and regulators, the tail wags the dog. Meaning usually the regulator is totally incompetent and defers to the very people they are supposed to be regulating. Its not a conspiracy per say, its how the system works, but people don't generally know that or like to hear that, so they say that kind of talk is a conspiracy.
Just as an example if you point to Goldman Sachs top executives bouncing back and for from Federal Government positions back to Goldman Sachs...the record speaks for itself, but people will say thats how its supposed to work and suggestions there is a problem with that is a conspiracy.
My consumer confidence in civil aviation is shaken. I think there should be a way to opt out of flying on aircrafts that I deem unsafe. Additionally, airline and travel websites should be required, by law, to display the safety record of the airplane that will fly me. Finally, if the airlines pull a bait and switch at the airport, the airline should be required to put me on a different flight that meets my aircraft criteria.
>NYT doesn’t paywall until you’ve read multiple articles in a month. Since you already like it perhaps you should subscribe.
Just to play a bit of devil's advocate, I don't like NYT, but as a willful participant of HN I must read the articles linked for discussion, which inevitably eats up NYT hits.
The problem is that I like HN, and I feel a duty to read the article that is up for discussion if I am to participate in the discussion itself.
tl;dr : paywall news hits coming from a news aggregator and discussion site is not an espescially useful metric in determining how much of your user base 'likes' your service.
>The Joint Authorities Technical Review, which produced the report, was led by Chris Hart, a former chairman of the National Transportation Safety Board, and included representatives from the F.A.A., NASA and aviation regulators from Europe, China, Brazil and other countries.