Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Corollary: Climate science models should be built by engineers, not by scientists. Task people who 'use what is known and understood to construct systems that can be validated'. Perhaps the whole scientist-as-faux-priest meme should be toned down a notch.


This notion seems to incorrectly represent what climate science models are and how they are developed. If there were some way to "validate" them that everyone could generally agree upon, it would be possible to move this work to the domain of engineering. But each climate model varies quite a lot in its structure, dynamical equations, what simplifications it makes for numerical tractability, use of datasets and how to map them to model parameters. The role of climate scientists is to hash out which models are more sensible than others and to push the boundaries of accuracy and certainty with new methodologies.

This is why the best we can do from a policy perspective at the moment is model aggregation: that is taking tens of models computed by various research teams and combining them into policy recommendations along with uncertainty brackets. Distilling the state of current research into a policy digest is the role of the International Panel on Climate Change, which is currently in their sixth iteration of this process. You can keep up to date and read previous reports here. https://www.ipcc.ch/


If they are anywhere nearly as sensitive to 'bayesian priors', aka parameters made up of full cloth, as the covid19 models, then I take a hard pass.

Edit: Perhaps that was a bit too much of an 'in jest' reply. I'm frustrated at the whole mathematical rituals schtick, when in reality the whole thing is at best an unfalsifiable qualitative guess. The parameter space is immense, and we only have 1 measurable trajectory through it for validation.


> The parameter space is immense

This. This. This.

I've seen claims that the number of parameters in the Imperial College agent-based model was c.a. 400. Perhaps it was 40.

Compare to a quadratic or quartic fit to some log-lin data. If you have to represent your ratio of poorly-constrained model parameters (40 or 400) to apparently necessary parameters (say 4) _using Big-O notation_, then science has arguably left the premises.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: