> That's actually quite a long time for an AI model.
Sure, but it's several orders of magnitudes smaller than the parent, physics-based weather model it's emulating, which may take 2-3 hours of wallclock time on an HPC system with a few thousand cores allocated to running it.
> I should probably read the paper to find out why but does anyone know? Is the model predicting the weather in 10 minutes time and just run iteratively 2000 times for a 14 day forecast?
Basically, yes. It's run autoregressively but in 6-hour forecast intervals. The added complexity is the iterative nature of the diffusion process used to generate each of those forecast intervals.
Sure, but it's several orders of magnitudes smaller than the parent, physics-based weather model it's emulating, which may take 2-3 hours of wallclock time on an HPC system with a few thousand cores allocated to running it.
> I should probably read the paper to find out why but does anyone know? Is the model predicting the weather in 10 minutes time and just run iteratively 2000 times for a 14 day forecast?
Basically, yes. It's run autoregressively but in 6-hour forecast intervals. The added complexity is the iterative nature of the diffusion process used to generate each of those forecast intervals.