Hmm interesting. Do you think deterministic simulation is a much more accessible approach than verification? Also curious what tooling you think is still missing/not up to par?
I assume by deterministic simulation you mean: use a harness wherein you can deterministically run specific executions of a distributed system (e.g., force specific event interleavings, failures etc?)?
If so, yes it's more accessible. The previous paper I linked (Sieve) used such a technique to reliably execute test plans so any bugs we find can be reproduced. But when it comes to correctness, I still think verification is just much more powerful.
W.r.t tooling, I think there's still quite the usability and knowledge gap to cross. The way I see it, we've had decades to figure out how to engineer complex software systems from simpler parts, with a lot written about software engineering and testing in general.
Full-system verification in contrast is still really young, and breaking up a system into an implementation, specification and proofs currently requires quite some creativity and even awareness of how the verifier works. The Anvil project I mentioned above had a team of systems, distributed computing, reliability and verification experts (including Verus authors) team up to do this.