Hacker News new | past | comments | ask | show | jobs | submit login
Bevy XPBD: A physics engine for the Bevy game engine (joonaa.dev)
151 points by lukastyrychtr on July 8, 2023 | hide | past | favorite | 20 comments



> Note that after 0.2, development might be a lot more inconsistent for a while because of my studies, as I will be graduating from high school next year

Impressive!


If you're interested in how XPBD works, I can highly recommend the YouTube series of one of the authors who wrote the paper, Matthias Müller: https://www.youtube.com/c/TenMinutePhysics/videos


Reading the presentation https://matthias-research.github.io/pages/tenMinutePhysics/0... , am I right in thinking that the vanilla Position-Based Dynamics is literally just how Quake does collision and then clip velocity against geometry? And the extension is simply to add a softness parameter that interacts in elastic-energy space?

So basically going from reality to PBS there goes three layers of simplification:

First Level: assume that the discrepancy between inertial and constrained position is caused by a constant force. Guess a force vector, simulate a constant-force scenario in a vacuum and fine tune your guess until the final position makes sense, to get the accompanying velocity

Second Level: actually, we don't really care about the velocity progression within the timestep, so we instead just assume that the discrepancy is caused by a velocity boost at the end of the last frame. Guess an impulse vector, simulate a constant-velocity scenario in a vacuum and fine tune your guess until the final position makes sense, to get the accompanying velocity

Third Level: actually, we don't really even care about the position progression within the timestep, so we instead just snap the position to the closest point that makes sense, and just say that your object went a bee-line from the last position to this position and your velocity is just the displacement over time.


I'm not familiar with the implementation of Quake's physics, but my recollection of the PBD paper was that it was basically a "mathing up" of what was already a fairly common way of handling physics in games. Thomas Jakobsen wrote a very influential paper in 2001 about the character physics in the original Hitman that popularized a lot of the same ideas later presented in the PBD paper.

What is really interesting to me is that later on in 2016 Miles Macklin et al. from Nvidia released the Extended Position Based Dynamics paper (the XPBD referenced in the article), which bridged the gap between hacky-gamey PBD and a principled fully physics-based derivation. The physical derivation was explored and refined further in Primal/Dual Descent Methods for Dynamics.

And finally most interesting was the Small Steps in Physics Simulation paper by the same Nvidia group that showed that a simplified variation of XPBD that got rid of iterative solving in order to increase the physics sim framerate is actually a state of the art dynamics solver. As in, many dynamics problems are currently solved most accurately/efficiently using this overgrown hacky algorithm game programmers came up with to make dragging around corpses look better.

Kind of parallels the whole graphics cards for gamers morphing into GPUs for AI transition, just in a more niche way.


The two key insights that drive the game-physics approach of PDB(which follow decades of spaghetti-on-the-wall) essentially come down to: choosing a source of error that can be controlled, and not throwing away information too readily.

You end up using position because you can then solve for "only give me an answer with a valid position" - addressing it through motion makes it an indirect process, and then errors become subject to positive feedback loops. This biases PDB towards losing energy, but that's desirable for stability, and XPDB reduces the margin of error.

You avoid throwing away information by being cautious about when you "go forward" with a solution to the next timestep, and possibly keeping multiple solution sets alive to pick them heuristically. This is something you can do extensively when you are aiming for simple physics with abstract dynamics(platforming games, fighting games, etc.) - you know what kinds of solutions will "look right" already, therefore test all of them, make a ranking, backtrack as needed. When realism is needed the principle still works - you can still rank solutions by making up a metric - it's just made more complicated by the number of answers you get with complex dynamics. That explains why XPDB moves away from "substepping" the physics: it's more important to "go wide" and scan for a single, high-quality solution than to try to linearize each aspect and hope that using smaller steps will reduce the error for you, which was a common approach for abstract dynamics and resulted in biasing like "x axis movement is favored over y". The secret sauce in XPDB's design is in getting the desired qualities in a more analytic fashion, without so much brute force computation.


> That explains why XPDB moves away from "substepping" the physics

Interestingly, XPBD has moved back to substepping! The relatively recent "Small Steps in Physics Simulation" from Nvidia goes into it, but I can outline the reasoning briefly.

In a physics simulation, there are 2 main sources of error, the integrator and the solver. Breaking that down a bit:

The integrator is an algorithm to numerically integrate the equations of motion. Some possibly familiar integrators are Euler, Verlet and Runge-Kutta. Euler is a simple integrator which has a relatively high error (the error scales linear with timestep size). The most common version of Runge-Kutta is more complex, but scales error with the 4th power of the timestep.

The solver comes into play because the most stable flavors of integrator (so-called implicit or backwards integrators) spit out a nonlinear system of equations you need to solve each physics frame. Solving a nonlinear system to high accuracy is a difficult iterative process with its own zoo of algorithms.

XPBD uses an implicit Euler-esque integrator and a simple, but relatively inefficient, Projected Gauss-Seidel solver. For most games, the linear error from the integrator is ugly but acceptable when running at 60 or even 30 frames a second. Unfortunately, for the solver, you have to spend quite a bit of time iterating to get that error low enough. The big insight from the "Small Steps" paper is that the difficulty of the nonlinear equations spat out by the integrator scales with the square of timestep (more or less -- nonlinear analysis is complicated). So if you double your physics framerate, you only have to spend a quarter of the time per frame in the solver! It turns out generally the best thing to do is actually run a single measly iteration of the solver each physics frame, and just fill your performance budget by increasing your physics frames-per-second. This ends up reducing both integrator and solver errors at no extra cost.


just related to this, I found neat overview of various PBD adjacent developments, XPBD is far from the only thing https://doi.org/10.1002/cav.2143


`TypeError - Cannot convert argument to a ByteString because the character at index 14 has a value of 283 which is greater than 255.`

That's the second Netlify app I've seen crashing like this after being linked from HN this week... Strangely enough, archive.is was able to scrape the site just fine without an error[0]

[0] https://archive.is/557mo


Looks really nice and well integrated. I'm a bit surprise to not see any mention of soft body.

Does any one know if they've implemented the "small step" paper as well?


Very cool; it's nice to see some development in this area since it seems very lacking compared to graphics frameworks. Afaik, the only real options for indie engine developers is PhysX, Bullet, or ReactPhysics3D. I'm unable to get consistent behavior in ReactPhysics3D, and Bullet has notoriously terrible documentation. PhysX has served me well but it would be cool to see a more diverse ecosystem


Rapier has bevy plugin, is there a reason you don't see it as a real option, at least as much as this bevy-xpbd?

https://rapier.rs/docs/user_guides/bevy_plugin/getting_start...


There is now also JoltPhysics https://github.com/jrouwe/JoltPhysics which is used in Horizon Forbidden West.


Semi-off topic, but related: does anyone know of a Rust (or Bevy) library for working with Tetrahedral meshes?

Most of the interesting stuff being done with the XPBD solver in SideFX Houdini uses them (e.g. for soft body simulations).


Interesting. So bevy_rapier isn’t liked as much because it doesn’t lay over Bevy ECS nicely. Does everything need to integrate with ECS? How does it help a physics engine, specifically?


I write C++, and I have little hope or motivation to learn rust. I would gladly learn it in class with a good teacher, though, but not without. Programming languages should not have a steep learning curve.

Disclaimer: I managed to parse a embryonic programming language using a parser generator (lexy).


If you have a good understanding of modern C++ then I don't think you'll find Rust that hard to learn. I think most people talking about a steep learning curve are probably coming from garbage collected languages.

I've found Rust just formalises and gives the compiler the ability to check and enforce concepts already familiar to most C++ programmers confidante with move semantics, the lifetime of temporaries, etc.

Deciphering errors from the borrow checker take some practice still but for C++ programmers used to deciphering template instantiation errors it's not so hard.


Maybe not for you, but I'm pretty sure most people consider the learning curve of C++ "steep" too.


Not really, C is easy enough, and you can still use a small subset of C++ that is quite easy to use.

It's not possible in rust.


Rustlings gives a great introduction to the language:

https://github.com/rust-lang/rustlings

Disclaimer: I write JavaScript


I'm not a big Rust guy and have no plans on really diving deep into it for any of my projects. But I'm very impressed with what people have accomplished with it and I love the tooling around it. So I still think I might be using it for things. That said, I don't currently program in C++ either and suspect that the features they (C++ and Rust) enable don't contribute to the success of the projects that are made with them as much as other things having more to do good project management.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: