Docker compose for those services while the language itself runs natively has been the best solution to this problem for me in the past. Docker compose for redis, postgres, elastic, etc.
IMO Docker for local dev is most beneficial for python where local installs are so all over the place.
This is exactly what I recently set up for our small team: use Docker Compose to start Postgres and Redis, and run Rails and Sidekiq natively. Everyone is pretty happy with this setup, we no longer have to manage Postgres and Redis via Homebrew and it means we're using the same versions locally and in production.
If anyone is curious about the details, I simply reused the existing `bin/dev` script set up by Rails by adding this to `Procfile.dev`:
docker: docker compose -f docker-compose.dev.yml up
The only issue is that foreman (the gem used by `bin/dev` to start multiple processes) doesn't have a way to mark one process as depending on another, so this relies on Docker starting the Postgres and Redis containers fast enough so that they're up and running for Rails and Sidekiq. In practice it means that we need to run the `docker compose` manually the first time (and I suppose every time we'll update to new versions) so that Docker downloads the images and caches them locally.
For your issue could you handle bringing up your docker-compose 'manually' in bin/dev? Maybe conditionally by checking if the image exists locally with `docker images`. Then tear it down and run foreman after it completes?
Yeah, and python has been most of my exposure to local Docker, so that may be coloring my experience here. Running a couple off-the-shelf applications in the background with Docker isn't too bad, but having it at the center of your dev workflow where you're constantly rebuilding a container around your own code is just awful in my experience
IMO Docker for local dev is most beneficial for python where local installs are so all over the place.