I know this is a dumb/basic question, but it's one I've wondered and don't fully understand. How is Docker that much easier/better than just writing some other kind of script that configures a machine for your app to run in and deploys it?
Docker like a package manager, except its better than that, its like a package manager that works across distributions. And there is already a full stack for just about everything. And we don't need to go through the normal package management red tape.
Docker also provides a level of separation and isolation that makes it easier to set up configuration and also increases security.
It provides a type of building block with a standard interface for connecting to other pieces.
It allows me to easily customize a build off of an existing stack.
It provides a binary distribution of an application, so you know the whole system that you are deploying is exactly the same one you tested.
Not to mention the fact that because intermediary containers are imaged, you don't have to worry about making a mistake in your script.
Sometimes (okay, every day) I find myself trying a bunch of different things to get something to work. With Docker, you've got a series of "save points" along the way. You can continue building an image from the last good point, and end up with a really clean image that represents the shortest path from bare install to "what I need to get done."
I was playing around with a Perl script about a month ago that would strip all the RUN statements out of a Dockerfile, convert them into a shell script and use that to bootstrap a Vagrant box. So what I'm saying is, if you don't know what you're doing (or just like to experiment with OSes), Docker is really interesting.