I've found many benefits in this, such as easily having machine-specific configuration side by side with my unified dotfiles and no backup clutter. This works especially well for things like vim configs, too.
I think puppet in this case is overkill. I simple bash script with a small function to check if it already exists and what not would do just fine and you don't have to install another application (puppet).
Nice method, but with a bash script you wouldn't have to manually add any new files to your Makefile. I use this one: https://gist.github.com/975295. Also handles forcibly linking a file if it exists already (with the -f flag).
Of course, you're just using it for one folder so it's not an issue for you ;)
Yeah I can see how this could be overkill. I initially started by writing my own script but I made that script more complicated than necessary (trying to handle removing files and such). So the end result was that I didn't trust the script with my files.
Perhaps there's room for a simple puppet-style configuration manager that lives in just a single script, in some ways similar to sqllite.
Agreed, I use a relatively simple shell script to link all of my files in a "home" directory in dropbox to my real home directory on multiple machines:
#named link.sh, greps out some common things we don't want to link, also have vim stuff in a different dotfiles repo
cd $(dirname $0)
for F in $(ls -a1 | grep -v link.sh | egrep -v "^..?$" | grep -v .DS_Store | egrep -v ".hg(ignore)?$" | egrep -v "^vim$"); do
if [ -a $HOME/$F ]; then
echo "**** Found existing $F, skipping..."
elif [ -h $HOME/$F ]; then
echo "Already symlinked $F, skipping..."
else
echo "Linking $F"
ln -s $PWD/$F $HOME/$F
fi
done
This would be a little hard for me because I like to keep my system configuration files in the same repository (such as fstab and php), although I could setup some standard naming scheme like root.etc.fstab to link to /etc/fstab
I prefer to keep things simple, and version the home directory directly. I renamed the ~/.git repository to ~/.dotfiles.git in order to keep it out of the way, and set up a broad .gitignore, which can always be overridden with -f. When I need to work on the repository I use the GIT_DIR environment variable. No need for managing symlinks.
That wouldn't work so well for me because I like to keep my system configuration files like apache's in my configuration folder as well. But I can see why that approach appeals to you.
God, I like reading stuff like this, but personally I think articles about keeping your entire home directory under version control are more interesting (Joey Hess does this, I believe, and probably others I don't know about).
But I think I am going to switch to puppet, because there are a few things I always need done on a new machine; adding myself as a sudoer, installing packages, editing the imapd config (for a local imap cache), etc.
I do something similar with BitBucket and hg. Over time, I've found I rarely need system specific branches any more. Each computer has a .bashrc-local for local shell customizations, kept outside the repo. Otherwise, it's enough to detect the OS flavor and adjust symlinks that way.
I just use SpiderOak. It includes my dot files during its normal backup routine, it has file revisioning included, and all my files are available immediately upon installing SpiderOak to a new computer.
Personally, I just backup my dotfiles via FTP as they don't change very often, but I lot of my fellow Arch Linux users seem to like backing up their dotfiles to Github. I have never used GitHub myself but everyone seems happy with it.
Here's my install script: https://github.com/shazow/dotfiles/blob/master/install.sh
I've found many benefits in this, such as easily having machine-specific configuration side by side with my unified dotfiles and no backup clutter. This works especially well for things like vim configs, too.