I'm looking for a better way of managing my dot/configuration files across multiple environments.
What I'm Doing Currently
I've got a dotfiles git repo with some build-scripts that dynamically build files, detect the operating system, and fill in the templates accordingly. This works okay-ish. Sort of. It could be better. The downside is that I'm always making teeny tiny tweaks to my dotfiles (particularly my .vimrc
and .zshrc
files), and so the individual instances drift apart pretty quickly. This is driven by the fact that I always forget (read: am too lazy) to do "commit/push" after I make a small change, and "pull/build/deploy" whenever I sit down at a different workstation. So, I almost always end up re-writing the dotfile for any given new workstation from an obsolete version. Which is lame.
Features I'd Like
- A non-local central place to store my configuration files.
- A method of making them OS- or machine-specific. For example, I've got an alias on my Mac that aliases
rm
totrash
for a more ergonomic delete. I don't necessarily need this on a Linux server, and I definitely don't need it on a Windows machine. - Ideally, it should be able to handle tiny tweaks robustly, so that it syncs up after the change (or periodically) without me having to type a bunch of git commands.
- It would be nice if it was easy to deploy to a new, fresh machine. What this means is that if it uses a build or deploy script, that should run without having to install a bunch of stuff beforehand, since that kind of defeats the purpose. I'm currently using a Ruby script, but, as you might know, Ruby is anything but simple to set up and it's certainly not consistent across OS's.
- There are certain pieces (keys, usernames, etc.) that are needed to fully configure a machine, but I don't necessarily want to put those out in public. I'm not sure how to separate the secret/private stuff from the public stuff.
Anyways, I'm happy to implement as much custom scripting as needed. In fact, I super duper love it. I'm just looking for some ideas.
Top comments (13)
I use
stow
. I have a repo where each folder in there, is a configuration for some thing. Like zsh, bash, vim, tmux, etc. Then I have a setup script that runs stow on everything i want to run it on. I also have a flag in there so if I run it as root, it installs only what root needs.setup.sh
and each of those names like
tmux
is a folder in my repos dir. and when I install them I default to my home directory. so the command that is run equates tostow -v -R -t ${HOME} tmux
for example. And the directory structure for each folder is like so:so that will link the
.tmux
directory, into my $HOME directory.tmuxp
is a python binary for managing preset tmux pane configs, liketmuxinator
.Lastly, in my
.bashrc
and.zshrc
(i default to zsh) I check if the file$HOME/.env_secrets
is available which will have api keys, and other secret keys and it NOT kept in git. This way i can manage secrets, db connections, etc. I would link mine, but it has stuff in it I dont want public. but if theres any part of the config you want to know about, just let me know.I also have a
_setup
directory that has anything specific i want to run based on OS. such as installing applications. I run arch so I have aarch_setup.sh
and anosx_setup.sh
for installing things in homebrew. though that one is pretty out of date since i haven't updated it for osx in a while. As i have been using linux.Now that I think about it, might be a good post to talk about how I have this setup. lol
One last thing, in the start of the setup script I make sure to get any submodules so I can pull in tmux/nvim configs I use that others make. that way I can still use them without maintaining them.
Stow is great.
Wow! There is a lot here. Thank you! I haven’t even heard of stow, I’ll look into it. Thanks for such a detailed answer!
Of course. This "how to manage dotfiles" is a path I have treaded off an on for a long time. I have went through a handful of iterations. Sure there are tools out there that give you command line binaries to add/remove things in your env, but I found that they tend to over complicate things.
I have a
~/.dotfiles
directory that I put all my dotfiles in and store them on github.I use
.gitignore
for any dotfiles that should only be local.I have a script that will source all the files in that directory and i just point my
~/.bash_profile
at the index.Here is my git repo, its minimal, I dont use a huge amount of alias'.
I'm in a similar boat. I also have a git repro for my dotfiles and I also make tiny tweaks that I forget to "push upstream," mostly in my .vimrc
It's not much help, but I honestly think I'll just stick with I've got. Since it's only used across 3-4 machines, it's not personally worth setting up something more complex or automated.
just my unhelpful $.02
I use
stow
and a custom install script and just... remember to push and pull, I guess.I do have conditionals in my configuration files where possible to let them work on systems I have to use for work as well as systems I prefer.
I've also experimented with using different branches for different systems, but that just seems like more trouble than it's worth to remember to commit stuff back to master when appropriate.
Stow is a bad idea. It's a shit. It separated each config file into a folder with the same structure as your home directory, but not collect the conf files together in one main folder with that structure. And it doesn't have a builtin support for file grouping for different machines, or OS's.
Check the wiki page for dotfiles from archwiki for all kinds of tools.
If you want to group some of the confs for machine 1 and some for machine 2. The best grouping solution is provided by dotdrop, which group files by setting but not duplicating them to another folder.
Features of dotdrop
What about putting all your .dotfiles in a cloud service like dropbox? It will keep everything in sync and you don't have to do anything as far as that is concerned.
The only thing you have to do is set up symlinks or source your .dotfiles from dropbox.
For point #1, #3, and #5, dropbox would be your friend.
You could still keep things in sync with a remote repo while using .gitignore for sensitive data. But if things are synced with dropbox then you are okay.
One issue is that if you need to build and deploy every time there is a change, you will need some cron job or maybe a file monitoring script (if file change then build and deploy).
edit: after I hit 'submit' I remembered that recently dropbox changed their pricing strategy. You used to be able to have unlimited number of devices synced. Now it's limited to 3 on the free plan. I don't know how many systems you have to sync with, but a cloud based storage of some kind might help - dropbox or otherwise
Ohh that’s not a bad idea either. I’ll take a look at a few different cloud storage strategies. Thanks!
I don't really have any good suggestions for the others, but for feature request 2 you could conditionally source files based on their existence. I do this with work related aliases and environment variables.
I keep all my dotfiles in a git repo (github.com/gonsie/dotfiles). The readme there has some details, but at a high level: I download the repo and keep that separate from my 'installed' dot files. I use a synch.sh script to synchronize the repo with what is installed and I use git to synchronize across machines. The synch script has some smarts and knows how to set up a new machine, otherwise, it diff's the repo and installed files to figure out what I've changed.
I can recommend yadm. It stores your dotfiles in a git repo.