Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Way to store your dotfiles: A bare Git repository (2016) (atlassian.com)
121 points by goranmoomin on April 6, 2019 | hide | past | favorite | 58 comments


I use something even simpler

My dotfiles git repo is meant to be cloned in my home directory. It comes with this .gitignore committed in the repo:

  /*
  !/.vim/
  /.vim/.netrwhist
Basically it ignores everything in my home directory, unless I explicitly `git add` it, which matches my workflow. For the few cases where I want to notice changes (like the entire ~/.vim/ subdirectory), I explicitly un-ignore it as you can see above.

The only downside I've experienced is my bash PS1 prompt shows the status of the dotfiles repo (branch/dirty/etc) in any directory I'm in that's inside my home dir - I've learnt to ignore it, and it doesn't interfere with CDing into an actual directory that's its own git repo.


That's clever, thanks for sharing this technique!

In the past I wrote a bash setup script for my dotfiles repository which pretty much does the opposite, symlinking a combination of shared and os-specific directories and files into my home dirs. One definite advantage with your technique is that no special setup script is required. I'm thinking I can obviate the need for splitting ommon and os-specific dirs by just detecting when the OS is Linux or macOS within the scripts themselves and using an if statement to gate their execution or sourcing.

This concept is related to another HN personal favorite of mine: "Best thing in your bash_profile / aliases" [0]. Lots of interesting command-line shell optimization and slick hack ideas in there.

Thanks again for showing me a superior way :)

[0] https://news.ycombinator.com/item?id=18898523


i do the same (dotfiles with symlinking script) and it’s mostly great, but I’d recommend against os-specific switching in the scripts and files themselves. it’s the way i’d always done it previously but i switched to git-branch-per-system and it’s much better. the problems started when “chromebook linux” was different than “other laptop linux” and then “server linux” (not to mention os x) and there kept being more and more messy logic. separate git branches is way easier and has the benefit of a sort of “inheritance” of the shared stuff (branching and rebasing)


I think that method requires git to scan all the files in the directory so it can then ignore them. The advantage of the “showUntrackedFiles no” method is that git will only look at the tracked files, which is much faster if you have a million files in your home dir, like I do. (Or so I believe.)


I tried quickly looking into this to verify the claim, but haven't yet found anything to explain why this would be the case. Do you have any details about how or why git will scan everything in this case?

I did find a pretty neat stack overflow post on the subject of gitignore whitelisting [0]. If we can get to the bottom of possible performance impact, would be cool to add the info there.

[0] https://stackoverflow.com/a/15320746/293064


Without spending too much time on it, I think a cursory check shows git doing "the smart thing" with ignore rules.

  $ mkdir foo
  $ cd foo
  $ mkdir $(seq 1 1000)
  $ git init .
  $ strace -c git status
    On branch master

    No commits yet

    nothing to commit (create/copy files and use "git add" to track)
    % time     seconds  usecs/call     calls    errors syscall
    ------ ----------- ----------- --------- --------- ----------------
     75.97    0.058830          57      1035        15 openat
     16.48    0.012764           6      2002           getdents
      3.87    0.002999           3      1021           close
      2.91    0.002254           2      1022           fstat
      0.19    0.000147           8        18        14 lstat
      0.13    0.000100           4        24           read
      0.12    0.000092           6        16        12 stat
      0.09    0.000072           3        21        14 access
      0.06    0.000048          10         5           write
      0.04    0.000033          33         1           unlink
      0.03    0.000020           3         8           rt_sigaction
      0.02    0.000018           2        12           mprotect
      0.02    0.000015          15         1           munmap
      0.01    0.000010           3         4           getcwd
      0.01    0.000006           0        16           mmap
      0.01    0.000006           6         1           ioctl
      0.01    0.000005           2         3           brk
      0.01    0.000004           4         1           chdir
      0.00    0.000003           2         2           getpid
      0.00    0.000003           3         1         1 readlink
      0.00    0.000002           1         2           rt_sigprocmask
      0.00    0.000002           2         1           set_tid_address
      0.00    0.000002           2         1           set_robust_list
      0.00    0.000002           2         1           prlimit64
      0.00    0.000000           0         1           execve
      0.00    0.000000           0         1           arch_prctl
    ------ ----------- ----------- --------- --------- ----------------
    100.00    0.077437                  5221        56 total
Now let's add an ignore-all and check:

  $ echo "/*" > .gitignore
  $ strace -c git status
    On branch master

    No commits yet

    nothing to commit (create/copy files and use "git add" to track)
    % time     seconds  usecs/call     calls    errors syscall
    ------ ----------- ----------- --------- --------- ----------------
     46.25    0.000568         284         2           getdents
     16.78    0.000206           6        35        14 openat
      7.41    0.000091           6        16        12 stat
      7.17    0.000088           4        21        14 access
      5.62    0.000069           3        25           read
      4.72    0.000058           3        18        14 lstat
      4.15    0.000051           2        22           close
      2.93    0.000036          36         1           unlink
      2.28    0.000028           1        23           fstat
      0.81    0.000010           3         4           getcwd
      0.65    0.000008           1         8           rt_sigaction
      0.41    0.000005           5         1           ioctl
      0.33    0.000004           4         1           chdir
      0.24    0.000003           2         2           getpid
      0.24    0.000003           3         1         1 readlink
      0.00    0.000000           0         5           write
      0.00    0.000000           0        16           mmap
      0.00    0.000000           0        12           mprotect
      0.00    0.000000           0         1           munmap
      0.00    0.000000           0         3           brk
      0.00    0.000000           0         2           rt_sigprocmask
      0.00    0.000000           0         1           execve
      0.00    0.000000           0         1           arch_prctl
      0.00    0.000000           0         1           set_tid_address
      0.00    0.000000           0         1           set_robust_list
      0.00    0.000000           0         1           prlimit64
    ------ ----------- ----------- --------- --------- ----------------
    100.00    0.001228                   224        55 total


Apparently either I’m remembering an ancient and fixed git behavior, or I’m just totally wrong. Either way, awesome, this is much simpler!


You can use the `git check-ignore` command to check if a certain directory is ignored. I used it in my zsh config to suppress the status of ignored directories in my prompt.


I don't understand how the curl http://site.com | bash anti-pattern has become so widespread. Especially with -k.


It originated as a way for people who aren't familiar with CLI to install things. People have now been trained to expect this level of simplicity. I've worked with people that will blindly copy and paste these lines into terminals, having absolutely no idea what they do, and even blindly type in sudo password when prompted. It's basically the worst of all worlds from a security perspective. In my opinion this should be burned to the ground.

Normally when I see this I will manually download the bash script, read through exactly what it does, and manually type in each command instead of running the script directly. This way I know what it is doing, and it can't hide command output by piping into /dev/null and doing something without my knowledge.

Seriously people. Never. Trust. Bash Scripts.


There's these cautionary tale you might find interesting

https://www.vidarholen.net/contents/blog/?p=746

and seriously just Shellcheck it should be in your repos.


Downloading and reading the bash script isn't a solution either. Even if it isn't deliberately malicious, those things usually want to just puke files into some arbitrary corner of your system or homedir - really the author just made a "works fine on my system" crutch rather than doing the actual work of packaging. Then to double down, they'll often add some "clever" hooks to self-auto-update your local junkheap from their git nightly, because releasing deliberate versions doesn't "move fast and break enough things".

I either want to pull from the standard distribution repositories and rely on things updating automatically, compile from source tarballs with explicit version numbers, or at the very worst have a path-independent binary tarball that can be unpacked anywhere. If you can't manage any of these, then your project simply isn't ready for general availability.


I've seen scripts that "clean up" with an 'rm *.o' statement (for example), which strongly suggests the potential for total disaster if you blindly run it from the wrong directory. Glad I'm not the only one with script paranoia.


Especially since it is possible to detect merely downloading from actually piping to a shell serverside[0], you should never do this even if you've examined the script first.

[0] - https://www.idontplaydarts.com/2016/04/detecting-curl-pipe-b...


I don't know why you are being downvoted but I'd like to know. I have a very official and very governmental API that isn't properly set up and the official doc says to use curl -k to talk to it. I had a long argument with one of the dev, with a PoC and a live example, about how it was a bad idea, especially considering how it can easily be fixed but... the 'feature' is still there.

I suppose the next $2,000 a day consultant will get them the memo.


It's fair to say that the technique described by SneakyCobra is amazing. I previously used this to manage my own dotfiles but there's still a few problems with SneakyCobra's approach.

* initial setup can be tricky even for experienced users * Incorrect use can potentially destroy your home directory

So, I wrote a little utility call "SDF: Sane dotfile manager" that makes the technique used by SneakyCobra approachable to a complete novice and hence more reliable to use.

You can find the introductory text here (https://shreyanshja.in/blog/sane-dotfiles-management/) And the source code here (https://github.com/shreyanshk/sdf)

Let me know how it works for you. :-)


Seems much more complicated compared to:

    git checkout https://github.com/my/dotfiles.git
    cd dotfiles
    # this is little more than `find . -maxdepth 1 -exec ln -sf {} ~/ \;`
    ./install
How are others here managing their dotfiles?


I use stow. But honestly, I don't think it's better than git.


Same - git repository (I presume), and GNU stow. Stow is a good way to maintain symlinks, but I see the value of not having that additional dependency. The OP's method is nifty, I might try it out.


It's orthogonal, isn't it? You keep your dotfiles in a repo, then have stow "install" the checkout/clone thereof.

EDIT: Also on stow - I tried it a couple times and it never seemed flexible enough. Can it handle things like keeping ~/.config/nvim and ~/.vim in the same stow directory? (And for that matter, can it link them together? I like having the same config for vim and neovim)


Re: your edit, Yeah it can! Set up a folder, say 'vim' for your example, and then inside of that folder you have a '.vim' file and a '.config/nvim' file.

Then when you run `stow vim` from the parent of the original 'vim' folder it will symlink everything in there to ~.

You can even have more folders that have a '.config/foobar' inside of them, and when you stow those it will all work itself out nicely!

Edit: You can find examples of this in my dotfiles https://github.com/AnthonyWharton/dotfiles/ (I use stow if you didn't guess :P). For example look at the 'i3' folder and the 'polybar' folder, they both add things to '~/.config '


I just have both directories in my dotfiles repo and have the vim one symlinked to the nvim one. Admittedly I haven't fully explored the capabilities of stow, there might be a better way.


Are you using stow for some specific reason? Or just trying it out?


Inertia


I do almost exactly the same as you; my install.sh is a glorified wrapper around `ln -s`, but for each file, it verifies whether the file is already symlinked and if not renames the original to something like `.foo.bak.$(date -I)`. This is probably overkill, but it was especially nice when I was just starting to version control my dotfiles and still found unmanaged files sometimes that contained things worth saving.


I do the same! https://github.com/mikew/dotfiles/blob/master/install.sh#L26...

Although yours is more sound, I'm just moving files to `*.old`.



Can you share?


Sure! https://gitlab.com/snippets/1844438

EDIT: Some quirks to note, especially if you want to steal this script: 1. I use ~/.local/etc as my dotfile directory. 2. I support multiple shells but have all of them use ~/.profile rather than shell-specific files (most config overlaps, and there's a case stmt that deals with per-shell settings). 3. The vim/neovim bit at the bottom will fail silently if the directory already exists; thankfully this is rare, but it should be fixed some time.


I use a combination of stow and git via a script I wrote that I call stash - https://github.com/scotte/stash - it's very basic and simple, but has served me well.


I'm still looking for a good way to I'm using this approach, but looking for a way manage dotfiles for multiple machines. Having separate branches feels clunky, since there is a lot of overlap and tweak may involve making the same tweak on several branches. Any recommendations for managing this situation?


I have a low-key solution. I check in files named by the hostname. For example `.bashrc.[hostname]`. Then I have a quick conditional in all my .bashrc files that checks for a hostname-specific file. This way, I commit them all, but only the relevant ones get loaded.


Surprised nobody has mentioned YADM - it can do per-device files and/or per-device templating too (jinja2 syntax). It's just a thin wrapper around git so you can use any git commands too.

http://yadm.io


There's nothing really to gain from using git directly. Yadm is awesome.

I put up my dotfiles here https://github.com/thingfox/dotfiles (sample documentation repository with examples), in it I show how I did the templating for ssh hosts amongst other configs.

This allows me to remove the most sensitive data and use the https://yadm.io/docs/encryption option for that.

The https://yadm.io/docs/bootstrap feature is also awesome as is https://yadm.io/docs/alternates


The reason I want to be able to have different files for different machines was to make slight variations to some of my dotfiles. I used to use branches but it was too much error prone work keeping all my branches up-to-date. I switched to a system where I template my dotfiles, but now I have to expand those templates for them to actually work. I do have leverage when to expand the templates and how to install them. There are a bunch of different ways to do this depending on what you want but what I ended up doing was:

1. Template files using a syntax that was easy find / replace using a regex. You could use an existing one if you like.

2. Generate a bash install script with all the file variants embedded as base64 strings. I can build this script locally, but I also have a travis ci build that pushes up the install.sh script as a gh-pages like branch.

3. I can now curl the install.sh script from any machine I want and bootstrap my dotfiles. The only install time dependencies are bash, curl, git, base64, mkdir, and echo so it's a very portable self-contained script.

4. During install time, I use a case on hostname to determine which files to use and I use git to put them into my $HOME directory using a similar strategy described by the article.

github: https://github.com/djblue/dotfiles install.sh: https://git.io/vxQ4g


I template trees using a python script, for both ~ and /. The template language isn't even currently that complex - basic per-host conditionals suffice. The host pushing the config does the templating, serializes it, and shoves it over ssh to the receiver. This way I can do things like leave passwords in config files (eg mpd.conf) and not have them end up on eg a VPS. Another example is having helpful comments in authorized_hosts to say where a key is from without that information ending up on the hosts themselves.

The receiving host runs python, so it can do things like refuse to overwrite files that have been changed locally. I still need to add a notion of hooks to run on the receiver when a given file is changed. If the remote dependency on python/ssh becomes a problem, I will simply add an option to dump a tarball locally.

I really tried to use ansible et al, but those tools seem to be geared towards managing large groups of essentially identical hosts, rather than generally differing hosts with some commonality.


I use https://github.com/thoughtbot/rcm

it's great. It has tags to only pull up specific dotfiles (say for emacs, .config etc), and supports configurations for multiple hosts and multiple source folders.


This is overkill, but I have a DAG of profiles. Each profile can refer to one or more parent profiles. When I produce a config for a particular profile, a small Python script applies the profiles starting from the root node(s).

To avoid trashing my home directory, this actually is done to the side and committed into a bare git repository (this part is similar to the article). Afterwards, I use `git --git-dir=... --work-tree=~ checkout -p` to apply any changes one-by-one, allowing me to preserve any local edits I may have made.

All this is available as a sparsely documented Python package: https://github.com/frutiger/stratum


Check $uname and other variables with if statements to activate aliases depending on OS, username etc.

I also keep a barebones “core” of aliases/functions that i use everywhere (e.g. on linux servers as well as my current macos laptop). And then a file that contains the non-core stuff that only get used on my dev environment (MacBook) but not on servers.

It’d help if you provided examples of what differences you have between machines though. Most should be pretty simple e.g. slightly different dir structure, different package managers etc.


I ended up putting everything common into the master branch, and keep only the varying parts (not the common parts) in machine-specific branches. These are normally "include files" that end up in ~/.profile.d/, ~/.emacd.d/, etc.

I have to check out both the main branch and the machine-specific branch in separate directories, and use symlinks. OK by me, though; I don't set up a new workstation often.


Why do you want different configs on different machines? I think the "easy" solution is "don't do that";) But of course that's useless advice if you actually have some usecase, so a suggestion: Have case/if blocks on $(hostname), or even do something like `test -f ~/dotfiles/bashrc.$(hostname).local && source ~/dotfiles/bashrc.$(hostname).local`


As others have mentioned, sourcing files based on hostname / uname might help:

    try-source () {
      if [ -f "$1" ]; then
        source "$1"
      fi
    }
    
    try-source "$HOME/.localrc.$(uname -s)"
    try-source "$HOME/.localrc.$HOSTNAME"
    try-source "$HOME/.localrc.$(uname -s).$HOSTNAME"


I just use vcsh combined with (optionally) myrepos to handle this.

https://github.com/RichiH/vcsh

https://myrepos.branchable.com/


I use exactly the same approach but for convenience I use the following script as wrapper for git: https://gist.github.com/hectorm/8891f7f1916a4379f84dfaccc6fa...


The problem with keeping all dotfiles in a single repo is that if you want to get an older version of one particular dotfile, you'll also be getting older versions of other dotfiles as well.

I want every dotfile I use to be independent of the rest and a log that shows changes to just that one dotfile, so I store each of them in separate repos and use GNU Stow[1] to manage them.

The above is actually a bit of an oversimplification of what I do, as I store related dotfiles in a single repo as well, so that (for example) all my weechat dotfiles are in a single repo, as I rarely want to checkout a single file independantly of the rest there.

[1] = https://www.gnu.org/software/stow/


> you want to get an older version of one particular dotfile, you'll also be getting older versions of other dotfiles as well.

That doesn't really follow; it is easy to check out old versions of a single file with git:

    git checkout <revision> <filename>
for ".bashrc from two commits ago on the current branch", that would be:

    git checkout HEAD~2 .bashrc
Hopefully I'm not just misunderstanding your point here.


Or you could learn how git already does those things


Git can fetch a single file as of any commit with 'git show' and show the history of any single file with 'git log' or 'git diff'. Any decent web/GUI tool will do these things too.

A Git repo per small text file seems like overkill to me.


This sounds like a somewhat uncommon use case to me, but wouldn't a simple git per-file checkout work here?

Something like:

  git checkout c3f2ab -- configs/.vimrc


You need just 3 lines to start storing your dotfiles in git:

    cd ~
    git init
    echo "*" >.git/info/exclude
Enjoy!


Could you explain how this works/is meant to work?


You basically make your home dir a git repo and by default ignore everything inside it. When you want to store a dotfile in that repo, you have to force add it (`git add -f`) and it will be tracked.

You can learn more about how git ignores things here: https://git-scm.com/docs/gitignore


This could lead to a potential security issue. Imagine there's a misconfiguration in your dotfiles -- now it's public.


This is why i recommend not committing actual dotfiles but committing templates instead.

Yadm, a thin wrapper around git allows for alternate files, encryption and templating. See my post https://news.ycombinator.com/item?id=19594859

https://github.com/thingfox/dotfiles

https://yadm.io/docs/encryption

https://yadm.io/docs/bootstrap

https://yadm.io/docs/alternates


Who says your git repo has to be public? Use a GitHub private repo, host the repo yourself behind SSH on a $5 Digital Ocean droplet, use the free private repos that come with Gitlab.com... securing git repositories is a solved problem.


There's a lot of dotfiles on Github and it doesn't seem to be a problem (Except if you check in private credentials, but that's not a problem unique to dotfiles).

If you rely on your configuration to be secret to be secure it's just security by obscurity and not worth much anyway.


> If you rely on your configuration to be secret to be secure it's just security by obscurity and not worth much anyway.

What I had in mind is that the average person wouldn't be a target, but publicly declaring their security vulnerability would attract attacks they wouldn't receive otherwise.


Only key harvester bots on GitHub/etc would notice. If you’re being consciously attacked by someone then you have bigger problems to deal with.

The solution isn’t particularly hard either, simply source a secrets file and make sure you add that to the gitignore file.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: