# Doing It Right: Passwords, Keys and Backups Like most Computer Nerds, I know that: * People should use a password manager * People should use 2-factor auth whenever possible * People should back up their keys and such * People should have regular backups, ideally offsite, and test them And, also like most Computer Nerds, none of those things were really as true as I wanted them to be for myself. I spent some time over the past couple of days fixing that and am pretty happy with where I landed. ## Passwords This was basically a mess when I started. Over the years, I'd tried a few things, so I had passwords stored in: * Dashlane (a password manager) * Chrome autofill * An existing pass(1) store * My home directory as regular files, named after the site * A notebook on my desk * My brain => https://www.dashlane.com And some passwords were also generated using some ad-hoc scheme I made up involving HMACs, with some but not all of the results duplicated in one or more of the above. It sucked. I wanted to move all of them into one place, and specifically, I wanted that place to be: * Available on all my devices (currently two Mac laptops, a Linux desktop, a Chromebook, and my Android phone) * Synced between those places, so I don't have to retype passwords on each device to move them around * Not dependent on a single centralized service * Encrypted at rest, so that I can keep backups of my password store without worrying about encrypting those backups separately * Stored in a way that I understand, so that if/when something inevitably goes wrong I have some chance of fixing it There are a bunch of options that meet all these criteria, but I ended up settling on "passwordstore", aka pass(1), which is actually a shell script that wraps gpg and git. => https://passwordstore.org/ This is pleasingly debuggable (despite being a shell script) and basically does what I want, when combined with some suitable Android client. It uses gpg to encrypt individual passwords in the store, and you can configure it to use git internally. I pretty much did this: 1. Generate a brand new gpg keypair, to be used to encrypt the password store[1] 2. Create a new git repo on sr.ht, my git host of choice 3. Set up a new password store on one of my machines, using that gpg keypair and that git repo 4. Imported everything 5. `pass git push` 6. Copy[2] the gpg key pair to every other machine I want to have access to my passwords on 7. Create new password stores on those machines, tell them to use the sr.ht repo as their upstream, `pass git pull` 8. Voila, passwords! Then I use `pass git push` and `pass git pull` to kick of syncing as needed. I could do that with cron, but I don't modify passwords often enough that I think that would be too useful. The actual pass tool has a very simple interface; I usually use it like this: $ pass -c foo.com/email@host.com and, after maybe unlocking the gpg key, the password ends up on my clipboard. Adding new ones is: $ pass insert foo.com/email@host.com # add an existing one $ pass generate foo.com/email@host.com # make a new strong one I haven't yet set up the Android client, but there is one that is pretty well regarded. ## Keys This one was actually pretty straightforward, but annoying because gnupg is annoying. I actually minted two brand new gpg keys - one for my password store specifically, and one for encrypted mail / etc. That part's pretty easy, since I did not have a previous gpg key to worry about. For ssh keys, like most people, I had a mix of keys, with different hosts I've had access to at different times all having their own authorized_keys files and such. What I ended up doing was generating a brand new ssh key pair for each machine I actually ssh from, then collecting together all their public keys into a new authorized_keys file, which I copied to every machine I could remember, overwriting their old authorized_keys. I then destroyed every other key pair I could find that *wasn't* one of the new keypairs, which will surely not come back to bite me later[3]. I also made one backup keypair, which isn't on any of my machines but is included in my offsite backups - see below about those. This section would not be complete without a list of gpg annoyances, so here it is: * gpg believes "real names" are >= 5 characters; you need to pass --allow-freeform-uid to disable this restriction * gpg defaults to making 3072-bit RSA keys with 2-year expiry, and if you want to change those behaviors you need to pass --full-generate-key * merely importing a gpg key is not enough; you also need to --edit-key it to mark it as trusted, or gpg won't use it as a recipient * my elderly macbook appears to have gpg 1.4.19 for no obvious reason[4] * gpg is very hard to compile from source on macos without using homebrew ## Backups This is the one that was the most fiddly to set up. It required less manual data entry than passwords, but a lot more physical labor. My goals with backups are to protect against these things, in decreasing order of likeliness: 1. Me making dumb mistakes with computers 2. Individual disk or machine failures 3. My house burning down / getting flooded / etc 4. Some kind of malware attack affecting all my machines It's pretty easy to protect against (1) and (2) by keeping copies of my data on external disks, and that's what I was doing before. For (3) and (4) it's a bit trickier, and I ended up settling on a 3-layer approach: ### Layer 1: Local Copies I already had some existing scripts for this (see a previous post) and I can just keep using those. Basically, I plug an external disk in every so often, run the backup script, and it emits a tarball onto the external disk. If the disk fills up, I delete it. Easy life! => https://elly.town/d/blog/2022-01-18-backups.txt ### Layer 2: Tarsnap Tarsnap is basically "encrypted tarball backups in The Cloud", and it provides pretty cheap off-site backups for me. I have about 30 GiB of data and almost all of it changes never, so the incremental cost of taking new tarsnap backups is just about zero. => https://tarsnap.com To set up tarsnap, I made an account, put some money into it, then installed the tarsnap client on each machine[5], set up keys for it, excluded some things I know I don't want to back up (like my Steam game library), and let it run. It takes a good while to do its initial backup but subsequent incremental backups are nice and fast. On every machine I have a script that runs tarsnap backups for the important data on that machine, which I currently run every so often by hand, but could easily configure to run with cron instead. One kind of neat tarsnap design feature that I didn't make use of (yet): each machine has three separate tarsnap keys - a read key, a write key, and a delete key. You could, in theory, scrub the delete key off a machine if you wanted to, thus preventing someone who compromises that machine from deleting its backups. I'm not so worried about that so I didn't bother doing this, but tarsnap does support it. Once I had provisioned the tarsnap keys, I needed to make sure I don't lose those either, which leads to... ### Layer 3: Hard Backups I have a safe deposit box at a local bank, which is both very physically secure and well protected against fire, floods, etc. It is also a huge pain in the butt to actually access (which is by design) so not a good place to store regular backup images. What I ended up doing here is buying a handful of USB sticks. Each of those sticks has a "hard backup", which is a tree of files like this: /aegis # backup of my phone's Aegis 2FA state /aegis.json.gpg /backups /$HOST-$DATE.tar.gz.gpg /gpg /main-pub.gpg /main-priv.gpg /pass-pub.gpg /pass-priv.gpg /README.txt /recovery-codes.txt.gpg /pass [git clone of my password store] /ssh /stick.gpg /stick.pub /tarsnap /$HOST-tarsnap.key.gpg Everything except for the README, the gpg keypairs themselves, and the ssh public key are encrypted with gpg to the key stored as /gpg/main-{pub,priv}.gpg, while the password store git clone is encrypted to the dedicated password store keypair, stored as /gpg/pass-{pub,priv}.gpg. The two gpg key pairs are themselves protected with very strong passphrases[6] which I don't use for anything else. The README file contains some contact info for me and offers a reward for returning the stick if found, in case I happen to lose one, but since the data is encrypted it doesn't much matter from a security perspective either way. A shell script generates/updates these USB sticks, and my plan is that every couple of months I'll generate a new one, then visit my safe deposit box and swap the new stick for the oldest one currently in the box so I can wipe it for reuse. These are definitely my backups of last resort since they are annoying to get to and very likely to be pretty out of date when I use them, but at least they exist. ## Testing Backups Of course, it isn't really the backups that matter - it's the restores. For local backup images it's very easy to test them by plugging them in and checking whether some sample of the data's still there. For tarsnap it's almost as easy, since one can do the normal tar operations (list files, or extract individual ones) and spot-check that stuff's there. For the hard backups it's more interesting, since for those the scenario where I'd use them is a restore from nothing. To test those, the approach I've settled on (which I haven't used yet!) is to spin up a clean Linux image in a VM on one of my machines and try to ensure that I can get back to access to like, my website, my bank, Slack, etc. That should look like: * Import both gpg key pairs from the stick image * Decrypt the ssh key pair and check that I can ssh somewhere * Decrypt tarsnap machine keys and check that they all work * Decrypt the various 2fa recovery codes and check that they are legible * Check that I can retrieve my test password and it is correct[7] * Check that I can decrypt the various backup tarballs and they contain files It would be nice if I could automate that testing, but at the moment I'm not sure how, so I have to do this by hand. ## Future Stuff I'd like to move my email off gmail, probably to something like mxroute, so that I can send and receive gpg-encrypted email without dying. I would also like to look at hacking pass(1) to use age instead of gpg, but I'm a bit nervous about moving away from the standard tool, even though it sucks. Also, I need to figure out what to do about my phone. I'm using Aegis for 2FA, but I don't yet have access to my password store on my phone, and I'm not sure whether I want to. We'll see. As always, thanks for reading, and feel free to email me with thoughts :D [1]: pass encrypts the password data, but *not* their identifiers, so the password store leaks your user@host pairs. Not a huge deal for my use case, since the repo on sr.ht is itself access-controlled; to get the repo one would have to compromise sr.ht or one of my machines. [2]: surprisingly annoying! [3]: because it almost certainly will, I kept a copy of all the old key pairs. [4]: I eventually figured out why this is and fixed it but it's too tedious to relate here. [5]: Not my work machines (one of the macbooks & the chromebook) since a) they have their own backup machinery already and b) secops would have words for me about using an external backup system [6]: It is possible, with a bit of practice, to memorize 128- or 256-bit random passphrases. That's what I did for these. [7]: I inserted a test password into pass(1) while I was setting it up and just left it there, and now I can use it as a consistency check on the password store. Yay :)