Unison might be worth a look, provides bidirectional merging and command-line operation. It’s what I’d use if I were mostly working with binary files and didn’t want a history.
Rsync, which someone else recommended, is really aimed at efficient unidirectional replication, not keeping two directories on computers that are both being changed and are intermittently connected in sync.
config files
If there’s mostly text and you’re going to want to review changes, want to keep a history, and do a lot of merging, I’d use git, symlink files to aim at the git repo. I have a custom helper script, but stuff like GNU stow is aimed at this, and I’d probably recommend that someone look at it before rolling their own. Here’s an example of someone using it with git in this role:
I agree with that guy about using bare git repos as the “master” copy, even if one of the machines in question also hosts the bare repos and technically you have some redundant information on it. Makes life easier, no machine is “special”.
If I had both binary files (say, a music collection) that I wanted kept in sync without a history and text files that I do (say, my dotfiles), I’d use both.
Yes. I wouldn’t be preemptively worried about it, though.
Your scan is going to try to read and maybe write each sector and see if the drive returns an error for that operation. In theory, the adapter could respond with a read or write error even if a read or write worked or even return some kind of bogus data instead of an error.
But I wouldn’t expect this to likely actually arise or be particularly worried about the prospect. It’s sort of a “could my grocery store checkout counter person murder me” thing. Theoretically yes, but I wouldn’t worry about it unless I had some reason to believe that that was the case.