I also have multiple versions of by bash_profile with syntax specific to the OS. It checks if we're on MacOS or Linux with a kernel check and then reads the appropriate ancillary bash_profile for that platform. Anything that can live in the main bash_profile with the same command on both platforms lives there and anything that needs to be system-specific is in the other one.
I have all my important functions as individual files that get loaded with the following:
function loadfuncs() {
local funcdir="$HOME/.dotfiles/functions/"
[ -e "${funcdir}".DS_Store ] && rm "$HOME/.dotfiles/functions/.DS_Store"
local n=0
for i in "${funcdir}"*; do
source "$(realpath $i)"
n=$(( n + 1 ))
done
}
loadfuncs
Interesting way to go about it. Though when I'm at the point where I need differences between linux and darwin, I'm probably going to do that at the home-manager level.
I'm surprised no one mentioned ansible yet. It's meant for this (and more).
By ssh keys I assume you're talking about authorized_keys, not private keys. I agree with other posters that private keys should not be synced, just generate new ones and add them to the relevant servers authorized_keys with ansible.
Passwords will be brute forced if it can be done offline.
Private SSH keys should never leave a machine. If a key gets compromised without you knowing, in worst case you will revoke the access it has once the machine's lifespan is over. If you copy around one key, it may get compromised on any of the systems, and you will never revoke the access it has.
And you may not want to give all systems the same access everywhere. With one key per machine, you can have more granularity for access.
I use Ansible for this as well. It's great. I encrypt secrets with Ansible vault and then use it to set keys, permissions, config files, etc. across my various workstations. Makes setup and troubleshooting a breeze.
On my devices like PCs, laptops or phones, syncthing syncs all my .rc files, configs, keys, etc.
For things like servers, routers, etc. I rely on OpenSSH's ability to send over environmental variables to send my aliases and functions.
On the remote I have [ -n "$SSH_CONNECTION" ] && eval "$(echo "$LC_RC" | { { base64 -d || openssl base64 -d; } | gzip -d; } 2>/dev/null)"
in whatever is loaded when I connect (.bashrc, usually)
On the local machine alias ssh="$([ -z "$SSH_CONNECTION" ] && echo 'LC_RC=$(gzip < ~/.rc | base64 -w 0)') ssh'
That's not the best way to do that by any means (it doesn't work with dropbear, for example), but for cases like that I have other non-generic, one-off solutions.
I love this solution, I've been using it for years. I had previously just been using the home directory is a git repo approach, and it never quite felt natural to me and came with quite a few annoyances. Adding stow to the mix was exactly what I needed.
Its simply best, does all the annoying background things like webUI, machines, versioning, verifying etc. If you disable global discovery you can use it tough LAN only
That my solution. I have a 'Sync' folder on every device's Home folder, and then I use some aliases to determine whether to grab the bash_aliases file or replace it:
alias dba='diff -s ~/.bash_aliases ~/Sync/.bash_aliases' # compare files
alias s2ba='cp ~/Sync/.bash_aliases ~/' # Push from Sync folder to current bash aliases
alias ba2s='cp ~/.bash_aliases ~/Sync/' # Push from current bash aliases to Sync folder
By far, the diff alias is the most used. It allows for a quick check on what is different between files w/o having to open them up
I use a git repo combined with the basic install utility. Clone the repo, run the app installer, then run the install script. For symlinks I just use a zsh script.
I keep my dotfiles in a got repo and just do a git pull your update them. That could definitely be a cron job if you needed.
SSH keys are a little trickier. I’d like to tell you I have a unique key for each of my desktop machines since that would be best practice, but that’s not the case. Instead I have a Syncthing shared folder. When I get around to cleaning that up, I’ll probably do just that and keep an authorize_keys and known_hosts file in git so I can pull them to needed hosts and a cron job to keep them updated.
Use a git repo and stow tool. For updating, you only need run git pull (and stow if you create config for a new software). If you modify some config, just git add && git commit && git push.
With this way, you can also record change history of your config.
1password does this for me, when it comes to ssh keys, and it's great. All I have to do on a new machine is setup the ssh-agent, which is also practically preconfigured. The actual key never leaves the password manager