nomad fmt was applied already - granted it is not a small easy to read job file, it might be easier to split it up into separate jobs
I will look into making this into a Pack - I have never built one because I have never shared my config like this before. I don't know how popular they are among selfhosters either!
I think an easy first step would be to contribute a sample job file like this into the Lemmy docs website. Then people can adapt to their setups. I find there is a lot more to configure in Nomad than in Docker compose for example because you stop assuming everything will be in a single box, which changes networking considerably. There is also whether to use Consul, Vault etc.
Agreed as a first step. Pack is relatively new and not popular currently because there isn’t a great “marketed” repo so to speak. Hopefully that’ll change with it being on the nomad website.
Personally I think lemmy instance admins could benefit a lot with the scaling capabilities of Nomad. Hopefully is keeps growing in popularity.
I'm also using nomad to run Lemmy, glad to see someone else is too! I did create separate jobs for each component though, and am using Traefik instead of nginx.
What are you using for storage in your nomad cluster?
Yep I am using traefik -> nginx. I simply add the traefik tags to the nginx service. I didn't include that in the example file to keep it simple.
As for the storage, I use SeaweedFS (has a CSI plugin, really cool, works well with nomad) but as a CSI volume it's not suitable for backing postgres' filesystem. The lookups are so noticeably slower that your Lemmy instance will be laggy. So I decided to use a normal host volume, so the DB writes to disk directly, and you can back that up to an S3-compatible storage with this (also cool). Could be SeaweedFS, AWS, Backblaze...
I think SeaweedFS is suitable for your pictrs storage though, be it through its S3 API (supported by pictrs) or through a SeaweedFS CSI volume that stores the files directly.
I hope that answers it! Do let me know what you end up with