Uses for local AI?
Uses for local AI?
Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.
Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?
You're viewing a single thread.
All Comments
55 comments
https://github.com/hendkai/paperless_sort_low_quality_ollama let ai tag your paperless ngx files base on content.
5 0 Reply
55 comments
Scroll to top