minus-squaregraveyard_bloom@alien.topBtoSelf-Hosted Main@selfhosted.forum•Ollama - super easy to host local LLMlinkfedilinkEnglisharrow-up1·1 year agoOllama is pretty sweet, I’m self-hosting it using 3B models on an old X79 server. I created a neat terminal AI client that makes requests to it on the local network - called “Jeeves Assistant”. linkfedilink
Ollama is pretty sweet, I’m self-hosting it using 3B models on an old X79 server. I created a neat terminal AI client that makes requests to it on the local network - called “Jeeves Assistant”.