tehnomad@alien.topBtoSelf-Hosted Main@selfhosted.forum•Best GPUs for self-hosted AI?English
2·
1 year agoThe best consumer NVIDIA card is the 3090ti because of its 24GB memory, so you can run bigger LLM models. I have a 3060ti 12GB which works pretty well with 7B and 13B LLM models.
I use a ZFS pool for my data and a combination of LXCs and Docker in an LXC to run my services. Proxmox is flexible enough that you can pretty much get any configuration to work. I even have my Intel iGPU passthrough set up to my Docker LXC to my jellyfin docker container. Caddy and Authelia are on one LXC for reverse proxy and authentication, and I can point it to my Docker LXC by its IP address. I use bind mounts to mount folders on my ZFS pool to the LXCs/Docker.
One advantage of Caddy running in Docker is you can use the caddy-docker-proxy module to automatically generate a Caddyfile from Docker labels of your containers.
I started my ZFS pool from scratch with new hard drives. If you want to reuse your existing ones without wiping your data, you may want to look in to MergerFS.