minus-squareJaff_Re@alien.topBtoSelf-Hosted Main@selfhosted.forum•Best GPUs for self-hosted AI?linkfedilinkEnglisharrow-up1·1 year agoTesla P40 is a good low budget option, it has 24gb and CUDA cores. I’ve tried running 13b LLMs with 1 and it did well, plus you can afford multiple if you have enough slots linkfedilink
Tesla P40 is a good low budget option, it has 24gb and CUDA cores. I’ve tried running 13b LLMs with 1 and it did well, plus you can afford multiple if you have enough slots