TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-24 months agoWhat's the bang for the buck go to setup for AI image generation and LLM models?message-squaremessage-square7fedilinkarrow-up125arrow-down18file-text
arrow-up117arrow-down1message-squareWhat's the bang for the buck go to setup for AI image generation and LLM models?TheBigBrother@lemmy.world to Selfhosted@lemmy.worldEnglish · edit-24 months agomessage-square7fedilinkfile-text
minus-squarekata1yst@sh.itjust.workslinkfedilinkEnglisharrow-up2·edit-24 months agoKobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs. I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
minus-squareDarkThoughts@fedia.iolinkfedilinkarrow-up2·4 months agoIt is probably dead but Easy Diffusion is imo the easiest for image generation. KoboldCPP can be a bit weird here and there but was the first thing that worked for me for local text gen + gpu support.
KobaldCPP or LocalAI will probably be the easiest way out of the box that has both image generation and LLMs.
I personally use vllm and HuggingChat, mostly because of vllm’s efficiency and speed increase.
It is probably dead but Easy Diffusion is imo the easiest for image generation.
KoboldCPP can be a bit weird here and there but was the first thing that worked for me for local text gen + gpu support.