baatliwala@lemmy.world to memes@lemmy.world · 3 days agoThe AI revolution is cominglemmy.worldimagemessage-square21fedilinkarrow-up1217arrow-down119
arrow-up1198arrow-down1imageThe AI revolution is cominglemmy.worldbaatliwala@lemmy.world to memes@lemmy.world · 3 days agomessage-square21fedilink
minus-squareLurker@sh.itjust.workslinkfedilinkarrow-up35arrow-down1·3 days agoDeepseek is good locally.
minus-squareMora@pawb.sociallinkfedilinkarrow-up1·2 days agoAs someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
minus-squareLurker@sh.itjust.workslinkfedilinkarrow-up3·2 days agoYou can try from lowest to bigger. You probably can run biggest too but it will be slow.
Deepseek is good locally.
As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
You can try from lowest to bigger. You probably can run biggest too but it will be slow.