@baatliwala@lemmy.world to memes@lemmy.world • 2 months agoThe AI revolution is cominglemmy.worldimagemessage-square77arrow-up1328
arrow-up1290imageThe AI revolution is cominglemmy.world@baatliwala@lemmy.world to memes@lemmy.world • 2 months agomessage-square77
minus-square@Mora@pawb.sociallinkfedilink1•2 months agoAs someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
minus-squareLurkerlinkfedilink3•2 months agoYou can try from lowest to bigger. You probably can run biggest too but it will be slow.
minus-square@kyoji@lemmy.worldlinkfedilink2•2 months agoI also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think
As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
You can try from lowest to bigger. You probably can run biggest too but it will be slow.
I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think