baatliwala@lemmy.world to memes@lemmy.world · 1 month agoThe AI revolution is cominglemmy.worldexternal-linkmessage-square77fedilinkarrow-up1329arrow-down138
arrow-up1291arrow-down1external-linkThe AI revolution is cominglemmy.worldbaatliwala@lemmy.world to memes@lemmy.world · 1 month agomessage-square77fedilink
minus-squarehmmm@sh.itjust.workslinkfedilinkarrow-up48arrow-down2·1 month agoDeepseek is good locally.
minus-squareMora@pawb.socialcakelinkfedilinkarrow-up1·1 month agoAs someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
minus-squarehmmm@sh.itjust.workslinkfedilinkarrow-up3·1 month agoYou can try from lowest to bigger. You probably can run biggest too but it will be slow.
minus-squarekyoji@lemmy.worldlinkfedilinkarrow-up2·1 month agoI also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think
Deepseek is good locally.
As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔
You can try from lowest to bigger. You probably can run biggest too but it will be slow.
I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think