prostagma@lemmy.world to memes@lemmy.worldEnglish · 21 days agoLife of a PC gamerlemmy.worldexternal-linkmessage-square88fedilinkarrow-up1836arrow-down17
arrow-up1829arrow-down1external-linkLife of a PC gamerlemmy.worldprostagma@lemmy.world to memes@lemmy.worldEnglish · 21 days agomessage-square88fedilink
minus-squareDomi@lemmy.secnd.melinkfedilinkarrow-up37arrow-down1·21 days ago39 GB is very small, DeepSeek R1 without quantization at full context size needs almost a full TB of RAM/VRAM. The large models are absolutely massive and you will still find some crazy homelabber that does it at home.
minus-square87Six@lemmy.ziplinkfedilinkarrow-up16·edit-221 days agoAll that RAM for the idiot AI to tell me what I can find on stackoverflow with one startpage search.
39 GB is very small, DeepSeek R1 without quantization at full context size needs almost a full TB of RAM/VRAM.
The large models are absolutely massive and you will still find some crazy homelabber that does it at home.
All that RAM for the idiot AI to tell me what I can find on stackoverflow with one startpage search.