@[email protected] to Programmer [email protected] • 1 day agoWork of pure human soul (and pure human sweat, and pure human tears)lemmy.worldimagemessage-square49arrow-up1356arrow-down134
arrow-up1322arrow-down1imageWork of pure human soul (and pure human sweat, and pure human tears)lemmy.world@[email protected] to Programmer [email protected] • 1 day agomessage-square49
minus-square@[email protected]linkfedilink4•20 hours agoyou don’t even need a supported gpu, I run ollama on my rx 6700 xt
minus-square@[email protected]linkfedilink1•1 hour agoI have the same gpu my friend. I was trying to say that you won’t be able to run ROCm on some Radeon HD xy from 2008 :D
minus-square@[email protected]linkfedilink2•14 hours agoYou don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
minus-square@[email protected]linkfedilink2•3 hours agoI tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)
you don’t even need a supported gpu, I run ollama on my rx 6700 xt
I have the same gpu my friend. I was trying to say that you won’t be able to run ROCm on some Radeon HD xy from 2008 :D
You don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)