pkkkzip 6 hours ago

Forgot about R1, what hardware are you using to run it?

1
syntaxing 6 hours ago

I haven’t ran QWQ yet, but it’s a 32B. So about 20GB RAM with Q4 quant. Closer to 25GB for the 4_K_M one. You can wait for a day or so for the quantized GGUFs to show up (we should see the Q4 in the next hour or so). I personally use Ollama on an MacBook Pro. It usually takes a day or two for it to show up. Any M series MacBook with 32GB+ of RAM will run this.