simonw 4 hours ago

This one is pretty impressive. I'm running it on my Mac via Ollama - only a 20GB download, tokens spit out pretty fast and my initial prompts have shown some good results. Notes here: https://simonwillison.net/2024/Nov/27/qwq/

1
cherioo 3 hours ago

What hardware are you able to run this on?

torginus 28 minutes ago

Sorry for the random question, I wonder if you know, what's the status of running LLMs non-NVIDIA GPUs nowadays? Are they viable?

naming_the_user 2 hours ago

Works well for me on an MBP with 36GB ram with no swapping (just).

I've been asking it to perform relatively complex integrals and it either manages them (with step by step instructions) or is very close with small errors that can be rectified by following the steps manually.

simonw 2 hours ago

M2 MacBook Pro with 64GB of RAM.