wetwater 6 days ago

I've only recently started looking into running these models locally on my system. I have limited knowledge regarding LLMs and even more limited when it comes to building my own PC.

Are there any good sources that I can read up on estimiating what would be hardware specs required for 7B, 13B, 32B .. etc size If I need to run them locally?

1
TechDebtDevin 6 days ago

VRAM Required = Number of Parameters (in billions) × Number of Bytes per Parameter × Overhead[0].

[0]: https://twm.me/posts/calculate-vram-requirements-local-llms/

manmal 6 days ago

Don’t forget to add a lot of extra space if you want a usable context size.

TechDebtDevin 6 days ago

Wouldn't that be your overhead var

wetwater 6 days ago

Thats neat! thanks