nosecreek 4 days ago

Related question: what is everyone using to run a local LLM? I'm using Jan.ai and it's been okay. I also see OpenWebUI mentioned quite often.

3
Havoc 4 days ago

LM studio if you just want an app. openwebui is just a front end - you'd need to have either llama.cpp or vllm behind it to serve the model

op00to 4 days ago

LMStudio, and sometimes AnythingLLM.

fennecfoxy 4 days ago

KoboldCPP + SillyTavern, has worked the best for me.