neumann 22 hours ago

This is great, and I like seeing all the implementations people are making for themselves.

Anyone using any opensource tooling that bundles this effectively to allow different local models to be used in this fashion?

I am thinking this would be nice to run fully locally to access my code or my private github repos from my commandline and switch models out (assuming through llama.ccp or Ollama)?

1
sagarpatil 19 hours ago

All IDE’s support OpenAI compatible endpoint. So you can host whatever model you like locally and use it. Check out Roo Code.