_1 2 days ago

This looks really nice. We've been considering developing something very similar in-house. Are you guys looking at supporting MLC Web LLM, or someother local models?

1
calcsam 2 days ago

Yup! We rely on the AI SDK for model routing, and they have an Ollama provider, which will handle pretty much any local model.