How are you running the model? Mistral’s api or some local version through ollama, or something else?
Is mistral on open router?
Yup https://openrouter.ai/provider/mistral
I guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...