Great job! Having ollama support would be useful as well[1]! [1]https://github.com/ollama/ollama
I thought this immediately also. I already have ollama set up to run llm tasks locally. I don't want to duplicate that but it would be fun to try this front end.