alkh 3 days ago

Great job! Having ollama support would be useful as well[1]! [1]https://github.com/ollama/ollama

1
totetsu 3 days ago

I thought this immediately also. I already have ollama set up to run llm tasks locally. I don't want to duplicate that but it would be fun to try this front end.