w8nC 20 hours ago

Now it’s just a wrapper around hosted APIs.

Went with my own wrapper around llama.cpp and stable-diffusion.cpp with optional prompting hosted if I don’t like the result so much, but it makes a good start for hosted to improve on.

Also obfuscates any requests sent to hosted, cause why feed them insight to my use case when I just want to double check algorithmic choices of local AI? The ground truth relationship func names and variable names imply is my little secret

1
Patrick_Devine 19 hours ago

Wait, what hosted APIs is Ollama wrapping?