Workaccount2 3 days ago

This is also compounded by the fact that LLMs are not deterministic, every response is different for the same given prompt. And people tend to judge on one off experiences.

1
otabdeveloper4 3 days ago

> LLMs are not deterministic

They can be. The cloud-hosted LLMs add a gratuitous randomization step to make the output seem more human. (In vein with the moronic idea of selling LLM's as sci-fi human-like assistants.)

But you don't have to add those randomizations. Nothing much is lost if you don't. (Output from my self-hosted LLM's is deterministic.)

CharlesW 3 days ago

Even at temperature = 0, LLM output is not guaranteed to be deterministic. https://www.vincentschmalbach.com/does-temperature-0-guarant...