og_kalu 8 days ago

>My intuition for this comes from our inability to be confidently precise in our probabilistic rationalizations: I don't know about you, but I don't know whether I'm 57.1% or 57.01983% confident in an expression.

LLMs are probabilistic and notoriously unable to be confidently precise in their probabilistic rationalizations.

1
woodruffw 8 days ago

> LLMs are probabilistic and notoriously unable to be confidently precise in their probabilistic rationalizations.

Sure. To tie these threads together: I think there are sufficient other different properties that make me reasonably confident that my thought process isn't like an LLM's.

(Humans are imprecise, LLMs are imprecise, thermometers are imprecise, but don't stick me or my computer in an oven, please.)

og_kalu 8 days ago

>Sure. To tie these threads together: I think there are sufficient other different properties that make me reasonably confident that my thought process isn't like an LLM's.

Doesn't have to be like an LLM's to be probabilistic