yellowapple 10 days ago

Neither does the mainstream approach, given the whole hallucination problem.

But then again, humans "hallucinate" in this sense all the time, too.

1
WesolyKubeczek 9 days ago

I thought the end goal is to make something that's way more than human in capabilities.

The thing is, if we produce something that is worse than humans (and right now LLMs are worse than humans with good search indexes at hand), there's not much point doing it. It's provably less expensive to bear, raise, and educate actual humans. And to educate a human, somehow you don't need to dump the whole internet and all pirated content ever created into their heads.

kazinator 9 days ago

> LLMs are worse than humans with good search indexes at hand

But "good search indexes" are all LLM-based.

:)