_Algernon_ 4 days ago

This is worse because the AI slop is full of hallucinations which they will now confidently parrot. No way in hell does this type of person verify or even think critically about what the LLMs tell them. No information is better than bad information. Less information while practicing the ability to critically use it is better than bad information in excess.

1
golergka 3 days ago

Do you have examples of recent models hallucinating when asked to summarize a text?