And models often makes reasoning errors. Many users will want to check that the sources substantiate the conclusion.
As they should do even for a Google search.
I see search engines as a dripfeed from a firehose, not some magical thing that's going to get me the 100% correct 100% accurate result.
Humans are the most prolific liars; I could never trust search results anyway since Google may find something that looks right but the author may be heavily biased, uninformed and all manner of other things anyways.