This would be like claiming in 2010 that because Page Rank is out there, search is a solved problem and there’s no secret sauce, and the following decade proved that false.
In a time where statistical models couldn't understand natural language the click stream from users was their secret sauce.
Today a consumer grade >8b decoder only model does a better job of predicting if some (long) string of text matches a user query than any bespoke algorithm would.
The only reason why encoder only models are better than decoder only models is that you can cache the results against the corpus ahead of time.