It just shows that they're unimaginative and good at copying.
What’s wrong with copying?
If they can only copy, which I'm not saying is the case, then their progress would be bounded by whatever the leader in the field is producing.
In much the same way with an LLM, if it can only copy from its training data, then it's bounded by the output of humans themselves.