I would argue that while it _feels_ wasteful to us humans, as we perceive it as a "big recomputation of the rendered graphics", technically it's not.
the redrawing of anything that changes in your ui requires gpu computation anyway, and some simple blur is quite efficient to add. Likely less expensive than any kind of animations of dom objects thar aren't optimized as gpu layers.
additionally, seeing how nowadays the most simple sites tend to load 1+ mb of JS and trackers galore, all eating at your cpu ressources, Id put that bit of blur for aesthetics very far down on the "wasteful" list
I generally agree - caveat is for some values of "some simple blur" - the one described in the article is not one in my book.
For reference, for every pixel in the input, we need to average 3x^2 pixels, roughly, where 3 is actually pi and x is the radius.
This blows up quite quickly. Not enough that my $5K MacBook really breaks a sweat with this example. But GPUs are one of the most insidious things a dev can accidentally forget to account for not being so great on other people's devices