>The top end Pentium 4 had a TDP of 115W vs. 170W for the current top end Ryzen 9000 and even worse for current Intel.
You can't compare CPUs based on TDP; it's an almost entirely useless measurement. The only thing it's good for is making sure you have a sufficient heatsink and cooling system, because it tells you only the peak power consumption of the chip. No one runs their CPUs flat-out all the time unless it's some kind of data center or something; we're talking about PCs here.
What's important is idle CPU power consumption, and that's significantly better these days.
>older phones or tablets with <5W CPUs, so why do those go out of support so fast?
That's an entirely different situation because of the closed and vendor-controlled nature of those systems. They're not PCs; they're basically appliances. It's a shitty situation, but there's not much people can do about it, though many have tried (CyanogenMod, GrapheneOS, etc.).
>Plenty of others who don't even care about 4k
Not everyone cares about 4k, it's true (personally I like it but it's not that much better than 1080p). But if you can't tell the difference between 1080p and an NTSC TV, you're blind.
>1080p TVs and even some 720p TVs are still sold new
Yes, as I said before, we're seeing diminishing returns. (Or should I say "diminishing discernable improvements"?)
Also, the 720p stuff is only in very small (relatively) screens. You're not going to find a 75" TV with 720p or even 1080p; those are all 4k. The low-res stuff is relegated to very small budget models where it's really pointless to have such high resolution.
For most videos, the difference between 1080p and 4k ain't that large.
But for certain video games on a large screen, I can definitely tell the different between 1080p and 4k. Especially strategy games that present a lot of information.
Btw, as far as I can tell modern screens use significantly less power, especially per unit of area, than the CRTs of old; even if that CRT is still perfectly functional.