> So how bright do we want/need our displays to be? How many bits? For me, even a 500 nits TV in a dark room is enough viewing from about 10' away.
The advantage of high peak brightness (eg > 1000 nits) is brighter small regions such as specular highlights. 500 nits full field is eye-searing.
500 NITS is extremely bright. The standard for professional imaging has been 100 candelas per square meter (NITS) for decades. In fact, when you go to a theater, you are typically looking at 30 to 50 NITS.
Modern high dynamic range imaging can provide enhancements to specular highlights when/where necessary. That said, due to observer adaptation in most environments this tends to be largely pointless. As someone mentioned, the viewing environment can be far more critical than the screen parameters. Even a 100 NITS screen can feel blinding in a dark environment before adaptation.
Something most people don't realize/understand is the quality of the blacks or lowlights in an image is a perceptual effect, not an absolute characteristic of the human vision system. This means that a super bright screen in a reflective environment will "pollute" your black level perception, therefore having the net effect of collapsing the range of the image (everything darker than a certain perceptual point will seem black).
As a matter of course, I generally cut the brightness of all of my computer monitors by at least 50%. I am convinced that a huge element of the visual fatigue people complain about when working long hours is because they are looking at a light bulb (the screen) pounding them at 500 NITS all day. There is no doubt that will have negative consequences.
Source: Among other things, I studied Color Science at the Rochester Institute of Technology.
A proper light controlled, dark ambient room is a lot more impactful for highlights than a higher peak brightness is