> locally-installed software that intends to be used by one or more public websites has to run an HTTP server on localhost
if that software runs with a pull approach, instead of a push one, the server becomes unnecessary
bonus: then you won't have websites grossly probing local networks that aren't theirs (ew)
It's harder to run html and xml files with xslt by just opening them in a web browser (things like nunit test run output). To view these properly now -- to get the css, xslt, images, etc. to load -- you now typically have to run a web server at that file path.
Note: this is why the viewers for these tools will spin up a local web server.
With local LLMs and AI it is now common to have different servers for different tasks (LLM, TTS, ASR, etc.) running together where they need to communicate to be able to create services like local assistants. I don't want to have to jump through hoops of running these through SSL (including getting a verified self-signed cert.), etc. just to be able to run a local web service.
I'm not sure any of that is necessary for what we're talking about: locally-installed software that intends to be used by one or more public websites.
For instance, my interaction with local LLMs involves 0 web browsers, and there's no reason facebook.com needs to make calls to my locally-running LLM.
Running HTML/XML files in the browser should be easier, but at the moment it already has the issues you speak of. It might make sense, IMO, for browsers to allow requests to localhost from websites also running on localhost.