userbinator 8 days ago

but on the other hand, having a rising browser engine might eventually remove this avenue for fingerprinting

If what I've seen from CloudFlare et.al. are any indication, it's the exact opposite --- the amount of fingerprinting and "exploitation" of implementation-defined behaviour has increased significantly in the past few months, likely in an attempt to kill off other browser engines; the incumbents do not like competition at all.

The enemy has been trying to spin it as "AI bots DDoSing" but one wonders how much of that was their own doing...

3
SoftTalker 8 days ago

It's entirely deliberate. CloudFlare could certainly distinguish low-volume but legit web browsers from bots, as much as they can distinguish chrome/edge/safari/firefox from bots. That is if they cared to.

hansvm 8 days ago

Hold up, one of those things is not like the other. Are we really blaming webmasters for 100x increases in costs from a huge wave of poorly written and maliciously aggressive bots?

refulgentis 8 days ago

> Are we really blaming...

No, they're discussing increased fingerprinting / browser profiling recently and how it affects low-market-share browsers.

hansvm 8 days ago

I saw that, but I'm still not sure how this fits in:

> The enemy has been trying to spin it as "AI bots DDoSing" but one wonders how much of that was their own doing...

I'm reading that as `enemy == fingerprinters`, `that == AI bots DDoSing`, and `their own == webmasters, hosting providers, and CDNs (i.e., the fingerprinters)`, which sounds pretty straightforwardly like the fingerprinters are responsible for the DDoSing they're receiving.

That interpretation doesn't seem to match the rest of the post though. Do you happen to have a better one?

userbinator 8 days ago

"their own" = CloudFlare and/or those who have vested interests in closing up the Internet.

jillyboel 7 days ago

Your costs only went up 100x if you built your site poorly

hansvm 7 days ago

I'll bite. How do you serve 100x the traffic without 100x the costs? It costs something like 1e-10 dollars to serve a recipe page with a few photos, for example. If you serve it 100x more times, how does that not scale up?

jillyboel 7 days ago

It might scale up but if you're anywhere near efficient you're way overprovisioned to begin with. The compute cost should be miniscule due to caching and bandwidth is cheap if you're not with one of the big clouds. As an example, according to dang HN runs on a single server and yet many websites that get posted to HN, and thus receive a fraction of the traffic, go down due to the load.

immibis 7 days ago

You got 100x the traffic if your traffic was near zero to begin with.

cyanydeez 8 days ago

I dont think they're doing this to kill off browser engines; they're trying to sift browsers into "user" and "AI slop", so they can prioritize users.

This is entirely web crawler 2.0 apocolypse.

nicman23 8 days ago

man i just want a bot to buy groceries for me

baq 8 days ago

That’s one of the few reasons to leave the house. I’d like dishes and laundry bots first, please.

dodslaser 8 days ago

You mean dishwashers and washing machines?

baq 8 days ago

Yes, but no. I want a robot to load and unload those.

dec0dedab0de 7 days ago

I have been paying my local laundromat to do mu laundry for over a decade now, it’s probably cheaper than youre imagining and sooo worth it.

baq 7 days ago

my household is 6 people, it isn't uncommon to run 3 washing machine loads in a day and days without at least one are rare. I can imagine the convenience, but at this scale it sounds a bit unreasonable.

dishwasher runs at least once a day, at least 80% full, every day, unless we're traveling.

nicman23 5 days ago

i mean it is called a dishwasher

extraduder_ire 7 days ago

I think "slop" only refers to the output of generative AI systems. bot, crawler, scraper, or spider would be a more apt term for software making (excessive) requests to collect data.