cosmic_quanta 2 days ago

> sounds like it would be hard to implement a web server handling 10k+ concurrent requests on commodity hardware?

In practice, it is not. The canonical Haskell compiler, GHC, is excellent at transforming operations on immutable data, as Haskell programs are written, into efficient mutations, at the runtime level. Also, since web development is quite popular in the Haskell community, lots of people have spent many hours optimizing this precise use-case.

In my experience, the real downside is that compilation times are a bit long -- the compiler is doing a LOT of work after all.

1
eru 2 days ago

> The canonical Haskell compiler, GHC, is excellent at transforming operations on immutable data, as Haskell programs are written, into efficient mutations, at the runtime level.

Yes, at the level of native machine code and memory cells, there's not that much of a difference between immutability + garbage collection, and higher level source code that mutates. Thanks to GC you are going to overwrite the same memory locations over and over again, too.

whateveracct 2 days ago

Programmers for some reason really don't understand that generational garbage collection provides locality. I am really surprised how often I see C/C++/Rust types not understand this.

eru 2 days ago

I think that only applies to a moving GC. A conservative GC (like the Boehm GC for C) doesn't move any items around, and thus doesn't do anything for locality.

Of course, even a moving GC has limits, itwon't turn a hashtable into something that has local accesses.