It doesn't actually have "large memory allocations" due to immutable data structures. This is a meme that isn't true. Immutable data structures, especially at small scale, do not have huge performance penalties. You don't copy the entire structure over and over...you copy the O(log n) spine.
Haskell's GC is also fast when you are mostly generating garbage, which is inherently true for web server handlers.
Deforestation helps with that
A composition of catamorphic and anamorphic functions can eliminate a lot of the in-between allocations (a hylomorphism)
Basically it looks like you’re building a ton of intermediate structure then consuming it - meaning much of the in-between stuff can be eliminated.
Interesting optimizations and a little mind blowing when you see it.