what sort of environment do you need to be in to have to compose concurrency like this instead of relying on native go's scaling?
The same sort of environment in which one uses such abstractions like "functions" instead of relying on the language's native ability to run sequential instructions.
It's generally good for languages to provide relatively low-level functionality and let libraries be able to build on top of it, because as the programming language development world has now learned many times over, the hardest code to change is the code in the language and its standard library. It isn't the job of the language itself to provide every possible useful iteration on the base primitives it provides.
Batching is a pattern I’ve had to manually build in the past to push large amounts of analytic data to a database. I’d push individual events to be logged, map reduce those in batches and then perform insert on duplicate update queries on the database, otherwise the threshold of incoming events was enough to saturate the connection pool making the app inoperable.
Even optimizing to where if an app instance new it ran the inert on update for a specific unique index by storing that in a hash map and only running updates from there on out to increase the count of occurrences of that event was enough to find significant performance gains as well.