Love the idea, some weirdness though:
> Here's a practical example: finding the first occurrence of a specific string among 1000 large files hosted online. Downloading all files at once would consume too much memory, processing them sequentially would be too slow, and traditional concurrency patterns do not preserve the order of files, making it challenging to find the first match.
But this example will process ALL items, it won't break when a batch of 5 finds something?
It will. Otherwise the example wouldn't make sense. There's one important detail I haven't clarified enough in that part of the readme.
For proper pipeline termination the context has to be cancelled. So it should have been be Like:
func main() { ctx, cancel := context.WithCancel(context.Background()) defer cancel()
urls := rill.Generate(func(send func(string), sendErr func(error)) {
for i := 0; i < 1000 && ctx.Err() == nil; i++ {
send(fmt.Sprintf("https://example.com/file-%d.txt", i))
}
})
...
One of the reasons I've ommited context cancellation in this and some other examples is because everything's happening inside the main function. I'll probably add cancellations to avoid confusion.