There's a lot you can do with jq to transform complex json into an intermediate format designed to be input for simpler Python processing.
My experience with Dagster has made me appreciative of plain, simple Python code and highly skeptical of frameworks.
The longer you stick with Python the more elegant you'll find your code. It takes time and trial and error to go from being annoyed with your code to being more comfortable with it and not looking for a tool or framework to be the silver bullet.
Thanks for sharing that perspective. I do think jq is great for slicing up JSON before handing it off to Python—especially if your transformations are primarily stateless and can be expressed as simple map/filters. And yes, a lot of frameworks can turn into "magic black boxes" that complicate what should be simple.
That said, my gripe is that when you move beyond "transform this data" and step into "transform and then push to some API, handle partial success, retries, etc.," the line-by-line or chunk-by-chunk side-effects logic in Python can get gnarly fast. That’s the part I wish there was a standard, declarative approach for—something that doesn’t require a big workflow orchestrator (like Dagster) but also doesn’t devolve into tons of imperative glue code.
I agree no tool is a silver bullet. In many cases, plain, well-structured Python is enough. But once you need concurrency, chunk-based error handling, or incremental streaming with side effects, you end up coding your own partial solution anyway—and that’s where I start wishing for a more composable pattern in the standard Python ecosystem. Until then, jq + Python is definitely a solid approach for the simpler jobs!