I feel like it's getting harder to tell what dagger is _actually_ for these days.
At first we'd hoped it could replace jenkins - it provided an alternative way to run and debug CI pipelines - right on your machine! You could write in golang and just import what you needed. The dev direction feels more scattered now, trying to replace docker, be a new shell(?), and weirdly trying to be some kind of langchain? Doing something different doesn't imply better. A new set of complicated CLI args is no better than what we started with (shell scripts, or jenkinsfiles to integrate docker builds). I'm a little bummed that the project has seemingly drifted (in my view) from the original mission.
If I may offer a different perspective: Dagger has always been a general-purpose composition engine, built on container tech. Its most successful use case is CI - specifically taking complex build and test environments, and making them more portable and reproducible. But we never claimed to replace Jenkins or any other CI platform, and we've always been very open with our community about our desire to expand the use of Dagger beyond CI. We also never claimed to replace Docker, or to "be a shell" (note that the title of this HN page doesn't reflect the title of our post in that regard).
Every feature we ship is carefully designed for consistency with the overall design. For example, Dagger Shell is built on the same Dagger Engine that we've been steadily improving for years. It's just another client. Our goal is to build a platform that feels like Lego: each new piece makes all other pieces more useful, because they all can be composed together into a consistent system.
It's a fair point - My opinions and use case are my own, I didn't mean to imply or assume there were promises not kept. The dagger team has been nothing but supportive and I do think has built a great community.
That said, in the early days it was definitely pitched for CI/CD - and this how we've implemented it.
> What is it? > Programmable: develop your CI/CD pipelines as code, in the same programming language as your application.
> Who is it for? > A developer wishing your CI pipelines were code instead of YAML
https://github.com/dagger/dagger/blob/0620b658242fdf62c872c6...
Edit: This functionality/interaction with the dagger engine still exists today, and is what we rely on. The original comment is more of an observation on the new directions the project has taken since then.
Yes it's a fair observation. In terms of use cases, we did focus exclusively on CI/CD, and only recently expanded our marketing to other use cases like AI agents. It's understandable that this expansion can be surprising, we're trying to explain it as clearly as possible, it's a work in progress.
I just wanted to clarify that in terms of product design and engineering, there is unwavering focus and continuity. Everything we build is carefully designed to fit with the rest. We are emphatically not throwing unrelated products at the wall to see what sticks.
For example, I saw your comment elsewhere about the LLM type not belonging in the core. That's a legitimate concern that we debated ourselves. In the end we think there is a good design reason to make it core; we may be wrong, but the point is that we take those kinds of design decisions seriously and we take all use cases into account when we make them.
Why not just using "Dagger is a modern compositional framework, with applications from CI/CD to AI agents", so people understand perhaps better that it's meant as a "framework" rather than a "tool" for a specific use-case?
Yes, this is what we're going for. At the moment the website says "cross-platform composition engine". But elsewhere in this thread, someone complained that it's too vague. It's hard to find a good balance between "too specific" and "too general".
I feel Nix is eating their lunch. Even though Nix as a language could be tough, it's a simpler solution than Dagger. It provides a shell, almost complete reproducibility and isolation can be added through containers. Working with dependencies to build source code using nix is usually straightforward but having the right binary version that doesn't have the same love as major languages (i.e. terraform, flux, etc) is one thing Nix really needs to implement. Marcelo's package version search[1] is a way to discover the particular hash but then you need to explicitly use that nixpkgs version for that particular binary and do a lot of mental gymnastics in your nix files. Nix could implement a syntax where you define the package versions as attribute set and then internally does a discovery of the nixpkgs hash for that versions and installs it. Flox and DevBox follow this pattern but I don't see why you need and external tool where this can be embedded in Nix (the cli).
[1] http://lazamar.github.io/download-specific-package-version-w...
IMO dagger isn't really comparable to nix.
We tried to fit dagger where we had jenkins - not just for binary builds, but for the other stuff. Mounting secrets for git clones / NPM installs, integration tests, terraform execution, build notifications and logging.
Caching is great, and dagger/nix both have interesting options here, but that's more of a bonus and not the core value prop of a build orchestrator.
Yeah, turning a lot of our Jenkins Groovy into Dagger one-liners. Makes it super easy to "run" CI locally before pushing, but you still need the orchestrator / scheduler. Also really loving the Dagger Cloud UI for build logs over Jenkins UI
We tried to fit dagger where we had jenkins
"Tried", implying it didn't go well and isn't a fit for replacing Jenkins?
I've found the Nix ecosystem to be lacking, missing packages, wrongly built packages (from the official upstream), and out of date versions. Homebrew still outclasses Nix in this regard (quality over quantity).
After taking Nix for a spin, I cannot be bothered to learn another custom tool with a bespoke language when I already have containers for doing the same things.
For Dagger, I can choose from a number of languages I already know and the Docker concepts map over nearly 1-1
> I've found the Nix ecosystem to be lacking, missing packages, wrongly built packages (from the official upstream), and out of date versions. Homebrew still outclasses Nix in this regard (quality over quantity).
That's also the case with the Docker ecosystem. On top of that, you need to take into account the base image, versions, etc.
At the end what I look for is for a project being able to build my source code with runtime dependencies and supporting tools that won't change overtime for the architecture that I need.
these days, the nix vs alts debates remind me of the emacs vs alts debates of yore
I have not yet encountered actual nix usage in my professional or open source work, so I don't see Nix as eating anyone's lunch
I happen to know Nix, but it’s from previous experience where I also learned to greatly dislike it. Anyways, did an interview yesterday where the company was using Nix to build and they were completely shocked I had Nix experience.
So yeah, no where near eating anyone’s lunch, let alone the behemoth that is Docker.
This is an interesting take, especially since I've had exactly the opposite impression. Since you mention Homebrew, I assume you're talking about running Nix on macOS, which has been eye-opening for me. The ecosystem is much larger, I very rarely don't find what I'm looking for in nixpkgs, and if it's not there, it _definitely_ won't be in homebrew. I only use homebrew for casks nowadays (and even that is managed through nix). There's also been times when I've tried to install something from homebrew on a colleague's machine, and it was a huge pain to set it up correctly (having to manually install some dependencies) whereas everything worked without effort using nix. The only downside imho is that package authors tend to go for updating homebrew first if anything, since that's considered the default. Nix will typically get updated version a few days later, but that's fine for me.
Homebrew supports both Mac and Linux
I definitely encountered packages available in homebrew but not nixpkgs, so the idea that if not in nix not in brew is wrong. Another package I use has been out of date for months, again it's quality over quantity and nix lacks the quality that i deem more important for myself
Someone else published a nixpkg for my project, but it is wrong. As an OSS maintainer, I have reduced the number of places I publish, too many packages managers these days for my limited time
> I've found the Nix ecosystem to be lacking, missing packages, wrongly built packages
This is an interesting problem you've faced, their marketing claim is that they have the most comprehensive catalog of packages (and I'm inclined to believe it). I very rarely run into broken packages, and that's usually resolved by using a stable release for that specific package - and it's not like my usage of packages is lightweight (7292 lines of nix config). That's on NixOS (and Silverblue, and Ubuntu) at least.
That's on NixOS, but on other distros there are issues. When I tried nix last year, installing alacritty, for example, required an opengl wrapper. Neovim couldn't compile plugins without environment hackery.
Things just work with pacman or dnf.
theoretically you could always use the previous version / iterate on them.
I think this approach of see what sticks and them trying out a lot of different things can be nice but not in its current form.
Although I have installed dagger to give it a try and I am just not sure how to make it work , the quickstart / hello world doesn't' work.
No I don't want to make an AI , I just want to see what you really are.
And so I do agree with your statement!
Yup - Ever since the introduction of llm as a core API we're basically going to stop upgrading / using the project unless things start moving in a more sane direction.
Why is that? It’s just an extra type that has no impact on any other types. It’s a tool in your toolbox that you can use when you need it and ignore otherwise.
Asked the opposite way - why is this a core API type of a buildkit interface? Having API calls to various LLMs as core functionality of a build system is just straight up weird. Doesn't fit, confuses developers on my team when they try to read/contribute to our shared build libraries, worries me about the direction the project is going.
Seems strange that especially with the push for modules this was integrated as a core type. It has nothing to do with buildkit or builds.
Dagger isn't just a build system. I use it for all sorts of things, one example would be translating Go types to CUE, or the other way around. This happens in a container.
With LLMs becoming core to the development workflow, it kinda makes sense to have a primitive, since LLM i/o is a bit different from other functions. I haven't tried it, but this thread had me go look at the details and now it's on the roadmap. Probably make something of an agentic workflow that uses a tool in a container, see how it works out. I'm still skeptical that Dagger is a good tool or ecosystem for LLM work without integrations to all the extra stuff you need around LLMs and agents.
Dagger aims to be more than a build kit interface and more than just a build system. Many people have been using it outside of CI for years.
But I can appreciate your perspective. Thank you for sharing your point of view.