My pet peeve (in general, not specific to UV, which I genuinely appreciate) is using comment sections for controlling code execution.
Using comments for linters and developer notes is perfectly acceptable. However, for configuration or execution-related data, a far superior pattern would be something like:
UV_ENV = {
"dependencies": { "requests": "2.32.3", "pandas": "2.2.3" }
}
This approach has clear advantages:- It's valid Python syntax.
- It utilizes standard, easily-parsable data structures rather than ad-hoc comment parsing. It makes creation and validation smooth.
- Crucially, it adheres to a core principle: if you remove all comments from your code, it should still execute identically.
I agree, but I would go a step further.
You’re using a magic constant that doesn’t do anything at runtime. It’s only there to be parsed by static analysis. In your case that’s uv doing the parsing but another tool might delete it as unused code. In the sense that it’s one thing pretending to be another, for me, it’s in the same category as a magic comment.
Instead, why not make a call to uv telling it what to do?:
import uv
uv.exec(
dependencies=[“clown”],
python=“>=3.10”,
)
from clown import nose
The first call can be with any old python runtime capable of locating this hypothetical uv package. The uv package sets up the venv and python runtime and re-exec(3)s with some kind of flag, say, an environment variable.In the second runtime uv.exec is a noop because it detects the flag.
There's justification for that:
https://peps.python.org/pep-0723/#why-not-use-possibly-restr...
Thanks for the link.
My counterpoints to the PEP’s arguments are (1) we’re about to run Python so we presumably have a Python parser on hand anyway; and (2) for the foreseeable future it is going to be capable of parsing all previous versions of Python.
It’s a bit fast and loose though. I can see though that it’s helpful for long term stability to have two completely separate languages for dependencies versus the actual code, with the former being far more reduced and conservative than the latter.
If you use Python4.98 triple walrus operators to say requires_version:::=“>=4.98” it would definitely be annoying for any version prior to that to not even be able to parse the requirements, let alone try to meet them.
So but that means instead of uv running python python runs uv now, which (I would imagine) has all kind of implications from a development perspective.
I agree that theoretically your proposed way of doing things would be conceptionally among the cleanest, but on the other hand in all kind of scripts the shebang was sort of a comment with big implications as well, so I am not sure if being dogmatic is worth it here.
I don't agree with it the argumentation.
It might be specified that I needs to be proper JSON. And a proper JSON is much more maintainable (and extendible) than impromptu syntax (that first starts manageable, but step by stem moves into parsing hell).
One of uv's justifications is that it isn't dependent on Python, and so there's no circular bootstrap problem. Things are now at the point with uv were you tell the person you're sharing a script with: 1. Get the install command from the uv site and run it (if they don't already have it installed). 2. Run the script with uv.
Literally cannot get simpler than that. Making uv an importable means assuming Python is present or easily installed on every system, which if it were the case then uv wouldn't be becoming a thing.
That makes your code depend on UV where otherwise it wouldn’t.
Remember the specification to indicate dependencies in a comment on a script is a PEP (723) and it’s tool-agnostic.
" In your case that’s uv doing the parsing but another tool might delete it as unused code."
That's probably the goal. It's only there for one tool. If it's not used, we want it to have no impact on the running app. Like comments.
Different python versions have different syntax grammars, so if the rest of your file has new syntax, and older python might not be able to execute even the first few lines.
For example if you run this on python3.6:
print("hello")
match 123:
case _: pass
you won't even get a "hello". What is the problem with that? I see no reason to expect that would work.
The whole point of writing a "self-contained" script is that it should run anywhere. uv bundles its own cpython runtime(s) for this purpose, but relying on script execution prior to invocation of uv breaks this.
The trick in the featured article would allow me to drop a script written in modern python syntax on my outdated ubuntu LTS box and have it "just work", while GP's suggestion would not.
Instead, why not make a call to uv telling it what to do?
One important aspect to remember is that this isn't intended to be a uv specific feature. It's a (proposed) python standard feature that in the future other python package managers will implement. So whatever solution they come up with it has to work with any standard compliant package manager, not just uv.
This isn't a uv invention, uv is only using the standard PEP 723 like other tools.
That’s fair. I agree with other replies though that parsing and evaluating imperative code is a lot tougher and less flexible than adhering to the principle of least power and making it declarative data.
It’s also worth noting that using comments is exactly how the shebang line works in the first place. It’s just so well-ingrained after 45 years that people don’t notice that it’s a shell comment.
> if you remove all comments from your code, it should still execute identically
It still -does- execute identically. Provided you install the same dependencies.
I don't see this as changing the semantics of the code itself, rather just changing the environment in which the code runs. In that respect it is no different from a `#!/bin/bash` comment at the top of a shell script.
I completely agree. Hope something like this is eventually standardized.
Problem is that uv probably does not want to execute anything to find out dependencies, so it would have to be a very restrictive subset of python syntax.
The fact that is is needed at all of course highlights a weakness in the language. The import statements themselves should be able to convey all information about dependencies
UV just implemented PEP 723[1], which is now PyPA Inline Script Metadata[2]. That's no longer provisional, it is standardized already! It's unfortunate that Python didn't have some non-comment way to provide this functionality.
[1] https://peps.python.org/pep-0723/
[2] https://packaging.python.org/en/latest/specifications/inline...
> The fact that is is needed at all of course highlights a weakness in the language. The import statements themselves should be able to convey all information about dependencies
What languages convey the version of the dependencies in a script’s import statements?
This has come up a LOT on HN in the past few months, some other recent examples:
https://news.ycombinator.com/item?id=43500124
https://news.ycombinator.com/item?id=42463975
I like uv and all, but I take exception to the "self-contained" claim in two regards:
1) The script requires uv to already be installed. Arguably you could make it a shell script that checks if uv is already installed and then installs it via curlpipe if not... but that's quite a bit of extra boilerplate and the curlpipe pattern is already pretty gross on its own.
2) Auto-creating a venv somewhere in your home directory is not really self-contained. If you run the script as a one-off and then delete it, that venv is still there, taking up space. I can't find any assertion in the uv docs that these temporary virtual environments are ever automatically cleaned up.
Right, you need to have uv installed, and if you don't, you'll probably have to install it manually or through `curl | sh`. I think this is a valid complaint. Something to consider is that it will become less of an issue as package managers include uv in their repositories. For example, uv is already available in Alpine Linux and Homebrew: https://repology.org/project/uv/versions.
Another thing is that inline script metadata is a Python standard. When there is no uv on the system and uv isn't packaged but you have the right version of Python for the script, you can run the script with pipx: https://pipx.pypa.io/stable/examples/#pipx-run-examples. pipx is much more widely packaged: https://repology.org/project/pipx/versions.
curl | sh is an abhorrent practice and should never be used.
The alternative is to wait for the 10 different distros to all package your program and then update it once every blue moon.
No, the alternative is to package it yourself and offer it with a signing key. If you make a .deb and .rpm, you’ve covered a large majority of end users.
That sounds worse than the status quo, a lot of developers use Arch Linux, NixOS, other uncommon (to non-devs) distros.
Why is signing key with .deb/.rpm better than `curl | sh` from a HTTPS link on a domain owned by the author? .deb/.rpm also contain arbitrary shell commands.
If the shell script happens to have key verification built in to it, then not much from the perspective of provenance verification, but that’s rare IME. Also, using the OS’s package manager means that you can trivially uninstall it.
I tried to hack together a shebang with docker+uv to solve this kind of problem, and it sort of does because that’s maybe more common than uv for a random dev machine (especially since tfa says it’s a gong project).
This works but doesn’t cache anything so the download for every run is awkward. This can probably be fixed with a volume tho?
Something like this: https://hugojosefson.github.io/docker-shebang/#python
You usually have to install something before you can run a program on your computer, so installing uv doesn't seem that bad to me. I still wouldn't call this self-contained because when you run the program it downloads who knows what from the internet!
To me, fully self-contained is something more like an AppImage
Agree 100%. Using something like py2exe creates a self contained "python script". This comes with a lot of problems for the developer but minimum problems for the user.
A nitpick: uv’s package deduplication means virtualenvs do not take up space unless they have unique dependencies.
> I can't find any assertion in the uv docs that these temporary virtual environments are ever automatically cleaned up.
That’s a good point. I wonder if at least they are reused when you run the script several times.
I took a closer look; uv installs the inline required packages in it's cache directory `~/.cache/uv` (if they are not already there). So the packages will probably exist until the cache is cleared with for example `uv clear`.
It's not that the inline requirements make a new `.venv` directory or something, uv seems to link the packages to a central location and reuse them if already there.
My understanding is that uv creates a hash of the script name, python version and dependencies when creating the venv. So if none of those change, it will reuse the venv.
As mentioned in other comments, the "self-contained" claim depends on `uv` being installed.
For those who want a really self-contained Python script, I'd like to point out the Nuitka compiler [0]. I've been using it in production for my gRPC services with no issues whatsoever - just "nuitka --onefile run.py" and that's it. It Just Werks. And since it's a compiler, the resulting binary is even faster than the original Python program would be if it were bundled via Pyinstaller.
The author's GitHub page [1] contains the following text:
Other than software development, my passion would be no
other. It's my life mission to create the best Python
Compiler I can possibly do or die trying, ... of old
age.
[0] https://nuitka.net/ We do the same with Nix, the shebang line looks like this:
#! nix-shell -i python3 -p "python312.withPackages (pkgs: [ pkgs.boto3 pkgs.click ])"
With this, the only requirement is Nix on the system, you don't even need Python to be installed! While that is true, there are still lots of PyPI packages not yet packaged with nixpkgs, so this is not as universal an approach as uv.
> you don't even need Python to be installed!
Note that this is exactly the case in TFA - uv takes care of installing Python ad-hoc.
Yup, and you can apply the same technique to any language. The obvious example is bash with all the dependencies specified, but I’ve also hacked up quick single file rust scripts using nix shebangs.
Having Nih installed is much stronger requirement than having uv
How to do the same thing with `nix shell` (The flake based command) instead of `nix-shell`?
I really like this pattern, but unfortunately I haven't been able to get it to work with my LSP (pyright, in Helix), even when running my editor via uv (`uv run hx script.py`).
I could always do `uv run --with whatever-it-is-i-need hx script.py`, but that's starting to get redundant.
I have my own ugly uve script
$ cat ~/.local/bin/uve
#!/bin/bash
temp=$(mktemp)
uv export --script $1 --no-hashes > $temp
uv run --with-requirements $temp vim $1
unlink $temp
Hope editor could support the `uv python find --script` soon. FYI, have a look at "trap .. EXIT" to defer cleanups like your unlink. It's neat cause it will run even if the script is interrupted / fails before the unlink.
Seems analogous to bundler/inline [1] on the Ruby side of the world. Happy to see something similar in Python— it’s really handy!
[1] https://bundler.io/guides/bundler_in_a_single_file_ruby_scri...
This looks quite useful! Is uv a safer choice to use for deploying a python based project long term? I’m referring to the anaconda rug pull that happened- using it for managing dependencies about 5 years ago, but then they changed some rules so that any of my clients who are organizations with over 200 employees are no longer free to use anaconda. They must pay a commercial license
uv is licensed under either MIT or Apache-2.0[0]
They can always stop developing or fork to a different license and all future work belongs under that license, but you can't back date licenses, so what exists is guaranteed Open Source. If you're super worried, you can create a fork and just keep it in sync.
But this is essentially true about any other OSS project so I wouldn't be concerned. As far as I'm aware, conda was never open sourced and had always distributed binaries.
[0] https://github.com/astral-sh/uv?tab=readme-ov-file#license
Just because something is open source doesn't mean it will be maintained. With uv there is the slight peculiarity that it's written in Rust rather than Python. So you need to count on there being an active group of Rust devs who care about Python.
Because it uses PyPI I'm happy to use it as a package manager and dev tool. The worst that can happen is I have to switch back to pip etc. But I wouldn't use it as package runtime dependency. Use pyinstaller for that.
The use case for this kind of trick I think is developer utility scripts in repos. I wouldn't want to tie any of my personal utils to uv. If it needs dependencies I'll just make a package, which is dead easy now.
The really great thing is that the inline metadata format is an accepted PEP spec, so even if uv goes down the tubes there will be other tools that can be dropped in to support it.
I think anaconda's rug pull was on the repository (you can still use packages from conda-forge for free).
uv just uses pypi, so it would be just a question of changing from uv to pip, poetry or whatever, all packages would still be coming from the same place.
As I understand it, relicensing is possible when a project has a Contributor Licensing Agreement (CLA) which says that you're signing over your copyright to your contribution to the project's owners. (Who will eventually be bought out by the worst rich person you can think of - Yes, him.)
I peeked in uv's contributing guide and issues and didn't see any CLA. In PyTorch the CLA was mentioned at the top of the contributing guide.
Although, there should have been a community fork of the last FOSS version of Anaconda. That's what happened with Redis, and Redis uses a CLA: https://github.com/redis/redis/blob/unstable/CONTRIBUTING.md...
Don't ever sign a CLA, kids. Hell, only contribute to copyleft projects. We get paid too much to work for free.
Not having a CLA prevents relicensing but open source licenses aren't revokable anyways.
That's not been tested much in the courts. Recent rulings suggest OSS at least has consideration and thus can't be blanket terminated without some justification, but the USC provides that justification (with some onerous requirements -- 2-10 years of notice, has to happen in a certain time window, ...), even for licenses stating irrevocability.
This seems like a good packaging alternative to containerizing for smaller utilities. Now to convince all my coworkers to install uv..
I love it, it feels like an extension of these other HN posts:
- Uv's killer feature is making ad-hoc environments easy (valatka.dev): https://news.ycombinator.com/item?id=42676432
- Using uv as your shebang line (akrabat.com): https://news.ycombinator.com/item?id=42855258
I learned to love uv because of this usecase but I still find it against the Zen of Python that an official (and, dare I say, extremely useful!) PEP is not supported by the official Python tools.
This is the first time that Python didn't come with "batteries included" from my POV.
Now I also have two Python dependency managers in my system. I know there are volumes to talk about Python dependency management but all these years, as long as a project had a requirements.txt, I managed to stick to vanilla pip+venv.
That's been a bit of a trend for the Python build specs. Pretty sure the pyproject toml predates the tomllib library. So for a few versions you had to specify your module in a language that Python couldn't read natively.
Which is worse than just having a default way for including metadata that's not used. That's what makes it metadata after all. Otherwise it would just be Python syntax
Has anyone gotten this to work on Windows? I wanted to use this trick for some tooling for a game mod I'm working on but couldn't get the shebang trick to work.
The regular CPython installer on Windows installs the py launcher and associates it with .py files. The py launcher supports shebang lines.
This was covered in a blog post about this same topic that was posted here a few days ago. According to that you have to omit the -S: https://thisdavej.com/share-python-scripts-like-a-pro-uv-and...
https://news.ycombinator.com/item?id=43500124
I haven't tried it myself, I simply changed the file association so all .py files are opened with uv run as standard.
https://docs.python.org/3/using/windows.html#python-launcher...
Interesting. The workflow I've been using skips the CPython installer and only uses uv.
Windows doesn't support shebang lines as you probably know, but if you associate uv with .py files you'll get the same result.
I think it should be something like this:
ftype Python.File=C:\Path\to\uv.exe run %L %*
If you don't use the CPython installer the Python.File file type might not be defined, so you might need to set that with `assoc` first: assoc .py=Python.File
I use this frequently on Windows and Linux. These are the steps I take:
$> uv init --script <script_name>.py
$> uv add --script <script_name>.py <pkg1> <pkg2> ...
$> uv add --script <script_name>.py --dev <dev_pkg1> <dev_pkg2> ...
$> uv run <script_name>.py
Hope this helps :)
Unfortunately, `uv add --dev` doesn't work with `--script`:
> uv -V
uv 0.6.10
> uv add --script foo.py --dev ruff
error: the argument '--script <SCRIPT>' cannot be used with '--dev'
Usage: uv add --script <SCRIPT> --link-mode <LINK_MODE> <PACKAGES|--requirements <REQUIREMENTS>>
For more information, try '--help'.
There is currently no mention of `uv add --script foo.py --dev ...` in https://docs.astral.sh/uv/guides/scripts/.
Inline script metadata in Python doesn't standardize development dependencies.I wrote a recent comment about how I develop scripts with `pyproject.toml` to have a regular development environment: https://news.ycombinator.com/item?id=43503171.
Interesting about the --deb flag not working with scripts. I wrote this from memory and feel like I've installed dev dependencies for a script, but it must be a hallucination because I just tried and it doesn't work. Thanks for the correction.
Aside: angle brackets should be avoided in shell examples; one mistaken enter and work can be destroyed.
I haven't done dev on windows in many years, but IIRC windows doesn't have shebang support.
But it does support registering extension handlers[1], so if you name your scripts with say .pyuv and register "uv run --script %1", or whatever it would take to run uv, as the handler of .pyuv files, it should work. Unless uv does something funky that is.
You could do this during an installation step, for example.
[1]: https://learn.microsoft.com/en-us/windows/win32/shell/fa-fil...
I'm not sure it's useful, but someone posted how to do a similar thing for Racket with PoweScript https://onor.io/2025/01/more-scripting-with-racket.html
Overall, and admittedly from a bit of a distance, uv run feels like a reinvention of Zero Install, but for only Python.
I also wondered why virtual environments were invented for Python when general environment managers (like Modules) already existed.
These packaging and environment problems have never been specific to Python
uv has little in common with 0install, which has its origins in RISC OS's application directories.
uv is an attempt to fix the fragments Python development environment tooling story.
I mean yes, you are correct. UV does a lot of different things, but this particular "self-contained app" feature is a lot like zero install. You run your app, its dependencies are automatically downloaded and cached for other uv/zero install apps to use, and it's all transparent and easy.
In this case, it's nothing specific to uv though: there's a PEP outlining how this stuff is declared and uv is just one of the tools that happens to support the format. I wouldn't be at all surprised of pipx also supports it.
So how does this guarantee that it will never raise some libc error, or similar? Unfortunately I have become sceptical about "self contained" distribution methods.
This isn't self contained in that sense, it's deferring dependency management to runtime, with uv apparently doing that reliably enough for the use case.
Hey I have done the same for Swift scripts! (Well I have rewritten what Homebrew’s creator did some time ago, but does not maintain anymore, to be precise.)
help get this supported in vscode by liking the issue here: https://github.com/microsoft/vscode-python/issues/24916
Even better: self-contained uv run python scripts generated by AI: https://everything.intellectronica.net/p/the-little-scripter
When Python has something akin to Tcl's starkits, then it'll be cooking with gas -- I might even use it again. Py2exe came close, but was not cross-platform.
Hmmm this seems bad
That code is bad for several reasons including not catching+handling exceptions (and possibly retrying), and accessing the JSON properties w/o get()
The overhead of re-installing stuff and setting up a header seem very unnecessary to run a simple script
If this is about sending some Python for somebody else to run easily - the recipient should always check the code. You should never run arbitrary code. For example, there have been hacks performed using YAML loader (crypto exchange).
For dependencies, use the standard pyproject.toml
I do not really understand how this is self contained, when you have to install additional software to run it.
my approach is to use python build-in venv https://gist.github.com/Szpadel/43794d606d9924e7fea3e63fb800...
that way you can run scripts with external packages with only basic python installation
Your version needs Python. uv needs needs only uv, as it can manage python installation by itself.
It's also a lot easier to install uv than to manage Python installations (there's a reason pyenv and the like exist).
That's fair point. Its maybe different use case. For me main goal was to be able to run project scripts on macos and Linux ootb, and both have some python 3 version already available ootb.
But probably if you need specific python version it isn't best way
So now I have to make sure uv is installed instead of python. Whar us this better? And python is available on almost any system
The whole point of uv is to solve the nightmare that is running a script with the right version of python with the right dependencies
"Just use the system python" gets you right back to the start (oh no! It didn't parse because it used python 3.11 features and I'm still on 3.5)
> And python is available on almost any system
This is actually the root of all Python problems, by it believing it has a right to be a core part of the operating system all the package design choices treat their installation like they're the only thing running on the machine when the realities of Python packages are they all rely on very specific versioning from the interpreter to between the packages so the idea of having a canonical version of pytorch that all your projects run on just doesn't exist.
Python is not a globally available thing. I was really surprised when I setup Kubuntu 24.04 and found it missing. And now you can do `uv run --python=<version> ...` and have all that automatically handled on a case by case basis.
uv can seemlessly get the correct python binary on demand. It would be a pain to get, say, python 3.9 or python 3.14 on my system. Making sure uv is installed actually seems to be less of an issue, but of course this varies.