Hacker Newsnew | past | comments | ask | show | jobs | submit | more zahlman's commentslogin

Does the third leading slash do something?

No, it just felt a tad cleaner to have the comment slashes separate from the path leading slash.

That's explicit support rather than using the same // hack. The language is specifically ignoring a shebang even though it doesn't match the usual comment syntax.

True, so less fun, but more practical in that it's supported.

> Oh come on, it's easy: (satire)

If you've checked out a repo or unpacked a tarball without documentation, sure.

If you got it from PyPI or the documentation indicates you can do so, then you just use your tooling of choice.

Also, the pip+venv approach works fine with pyproject.toml, which was designed for interoperability. Poetry is oriented towards your own development, not working with someone else's project.

Speaking of which, a project that has a pipfile, environment.yml, uv.lock etc. and doesn't have pyproject.toml is not being seriously distributed. If this is something internal to your team, then you should already know what to do anyway.


Any time you have to resort to the no true scotsman fallacy you are telling me everything I need to know to run in the other direction.

It is not "no true scotsman" to point out that tons of projects are put on GitHub etc. without caring about whether others will actually be able to download and "install" and use the code locally, and that it's unreasonable to expect ecosystems to handle those cases by magic. To the extent that a Python ecosystem exists and people understand development within that ecosystem, the expectations for packaging are clear and documented and standard.

Acting as if these projects using whatever custom tool (and its associated config, by which the tool can be inferred), where that tool often isn't even advertised as an end-user package installer, are legitimate distributions is dishonest; and acting as if it reflects poorly on Python that this is possible, far more so. Nothing prevents anyone from creating a competitor to npm or Cargo etc.


Yes, many things can use inline script metadata.

But a script only has one shebang.


Perhaps a case for standardizing on an executable name like `python-script-runner` that will invoke uv, pipx, etc. as available and preferred by the user. Scripts with inline metadata can put it in the shebang line.

I see it has been proposed: https://discuss.python.org/t/standardized-shebang-for-pep-72....


I get the impression that others didn't really understand your / the OP's idea there. You mean that the user should locally configure the machine to ensure that the standardized name points at something that can solve the problem, and then accepts the quirks of that choice, yes?

A lot of people seem to describe a PEP 723 use case where the recipient maybe doesn't even know what Python is (or how to check for a compatible version), but could be instructed to install uv and then copy and run the script. This idea would definitely add friction to that use case. But I think in those cases you really want to package a standalone (using PyInstaller, pex, Briefcase or any of countless other options) anyway.


> You mean that the user should locally configure the machine to ensure that the standardized name points at something that can solve the problem, and then accepts the quirks of that choice, yes?

I was thinking that until I read the forum thread and Stephen Rosen's comments. Now I'm thinking the most useful meta-runner would just try popular runners in order.

I have put up a prototype at https://github.com/dbohdan/python-script-runner.


Neat. Of course it doesn't have much value unless it's accepted as a standard and ships with Python ;) But I agree with your reasoning. Might be worth reviving that thread to talk about it.

> I just hate trying to install other people’s scripts.

This notion is still strange to me. Just... incompatible with how I understand the term "script", I guess.


You don't understand the concept of people running software written by other people?

One of my biggest problems with python happens to be caused by the fact that a lot of freecad is written in python, and python3 writes _pycache_ directories everywhere a script executes (which means everywhere, including all over the inside of all my git repos, so I have to add _pycache_ to all the .gitignore ) and the env variable that is supposed to disable that STUPID behavior has no effect because freecad is an appimage and my env variable is not propagating to the environment set up by freecad for itself.

That is me "trying to install other people's scripts" the other people's script is just a little old thing called FreeCAD, no big.


> That is me "trying to install other people's scripts" the other people's script is just a little old thing called FreeCAD, no big.

What I don't understand is why you call it a "script".

> and python3 writes _pycache_ directories everywhere a script executes (which means everywhere, including all over the inside of all my git repos, so I have to add _pycache_ to all the .gitignore )

You're expected to do that anyway; it's part of the standard "Python project" .gitignore files offered by many sources (including GitHub).

But you mean that the repo contains plugins that FreeCAD will import? Because otherwise I can't fathom why it's executing .py files that are within your repo.

Anyway, this seems like a very tangential rant. And this is essentially the same thing as Java producing .class files; I can't say I run into a lot of people who are this bothered by it.


This is 99% of the complaints in these threads. "I had this very specific problem and I refuse to handle it by using best practise, and I have not used python to anything else, but I have very strong opinions".

I've had many problems with many python scripts over the many years. Problems that I did not have with other scripts from other ecosystems. And that's just sticking to scripts and not languages in general.

A random sh script from 40 years ago usually works or works with only the tiniest adjustment on any current system of any unix-like.

A random py script from 6 months ago has a good chance of not working even on the same system let alone another system of the same platform, let alone another system of a different platform.

Now please next by all means assume that I probably am only complaining about the 2->3 trasition and nothing that actually applies since 15 years ago.


>A random py script from 6 months ago has a good chance of not working even on the same system

This just isn't true and nothing I have experienced over many years of python programming.

Maybe your problems with python scripts (is it scripts or program?) is an issue with the code?


I wouldn't use "script" to describe FreeCAD. Regardless, this problem is much more with FreeCAD than with Python.

> I have to add _pycache_ to all the .gitignore

I just add that, once, in my global gitignore.


> All I had to do was embed a primitive AI generated TOML parser in it.

The standard recommendation for this is `tomli`, which became the basis of the standard library `tomllib` in 3.11.


Hard links, in fact. It's not hard to do, just (the Rust equivalent of) `os.link` in place of `os.copy` pretty much. The actually clever part is that the package cache actually contains files that can be used this way, instead of just having wheels and unpacking them from scratch each time.

For pip to do this, first it would have to organize its cache in a sensible manner, such that it could work as an actual download cache. Currently it is an HTTP cache (except for locally-built wheels), where it uses a vendored third-party library to simulate the connection to files.pythonhosted.org (in the common PyPI case). But it still needs to connect to pypi.org to figure out the URI that the third-party library will simulate accessing.


> you might find someone saying to use uv, but also potentially venv, poetry or hatch.

This is sort of like saying "You might find someone saying to drive a Ford, but also potentially internal combustion engine, Nissan or Hyundai".


Only to those already steeped in Python. To an outsider they're all equally arbitrary non-descriptive words and there's not even obvious proper noun capitalization to tell apart a component from a tool brand.

It's always rather irritating to me that people make these complaints without trying to understand any of the under-the-hood stuff, because the ultimate conclusion is that it's somehow a bad thing that, on a FOSS project, multiple people tried to solve a problem concurrently.

That’s especially ironic given that inside Python part of the philosophy is “There should be one-- and preferably only one --obvious way to do it.” So why does Python’s external environment seem more like something that escape from a Perl zoo?

Because a lot of people have no clue about packaging or how to write compatible software, one that is actually installable as normal application. I suspect a lot of them learned stuff in node.js or ruby ecosystem first and this is the result. Same as requiring using docker to install or build an application. It isn't cool, funny or right way to do stuff. I still don't get what was so wrong about venv that anyone needed uv. I have no need to even try and i'm writing python stuff so long that i cannot even estimate it. To me it feels like reinvention for sake of rewrite in rust. If it is so good, ok, i get it, it might be - and all that good stuff needs to go back to python as python.

> I still don't get what was so wrong about venv that anyone needed uv.

Pip is slow, far slower than it needs to be in almost everything that it does, regardless of being written in Python. It's "standard" but not part of the standard library (so that it can be developed independently), and was never designed to install cross-environment properly (the current best approach, since 22.3, is a hack that incurs a significant delay and expects everyone to move in lock-step with the CPython EOL schedule). It wastes disk space, both by re-copying packages into new environments (rather than hard-linking them as uv does) and by spawning copies of itself in those environments (the original work-around to avoid needing cross-environment installation support, which a few people have also come to rely on in other ways).

> If it is so good, ok, i get it, it might be - and all that good stuff needs to go back to python as python.

I like these threads because they encourage me to work on my main project.


The one obvious way is the underlying virtualenv abstraction. Everything else just makes that part easier or more transparent.

What kstrauser said.

But with much more detail: it seems complicated because

* People refuse to learn basic concepts that are readily explained by many sources; e.g. https://chriswarrick.com/blog/2018/09/04/python-virtual-envi... [0].

* People cling to memories of long-obsolete issues. When people point to XKCD 1987 they overlook that Python 2.x has been EOL for almost six years (and 3.6 for over four, but whatever)[1]; only Mac users have to worry about "homebrew" (which I understand was directly interfering with stuff back in the day) or "framework builds" of Python; easy_install is similarly a long-deprecated dinosaur that you also would never need once you have pip set up; and fewer and fewer people actually need Anaconda for anything[2][3].

* There is never just one way to do it, depending on your understanding of "do". Everyone will always imagine that the underlying functionality can be wrapped in a more user-friendly way, and they will have multiple incompatible ideas about what is the most user-friendly.

But there is one obvious "way to do it", which is to set up the virtual environment and then launch the virtual environment's Python executable. Literally everything else is window dressing on top of that. The only thing that "activating" the environment does is configure environment variables so that `python` means the virtual environment's Python executable. All your various alternative tools are just presenting different ways to ensure that you run the correct Python (under the assumption that you don't want to remember a path to it, I guess) and to bundle up the virtual environment creation with some other development task.

The Python community did explicitly provide for multiple people to provide such wrappers. This was not by providing the "15th competing standard". It was by providing the standard (really a set of standards designed to work together: the virtual environment support in the standard library, the PEPs describing `pyproject.toml`, and so on), which replaced a Wild West (where Setuptools was the sheriff and pip its deputy).

[0]: By the way, this is by someone who doesn't like virtual environments and was one of the biggest backers of PEP 582.

[1]: Of course, this is not Randall Munroe's fault. The comic dates to 2018, right in the middle of the period where the community was trying to sort things out and figure out how to not require the often problematic `setup.py` configuration for every project including pure-Python ones.

[2]: The SciPy stack has been installable from wheels for almost everyone for quite some time and they were even able to get 3.12 wheels out promptly despite being hamstrung by the standard library `distutils` removal.

[3]: Those who do need it, meanwhile, can generally live within that environment entirely.


I imagine by this they meant `python -m venv` specifically, using that interface directly, rather than through another wrapper CLI tool.

Fair.

The way I teach, I would start there; then you always have it as a fallback, and understand the system better.

I generally sort users into aspirants who really should learn those things (and will benefit from it), vs. complete end users who just want the code to run (for whom the developer should be expected to provide, if they expect to gain such a following).


This isn't really "alternatively"; it's pointing out that in addition to the shebang you can add a PEP 723 dependency specification that `uv run` (like pipx, and some other tools) can take into account.

There are really so many things about this point that I don't get.

First off, in my mind the kinds of things that are "scripts" don't have dependencies outside the standard library, or if they do are highly specific to my own needs on my own system. (It's also notable that one of the advantages the author cites for Go in this niche is a standard library that avoids the need for dependencies in quick scripts! Is this not one of Python's major selling points since day 1?)

Second, even if you have dependencies you don't have to learn differences between these tools. You can pick one and use it.

Third, virtual environments are literally just a place on disk for those dependencies to be installed, that contains a config file and some stubs that are automatically set up by a one-liner provided by the standard library. You don't need to go into them and inspect anything if you don't want to. You don't need to use the activation script; you can just specify the venv's executable instead if you prefer. None of it is conceptually difficult.

Fourth, sharing an environment for these quick scripts actually just works fine an awful lot of the time. I got away with it for years before proper organization became second nature, and I would usually still be fine with it (except that having an isolated environment for the current project is the easiest way to be sure that I've correctly listed its dependencies). In my experience it's just not a thing for your quick throwaway scripts to be dependent on incompatible Numpy versions or whatever.

... And really, to avoid ever having to think about the dependencies you provide dynamically, you're going to switch to a compiled language? If it were such a good idea, nobody would have thought of making languages like Python in the first place.

And uh...

> As long as the receiving end has the latest version of go, the script will run on any OS for tens of years in the future. Anyone who's ever tried to get python working on different systems knows what a steep annoying curve it is.

The pseudo-shebang trick here isn't going to work on Windows any more than a conventional one is. And no, when I switched from Windows to Linux, getting my Python stuff to work was not a "steep annoying curve" at all. It came more or less automatically with acclimating to Linux in general.

(I guess referring to ".pyproject" instead of the actually-meaningful `pyproject.toml` is just part of the trolling.)


> Third, virtual environments are literally just a place on disk for those dependencies

I had a recent conversation with a colleague. I said how nice it is using uv now. They said they were glad because they hated messing with virtualenvs so much that preferred TypeScript now. I asked them what node_modules is, they paused for a moment, and replied “point taken”.

Uv still uses venvs because it’s the official way Python stores all the project packages in one place. Node/npm, Go/go, and Rust/cargo all do similar things, but I only really here people grousing about Python’s version, which as you say, you can totally ignore and never ever look at.


From my experience, it seems like a lot of the grousing is from people who don't like the "activation script" workflow and mistakenly think it's mandatory. Though I've also seen aesthetic objections to the environment actually having internal structure rather than just being another `site-packages` folder (okay; and what are the rules for telling Python to use it?)

The very long discussion (https://discuss.python.org/t/pep-582-python-local-packages-d...) of PEP 582 (https://peps.python.org/pep-0582/ ; the "__pypackages__" folder proposal) seems relevant here.


I've heard those objections, too. I do get that specific complaint: it's another step you have to do. That said, things like direnv and mise make that disappear. I personally like the activation workflow and how explicit it is, as you're activating that specific venv, or maybe one in a different location if you want to use that instead. I don't like sprinkling "uv run ..." all over the place. But the nice part is that both of those work, and you can pick whichever one you prefer.

It'll be interesting to see how this all plays out with __pypackages__ and friends.


> But the nice part is that both of those work, and you can pick whichever one you prefer.

Yep. And so does the pyenv approach (which I understand involves permanently adding a relative path to $PATH, wherein the system might place a stub executable that invokes the venv associated with the current working directory).

And so do hand-made subshell-based approaches, etc. etc.

In "development mode" I use my activation-script-based wrappers. When just hacking around I generally just give the path to the venv's python explicitly.


I use your "hacking around" method for things like cron jobs, with command lines like:

  * * * * * /path/to/project/.venv/python /path/to/project/foo.py
It's more typing one time, but avoids a whole lot of fragility later.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: