Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> When a package says it requires python<4.0, uv ignores the upper bound and only checks the lower. This reduces resolver backtracking dramatically since upper bounds are almost always wrong. Packages declare python<4.0 because they haven’t tested on Python 4, not because they’ll actually break. The constraint is defensive, not predictive.

This is kind of fascinating. I've never considered runtime upper bound requirements. I can think of compelling reasons for lower bounds (dropping version support) or exact runtime version requirements (each version works for exact, specific CPython versions). But now that I think about it, it seems like upper bounds solve a hypothetical problem that you'd never run into in practice.

If PSF announced v4 and declared a set of specific changes, I think this would be reasonable. In the 2/3 era it was definitely reasonable (even necessary). Today though, it doesn't actually save you any trouble.



I think the article is being careful not to say uv ignores _all_ upper bound checks, but specifically 4.0 upper bound checks. If a package says it requires python < 3.0, that's still super relevant, and I'd hope for uv to still notice and prevent you from trying to import code that won't work on python 3. Not sure what it actually does.


I read the article as saying it ignores all upper-bounds, and 4.0 is just an example. I could be wrong though - it seems ambiguous to me.

But if we accept that it currently ignores any upper-bounds checks greater than v3, that's interesting. Does that imply that once Python 4 is available, uv will slow down due to needing to actually run those checks?


Are there any plans to actually make a 4.0 ever? I remember hearing a few years ago that after the transition to 3.0, the core devs kind of didn't want to repeat that mess ever again.

That said, even if it does happen, I highly doubt that is the main part of the speed up compared to pip.


I think there's a future where we get a 4.0, but it's not any time soon. I think they'd want an incredibly compelling backwards-incompatible feature before ripping that band-aid off. It would be setting up for a decade of transition, which shouldn't be taken lightly.


There are indeed not any such plans.


That would deliver a blow to the integrity of the rest of that section because those sorts of upper bound constraints immediately reducible to “true” cannot cause backtracking of any kind.


uv doesn't support <3.0 (I think the minimum is 3.8?) so it would be difficult for that to be relevant. But for pip, obviously yes.


uv supports PyPI, which still has packages that are Python-2-only. So even if you're running python 3.8, it seems possible to try to declare a dependency on some <3.0 code from PyPI. That means it's an error they should detect.


The problem: The specification is binary. Are you compatible or not?

That is unanswerable now, whether a python package will be compatible with a version that is not released.

Having an ENUM like [compatible, incompatible, untested] at the least would fix this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: