Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why are you starting a separate Python process for each dependency?


Real thread are very recent and didn't exist when uv was created. So you needed multiprocesses.


No, I mean why are you starting them for each dependency, rather than having a few workers pulling build requests from a queue?


At least one worker for each virtual cpu core you get for CPU. I got 16 on my laptop. My servers have much more.

If I have 64 cores, and 20 dependencies, I do want the 20 of them to be uncompressed in parallel. That's faster and if I'm installing something, I wanna prioritize that workload.

But it doesn't have to be 20. Even say 5 with queues, that's 100ms. It adds up.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: