~/project/vllm uv pip install vllm --torch-backend=auto Resolved 142 packages in 1.54s × Failed to download `torch==2.8.0+cu129` ├─ Failed to extract archive: torch-2.8.0+cu129-cp312-cp312-manylinux_2 ...
since pypi packages can have conflicting names, if you install two different packages with the same name they will silently override each other. uv should fail when this occurs to prevent unexpected ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results