gorgeous-winter-99296
05/04/2023, 8:03 AMgorgeous-winter-99296
05/04/2023, 8:08 AMgorgeous-winter-99296
05/04/2023, 9:48 AMgorgeous-winter-99296
05/04/2023, 9:48 AMgorgeous-winter-99296
05/04/2023, 9:49 AMbitter-ability-32190
05/04/2023, 10:39 AMbitter-ability-32190
05/04/2023, 10:42 AMsed
the version to the CUDA one, relock
• 🎉gorgeous-winter-99296
05/04/2023, 10:46 AMbitter-ability-32190
05/04/2023, 10:59 AMgorgeous-winter-99296
05/04/2023, 11:01 AMbitter-ability-32190
05/04/2023, 11:01 AMgorgeous-winter-99296
05/04/2023, 11:04 AMbitter-ability-32190
05/04/2023, 11:05 AMgorgeous-winter-99296
05/04/2023, 11:09 AMpants generate-lockfiles
?bitter-ability-32190
05/04/2023, 11:09 AMbitter-ability-32190
05/04/2023, 11:11 AMgenerate-lockfiles
-> Linux/Mac CPU lockfile
• Take the list of packages with pinned versions from that lockfile (choose your favorite way to do this)
• Pass that into a pex lock create
with the torch
version pinned to CUDA, and the torch PyPI as an addittional index _> Linux GPU lockfilebitter-ability-32190
05/04/2023, 11:11 AMgorgeous-winter-99296
05/04/2023, 11:13 AMpants generate-lockfiles --resolve=cpu
pants generate-lockfiles --python-indexes='["<http://pypi.org/simple|pypi.org/simple>", "<http://download.torch.org/cu117|download.torch.org/cu117>"]' --resolve=gpu
bitter-ability-32190
05/04/2023, 11:13 AMpants generate-lockfiles
tries to do a universal
lock 🤮gorgeous-winter-99296
05/04/2023, 11:13 AMtorch==...
statementsbitter-ability-32190
05/04/2023, 11:13 AMbitter-ability-32190
05/04/2023, 11:14 AMThe important thing is your GPU lockfile has EXACTLY the same deps of everything (but torch)☝️
bitter-ability-32190
05/04/2023, 11:14 AMgorgeous-winter-99296
05/04/2023, 11:18 AMbitter-ability-32190
05/04/2023, 11:19 AMbitter-ability-32190
05/04/2023, 11:20 AMcat
the one and then sed
itgorgeous-winter-99296
05/04/2023, 11:20 AMaverage-balloon-31442
05/04/2023, 11:46 AMaverage-balloon-31442
05/04/2023, 11:49 AMgorgeous-winter-99296
05/04/2023, 11:50 AMgorgeous-winter-99296
05/04/2023, 12:05 PM+cpu
(or some other!) gets picked for local tags, it still doesn't work for Mac. Otherwise using markers would be interesting.gorgeous-winter-99296
05/04/2023, 12:08 PMtorch==1.11.0+cpu ; platform!="darwin"
torch==1.11.0 ; platform=="darwin"
will always use +cpu anyways, since they're meant to be compatible. Just that there's no 1.11.0+cpu for Mac.gorgeous-winter-99296
05/04/2023, 12:11 PMbitter-ability-32190
05/04/2023, 12:49 PMbitter-ability-32190
05/04/2023, 12:50 PMLocal version identifiers SHOULD NOT be used when publishing upstream projects to a public index server,
gorgeous-winter-99296
05/04/2023, 12:56 PMLocal versions sort differently, this PEP requires that they sort as greater than the same version without a local
gorgeous-winter-99296
05/04/2023, 12:56 PMbitter-ability-32190
05/04/2023, 1:04 PMenough-analyst-54434
05/04/2023, 2:13 PMenough-analyst-54434
05/04/2023, 2:14 PMenough-analyst-54434
05/04/2023, 2:18 PMbitter-ability-32190
05/04/2023, 3:01 PMFWICT torch should be using extras, say [CPU] or [GPU] to accomplish this. That would require they change how they package a bit and it's likely a sailed ship.Yeah that's my hot take as well. I forgot to mention it to the Metas folks I met at PyCon 😕
gorgeous-winter-99296
05/04/2023, 3:35 PMbitter-ability-32190
05/04/2023, 3:37 PMmxnet
uses different package names. So that's a pro here, but then there's other cons 😛gorgeous-winter-99296
05/04/2023, 3:42 PMtorch==1.11.0
and repos like:
[python-repos]
indexes = [
"<https://pypi.org/simple/>",
"<https://download.pytorch.org/whl/cpu/>"
]
and do a lock via pants and get
{
"artifacts": [
{
"algorithm": "sha256",
"hash": "544c13ef120531ec2f28a3c858c06e600d514a6dfe09b4dd6fd0262088dd2fa3",
"url": "<https://download.pytorch.org/whl/cpu/torch-1.11.0%2Bcpu-cp39-cp39-linux_x86_64.whl>"
}
],
"project_name": "torch",
"requires_dists": [
"typing-extensions"
],
"requires_python": ">=3.7.0",
"version": "1.11.0+cpu"
}
Which seems... like the wrong thing to do. So it's locking a different, actually incompatible version than the one specified. If it was truly universal there I'd expect it to fallback to the pypi version, which would work for both.gorgeous-winter-99296
05/04/2023, 5:45 PMpants generate-lockfiles
. The indexes are always in the pants.toml. The way I did it was to add three extra requirements for the torch dependencies, one per resolve.
python_requirement(
name="torch",
requirements=["torch==1.11.0,!=1.11.0+cpu,!=1.11.0+cu115"],
resolve="reqs",
)
python_requirement(
name="torch_cpu",
requirements=["torch==1.11.0+cpu"],
resolve="cpu",
)
python_requirement(
name="torch_gpu",
requirements=["torch==1.11.0+cu115"],
resolve="gpu",
)
I've not dug deeper but this at least lets me get three correct resolves and at least one that works on Mac.bitter-ability-32190
05/04/2023, 5:48 PMaverage-balloon-31442
05/04/2023, 7:13 PMtorch
. How do you accomplish to use right resolver in CI and right on mac/linux users?gorgeous-winter-99296
05/04/2023, 7:14 PMaverage-balloon-31442
05/04/2023, 7:15 PMenough-analyst-54434
05/04/2023, 10:12 PMenough-analyst-54434
05/04/2023, 10:13 PMgorgeous-winter-99296
05/05/2023, 7:25 AMgorgeous-winter-99296
05/05/2023, 7:34 AMThe universal mode creates 1 lock that must work for the complete range of interpreters and machines implied byand any--interpreter-constraint
specified (Pants always passes--target-system
).--target-system mac --target-system linux
enough-analyst-54434
05/05/2023, 1:52 PMgorgeous-winter-99296
05/05/2023, 1:53 PMgorgeous-winter-99296
05/08/2023, 8:15 AMenough-analyst-54434
05/08/2023, 11:58 PM--target-system
. That just limits what artifacts Pex locks. I.E: It does not try to lock Windows artifacts. It locks any Linux and Mac artifacts available though. This does not mean it somehow tests the locks will work on Linux and Mac. In fact, universal locks are conceptually broken this way in general. The "universal" claim is best effort. If your universe includes, say, Mac arm and there are no wheels released for Mac arm for some set of your transitive requirements, the end result is yolo. Who knows, some of those sdists may never build on Mac arm and the lock never work for those. So, if a lock grabs some pin - all this torch shenanigans with local versions aside - and that pin has just a - say - Linux x84_64 wheel available with no sdist, the universal lock will succeed.enough-analyst-54434
05/08/2023, 11:59 PM--platform
or --omplete-platform
. In that case Pex would still perform the lock as-is, but then attempt to resolve a subset of the lock for each supplied platform and fail if it could not.enough-analyst-54434
05/09/2023, 12:00 AMenough-analyst-54434
05/09/2023, 12:01 AMgorgeous-winter-99296
05/09/2023, 1:34 PM+cpu
wouldn't be lockable instead. If I understand correctly, adding --platform
or the complete dito would still only be validation, right? It wouldn't actually constrain the resolve process. I know from attempting to work around this issue in both PDM and Poetry that it's equally borked everywhere; and the hacks are equally ugly. If I understand you correctly; if Pants would support a multi-platform lock this would be solvable (albeit maybe quite complex). Is there a philosophical reason Pants does not support this, or just no-one having been hurt enough by it to fix it? It doesn't seem like an insurmountable contribution to make if it'd be accepted.
OTOH; I now have a documented workable approach that'll continue working for the foreseeable future. I expect though that when we upgrade torch to 1.13 or 2.0 when it starts pulling in CUDA we'll have another fun set of problems to solve... Which might necessitate the multi-platform lock anyways.enough-analyst-54434
05/09/2023, 1:47 PMbitter-ability-32190
05/09/2023, 1:48 PMgorgeous-winter-99296
05/09/2023, 1:51 PMgorgeous-winter-99296
05/09/2023, 1:51 PMenough-analyst-54434
05/09/2023, 1:54 PM===
may work for your non-local-version resolve instead of having to add the !=
clauses. But sticking to what you came up with may be your best bet. I think either way takes a book to explain.bitter-ability-32190
05/09/2023, 1:55 PMgorgeous-winter-99296
05/09/2023, 1:56 PMgorgeous-winter-99296
05/10/2023, 12:28 PM