anyone else had issues using torch and transformer...
# general
b
anyone else had issues using torch and transformers in a Pants project? All my dependencies work as expected, but for Torch I'm getting this error:
Copy code
File "/home/delucca/.cache/pants/named_caches/pex_root/venvs/s/17fb5414/venv/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1495, in requires_backends
    raise ImportError("".join(failed))
                              โ”” ['\nAutoModelForTokenClassification requires the PyTorch library but it was not found in your environment. Checkout the instr...

ImportError:
AutoModelForTokenClassification requires the PyTorch library but it was not found in your environment. Checkout the instructions on the
installation page: <https://pytorch.org/get-started/locally/> and follow the ones that match your environment.
Please note that you may need to restart your runtime after installation.
(I've already generated lockfiles after I've added it to my
pyproject.toml
This is my `pyproject.toml`:
Copy code
[project]
name = "item-matcher"
version = "0.1.0"
dependencies = [
    "fastapi>=0.11.0",
    "uvicorn>=0.30.1",
    "loguru>=0.7.2",
    "google-cloud-storage>=2.17.0",
    "google-auth>=2.31.0",
    "google-cloud-secret-manager>=2.20.1",
    "msgpack>=1.0.8",
    "dependency-injector>=4.41.0",
    "typing-extensions>=4.12.2",
    "transformers>=4.42.4",
    "numpy>=1.26.4",
    "torch>=2.0.0",
    "torchvision>=0.15.0",
    "torchaudio>=2.0.0"

]

[tool.pants]
pants_version = "2.21.0"

[tool.pants.backend.python]
interpreter_constraints = ["==3.11.*"]

[project.optional-dependencies]
dev = [
    "pytest>=8.2.2",
    "pytest-docker>=3.1.1",
    "httpx>=0.27.0",
]
it turns out this is the root cause:
for some reason,
importlib.util.find_spec
returns
None
even if I try to pick
pytest
(that 100% sure is installed on my env since I use it)
and the
transformers
lib uses it to decide if
torch
is installed
is this expected? does
importlib.util.find_spec
doesn't work when running dependencies from
pants
?
it seems if I add
import torch
anywhere in the code it fixes the issue. So, pants doesn't resolve the dependencies a given dependency is using? ๐Ÿค”
I'm checking on
transformers
lib code, it seems they always import
torch
on runtime, during a function execution
is there any way to solve this without having to add
import torch
manually on my side?
s
I use dependency overrides, something like this in BUILD:
Copy code
poetry_requirements(
    name="poetry",
    # pip is unable to infer these transitive dependencies
    overrides={
        "transformers": {"dependencies": [":poetry#torch"]},
    },
)
b
@steep-eve-20716 this is pretty clever! I'll test it, thanks
h
Yeah, when behavior is different depending on the presence or absence of an explicit
import
it usually means that Pants dep inference had no way to infer a dep (without that explicit
import
) so you need to add it manually
s
@happy-kitchen-89482 What metadata is missing in the direct dependency that prevents Pants/pex from identifying the transitive dep?
h
Presumably, since
transformers
works with or without
torch
(albeit with different behavior in each case), it does not require it?
So it is missing that from its
Requires-Dist
wheel metadata?
s
HF has it as an optional dep in in
<https://github.com/huggingface/transformers/blob/main/setup.py>
bc yeah, transformers works without torch
related question to @steep-eve-20716โ€™s last one about missing metadata - i assume pants doesnโ€™t do dep inference for runtime imports like above very well?
h
Yeah, there is no sensible way to do that
๐Ÿ‘ 2
There are various ways to get it to understand runtime imports from your own first-party code
Or writing some custom code
but Pants doesn't introspect third-party code at all, it assumes that the metadata is correct
๐Ÿ‘€ 1
Which is the right behavior in this case
โœ… 1
s
yeah, makes sense
s
Yea totally makes sense
h
It would arguably be incorrect to infer a dep from transformers to torch
s
agreed
h
The semantics of transformers is - if you want torch, install it yourself
s
@steep-eve-20716 - lots of people have had various issues with torch as a dep in pants (and elsewhere); trying to sort out my own right now too
s
I guess the best pants could do is understand that if I have
transformers
and the
transformers[torch]
extra, automatically infer the relationship between the two instead of manually declaring it?
b
if I can, the only thing I would like to suggest is maybe adding a better explanation specifically about this in the docs. I can't point a suggestion of where to place it (because the site is down on my region) but it was really complicated for me (that doesn't have much Pants experience) to understand this detail
but, indeed it makes total sense why it doesn't work, but I think it would be nice having an explanation with the suggested way to handle this ๐Ÿ˜„
s
i saw someone mention adding a more cookbook / faq-style docs page; could be a really nice way of consolidating these sorts of notes somewhere a bit more loose
e.g., having a whole section dedicated to torch
h
That would be great, since a lot of people have torch-related questions
Feel free to create that as a GitHub Discussion, I think that was where we planned to have cookbook style stuff