Hi all - I just ran into some trouble with require...
# general
a
Hi all - I just ran into some trouble with requirements I'm getting an ABI error with Greenlet - which seems pretty common when platform differs from the target platform for some reason, and the obvious solution is to build it yourself I can do this normally with:
<http://requirements.in|requirements.in>
Copy code
...
gevent>=21.1.0 --no-binary greenlet
...
Then:
pip-compile -o requirements.txt <http://requirements.in|requirements.in>
generates something like
Copy code
# this file is generated blah blah blah
--no-binary greenlet
...
gevent==21.1.2
...
if you
pip install -r requirements.txt
it will correctly build greenlet for you when i'm going through the
PythonRequirementLibrary
s that are generated for my project however I can't find anything referencing that I can't rely on the greenlet wheel does that all make sense?
e
I think it does. Can you clarify the target interpreter you want greenlet to be run under vs the wheels out at PyPI? (assuming you're hooked up to PyPI and not a custom index)
Put another way, if you
pip wheel --no-deps --no-binary greenlet greenlet
what is the file name of the generated wheel?
I get
greenlet-1.1.0-cp39-cp39-linux_x86_64.whl
and I find:
Copy code
$ auditwheel show greenlet-1.1.0-cp39-cp39-linux_x86_64.whl 

greenlet-1.1.0-cp39-cp39-linux_x86_64.whl is consistent with the
following platform tag: "manylinux_2_17_x86_64".

The wheel references external versioned symbols in these
system-provided shared libraries: libc.so.6 with versions
{'GLIBC_2.2.5', 'GLIBC_2.4', 'GLIBC_2.14'}, libgcc_s.so.1 with
versions {'GCC_3.0'}, libstdc++.so.6 with versions {'CXXABI_1.3'}

This constrains the platform tag to "manylinux_2_17_x86_64". In order
to achieve a more compatible tag, you would need to recompile a new
wheel from source on a system with earlier versions of these
libraries, such as a recent manylinux image.
Last question: assuming you get the ABI error at runtime, meaning you have a built PEX file in hand, what is the name of the greenlet distribution inside the PEX file?:
Copy code
pex-tools your.pex info | jq '.distributions | to_entries[] | .key | match("^greenlet.*") | .string'
Or without pex-tools:
Copy code
unzip -qc your.pex PEX-INFO | jq '.distributions | to_entries[] | .key | match("^greenlet.*") | .string'
a
Hi! Thanks for getting back to me -
Copy code
(.venv) [nate@ragin-cajun pants-docker]$ pip wheel --no-deps --no-binary greenlet greenlet
Collecting greenlet
  Using cached greenlet-1.1.0.tar.gz (85 kB)
Building wheels for collected packages: greenlet
  Building wheel for greenlet (setup.py) ... done
  Created wheel for greenlet: filename=greenlet-1.1.0-cp38-cp38-linux_x86_64.whl size=149860 sha256=3289eb703bc2faf2f06aeb8ddb72a5bc62d9c2df1517e28f237eeaf8ed965e77
  Stored in directory: /home/nate/.cache/pip/wheels/2f/e7/47/b6a893203ccff7cb1cb81576f22066a93657a6429a68a912d0
(so:
greenlet-1.1.0-cp38-cp38-linux_x86_64.whl
) From the generated pex:
Copy code
(.venv) [nate@ragin-cajun pants-docker]$ unzip -qc dist/test_docker/test_greenlet.pex PEX-INFO | jq '.distributions | to_entries[] | .key | match("^greenlet.*") | .string'
"greenlet-1.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl"
"greenlet-1.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl"
e
Can you
pip install auditwheel
and run
auditwheel show
against the greenlet wheel you built? For the PEX embedded
greenlet-1.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
wheel not to work, the auditwheel should say your system does not support the glibc 2.17 spec.
a
Copy code
(.venv) [nate@ragin-cajun pants-docker]$ auditwheel show greenlet-1.1.0-cp38-cp38-linux_x86_64.whl

greenlet-1.1.0-cp38-cp38-linux_x86_64.whl is consistent with the
following platform tag: "manylinux_2_17_x86_64".

The wheel references external versioned symbols in these
system-provided shared libraries: libc.so.6 with versions
{'GLIBC_2.2.5', 'GLIBC_2.4', 'GLIBC_2.14'}, libgcc_s.so.1 with
versions {'GCC_3.0'}, libstdc++.so.6 with versions {'CXXABI_1.3'}

This constrains the platform tag to "manylinux_2_17_x86_64". In order
to achieve a more compatible tag, you would need to recompile a new
wheel from source on a system with earlier versions of these
libraries, such as a recent manylinux image.
OH -huh i was working on this more and figured out how to fix it
which is totally unrelated
e
Ok - if you can share what you did to fix this that might be helpful. I'm a bit lost. That auditwheel output says the wheel inside the PEX file should work fine on your machine / it should be equivalent to the wheel pip wheel built.
(Pex uses Pip wheel internally!)
a
a) i had
Copy code
gevent==20.6.2
greenlet==0.4.16
in both my
requirements.txt
&
constraints.txt
with a python library that used gevent When creating a distribution it would correctly limit the gevent version to 20.6.2, but would not limit the greenlet version, even though it was included in the requirements/constraints file. so if I make a python distribution from a library that depends on just gevent I get:
Copy code
(.venv) [nate@ragin-cajun remit-srv]$ pip install dist/remit_srv-0-py3-none-any.whl
...
(.venv) [nate@ragin-cajun remit-srv]$ pip freeze | grep gevent
gevent==20.6.2
(.venv) [nate@ragin-cajun remit-srv]$ pip freeze | grep greenlet
greenlet==1.1.0
(.venv) [nate@ragin-cajun remit-srv]$ python -m app
<frozen importlib._bootstrap>:219: RuntimeWarning: greenlet.greenlet size changed, may indicate binary incompatibility. Expected 144 from C header, got 152 from PyObject
^ that's what I don't want
yeah sorry for the delay was collecting info
e
No problem - thanks for that. What version of the Pip resolver are you using? That sounds like legacy behavior: https://www.pantsbuild.org/docs/reference-python-setup#section-resolver-version
a
if I add
"//:greenlet"
as an explicit dependency on my distribution and then:
Copy code
(.venvtwo) [nate@ragin-cajun remit-srv]$ pip install dist/remit_srv-0-py3-none-any.whl
(.venvtwo) [nate@ragin-cajun remit-srv]$ pip freeze | grep gevent
gevent==20.6.2
(.venvtwo) [nate@ragin-cajun remit-srv]$ pip freeze | grep greenlet
greenlet==0.4.16
e
Aha - ok. So this is all down to gevent activates greenlet only when found on the sys.path - it has no declared dep.
Yeah - you need to let Pants know in that case.
We can infer deps from imports in first party code and from reading wheel RequiresDist manifests, but that's it. We do not read imports in 3rdparty code. Anything that uses magic we can't help with.
So - to make sure I've got this all straight: You're good to go now after adding an explicit manual dependency in a BUILD file on greenlet?
a
the gevent setup.py does install_requires greenlet? is that not used when figuring out dependencies
yeah my problem is solved!
now just trying to understand what was goin' on
e
the gevent setup.py does install_requires greenlet? is that not used when figuring out dependencies
It is. So .... digging more. But do you have an answer on the Pip resolver question? Do you configure that for Pants? Which Pants version?
a
oh sorry missed that!
uh i am not changing the pip resolver as far as I know: pants.toml:
Copy code
pants_version = "2.5.0.dev3"
level = "info"
colors = true
files_not_found_behavior = "error"
#process_execution_local_parallelism=36
rule_threads_core=12
plugins=["compyman-pants-docker @ file:///home/nate/wave/pants-docker/dist/compyman_pants_docker-0.1a0-py3-none-any.whl"]
backend_packages = [
    "pants.backend.python",
    "pants.backend.python.lint.docformatter",
    "pants.backend.python.lint.black",
    "sendwave.pants_docker"
]

[source]
root_patterns = [
    "/", "/src"
]

[pytest]
pytest_plugins.add = ["celery==4.4.7", "requests-mock", "faker==4.1.1"]

[python-setup]
interpreter_constraints = ["CPython==3.8.*"]
requirement_constraints = "contraints.txt"

[python-infer]
inits = true
oh - wait let me check that might have been a problem with my specific pinned version of gevent
e
I find:
Copy code
$ unzip -qc ~/downloads/geve* gevent-20.6.2.dist-info/METADATA | grep Requires | grep greenlet
Requires-Dist: greenlet (>=0.4.16) ; platform_python_implementation == "CPython"
Which is legit. Pants / Pex / Pip should be honoring that and using your pin from constraints.txt.
a
oh yeah I forgot about the constraints - I'm not sure why greenlet then would not be resolved when calculating the third-party dependencies?
e
so, what are the lines in your requirements.txt and constraints.txt that mention gevent and greenlet? Can you share those 4 lines verbatim?
a
yeah
requirements.txt
Copy code
gevent==20.6.2
    # via -r <http://requirements.in|requirements.in>
greenlet==0.4.16
    # via gevent
constraints.txt
Copy code
gevent==20.6.2
greenlet==0.4.16
e
Do you have pex installed? If so can you run
pex -V
and then
pex --constraints constraints.txt gevent==20.6.2 -onate.pex
?
For example, I find:
Copy code
$ pex -V
2.1.42
$ cat constraints.txt 
gevent==20.6.2
greenlet==0.4.16

$ pex --constraints constraints.txt gevent==20.6.2 -onate.pex
$ pex-tools nate.pex info | jq .distributions
{
  "gevent-20.6.2-cp39-cp39-manylinux2010_x86_64.whl": "bf57b6087e29a0bb652e029522e30cf5a794edd6",
  "greenlet-0.4.16-cp39-cp39-linux_x86_64.whl": "8a34135ee4a0f738cf8eff15aabb139e72d894b2",
  "setuptools-56.2.0-py3-none-any.whl": "4c58b9c155902d9c63742e1862a7ef4ba4886751",
  "zope.event-4.5.0-py2.py3-none-any.whl": "29e480254c9d32c09e95f9e58b759ecc1d309629",
  "zope.interface-5.4.0-cp39-cp39-manylinux2010_x86_64.whl": "94f09dbe0ee11d80ac374261764597ebd489cea7"
}
a
Copy code
pex-tools nate.pex info | jq .distributions
{
  "gevent-20.6.2-cp39-cp39-manylinux2010_x86_64.whl": "bf57b6087e29a0bb652e029522e30cf5a794edd6",
  "greenlet-0.4.16-cp39-cp39-linux_x86_64.whl": "8a34135ee4a0f738cf8eff15aabb139e72d894b2",
  "setuptools-56.2.0-py3-none-any.whl": "4c58b9c155902d9c63742e1862a7ef4ba4886751",
  "zope.event-4.5.0-py2.py3-none-any.whl": "29e480254c9d32c09e95f9e58b759ecc1d309629",
  "zope.interface-5.4.0-cp39-cp39-manylinux2010_x86_64.whl": "94f09dbe0ee11d80ac374261764597ebd489cea7"
}
I get the same thing
e
Well, thats not apples-apples since your case above involved python3.8. That'd be
pex --python=/path/to/same/python ...
Just for sanity sake.
Oh - is "creating a distribution" a key phrase here I glossed over? Do you have the same problem when you create an app / PEX via
./pants package
or is just when building a distribution / wheel?
a
ah ha yeah - so I just built a pex package of our app and the dependency is included correctly (like above in nate.pex) But when building a distribution or iterating through the the python requirements targets in a plugin I only get the direct third party dependencies: I.e. my install requires is this:
Copy code
'install_requires': (
...
        'gevent==20.6.2',
...
    ),
no greenlet at all - so then when I pip install that package it only follows the constraints on the gevent package (>= 0.4.16) & picks up the most recent version (and its binary incompatibility)
e
Ok. Explained then.So - yeah, we do not try to include your constraints in any way in distributions we help you package up and this really is not a thing anyway. I think you've simply hit a problem that is best solved with extras? Would that work? Your distribution would have a greenlet extra that only gets installed with
yourdist[greenlet]
and you can use that extra to express your constraint.
I'll hang back since I think you have a full grasp of the issue now. If you need further guidance, speak up.
a
I... am not really sure how extra's work so I'd have to do a little more looking!
it does seem like a package should be able to express that it only works with some versions of upstream dependencies even if it's direct dependencies are more forgiving about what versions they work with (although I'm not sure how this is, in general, handled in python)
but what i'm actually trying to do here is really what the pex implementation is doing and get a full list of of transitive dependencies + versions (in a plugin) do you have any advice about how to achieve that?
e
Well, the probelm in this case is the dependency is soft, its optional. So extras are for optional dependencies and they must be specifically asked for by a user in a requirement for your dist in square brackets: https://setuptools.readthedocs.io/en/latest/userguide/dependency_management.html#optional-dependencies
a
wait is the dependency optional in this case? we require gevent and gevent requires greenlet
e
And you require greenlet but are not saying so.
You really need to say so.
So you must add a requirement on it - with your constraints
a
ahhhhh ok yes that makes more sense
e
Ok. Excellent. Otherwise, your code that uses greenlet when present needs to trap the error you were showing and fail more gracefully.
a
Do you have any info about how to do.. the thing I actually end up needing to do?
but what i'm actually trying to do here is really what the pex implementation is doing and get a full list of of transitive dependencies + versions (in a plugin)
e
Ah. Sorry - missed that. So, what does the plugin operate on? Just distributions or also first party code?
Basically, what is your current or planned input set to this rule?
And what is the intended end result? names and versions or a graph or urls pointing to where to find these or ...?
a
@enough-analyst-54434 I'm trying to produce a docker image w/all dependencies installed I just had a think on it and the easy + good solution is to copy in the constraints.txt and use it when installing all dependencies
since I get all the inferred/explicit dependencies easily
e
Ok. So instead of building a PEX and copying that or those into the image, you want a single venv with all the deps installed and you'll do that with Pip?
a
yeah I think so (i'm generating a docker file from the target parameters and it generates a command
pip install {dependency line}
in the dockerfile for each python library dependency
e
OK. Sounds like you have a path. Just in case you're not aware pex_binary supports execution_mode="venv" and those PEX files will turn themselves into a venv on 1st run and forevermore execute through that. You can also say
PEX_TOOLS=1 /venv.mode.pex venv right/here
to build the venv up front - say in a Dockerfile
RUN
.
a
hmmm i was not going to package everything through a PEX since it will already by isolated in the docker container
or are you saying build the pex to build the venv, copy the venv & then throw away the rest of the pex?
e
Yup, you could do exactly that with the RUN step.
RUN PEX_TOOLS=1 ./my.pex venv right/here && rm my.pex
a
I think my idea will cache (on docker) better so i'm gonna give that a shot and see if it works?
e
Oh yeah - like I said - sounds like you have a path. I just wanted to make sure you knew about Pants & Pex venv support.
👌 1
a
rad thanks
I'll keep ya'll updated with how it goes