:wave: , it's me again :grin: I've encountered a p...
# pex
r
👋 , it's me again 😁 I've encountered a pickling error when spawning multiple processes in pex, the context is: 1. I'm trying to create a binary of the script: https://github.com/facebookresearch/detectron2/blob/master/tools/train_net.py 2. The script spawns as many processes as gpu's in the machine, using the
launch
function 3. when running the script to spawn 2 processes for example I get the following error:
Copy code
File "/home/ubuntu/git/repo/.pants.d/pyprep/requirements/CPython-3.7.4/98abeb07f7c533df9da32c92acfed24c8711ff91/.deps/detectron2-0.1-cp37-cp37m-linux_x86_64.whl/detectron2/engine/launch.py", line 49, in launch
    daemon=False,
  File "/home/ubuntu/git/repo/.pants.d/pyprep/requirements/CPython-3.7.4/98abeb07f7c533df9da32c92acfed24c8711ff91/.deps/torch-1.4.0-cp37-cp37m-manylinux1_x86_64.whl/torch/multiprocessing/spawn.py", line 162, in spawn
    process.start()
  File "/home/ubuntu/.conda/envs/py37/lib/python3.7/multiprocessing/process.py", line 112, in start
    self._popen = self._Popen(self)
  File "/home/ubuntu/.conda/envs/py37/lib/python3.7/multiprocessing/context.py", line 284, in _Popen
    return Popen(process_obj)
  File "/home/ubuntu/.conda/envs/py37/lib/python3.7/multiprocessing/popen_spawn_posix.py", line 32, in __init__
    super().__init__(process_obj)
  File "/home/ubuntu/.conda/envs/py37/lib/python3.7/multiprocessing/popen_fork.py", line 20, in __init__
    self._launch(process_obj)
  File "/home/ubuntu/.conda/envs/py37/lib/python3.7/multiprocessing/popen_spawn_posix.py", line 47, in _launch
    reduction.dump(process_obj, fp)
  File "/home/ubuntu/.conda/envs/py37/lib/python3.7/multiprocessing/reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
_pickle.PicklingError: Can't pickle <function main at 0x7fd2ffb1edd0>: attribute lookup main on __main__ failed
👋 1
w
IIRC, use of
multiprocessing
might require calling the pex
zip_safe=False
e
From the 1st line of the backtrace above it looks like this is an unzipped PEX. I think you're running afoul of the "Safe importing of main module" https://docs.python.org/3.7/library/multiprocessing.html#multiprocessing-programming Probably something to do with the fact that a PEX has
___main___.py
as its entry point for bootstrapping that then hand off to your entry point. We do take pains to delegate to your entrypoint as if it were ___main__ though: https://github.com/pantsbuild/pex/blob/70d810045be9270cd9ab0e30916b2380a9da04f2/pex/pex.py#L538-L542_
r
should i create an issue in pex then ?
or maybe try to define a different entry point instead of ___main _ ?
e
This doesn't really sound like an issue solvable by PEX since it's central enabling mechanism is it's ___main__.py._
Yeah, you should probably read up on those multiprocessing criteria and try to rework the Facebook main example. It looks like just moving the multiprocessing code into a module main delegates to might do it.
r
thx, i'll do that
fixed, thx @enough-analyst-54434:
moving the multiprocessing code into a module main delegates to might do it