I've found that a zip created by python_aws_lambda...
# general
b
I've found that a zip created by python_aws_lambda_function has strange permissions on the directories: they're 756 instead of 755. Notably that missing o+x bit has caused me a lot of headaches! My python imports are failing since it can't traverse the subdirectories.
b
Hm, sorry for the trouble. Can you provide a bit more details about what you're seeing? As yet, I can't reproduce. When I look at a one of the
python_aws_lambda_function
-built lambdas in my work codebase, I see directories with permissions like
Copy code
drwxr-xr-x      0   1-Jan-1980 00:00:00  aws_helpers/
Which I think is 755 as desired
(that's
unzip -l ...
output)
b
I'll see if I can get a minimal repro. But for example (excerpted from unzip -Z lambda.zip):
Copy code
drwxr-xrw-  2.0 unx        0 b- stor 80-Jan-01 00:00 zipp/
drwxr-xrw-  2.0 unx        0 b- stor 80-Jan-01 00:00 zipp-3.18.2.dist-info/
-rw-r--r--  2.0 unx        4 b- defN 80-Jan-01 00:00 zipp-3.18.2.dist-info/INSTALLER
-rw-r--r--  2.0 unx     1023 b- defN 80-Jan-01 00:00 zipp-3.18.2.dist-info/LICENSE
-rw-r--r--  2.0 unx     3539 b- defN 80-Jan-01 00:00 zipp-3.18.2.dist-info/METADATA
-rw-r--rw-  2.0 unx      756 b- defN 80-Jan-01 00:00 zipp-3.18.2.dist-info/RECORD
-rw-r--r--  2.0 unx       92 b- defN 80-Jan-01 00:00 zipp-3.18.2.dist-info/WHEEL
-rw-r--r--  2.0 unx        5 b- defN 80-Jan-01 00:00 zipp-3.18.2.dist-info/top_level.txt
-rw-r--r--  2.0 unx    11361 b- defN 80-Jan-01 00:00 zipp/__init__.py
drwxr-xrw-  2.0 unx        0 b- stor 80-Jan-01 00:00 zipp/compat/
-rw-r--r--  2.0 unx        0 b- defN 80-Jan-01 00:00 zipp/compat/__init__.py
-rw-r--r--  2.0 unx      219 b- defN 80-Jan-01 00:00 zipp/compat/py310.py
-rw-r--r--  2.0 unx     3082 b- defN 80-Jan-01 00:00 zipp/glob.py
4748 files, 44644983 bytes uncompressed, 21482250 bytes compressed:  51.9%
the directories show up as
drwxr-xrw-
which is surprising
b
yeah, that is surprising
b
my umask looks normal (0022)
it happened in our github actions CI/CD also, so I doubt it's my local env
b
sorry, that appear to be empty
b
ah I just saw that too. I'll get it in a proper repo (ran into codespaces permission problems before but I'm just doing it locally now)
b
Hm, this is what I see if I run
pants package :: && unzip -Z dist/lambda.zip
:
Copy code
...
Zip file size: 690 bytes, number of entries: 5
-rw-r--r--  2.0 unx       91 b- defN 80-Jan-01 00:00 lambda.py
-rw-r--r--  2.0 unx       37 b- defN 80-Jan-01 00:00 lambda_function.py
drwxr-xr-x  2.0 unx        0 b- stor 80-Jan-01 00:00 sub/
drwxr-xr-x  2.0 unx        0 b- stor 80-Jan-01 00:00 sub/sub/
-rw-r--r--  2.0 unx       63 b- defN 80-Jan-01 00:00 sub/sub/something.py
5 files, 191 bytes uncompressed, 170 bytes compressed:  11.0%
b
huh. I've been running in github codespaces
I just started looking at the code, I'm in pex_venv now
wondering if codespaces does something weird setting up temp dirs or something? Haven't ever seen any symptoms like this before
b
The zip creation is done by the
pex
tool, using
--keep-sandboxes=always
and jumping into the sandbox that creates the zip and playing with the
__run.sh
in there will theoretically eliminate some layers here
b
I'll try that
where's the sandbox?
b
It'll be a temporary directory that pants prints out, e.g. on my macOS machine, it's the path in the output like:
Copy code
14:52:45.44 [INFO] Preserving local process execution dir /private/var/folders/sv/vd266m4d4lvctgs2wpnhjs9w0000gn/T/pants-sandbox-1vUD9B for Build python_aws_lambda_function artifact for //:lambda
b
mine must be all cached; it's just saying
04:53:34.54 [INFO] Wrote dist/lambda.zip
b
ah, yeah.
pants --no-local-cache --keep-sandboxes=always ...
might do the trick
b
I just found the page describing --no-pantsd; trying that first
yep
so yeah the sandbox directory has drwxr-xrw-+
drwxr-xrwt+ 13 root root 4096 May 29 04:55 /tmp/
b
Do the individual directories within the sandbox pants creates have the same permissions?
b
yes:
Copy code
codespace@codespaces-d61058:/tmp/pants-sandbox-gsvAIj$ ls -al lambda
total 20
drwxr-xrw-+ 3 codespace codespace 4096 May 29 04:55 ./
drwxr-xrw-+ 6 codespace codespace 4096 May 29 04:55 ../
-rw-r--r--  1 codespace codespace   91 May 29 04:55 lambda.py
-rw-r--r--  1 codespace codespace   37 May 29 04:55 lambda_function.py
drwxr-xrw-+ 3 codespace codespace 4096 May 29 04:55 sub/
codespace@codespaces-d61058:/tmp/pants-sandbox-gsvAIj$ find lambda -ls
   131126      4 drwxr-xrw-   3 codespace codespace     4096 May 29 04:55 lambda
   131127      4 drwxr-xrw-   3 codespace codespace     4096 May 29 04:55 lambda/sub
   131130      4 drwxr-xrw-   2 codespace codespace     4096 May 29 04:55 lambda/sub/sub
   131131      4 -rw-r--r--   1 codespace codespace       63 May 29 04:55 lambda/sub/sub/something.py
   131129      4 -rw-r--r--   1 codespace codespace       37 May 29 04:55 lambda/lambda_function.py
   131128      4 -rw-r--r--   1 codespace codespace       91 May 29 04:55 lambda/lambda.py
this matches what ends up in my zipfile!
I don't fully understand why the sandboxes are 756 instead of 755
b
That is weird. Just to confirm. What are the permissions of the files on disk outside the sandbox?
b
/tmp just has the odd behavior; looks "normal" (755) elsewhere
Copy code
codespace@codespaces-d61058:~$ mkdir /tmp/foo ~/foo
codespace@codespaces-d61058:~$ ls -ald /tmp/foo ~/foo
drwxr-sr-x  2 codespace codespace 4096 May 29 05:04 /home/codespace/foo/
drwxr-xrw-+ 2 codespace codespace 4096 May 29 05:04 /tmp/foo/
I could involve github support if useful
I could probably work around this by setting a TMPDIR outside of /tmp
or similar
b
b
I wonder if it would also make sense to explicitly set the tmp dirs to 0755 after creating them though?
hm, looks like that won't necessarily be good enough. I tried creating a /tmp/pants-local-execution-root, chmodded it to 0755, and used that as the flag value. The sandboxes inside there still have 756
seems to be the acl
Copy code
codespace@codespaces-d61058:/workspaces/pants-perms-bug$ getfacl /tmp
getfacl: Removing leading '/' from absolute path names
# file: tmp
# owner: root
# group: root
# flags: --t
user::rwx
group::r-x
other::rwx
default:user::rwx
default:group::r-x
default:other::rw-
just using
--local-execution-root-dir=/var/tmp
works around it
so at least I can set that in my project's pants.toml
b
phew, a workaround! annoying that the acls infect the built artefacts
b
yeah I think it's mostly a github codespaces problem; the default acl they put on /tmp. That said I do think it's worth considering checking the perms on pants' sandbox directories as they're created
FYI no activity there in 2 weeks so I also filed an enterprise support ticket for this issue