full-window-78652
05/13/2024, 11:00 PMrequirements.txt
in Cloud Function?
Update:
• Packaging: The answer is that if the packages are located in the same repository, then best practice is to let Pants handle the dependencies and skip Artifact Registry. This works best if all packages are in a single resolve
.
• Local development: The only thing that worked for me was creating a custom shell script to:
◦ Update PYTHONPATH
to enable first party imports; and
◦ Create venvs using requirements.txt
for each resolve
▪︎ Note: Had to use standard python and pip to create venvs as had issues with the virtualenvs exported using Pantsbroad-processor-92400
05/14/2024, 11:58 PMfull-window-78652
05/15/2024, 12:30 AMpants.toml
libs/
lib1/
lib2/
projects/
cloud_function/
src/
main.py
another_cloud_function/
src/
main.py
And cloud_function/src/main.py
has an import like
from lib2 import function_2
And the main lib2
python file has an import like
from lib1 import function_1
Assuming based on your question that we can skip Artifact Registry if in same repo?broad-processor-92400
05/15/2024, 12:49 AMbroad-processor-92400
05/15/2024, 12:49 AMfull-window-78652
05/15/2024, 1:34 AMbroad-processor-92400
05/15/2024, 1:41 AMfull-window-78652
05/15/2024, 1:44 AMfull-window-78652
05/17/2024, 4:54 AM$ functions-framework --source=src/main.py --target=example_endpoint --debug
but cannot find a good way to do this with pants
What I tried:
• Changing directory structure of cloud function
• pants run
• run_shell_command
• Activating virtual env exported using pants, which works for 3rd party deps but not first party deps (e.g. lib1
)
• Using pants package
to create the zip file, then extracting the files, cd'ing to directory, activating virtual env exported using pants, and running functions-framework
◦ This technically 'works' but would be a pain to do on a regular basis while developing
Question #2: The only way I could get tests working for cloud function was to place test_main.py
next to main.py
. Is this necessary?
We generally place tests in a separate tests directory like below
cloud_function/
src/
main.py
tests/
unit/
test_main.py
when using pants I tried adding this line to test_main.py
from cloud_function.src.main import get_data
This is the error when using the tests/ directory structure:
Command:
pants test projects/cloud_function/tests/unit/test_main.py
Error:
ModuleNotFoundError: No module named 'cloud_function'
broad-processor-92400
05/17/2024, 6:37 AMPYTHONPATH
based on the pants roots
output will allow Python to find the first-party modules: https://www.pantsbuild.org/2.20/docs/using-pants/setting-up-an-ide#first-party-sources
b. for the package, there's a new layout="flat"
field coming in 2.22 (not yet released) that packages to just loose files, not a zip file, see the notes https://github.com/pantsbuild/pants/blob/main/docs/notes/2.22.x.md#python or docs https://www.pantsbuild.org/2.22/reference/targets/python_google_cloud_function#layout. This would slightly simplify
2. this one depends on your source root configuration, see pants roots
. A source root defines a location that contains libraries (i.e. what can be imported). If cloud_function/src
is a root, then main.py
would be expected to be loaded as just import main
. If you want the import path to be cloud_function.src.main
, you'll need to have the parent directory of cloud_function
be the source root instead. https://www.pantsbuild.org/2.20/docs/using-pants/key-concepts/source-rootsfull-window-78652
05/17/2024, 4:22 PMfull-window-78652
05/20/2024, 9:44 PM.env
file with pants roots
does allow VS Code to trace references, but didn't help with running locally using functions-framework
i. Tried updating VS Code and reinstalling extensions etc, also found this thread which was somewhat helpful re: VS Code settings
b. Ended up creating custom shell script to update PYTHONPATH
variable to include pants roots
so you can access first-party deps within any subdirectory of repo
c. Tried at first using the virtualenvs created using pants export
but then couldn't import first-party deps using PYTHONPATH
config above
d. Ended up creating custom shell script to create separate venvs and install packages from requirements.txt files, then in any directory you can just activate the exported venv
2. Testing
a. Using import main
works!