Hi all! I’m Zeyuan working with Python, C++ and Go...
# welcome
g
Hi all! I’m Zeyuan working with Python, C++ and Golang. I found pants when I searched for code analysis and testing automation tools for python. Besides building and testing code, I’m evaluating pants because I’m also looking for tools for general and scalable dependency management and caching for CI/CD.
👋 7
h
FWIW, I wrote this a few months ago, on the topic of CI/CD caching: https://dev.to/benjyw/better-cicd-caching-with-new-gen-build-systems-3aem and, my company, https://toolchain.com. provides remote caching as-a-service, if that’s of interest.
Re dependency management, @curved-television-6568 is working on a very robust feature that allows you to “lint your dependencies” and limit who can depend on what. And because Pants relies on dep inference instead of requiring BUILD file dependencies, it will be enforcing your “real” dependencies (based on your imports) as well as any declared ones.
g
Thanks @happy-kitchen-89482! My feeling is that going bottom up from source code level to service/deployment level usually means there’s less inferable but more complicated and manually specified dependencies. It would be good to have a monolithic tool to do these. Otherwise one might choose build tools for the lower level and orchestration tools for the upper level. Happy to see there are ongoing efforts to make dependency as core and extend it to fit different granularities.
h
I’m interested in what kind of dependency management use-cases you had in mind. Pants can do stuff like “Which binaries do I need to redeploy based on git changes since commit X” , “Which tests are impacted by this file” etc
g
Actually I’m not fully moving to monorepo yet, so the workflow might be a little messy such as: If git changes happen in a “core” repo, a “deployment” repo will use “core” repo source code to build some docker images and using these images to do integration tests, etc. Now there are many bash script in between the dependent components.
The current status is I need to create the dependency graph by declaring components in a file and specifying how their dependencies could be resolved (mostly by bash script) in a separate file.
Actually if I have docker build, integration testing, etc as separate services and use orchestration framework to define workflows, it would be much better, but it might also miss some opportunities for reusing/caching.
h
I see. How would “deployment” consume the code in “core”? Is “core” publishing a new wheel or something?
g
It’s simple, just copy paste some source code of “core” and the “deployment” will use that to build docker images (e.g.
COPY core/module /workspace
and
RUN pip install .
in “deployment”’s Dockerfile)