I’m developing a microservice using tensorflow and...
# general
a
I’m developing a microservice using tensorflow and pytorch with mono-repo. Building this directory consumes roughly 100GB of disk. I checked the details and it seems that most of it is consumed as cache. I would like to limit the cache as the disk for github actions specification is limited. https://docs.github.com/en/actions/using-github-hosted-runners/about-github-hosted-runners#supported-runners-and-hardware-resources is it possible to configure such a setting?
b
Some of those GB are for your libraries themselves. They can get HUGE. Additionally, I think part of this as well is the multiple copies of libraries, which AFAIK is how
pip
resolves/installs packages. The GH VMs claim to have 14GB if space. Not sure if that's gonna be even possible with those libraries 🤔