Hi all, another question, I have a few large files...
# general
j
Hi all, another question, I have a few large files I want to add to the
files
target, some file is about 3G+, I’m getting this error:
Copy code
Snapshot failed: Failed to digest inputs: Throw { val: Error storing Digest { hash: Fingerprint<4728718212c19c56cdcb11074aa5ef1e939461df8de8e5998130854af496d427>, size_bytes: 3395621743 }: Invalid argument, python_traceback:
Not sure what’s the best way to handle large file with pants
w
Larger files will be supported in
2.17.x
(and should already be on
main
), but aren't before that.
j
Got u, if I don’t use pex or archive, but just want to copy some large files in the Dockerfile directly, that should be feasible, right?
Copy code
# <root>/my_project/BUILD
docker_image(
    name="docker",
    dependencies=[":pkg],
    # context_root="",
)

# the large files are at <root>/models

# Dockerfile
COPY models /
For some reasons, the files can’t be found when building the image