Hey folks! I’m trying to debug a particularly odd ...
# general
b
Hey folks! I’m trying to debug a particularly odd issue with typechecking using Pyright in Github Actions CI jobs. The Pyright run seems to fail in CI when remote caching is enabled, but succeeds when remote caching is disabled. I set up the remote cache using bazel-remote similar to the implementation here: https://github.com/pantsbuild/pants/pull/19144 Code to set up remote caching (currently commented out):
Copy code
#    - name: Setup Pants remote cache
#      env:
#        AWS_ACCESS_KEY_ID: ${{ secrets.PANTS_AWS_ACCESS_KEY_ID }}
#        AWS_SECRET_ACCESS_KEY: ${{ secrets.PANTS_AWS_SECRET_ACCESS_KEY }}
#        AWS_S3_BUCKET: ${{ secrets.PANTS_AWS_S3_BUCKET }}
#      run: |
#        mkdir -p ~/bazel-remote
#        docker run --detach -u 1001:1000 \
#                -v ~/bazel-remote:/data  \
#                -p 9092:9092 \
#                buchgr/bazel-remote-cache:v2.4.1 \
#                --s3.auth_method=access_key \
#                --s3.access_key_id="${AWS_ACCESS_KEY_ID}" \
#                --s3.secret_access_key="${AWS_SECRET_ACCESS_KEY}" \
#                --s3.bucket="${AWS_S3_BUCKET}" \
#                --s3.endpoint=<http://s3.us-east-2.amazonaws.com|s3.us-east-2.amazonaws.com> \
#                --max_size 100
#        echo "PANTS_REMOTE_STORE_ADDRESS=<grpc://localhost:9092>" >> "$GITHUB_ENV"
#        echo "PANTS_REMOTE_CACHE_READ=true" >> "$GITHUB_ENV"
#        echo "PANTS_REMOTE_CACHE_WRITE=true" >> "$GITHUB_ENV"
Is there a way to bust the remote cache / start over? Alternatively, is there some obvious issue with Pyright + remote caching that I am missing?
Here are some examples of errors - they all have to do with not resolving specific libraries which are present in the 3rdparty lockfile:
Copy code
/tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/spiders/tickers.py:5:6 - error: Import "scrapy.exceptions" could not be resolved (reportMissingImports)
  /tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/spiders/tickers.py:6:6 - error: Import "scrapy.http" could not be resolved (reportMissingImports)
/tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/utils/edgar_parser.py
  /tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/utils/edgar_parser.py:10:6 - error: Import "charset_normalizer" could not be resolved (reportMissingImports)
  /tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/utils/edgar_parser.py:11:6 - error: Import "loguru" could not be resolved (reportMissingImports)
  /tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/utils/edgar_parser.py:9:8 - warning: Import "dateutil.parser" could not be resolved from source (reportMissingModuleSource)
/tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/utils/filing_processor.py
  /tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/utils/filing_processor.py:4:6 - warning: Import "bs4" could not be resolved from source (reportMissingModuleSource)
/tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/scripts/process_company_facts.py
  /tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/scripts/process_company_facts.py:8:6 - error: Import "loguru" could not be resolved (reportMissingImports)
  /tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/scripts/process_company_facts.py:11:6 - error: Import "pydantic" could not be resolved (reportMissingImports)
  /tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/scripts/process_company_facts.py:9:8 - warning: Import "psycopg2" could not be resolved from source (reportMissingModuleSource)
  /tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/scripts/process_company_facts.py:10:6 - warning: Import "psycopg2.extras" could not be resolved from source (reportMissingModuleSource)
/tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/scripts/run_downloader_ecs_tasks.py
  /tmp/pants-sandbox-lZY9ut/scraper/src/py/scraper/scripts/run_downloader_ecs_tasks.py:3:8 - error: Import "boto3" could not be resolved (reportMissingImports)
A thoroughly unsatisfying answer, but nuking the entire contents of the cache S3 bucket worked 🤷
a
This just popped up for me, I think. Did you ever get a recurrence?