So we are running into an issue where the latest 2...
# general
b
So we are running into an issue where the latest 2.18.0rc1 pex artifacts are linked on github as a release under the
download/release
path. The current URLs on github are listed under the github domain (https://github.com/pantsbuild/pants/releases/download/release_2.18.0rc0/pants.cp39-darwin_x86_64.pex), but do a redirect to https://objects.githubusercontent.com/github-production-release-asset-2e65be/7209075/a0373f[…].pex&response-content-type=application%2Foctet-stream%. We are using an artifact caching tools to proxy the requests to github for security reasons (av scanning, checksums, etc) and the redirects are currently breaking our ability to cache the content. Is this particular set of artifacts (and others moving forward) going to be redirected going forward?
e
Presumably yes at GitHub's whim. We're an open source project / non-profit with a limited bank account; so we try to use free services when possible.
In the older ftp driven world many OSS projects had friendlies that set up mirrors, mostly for taking load, but also that sort of friendly action would alleviate this issue; assuming it too did not leverage re-directs.
b
cool. I'll pass it along to our devops team that this is a github choice and not a project specific choice. This will be important as it sounds like this choice is going to affect a large swath of projects hosted on github.
e
Yeah - lots of folks use GitHub Releases - that's the product feature here.
If push comes to shove and you must work around, there is a way to specify alternate URLs with a bit of work per release you wish to use, see: https://github.com/pantsbuild/scie-pants?tab=readme-ov-file#firewall-support
b
yeah, that's what we've been doing is the ptex mappings. I'm just not sure if the URLs that we are going to are static for the object store, or if the 3xx redirects are generated with specific auth tokens, etc.
I'll do some investigation and let you know.
e
Well, I'd think you'd not use GitHub at that point. Since you write down the URL can you not have the process that does that also just download the blob and store it at a local URL? Whether that process of editing the ptex mapping is a human or a script.
IIRC the URLs can even be file:// if you use NFS at your org.
b
yeah, the only option we would have is to checkin or git LFS store it in a repo to use file urls.
e
Ok, so you're ~totally outsourced with no infra - no internal web servers.
b
yeah, we are dark in our build environment and all of the dev tools must be proxied. We have artifactory, but nothing else.
Also note,
When you download a release asset from GitHub, the link redirects from the one you provided, using an HTTP 302, to <http://objects.githubusercontent.com|objects.githubusercontent.com>, using a special, time-limited, signed URL. By the time the download failed, the link was no longer valid. When your program retried the connection, it should have retried using the original URL because an HTTP 302 is a _temporary_ redirect, and therefore the user-agent (your browser or download tool) is supposed to access the _original_ URL since the redirect might change (which, in this case, it did).
e
So, I have had admin access to artifactory instance in the past and you can definitely manually upload artifacts. Perhaps that feature is not available to you or security banned though?
b
yes, we can do that, and it's going to be our last resort. But I don't manage the workload for our devops team (who is crushed with work), so the latency of getting things done is measured in hours or days, not minutes.
e
That term has so warped. I thought DevOps meant devs do ops. Sounds more like dedicated ops. Good luck!
b
👍
looks like our team did some black magic to make it work. Thanks for the prompt support on this.