build, provenance, publish workflow #346
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
When a new tag is pushed, build the sdist, all platform wheels, and generate a signed SLSA provenance document. A new GitHub release draft is created with all the files, then the workflow waits for approval before uploading to Test PyPI and then PyPI. The tag's commit date is used to avoid date issues for reproducible builds.
Thanks @sethmlarson for the examples in https://github.com/sethmlarson/secure-python-package-template and https://github.com/urllib3/urllib3/, and answering my questions. It may look like one commit, but I've been developing and debugging this in a personal fork for a week.
This differs from the SLSA example and linked repos a bit, since it deals with the sdist and wheels separately. There are 4 build jobs that all upload artifacts, then subsequent jobs download the artifacts to generate hashes, provenance, and do uploads, instead of passing hashes around as outputs. It also uses the GitHub CLI to make a draft release rather than letting the SLSA generator workflow create a release; the generator still uploads a build artifact.
Instead of tying both the release and PyPI uploads to the manually approved "publish" environment, they're split into separate jobs and only PyPI is gated. This allows reviewing the generated draft release and files before taking the final permanent step.
This also supports building new wheels for an existing release when a new version of Python comes out. When that happens, update cibuildwheel to support the new Python version, then run the workflow manually. You'll enter the existing tag to check out and build, and the Python version to build for, like "cp311". This will generate the new wheels (but not an sdist), generate a new provenance file (with a name like "cp311.intoto.jsonl"), and upload the new files to the existing release. I'm not 100% sure if "upload to existing release" works consistently, sometimes it created a new draft, but if that happens you can manually download/upload the artifacts to the existing release.
I have been using Twine/PyPI's unadvertised feature of GPG signing each uploaded file. That won't happen automatically in the workflow since I'd need to put my GPG private key as a GitHub secret, which is not a good idea.
But luckily, I can download the artifacts and useEventually PyPI will support OIDC from GitHub (and presumably provenance), so GPG sigs shouldn't be required at that point.twine upload -s --skip-existing
locally to only generate and upload the signatures after the fact.