Python Packages in Artifact Registry

Today, Google announced a public preview of support for Python packages (using the Pypi repository format) in Artifact Registry. With this feature, you can create a private repository in your project where you can publish your packages, instead of pushing them to the public repository at Pypi.org.

Uploading Python Packages with Cloud Build

Google recommends authenticating to Artifact Registry PyPi repositories using the Python keyring library. Here is an example of how you can do this in your :

This Cloud Build config handles the following steps:

  • install , , , and the Artifacts Registry backend
  • create the directory
  • populate the
  • build the wheels and sdists
  • upload to the registry

and it assumes the of the Pypi repository is the same project where Cloud Build is running and depends on the following substitutions:

  • : the name of the Pypi repository ( for example)
  • : the region

Your Cloud Build GitHub Trigger could look something like this:

This way, when you push a new commit to the branch in GitHub with changes to the file (since you need to update the version anytime you push to a repository), the Cloud Build job will run and will update the package in the Artifact Registry Repository.

Using Python Packages with Cloud Functions

What if you want to use these private Artifact Registry packages in your Cloud Functions? Google provides instructions for how to install a package, so you could use that to install the package in your local environment or to include it in one of container images which you could then use with Cloud Run. But what about Cloud Functions?

The first undocumented trick you need to know is there is support for authenticating with an by adding the following line to the beginning of your .

where is replaced by the output of:

and , and describe the location of your Artifact Registry Pypi repository.

In order to get this working in a , I did a couple of other things.

  1. Renamed my to and added the templated line above with the {{}}’s .
  2. Added to my so that the file with the access token never gets committed to GitHub.
  3. Added to my so that the file does get pushed when I run .
  4. Added a shell script to prepare the file called (see below).
  5. Added a step to the that deploys my Cloud Function

Here is an example of the shell script that turns the into the with the populated.

Then, in my for deploying the Cloud Function, I added a step at the beginning to prepare the .

Finally, I created a Cloud Build GitHub Trigger to deploy the Cloud Function:

Now, when I push a new commit to the branch in GitHub, if it includes changes to this Cloud Function, the Cloud Build Job will run and will populate the and deploy the function.

Conclusion

Support for Pypi packages in Artifact Registry is currently in the Preview launch stage, so it is very possible that this process will change, almost certainly to make it easier to do than the current keychain approach. In the meantime, you should be able to start using private Pypi repositories with your Cloud Functions today!

Check back here for updates as things change!

Google Developer Expert, Cloud Platform; Google Certified Cloud Architect. Somerville, MA.