Python Packages in Artifact Registry (Updated)

Lukas Karlsson
4 min readJun 8, 2021

Update: Google released an update to Cloud Functions on February 16th, 2022, that makes it very easy to use Artifact Registry, so the content below regarding Cloud Functions is no longer accurate. Check out the Cloud Functions documentation for using private dependencies from Artifact Registry.

Today, Google announced a public preview of support for Python packages (using the Pypi repository format) in Artifact Registry. With this feature, you can create a private repository in your project where you can publish your packages, instead of pushing them to the public repository at Pypi.org.

Uploading Python Packages with Cloud Build

Google recommends authenticating to Artifact Registry PyPi repositories using the Python keyring library. Here is an example of how you can do this in your cloudbuild.yaml :

---
steps:
- name: python:3.9
entrypoint: /bin/bash
dir: package
args:
- -c
- |
pip install --upgrade pip setuptools twine wheel
pip install keyring
pip install keyrings.google-artifactregistry-auth
mkdir -p ~/.config/pip
printf "[distutils]\nindex-servers = \n ${_REPO}\n\n[${_REPO}]\nrepository: https://${_REGION}-pypi.pkg.dev/${PROJECT_ID}/${_REPO}/\n" > ~/.pypirc
printf "[global]\nindex-url = https://${_REGION}-pypi.pkg.dev/${PROJECT_ID}/${_REPO}/simple/\n" > ~/.config/pip/pip.conf
python setup.py sdist bdist_wheel
twine upload --repository-url https://${_REGION}-pypi.pkg.dev/${PROJECT_ID}/${_REPO}/ dist/*

This Cloud Build config handles the following steps:

  • install setuptools, twine, wheel, keyring and the Artifacts Registry backend
  • create the ~/.config/pip directory
  • populate the pip.conf
  • build the wheels and sdists
  • upload to the registry

and it assumes the PROJECT_ID of the Pypi repository is the same project where Cloud Build is running and depends on the following substitutions:

  • _REPO: the name of the Pypi repository (mypypi for example)
  • _REGION: the region

Your Cloud Build GitHub Trigger could look something like this:

---
description: Upload Package to Artifact Registry PyPi Repository
filename: package/cloudbuild.yamlgithub:
name: my-package
owner: my-org
push:
branch: ^main$

includedFiles:
- package/setup.py
name: upload-packagesubstitutions:
_REGION: us-east4
_REPO: mypypi

This way, when you push a new commit to the main branch in GitHub with changes to the package/setup.py file (since you need to update the version anytime you push to a repository), the Cloud Build job will run and will update the package in the Artifact Registry Repository.

Using Python Packages with Cloud Functions

What if you want to use these private Artifact Registry packages in your Cloud Functions? Google provides instructions for how to install a package, so you could use that to install the package in your local environment or to include it in one of container images which you could then use with Cloud Run. But what about Cloud Functions?

The first undocumented trick you need to know is there is support for authenticating with an oauth2accesstoken by adding the following line to the beginning of your requirements.txt.

--extra-index-url https://oauth2accesstoken:{{ACCESS_TOKEN}}@{{REGION}}-pypi.pkg.dev/{{PROJECT}}/{{REPO}}/simple/

where {{ACCESS_TOKEN}} is replaced by the output of:

gcloud auth print-access-token

and {{REGION}}, {{PROJECT}} and {{REPO}} describe the location of your Artifact Registry Pypi repository.

In order to get this working in a cloudbuild.yaml , I did a couple of other things.

  1. Renamed my requirements.txt to requirements.txt.tmpl and added the templated line above with the {{}}’s .
  2. Added requirements.txt to my .gitignore so that the file with the access token never gets committed to GitHub.
  3. Added !requirements.txt to my .gcloudignore so that the requirements.txt file does get pushed when I run gcloud functions deploy.
  4. Added a shell script to prepare the requirements.txt file called prepare-requirements.sh (see below).
  5. Added a step to the cloudbuild.yaml that deploys my Cloud Function

Here is an example of the prepare-requirements.sh shell script that turns the requirements.txt.tmpl into the requirements.txt with the --extra-index-url populated.

#!/bin/bashACCESS_TOKEN="$(gcloud auth print-access-token)"
REGION="${REGION:="us-east4"}"
REPO="${REPO:="mypypi"}"
PROJECT="${PROJECT:="broad-gaia-karlsson"}"
sed -e "s/{{ACCESS_TOKEN}}/${ACCESS_TOKEN}/" \
-e "s/{{PROJECT}}/${PROJECT}/" \
-e "s/{{REGION}}/${REGION}/" \
-e "s/{{REPO}}/${REPO}/" \
requirements.txt.tmpl > requirements.txt
echo "Created requirements.txt from requirements.txt.tmpl"

Then, in my cloudbuild.yaml for deploying the Cloud Function, I added a step at the beginning to prepare the requirements.txt.

---
steps:
- name: gcr.io/google.com/cloudsdktool/cloud-sdk:alpine
dir: functions/my-fancy-function
entrypoint: bash
args:
- ../../scripts/prepare-requirements.sh
env:
- PROJECT=${PROJECT_ID}
- REGION=${_REGION}
- REPO=${_REPO}
- name: gcr.io/google.com/cloudsdktool/cloud-sdk:alpine
dir: functions/my-fancy-function
entrypoint: gcloud
args:
- functions
- deploy
- my-fancy-function
- --memory=1024MB
- --project=${PROJECT_ID}
- --region=us-east4
- --runtime=python38
- --timeout=300
- --trigger-http
- -q

Finally, I created a Cloud Build GitHub Trigger to deploy the Cloud Function:

---
description: Deploy my-fancy-function
filename: functions/my-fancy-function/cloudbuild.yamlgithub:
name: my-package
owner: my-org
push:
branch: ^main$

includedFiles:
- functions/my-fancy-function/**
name: deploy-my-fancy-functionsubstitutions:
_REGION: us-east4
_REPO: mypypi

Now, when I push a new commit to the main branch in GitHub, if it includes changes to this Cloud Function, the Cloud Build Job will run and will populate the requirements.txt and deploy the function.

Conclusion

Support for Pypi packages in Artifact Registry is currently in the Preview launch stage, so it is very possible that this process will change, almost certainly to make it easier to do than the current keychain approach. In the meantime, you should be able to start using private Pypi repositories with your Cloud Functions today!

Check back here for updates as things change!

--

--

Lukas Karlsson

Google Developer Expert, Cloud Platform; Google Certified Cloud Architect. Somerville, MA.