While working on publishing a new Python package, I wanted to publish the documentation built from the source code to GitHub Pages. This process has become much simpler and more streamlined than the last time I setup a workflow to do so (and that was building OpenAPI docs).
Disclaimer: as of the time of this post the method I'm using is still labeled as Beta, so ymmv).
This is a Python project and I'm using Sphinx for my docs. Below is the complete GitHub Actions YAML file that builds and publishes documentation for the project. I'm going to break down each section and describe the entire workflow.
name: Publish Documentation
on:
push:
branches:
- 'main'
paths:
- 'docs/**'
- 'src/**'
permissions:
contents: read
pages: write
id-token: write
concurrency:
group: "pages"
cancel-in-progress: false
jobs:
build:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.9'
- name: Install Dependencies
run: python3 -m pip install --editable '.[dev]'
- name: Build Sphinx Docs
run: sphinx-build -b html docs/ build/docs/
- name: Upload Artifact
uses: actions/upload-pages-artifact@v2
with:
path: 'build/docs/'
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v2
Repository Setup
First, GitHub Pages needs to be setup for the repository. This is done in Settings -> Pages (in sidebar) -> Build and deployment. I am using GitHub Actions (Beta) and not deploying from a branch - the old method.
There also needs to be an Environment specifically for the GitHub Pages build. This was created by default for me in a new repo. Environments let you define additional rules and secrets around associated builds. The key thing here is that only the main
branch is allowed for deployments of the docs.
Check Settings -> Environment (in sidebar) to see if you have a github-pages
environment. If not, create one and add the main
branch as the only deployment branch. For the workflow I'm describing here no other environment settings are required.
Trigger (On)
on:
push:
branches:
- 'main'
paths:
- 'docs/**'
- 'src/**'
In this project I made the decision that docs are always built on any push to main
where changes have been made ion the docs/
and src/
directories. Any other changes (like to tests, the README, etc.) that don't impact the docs skips this.
The filtering options for branches and paths are powerful tools for preventing workflows from running when they shouldn't.
Permissions
permissions:
contents: read
pages: write
id-token: write
For certain actions the workflow will need to be granted permissions to perform them. The possible permissions are documented here. There are three permissions I've granted to this workflow:
- contents - Grants read access to the repository's contents. This was only required while the repository was set to private. Once switched to public this permission is no longer needed.
- pages - This allows the workflow to request a GitHub Pages build.
- id-token - The workflow will fetch an OIDC token. This permission is used for a lot of other GitHub Actions workflows (like PyPI publishing, or AWS IAM role assumption). Here it used to verify the deployment originates from an appropriate source.
I've defined permissions at a global level for the workflow, but you can also define permissions at the job level for more fine-grained access control.
Concurrency
concurrency:
group: "pages"
cancel-in-progress: false
This is a very handy control to remember for any workflow where you want to ensure only one instance is happening at any given moment. I don't want multiple commits to main
to trigger multiple parallel docs builds. By setting a group label for the workflow and cancel-in-progress
to false
I now have a queue where each commit will build and publish in order.
Alternatively, if I set
cancel-in-progress
totrue
each subsequent commit tomain
will cancel any builds in progress in favor of the new one.
The Job
The workflow's job is comprised of a number of steps. The job itself has two keys set for configuration.
jobs:
build:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
- environment - This defines which environment is used for the job.
-
name - The name of the environment. I've set
github-pages
to use the environment from earlier in setup. -
url - This is the URL to the output of the job. As this is publishing to GitHub Pages the URL will be the link to the final docs (a static github.io URL). This syntax is referencing value from the outputs of a step with the ID
deployment
. -
runs-on - The type of host the job will execute on. This is usually
ubuntu-latest
in most examples, and you should leave it as unless you have specific build needs.
Job Setup
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.9'
The first two steps are setting up the job's environment. The checkout action will checkout out the repository at the triggering ref. The setup-python action will setup the desired Python runtime. My package supports Python 3.9+ so I'm targeting the minimum version for my build environments.
Docs Build
- name: Install Dependencies
run: python3 -m pip install --editable '.[dev]'
- name: Build Sphinx Docs
run: sphinx-build -b html docs/ build/docs/
The next two steps don't have an action associated. They are only running two commands: a pip install
of all the dev requirements defined to the package, and the sphinx-build
to generate the HTML docs.
Splitting out certain commands into their own steps can help with troubleshooting. If this workflow fails I will know if it failed during dependency installs or if it failed during the build.
Publishing
- name: Upload Artifact
uses: actions/upload-pages-artifact@v2
with:
path: 'build/docs/'
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v2
The last two steps complete the process by uploading the artifact for the docs (the build/docs/
directory that sphinx-build
outputted to) and then using the deploy-pages action.
The id
field has the value deployment
which was used earlier in the environment
section:
url: ${{ steps.deployment.outputs.page_url }}
The action for uploading the artifact doesn't do too much, but it takes care of all the nuance around GitHub Pages artifacts specifically. You can view the action's source here. It will tar
the path (provided by the with
option) and then call the upload artifact action. The artifact's name is github-pages
and has a 1 day expiration. This artifact has the name and format required for the deploy action. It all just works (so far).
Success
The result is a fresh build of the documentation on every commit to main
available at the project's GitHub Pages URL.
My workflow is for a Python package's docs, but there are only two parts that are specific to that: the runtime setup, and the step that installs dependencies and builds the docs. You could swap those out for any other runtime and/or documentation framework and the rest should be pretty much the same.
Top comments (0)