Skip to content

Commit

Permalink
Add GitHub Actions documentation-deployment pipeline (backport #10610) (
Browse files Browse the repository at this point in the history
#10638)

* Add GitHub Actions documentation-deployment pipeline (#10610)

* Add GitHub Actions documentation-deployment pipeline

This brings a documentation-deployment pipeline into the Qiskit/Terra
repository, allowing it to fully deploy its documentation to
`qiskit.org`, a task previously only the metapackage could perform.
This does not fully unify the documentation with files from the
metapackage, it just adds a pipeline to do the final deployment.

This includes a revitalised translatable-strings pipeline, which was
previously broken on the metapackage for the last month or two. It also
previously included a fair amount of legacy weight that was no longer
relevant.

* Add missing secret insertions

* Improve logic for deployments

This changes the logic for the deployments so that pushes to 'stable/*'
no longer trigger any deployment to qiskit.org.  Instead, tag events
trigger a deployment to the relevant stable branch, and a tag event of
the _latest_ tag triggers a deployment to the documentation root.

The translatables logic is modified to push only the latest full-release
tag.

(cherry picked from commit 80e95d1)

# Conflicts:
#	docs/conf.py

* Fix bad merge conflict

* Hardcode publish path from tags

For any tag we publish from the `stable/0.25` branch the publish path
should be both `stable/0.44` (which matches the qiskit package version) 
and the root as it's the current release. When we stop publishing from
the stable/0.25 branch after the release of 0.45.0rc1 the existing logic 
will work and we'll no longer have any 0.25.x releases. But for the
0.25.x release series only the version numbers are not unified so we
need to manually ensure we're publishing to the right paths.

---------

Co-authored-by: Jake Lishman <jake.lishman@ibm.com>
Co-authored-by: Luciano Bello <bel@zurich.ibm.com>
Co-authored-by: Eric Arellano <14852634+Eric-Arellano@users.noreply.github.com>
Co-authored-by: Matthew Treinish <mtreinish@kortar.org>
  • Loading branch information
5 people committed Aug 16, 2023
1 parent 48a7b82 commit 67061ae
Show file tree
Hide file tree
Showing 11 changed files with 323 additions and 32 deletions.
12 changes: 5 additions & 7 deletions .azure/docs-linux.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,16 +19,14 @@ jobs:
versionSpec: '${{ parameters.pythonVersion }}'
displayName: 'Use Python ${{ parameters.pythonVersion }}'

- bash: |
set -e
python -m pip install --upgrade pip setuptools wheel
python -m pip install -U "tox<4.4.0"
sudo apt-get update
sudo apt-get install -y graphviz pandoc
- bash: tools/install_ubuntu_docs_dependencies.sh
displayName: 'Install dependencies'

- bash: |
tox -edocs
set -e
tox -e docs
# Clean up Sphinx detritus.
rm -rf docs/_build/html/{.doctrees,.buildinfo}
displayName: 'Run Docs build'
- task: ArchiveFiles@2
Expand Down
8 changes: 2 additions & 6 deletions .azure/tutorials-linux.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,14 +16,10 @@ jobs:
versionSpec: '${{ parameters.pythonVersion }}'
displayName: 'Use Python ${{ parameters.pythonVersion }}'

- bash: |
set -e
python -m pip install --upgrade pip setuptools wheel
python -m pip install -U "tox<4.4.0"
sudo apt-get update
sudo apt-get install -y graphviz pandoc
- bash: tools/install_ubuntu_docs_dependencies.sh
displayName: 'Install dependencies'

# Sync with '.github/workflows/docs_deploy.yml'
- bash: tools/prepare_tutorials.bash algorithms circuits circuits_advanced operators
displayName: 'Download current tutorials'

Expand Down
258 changes: 258 additions & 0 deletions .github/workflows/docs_deploy.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,258 @@
name: Documentation
on:
push:
branches:
- main
tags:
# Only match non-prerelease tags.
- '[0-9]+.[0-9]+.[0-9]'
workflow_dispatch:
inputs:
deploy_prefix:
description: "Deployment prefix (leave blank for the root): https://qiskit.org/documentation/<prefix>."
required: false
type: string
do_deployment:
description: "Push to qiskit.org?"
required: false
type: boolean
do_translatables:
description: "Push translatable strings?"
required: false
type: boolean

jobs:
build:
if: github.repository_owner == "Qiskit"
name: Build
runs-on: ubuntu-latest

outputs:
latest_tag: ${{ steps.latest_tag.outputs.latest_tag }}

steps:
- uses: actions/checkout@v3
with:
# We need to fetch the whole history so 'reno' can do its job and we can inspect tags.
fetch-depth: 0

- name: Determine latest full release tag
id: latest_tag
run: |
set -e
latest_tag=$(git tag --list --sort=-version:refname | sed -n '/^[0-9]\+\.[0-9]\+\.[0-9]\+$/p' | head -n 1)
echo "Latest release tag: '$latest_tag'"
echo "latest_tag=$latest_tag" >> "$GITHUB_OUTPUT"
- uses: actions/setup-python@v4
name: Install Python
with:
# Sync with 'documentationPythonVersion' in 'azure-pipelines.yml'.
python-version: '3.9'

- name: Install dependencies
run: tools/install_ubuntu_docs_dependencies.sh

# Sync with '.azure/tutorials-linux.yml'.
- name: Download current tutorials
run: tools/prepare_tutorials.bash algorithms circuits circuits_advanced operators
shell: bash

# This is just to have tox create the environment, so we can use it to execute the tutorials.
# We want to re-use it later for the build, hence 'tox run --notest' instead of 'tox devenv'.
- name: Prepare Python environment
run: tox run -e docs --notest

# The reason to use the custom script rather than letting 'nbsphinx' do its thing normally
# within the Sphinx build is so that the execution process is the same as in the test CI.
- name: Execute tutorials in place
run: .tox/docs/bin/python tools/execute_tutorials.py docs/tutorials
env:
QISKIT_CELL_TIMEOUT: "300"

- name: Build documentation
# We can skip re-installing the package, since we just did it a couple of steps ago.
run: tox run -e docs --skip-pkg-install
env:
QISKIT_ENABLE_ANALYTICS: "true"
# We've already built them.
QISKIT_DOCS_BUILD_TUTORIALS: "never"

- name: Build translatable strings
run: tox -e gettext
env:
# We've already built them.
QISKIT_DOCS_BUILD_TUTORIALS: "never"

- name: Store built documentation artifact
uses: actions/upload-artifact@v3
with:
name: qiskit-docs
path: |
./docs/_build/html/*
!**/.doctrees
!**/.buildinfo
if-no-files-found: error

- name: Store translatable strings artifact
uses: actions/upload-artifact@v3
with:
name: qiskit-translatables
path: ./docs/locale/en/*
if-no-files-found: error

deploy:
if: github.event_name != 'workflow_dispatch' || inputs.do_deployment
name: Deploy to qiskit.org
needs: [build]
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
with:
path: qiskit

- uses: actions/download-artifact@v3
with:
name: qiskit-docs
path: deploy

- id: choose
name: Choose deployment location(s)
run: |
set -e
declare -a prefixes
case ${{ github.event_name }} in
push)
case ${{ github.ref_type }} in
branch)
if [[ "$GITHUB_REF_NAME" != "main" ]]; then
echo "Push to unhandled branch '$GITHUB_REF_NAME'" >&2
exit 1
fi
prefixes+=( "dev" )
;;
tag)
prefixes+=( "stable/0.44" )
prefixes+=( "" )
;;
*)
echo "Unhandled reference type '${{ github.ref_type }}'" >&2
exit 1
;;
esac
;;
workflow_dispatch)
prefixes+=( "$WORKFLOW_DISPATCH_PREFIX" )
;;
*)
echo "Unhandled GitHub event ${{ github.event_name }}" >&2
exit 1
;;
esac
# Join the array of prefixes into a colon-delimited list for
# serialisation. This includes a trailing colon, so we can detect
# the presence of the empty string, even if it's the only prefix.
if [[ "${#prefixes[@]}" -gt 0 ]]; then
joined_prefixes=$(printf "%s:" "${prefixes[@]}")
echo "Chosen deployment prefixes: '$joined_prefixes'"
echo "joined_prefixes=$joined_prefixes" >> "$GITHUB_OUTPUT"
else
echo "Nothing to deploy to."
fi
env:
LATEST_TAG: ${{ needs.build.outputs.latest_tag }}
GITHUB_REF_NAME: ${{ github.ref_name }}
WORKFLOW_DISPATCH_PREFIX: ${{ inputs.deploy_prefix }}

- name: Install rclone
run: |
set -e
curl https://downloads.rclone.org/rclone-current-linux-amd64.deb -o rclone.deb
sudo apt-get install -y ./rclone.deb
- name: Deploy to qiskit.org
if: ${{ steps.choose.outputs.joined_prefixes != '' }}
run: |
set -e
RCLONE_CONFIG=$(rclone config file | tail -1)
openssl aes-256-cbc -K "$RCLONE_KEY" -iv "$RCLONE_IV" -in qiskit/tools/rclone.conf.enc -out "$RCLONE_CONFIG" -d
IFS=: read -ra prefixes <<< "$JOINED_PREFIXES"
for prefix in "${prefixes[@]}"; do
# The 'documentation' bit of the prefix is hard-coded in this step
# rather than being chosen during the prefix-choosing portion
# because we don't want to allow the 'workflow_dispatch' event
# trigger to accidentally allow a deployment to a dodgy prefix that
# wipes out _everything_ on qiskit.org.
location=documentation/$prefix
echo "Deploying to 'qiskit.org/$location'"
rclone sync --progress --exclude-from qiskit/tools/docs_exclude.txt deploy "IBMCOS:qiskit-org-web-resources/$location"
done
env:
JOINED_PREFIXES: ${{ steps.choose.outputs.joined_prefixes }}
RCLONE_KEY: ${{ secrets.ENCRYPTED_RCLONE_KEY}}
RCLONE_IV: ${{ secrets.ENCRYPTED_RCLONE_IV }}

deploy_translatables:
if: (github.event_name == 'workflow_dispatch' && inputs.do_translatables) || (github.event_name == 'push' && github.ref_type == 'tag' && github.ref_name == needs.build.outputs.latest_tag)
name: Push translatable strings
needs: [build]
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
with:
path: 'qiskit'

- uses: actions/download-artifact@v3
with:
name: qiskit-translatables
path: 'deploy'

- name: Decrypt SSH secret key
id: ssh_key
run: |
set -e
ssh_key=$(openssl enc -aes-256-cbc -d -in qiskit/tools/github_poBranch_update_key.enc -K $SSH_UPDATE_KEY -iv $SSH_UPDATE_IV)
echo "::add-mask::${ssh_key}"
echo "ssh_key=${ssh_key}" >> "$GITHUB_OUTPUT"
env:
SSH_UPDATE_KEY: ${{ secrets.ENCRYPTED_DEPLOY_PO_BRANCH_KEY }}
SSH_UPDATE_IV: ${{ secrets.ENCRYPTED_DEPLOY_PO_BRANCH_IV }}

- uses: actions/checkout@v3
with:
repository: 'qiskit-community/qiskit-translations'
path: 'qiskit-translations'
ssh-key: '${{ steps.ssh_key.outputs.ssh_key }}'

- name: Remove ignored documents
run: rm -r LC_MESSAGES/{apidocs,stubs}
working-directory: 'deploy'

- name: Push changes to translations repository
run: |
set -e
shopt -s failglob
# Bring the new `.po` target files into the repository.
git rm -r --ignore-unmatch docs/locale/en
mv "${{ github.workspace }}/deploy" docs/locale/en
# Update the ways to recreate the build.
cp "${{ github.workspace }}/qiskit/"{setup.py,requirements-*.txt,constraints.txt} .
git add .
cat > COMMIT_MSG << EOF
Automated documentation update to add .po files from ${{ github.repository }}
skip ci
Commit: ${{ github.sha }}
GitHub Actions run: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}
EOF
git config user.name "Qiskit Autodeploy"
git config user.email "qiskit@qiskit.org"
git commit -F COMMIT_MSG
git push origin
working-directory: 'qiskit-translations'
13 changes: 5 additions & 8 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -76,8 +76,6 @@ instance/
# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
target/
Expand Down Expand Up @@ -119,10 +117,6 @@ tutorial/rst/_build/*

test/python/test_qasm_python_simulator.pdf

doc/_build/*

doc/**/_autodoc

qiskit/bin/*

test/python/test_save.json
Expand All @@ -143,8 +137,11 @@ src/qasm-simulator-cpp/test/qubit_vector_tests
qiskit/transpiler/passes/**/cython/**/*.cpp
qiskit/quantum_info/states/cython/*.cpp

docs/stubs/*
executed_tutorials/
# Sphinx documentation
/docs/_build
/docs/stubs
/docs/locale
/executed_tutorials

# Notebook testing images
test/visual/mpl/circuit/circuit_results/*.png
Expand Down
1 change: 1 addition & 0 deletions azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,7 @@ parameters:
type: string
default: "3.8"

# Sync with 'python-version' in '.github/workflows/docs_deploy.yml'.
- name: "documentationPythonVersion"
displayName: "Version of Python to use to build Sphinx documentation"
type: string
Expand Down
29 changes: 25 additions & 4 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,16 @@
# The short X.Y version
version = "0.25"
# The full version, including alpha/beta/rc tags
release = "0.25.0"
release = "0.45.0"

# The language for content autogenerated by Sphinx or the default for gettext content translation.
language = "en"

# For 'qiskit_sphinx_theme' tells it we're based at 'https://qiskit.org/<docs_url_prefix>'.
# Should not include the subdirectory for the stable version.
docs_url_prefix = "documentation"

rst_prolog = f".. |version| replace:: {version}"

# For 'qiskit_sphinx_theme' tells it we're based at 'https://qiskit.org/<docs_url_prefix>'.
# Should not include the subdirectory for the stable version.
Expand Down Expand Up @@ -57,8 +66,20 @@
# Available keys are 'figure', 'table', 'code-block' and 'section'. '%s' is the number.
numfig_format = {"table": "Table %s"}

# The language for content autogenerated by Sphinx or the default for gettext content translation.
language = "en"
# Translations configuration.
translations_list = [
("en", "English"),
("bn_BN", "Bengali"),
("fr_FR", "French"),
("de_DE", "German"),
("ja_JP", "Japanese"),
("ko_KR", "Korean"),
("pt_UN", "Portuguese"),
("es_UN", "Spanish"),
("ta_IN", "Tamil"),
]
locale_dirs = ["locale/"]
gettext_compact = False

# Relative to source directory, affects general discovery, and html_static_path and html_extra_path.
exclude_patterns = ["_build", "**.ipynb_checkpoints"]
Expand Down Expand Up @@ -113,7 +134,7 @@
html_favicon = "images/favicon.ico"
html_last_updated_fmt = "%Y/%m/%d"
html_context = {
"analytics_enabled": os.getenv("QISKIT_ENABLE_ANALYTICS", False)
"analytics_enabled": bool(os.getenv("QISKIT_ENABLE_ANALYTICS", ""))
} # enable segment analytics for qiskit.org/documentation
html_static_path = ["_static"]

Expand Down
3 changes: 3 additions & 0 deletions tools/docs_exclude.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
/stable/**
/locale/**
/dev/**
Binary file added tools/github_poBranch_update_key.enc
Binary file not shown.

0 comments on commit 67061ae

Please sign in to comment.