Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Drop in MetPy docs workflow #805

Merged
merged 6 commits into from
Nov 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
90 changes: 22 additions & 68 deletions .github/workflows/docs-conda.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,15 @@ name: Build Docs (Conda)
# We don't want pushes (or PRs) to gh-pages to kick anything off
on:
pull_request:
branches: [ main ]
branches:
- main
- '[0-9]+.[0-9]+.x'

concurrency:
group: ${{ github.workflow}}-${{ github.head_ref }}
cancel-in-progress: true

jobs:
#
# Build our docs on macOS and Windows on Python 3.6 and 3.8, respectively.
#
Docs:
name: ${{ matrix.os }} ${{ matrix.python-version }}
runs-on: ${{ matrix.os }}-latest
Expand All @@ -19,78 +22,29 @@ jobs:
fail-fast: false
matrix:
include:
- python-version: 3.8
os: macOS
- python-version: 3.7
- python-version: '3.10'
os: Windows
- python-version: 3.11
os: macOS
- python-version: 3.12
os: macOS

steps:
# We check out only a limited depth and then pull tags to save time
- name: Checkout source
uses: actions/checkout@v3
with:
fetch-depth: 100

- name: Get tags
run: git fetch --depth=1 origin +refs/tags/*:refs/tags/*

- name: Setup conda caching
uses: actions/cache@v3
uses: actions/checkout@v4
with:
path: ~/conda_pkgs_dir
key: conda-docs-${{ runner.os }}-${{ matrix.python-version}}-${{ hashFiles('ci/*') }}
restore-keys: |
conda-docs-${{ runner.os }}-${{ matrix.python-version}}
conda-docs-${{ runner.os }}
conda-docs-
fetch-depth: 150
fetch-tags: true

- name: Set up Python ${{ matrix.python-version }}
uses: conda-incubator/setup-miniconda@v2
- name: Install from Conda
uses: Unidata/MetPy/.github/actions/install-conda@main
with:
miniconda-version: "latest"
type: 'doc'
python-version: ${{ matrix.python-version }}
channel-priority: strict
channels: conda-forge
show-channel-urls: true
# Needed for caching
use-only-tar-bz2: true

- name: Install dependencies
run: conda install --quiet --yes --file ci/doc_requirements.txt --file ci/extra_requirements.txt --file ci/requirements.txt

# This imports CartoPy to find its map data cache directory
- name: Get CartoPy maps dir
id: cartopy-cache
run: echo "::set-output name=dir::$(python -c 'import cartopy;print(cartopy.config["data_dir"])')"

- name: Setup mapdata caching
uses: actions/cache@v3
env:
# Increase to reset cache of map data
CACHE_NUMBER: 0
with:
path: ${{ steps.cartopy-cache.outputs.dir }}
key: docs-cartopy-${{ env.CACHE_NUMBER }}
restore-keys: docs-cartopy-

- name: Install
# For some reason on Windows 3.7 building the wheel fails to properly include our extra
# stuff. Executing the egg_info beforehand for some reason fixes it. No idea why. We're
# deep in territory where googling for answers helps not at all.
run: |
python setup.py egg_info
python -m pip install --no-deps .
need-cartopy: true

- name: Build docs
run: |
pushd docs
make html O=-W
popd

- name: Upload docs as artifact
uses: actions/upload-artifact@v3
uses: Unidata/MetPy/.github/actions/build-docs@main
with:
name: ${{ matrix.os }}-${{ matrix.python-version }}-docs
path: |
docs/build/html
!docs/_static/*.pdf
key: ${{ matrix.os }}-${{ matrix.python-version }}
make-targets: ''
136 changes: 45 additions & 91 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,128 +10,82 @@ on:
- v[0-9]+.[0-9]+.[0-9]+
pull_request:

permissions:
contents: write

concurrency:
group: ${{ github.workflow}}-${{ github.head_ref }}
cancel-in-progress: true

jobs:
#
# Build our docs on Linux against multiple Pythons, including pre-releases
# Build our docs on Linux against multiple Pythons
#
Docs:
name: ${{ matrix.python-version }} ${{ matrix.dep-versions }}
runs-on: ubuntu-20.04
continue-on-error: ${{ matrix.experimental }}
env:
DOC_VERSION: dev
name: "Linux ${{ matrix.python-version }}"
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version: ['3.10', 3.11]
check-links: [false]
include:
- python-version: 3.8
check-links: false
dep-versions: requirements.txt
experimental: false
- python-version: 3.9
- python-version: 3.12
check-links: true
dep-versions: requirements.txt
experimental: false
- python-version: 3.9
check-links: false
dep-versions: Prerelease
experimental: true
outputs:
doc-version: ${{ steps.build-docs.outputs.doc-version }}

steps:
# We check out only a limited depth and then pull tags to save time
- name: Checkout source
uses: actions/checkout@v3
uses: actions/checkout@v4
with:
fetch-depth: 100
fetch-depth: 150

- name: Get tags
run: git fetch --depth=1 origin +refs/tags/*:refs/tags/*

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
- name: Install using PyPI
uses: Unidata/MetPy/.github/actions/install-pypi@main
with:
type: 'doc'
python-version: ${{ matrix.python-version }}

# This uses pip to find the right cache dir and then sets up caching for it
- name: Get pip cache dir
id: pip-cache
run: echo "::set-output name=dir::$(pip cache dir)"

- name: Setup pip cache
uses: actions/cache@v3
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: pip-docs-${{ runner.os }}-${{ matrix.python-version }}-${{ hashFiles('ci/*') }}
restore-keys: |
pip-docs-${{ runner.os }}-${{ matrix.python-version }}-${{ hashFiles('ci/*') }}
pip-docs-${{ runner.os }}-${{ matrix.python-version }}-
pip-docs-${{ runner.os }}-
pip-docs-

# This installs the stuff needed to build and install Shapely and CartoPy from source.
# Need to install numpy first to make CartoPy happy.
- name: Install dependencies (PyPI)
if: ${{ runner.os == 'Linux' }}
run: |
sudo apt-get install libgeos-dev libproj-dev proj-bin
python -m pip install --upgrade pip setuptools
python -m pip install -c ci/${{ matrix.dep-versions }} numpy
python -m pip install -r ci/doc_requirements.txt -r ci/extra_requirements.txt -c ci/${{ matrix.dep-versions }}

# This imports CartoPy to find its map data cache directory
- name: Get CartoPy maps dir
id: cartopy-cache
run: echo "::set-output name=dir::$(python -c 'import cartopy;print(cartopy.config["data_dir"])')"

- name: Setup mapdata caching
uses: actions/cache@v3
env:
# Increase to reset cache of map data
CACHE_NUMBER: 0
with:
path: ${{ steps.cartopy-cache.outputs.dir }}
key: docs-cartopy-${{ env.CACHE_NUMBER }}
restore-keys: docs-cartopy-

- name: Install self
run: python -m pip install -c ci/${{ matrix.dep-versions }} .
need-extras: true
need-cartopy: true

- name: Build docs
run: |
pushd docs
make html O=-W
popd

- name: Enable linkchecker for PRs
# Doing the linkchecker separately so that we avoid problems with vendored LICENSE
# files in the build directory
if: ${{ github.event_name == 'pull_request' && matrix.check-links == true }}
run: |
pushd docs
find build/html/_static -name LICENSE.md -delete
make linkcheck
popd
id: build-docs
uses: Unidata/MetPy/.github/actions/build-docs@main
with:
run-linkchecker: ${{ github.event_name == 'pull_request' && matrix.check-links == true }}
key: ${{ runner.os }}-${{ matrix.python-version }}
make-targets: ''

Deploy:
if: ${{ github.event_name != 'pull_request' }}
needs: Docs
environment:
name: github-pages
runs-on: ubuntu-latest
env:
DOC_VERSION: dev

- name: Upload docs as artifact
if: ${{ github.event_name == 'pull_request' }}
uses: actions/upload-artifact@v3
steps:
- name: Download doc build
uses: actions/download-artifact@v4
with:
name: ${{ matrix.python-version }}-${{ matrix.dep-versions }}-docs
path: |
docs/build/html
!docs/_static/*.pdf
name: Linux-3.11-docs
path: ./docs/build/html

# This overrides the version "dev" with the proper version if we're building off a
# branch that's not main (which is confined to n.nn.x above) or on a tag.
- name: Set doc version
if: ${{ github.event_name != 'push' || !contains(github.ref, 'main') }}
run: echo "DOC_VERSION=v$(python -c 'import siphon; print(siphon.__version__.rsplit(".", maxsplit=2)[0])')" >> $GITHUB_ENV
run: echo "DOC_VERSION=v${{ needs.Docs.outputs.doc-version }}" >> $GITHUB_ENV

- name: Upload to GitHub Pages
if: ${{ github.event_name != 'pull_request' && matrix.experimental == false }}
uses: peaceiris/[email protected]
uses: peaceiris/actions-gh-pages@v4
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
deploy_key: ${{ secrets.GHPAGES_DEPLOY_KEY }}
publish_dir: ./docs/build/html
exclude_assets: '.buildinfo,_static/jquery-*.js,_static/underscore-*.js'
destination_dir: ./${{ env.DOC_VERSION }}
Expand Down
6 changes: 3 additions & 3 deletions ci/doc_requirements.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
matplotlib==3.5.3
cartopy==0.22
matplotlib==3.8
metpy==1.2.0
pint==0.18
scipy==1.7.3
pint==0.19
sphinx==5.2.3
sphinx_rtd_theme==1.0.0
sphinx-gallery==0.11.1
Expand Down
26 changes: 26 additions & 0 deletions ci/download_cartopy_maps.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#!/usr/bin/env python
# Copyright (c) 2021 MetPy Developers.
"""Explicitly download needed Cartopy maps."""
from cartopy.io import config, Downloader

AWS_TEMPLATE = ('https://naturalearth.s3.amazonaws.com/{resolution}_'
'{category}/ne_{resolution}_{name}.zip')


def grab_ne(category, feature, res):
"""Download the correct Natural Earth feature using Cartopy."""
download = Downloader.from_config(('shapefiles', 'natural_earth'))
download.path({'category': category, 'name': feature, 'resolution': res, 'config': config})


if __name__ == '__main__':
# Need to override the pre-Cartopy 0.20 URL to use S3
config['downloaders'][('shapefiles', 'natural_earth')].url_template = AWS_TEMPLATE

for feat in ['admin_0_boundary_lines_land', 'admin_1_states_provinces_lakes']:
for r in ['110m', '50m', '10m']:
grab_ne('cultural', feat, r)

for feat in ['coastline', 'lakes', 'land', 'ocean', 'rivers_lake_centerlines']:
for r in ['110m', '50m', '10m']:
grab_ne('physical', feat, r)
45 changes: 45 additions & 0 deletions ci/filter_links.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
#!/usr/bin/env python

Check failure on line 1 in ci/filter_links.py

View workflow job for this annotation

GitHub Actions / Flake8

[flake8] reported by reviewdog 🐶 C801 Copyright notice not present. Raw Output: ./ci/filter_links.py:1:2: C801 Copyright notice not present.
# Copyright (c) 2021 MetPy Developers.
"""Filter links from Sphinx's linkcheck."""
import json
import subprocess
import sys


def get_failing_links(fname):
"""Yield links with problematic statuses."""
with open(fname) as linkfile:
links = json.loads('[' + ','.join(linkfile) + ']')
for link in links:
if link['status'] not in {'working', 'ignored', 'unchecked'}:
yield link


def get_added():
"""Get all lines added in the most recent merge."""
revs = subprocess.check_output(['git', 'rev-list', '--parents', '-n', '1', 'HEAD'])
merge_commit, target, _ = revs.decode('utf-8').split()
diff = subprocess.check_output(['git', 'diff', f'{target}...{merge_commit}'])
return '\n'.join(line for line in diff.decode('utf-8').split('\n')
if line.startswith('+') and not line.startswith('+++'))


if __name__ == '__main__':
# If the second argument is true, then we only want links in the most recent merge,
# otherwise we print all failing links.
if sys.argv[2] in ('true', 'True'):
print('Checking only links in the diff')

Check failure on line 31 in ci/filter_links.py

View workflow job for this annotation

GitHub Actions / Flake8

[flake8] reported by reviewdog 🐶 T201 print found. Raw Output: ./ci/filter_links.py:31:9: T201 print found.
added = get_added()
check_link = lambda link: link['uri'] in added

Check failure on line 33 in ci/filter_links.py

View workflow job for this annotation

GitHub Actions / Flake8

[flake8] reported by reviewdog 🐶 E731 do not assign a lambda expression, use a def Raw Output: ./ci/filter_links.py:33:9: E731 do not assign a lambda expression, use a def
else:
print('Checking all links')

Check failure on line 35 in ci/filter_links.py

View workflow job for this annotation

GitHub Actions / Flake8

[flake8] reported by reviewdog 🐶 T201 print found. Raw Output: ./ci/filter_links.py:35:9: T201 print found.
check_link = lambda link: True

Check failure on line 36 in ci/filter_links.py

View workflow job for this annotation

GitHub Actions / Flake8

[flake8] reported by reviewdog 🐶 E731 do not assign a lambda expression, use a def Raw Output: ./ci/filter_links.py:36:9: E731 do not assign a lambda expression, use a def

ret = 0
for link in get_failing_links(sys.argv[1]):
if check_link(link):
ret = 1
print(f'{link["filename"]}:{link["lineno"]}: {link["uri"]} -> '

Check failure on line 42 in ci/filter_links.py

View workflow job for this annotation

GitHub Actions / Flake8

[flake8] reported by reviewdog 🐶 T201 print found. Raw Output: ./ci/filter_links.py:42:13: T201 print found.

Check failure on line 42 in ci/filter_links.py

View workflow job for this annotation

GitHub Actions / Flake8

[flake8] reported by reviewdog 🐶 Q000 Double quotes found but single quotes preferred Raw Output: ./ci/filter_links.py:42:27: Q000 Double quotes found but single quotes preferred

Check failure on line 42 in ci/filter_links.py

View workflow job for this annotation

GitHub Actions / Flake8

[flake8] reported by reviewdog 🐶 E231 missing whitespace after ':' Raw Output: ./ci/filter_links.py:42:39: E231 missing whitespace after ':'

Check failure on line 42 in ci/filter_links.py

View workflow job for this annotation

GitHub Actions / Flake8

[flake8] reported by reviewdog 🐶 Q000 Double quotes found but single quotes preferred Raw Output: ./ci/filter_links.py:42:46: Q000 Double quotes found but single quotes preferred
f'{link["status"]} {link["info"]}')

sys.exit(ret)
Loading
Loading