Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/sphinx docs #5

Merged
merged 5 commits into from
Aug 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 5 additions & 1 deletion .github/workflows/check.yml
Original file line number Diff line number Diff line change
Expand Up @@ -76,4 +76,8 @@ jobs:
- name: Run mypy
run: |
poetry install --with mypy
poetry run mypy .
poetry run mypy .
- name: Run sphinx
run: |
poetry install --with docs
poetry run sphinx-build -M dummy ./docs ./docs/_build -W -a
67 changes: 67 additions & 0 deletions .github/workflows/sphinx.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
# Sample workflow for building and deploying a Sphinx site to GitHub Pages
name: Deploy Sphinx site to Pages

on:
# Runs on pushes targeting the default branch
push:
branches: [$default-branch]

# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:

# Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions:
contents: read
pages: write
id-token: write

# Allow only one concurrent deployment, skipping runs queued between the run in-progress and latest queued.
# However, do NOT cancel in-progress runs as we want to allow these production deployments to complete.
concurrency:
group: "pages"
cancel-in-progress: false

# Default to bash
defaults:
run:
shell: bash

jobs:
# Build job
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
submodules: recursive
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Install Poetry
run: |
pipx install poetry
- name: Install dependencies
run: poetry install --with docs
- name: Setup Pages
id: pages
uses: actions/configure-pages@v5
- name: Build docs
run: |
poetry run sphinx-build -M html ./docs ./docs/_build -q -a
- name: Upload artifact
uses: actions/upload-pages-artifact@v3
with:
path: ./docs/_build

# Deployment job
deploy:
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
runs-on: ubuntu-latest
needs: build
steps:
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v4
5 changes: 5 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -52,3 +52,8 @@ repos:
entry: poetry run mypy --no-namespace-packages --exclude esgf_playground_utils/models/__init__.py
language: system
types: [ file, python ]
- id: docs
name: docs
entry: bash -c "poetry run sphinx-build -M dummy ./docs ./docs/_build -W -q -a"
language: system
types: [ file, python ]
3 changes: 2 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ When you commit, the following checks will be run:
- mypy (python static type analysis)
- bandit (python SAST analyis)
- xenon (McCabe cyclomatc complexity analysis)
- sphinx (dry-run documentation build)

You can disable the `pre-commit hooks` per commit with the flag `--no-verify` however all checks will be preformed in the CI.

Expand All @@ -49,7 +50,7 @@ foo@bar:~$ poetry run pre-commit run -a

The CI Enviroment runs the same checks as the `pre-commit hooks` on push[^2] plus the following additional checks:

- audit (checks all dependencies for vulnerabilities)
- audit (checks all dependencies for vulnerabilities)

[^1]: These pre-commit hooks will attempt to fix the issue in place
[^2]: On CI, no checks perform code changes in place
Expand Down
20 changes: 20 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = .
BUILDDIR = _build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
Empty file added docs/_static/.keep
Empty file.
Empty file added docs/_templates/.keep
Empty file.
34 changes: 34 additions & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Configuration file for the Sphinx documentation builder.
#
# For the full list of built-in configuration values, see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html

# -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information

project = "ESGF Playground Utils"
copyright = "2024, David Poulter"
author = "David Poulter"
release = "0.3.1-alpha.2"

# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration

extensions = [
"autoapi.extension",
"sphinx-pydantic",
"sphinx_rtd_theme",
]

templates_path = ["_templates"]
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]


# -- Options for HTML output -------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output

html_theme = "sphinx_rtd_theme"
html_static_path = ["_static"]

# Autoapi Configuration
autoapi_dirs = ["../esgf_playground_utils"]
17 changes: 17 additions & 0 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
.. ESGF Playground Utils documentation master file, created by
sphinx-quickstart on Wed Jul 24 15:04:46 2024.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.

ESGF Playground Utils documentation
===================================

Add your content using ``reStructuredText`` syntax. See the
`reStructuredText <https://www.sphinx-doc.org/en/master/usage/restructuredtext/index.html>`_
documentation for details.


.. toctree::
:maxdepth: 2
:caption: Contents:

35 changes: 35 additions & 0 deletions docs/make.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
@ECHO OFF

pushd %~dp0

REM Command file for Sphinx documentation

if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build

%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.https://www.sphinx-doc.org/
exit /b 1
)

if "%1" == "" goto help

%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end

:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%

:end
popd
78 changes: 78 additions & 0 deletions esgf_playground_utils/models/kafka.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,60 +11,138 @@


class _Payload(BaseModel):
"""
Base model for payloads in a Kafka message, provides the required ``collection_id`` attribute.

.. warning::

This model should not be used directly.

"""

collection_id: str


class CreatePayload(_Payload):
"""
Model describing a ``CREATE`` payload. This must be sent as a ``POST`` request.
"""

method: Literal["POST"]
item: Item


class RevokePayload(_Payload):
"""
Model describing a ``REVOKE`` payload. This must be sent as a ``PATCH`` or ``DELETE`` request.

.. note::

It is intended that the ``PATCH`` request is interpreted as a "soft" delete, simply updating the item to
signify that it is revoked.

The ``DELETE`` request will operate as a "hard" delete strictly removing the item from the STAC index.

.. danger::

The behaviour of either of these actions is not yet fully defined.
"""

method: Literal["PATCH", "DELETE"]
item_id: str


class UpdatePayload(_Payload):
"""
Model describing a ``UPDATE`` payload. This must be sent as a ``PATCH`` or ``PUT`` request.

.. danger::

The behaviour of the ``PATCH`` request is not yet fully defined. Specifically, it is not clear
if the ``PATCH`` request will require a specific ``item_id``, and as such will require an alternative
model.
"""

method: Literal["PUT", "PATCH"]
item: Item


class Data(BaseModel):
"""
Model describing the ``DATA`` component of a Kafka message. This contains the payload itself.

.. note::

Whilst the ``type`` and ``version`` attributes are available, it is not expected that these will change for
a significant length of time.
"""

type: Literal["STAC"]
version: Literal["1.0.0"]
payload: Union[CreatePayload, RevokePayload, UpdatePayload]


class Auth(BaseModel):
"""
Model describing the ``AUTH`` component of a Kafka message.

.. note::

This is not an authorisation token or other verified identity. It is the simply an indication of the institute
providing the message.
"""

client_id: str
server: str


class Publisher(BaseModel):
"""
Model describing the ``PUBLISHER`` component of a Kafka message. This is the name and version of the software used
to publish the Kafka message.
"""

package: str
version: str


class Metadata(BaseModel):
"""
Multiple metadata attributes required for ESGF but not part of the STAC payload.
"""

auth: Auth
publisher: Publisher
time: datetime
schema_version: str


class KafkaEvent(BaseModel):
"""
The full content of a Kafka message, containing both the STAC payload, the request description and the ESGF
mandated metadata.
"""

metadata: Metadata
data: Data


class ErrorType(str, Enum):
"""
Enum describing the source of the error that occurred.
"""

payload = "payload"
stac_server = "stac_server"
kafka = "kafka"
unknown = "unknown"


class Error(BaseModel):
"""
Error event published to the Kafka error queue.
"""

original_payload: str
node: str
traceback: str
Expand Down
Loading