Information | Links |
---|---|
Project |
Welcome to the Jupyter Accessibility testing tools repository. 👋🏽 This repository is a place for accessibility testing within Jupyter.
Important To learn more about the broader accessibility initiatives within Jupyter, check the jupyter/accessibility repository.
Automated accessibility tests cannot address accessibility issues on their own, but used correctly they can be a useful tool.
Work in this repository is modelled after the JupyterLab Benchmarks repo.
As described in the Jupyter Accessibility Roadmap, the plan is to start by adding tests for JupyterLab. We are starting with the web app UI, then we will add tests for the docs. (The rationale for that sequence is to start with the harder problem first.)
After JupyterLab, though it is not within the scope of the grant driving the roadmap, we hope to extend this testing to other parts of the Jupyter ecosystem beyond JupyterLab.
This repository is organized as follows:
.
├── .github
│ ├── .github/workflows # set of GitHub actions to run the accessibility tests based on certain type of triggers
│ └── .github/actions # composable actions that perform specific tasks (not to be used on their own but as part of a GitHub actions workflow)
├── testing # root folder for the testing tools
│ └── testing/jupyterlab # testing tools and scripts for JupyterLab
│ ├── testing/jupyterlab/manual-testing-scripts # "recipes" that explain in plain language how automated tests can be also be carried out manually
│ ├── testing/jupyterlab/tests # set of Playwright automated tests
│ ├── testing/jupyterlab/environment.yml # conda environment file to install the dependencies for the automated tests
│ ├── testing/jupyterlab/README.md # documentation for the JupyterLab tests - start here to learn how to run the tests locally or in GitHub actions
│ ├── testing/jupyterlab/package.json # npm package file to install the dependencies for the automated tests
│ └── testing/jupyterlab/playwright.config.ts # Playwright configuration file
│ └── testing/notebooks # set of reference Jupyter notebooks to be used in the automated tests
│ └── testing/scripts # set of manual testing scripts for JupyterLab (include relevant WCAG success criteria and step-by-step guides to audit)
├── .pre-commit-config.yaml # configuration file for the pre-commit hooks
├── README.md # this file
└── LICENSE # license file
👉🏽 To learn about how to run and inspect the JupyterLab accessibility tests check the JupyterLab testing README.
This repository uses several pre-commit hooks to standardize our codebase. Follow these steps to install the hooks:
-
Before you can run the hooks, you need to install the pre-commit package manager:
# using pip pip install pre-commit # if you prefer using conda conda install -c conda-forge pre-commit
-
From the root of this project, install the git hook scripts:
# install the pre-commit hooks pre-commit install
-
Optional - run the hooks against the files in this repository
# run the pre-commit hooks pre-commit run --all-files
Once installed, the pre-commit hooks will run automatically when you make a commit in version control.
Jupyter uses a shared copyright model that enables all contributors to maintain the copyright on their contributions. All code is licensed under the terms of the revised BSD license.