Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jenkins: make it possible to run notebooks from external repos programmatically #138

Open
wants to merge 18 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 10 commits
Commits
Show all changes
18 commits
Select commit Hold shift + click to select a range
e594d24
downloadrepos: allow to be used as library as well
tlvu May 14, 2024
5216bca
downloadrepos: only download repos that will be tested
tlvu May 14, 2024
57936f5
testall: make processing here available to other repos via downloadrepos
tlvu May 14, 2024
cccae70
testall: make available all functions in downloadrepos to CONFIG_PARA…
tlvu May 14, 2024
b84da18
Jenkins: all artifacts are now under buildout/ so easier to add new n…
tlvu May 14, 2024
cfe5f95
testall: git clean before CONFIG_PARAMETERS_SCRIPT_URL to avoid wipin…
tlvu May 14, 2024
7253e6a
runtest: allow to override --nbval-sanitize-with from CONFIG_OVERRIDE…
tlvu May 14, 2024
d93b229
downloadrepos: avoid set +x because it hides all subsequent commands …
tlvu May 14, 2024
9160053
jenkins sample override: demo runnings notebooks from an external repo
tlvu May 14, 2024
25c0abb
runtest: CONFIG_PARAMETERS_SCRIPT_URL can override DEFAULT_PRODUCTION…
tlvu May 14, 2024
0182490
runtest: add repo and branch name to archived nbs name to id which re…
tlvu May 16, 2024
1cf434e
Merge remote-tracking branch 'origin/master' into make-it-easier-to-a…
tlvu Jun 5, 2024
a20ef26
Merge remote-tracking branch 'origin/master' into make-it-easier-to-a…
tlvu Oct 31, 2024
36fba3d
testall: try to avoid delete_files_confusing_pytest by using py.test …
tlvu Oct 31, 2024
5f13ec9
runtest: allow saved files under buildout/ to have hierarchy
tlvu Oct 31, 2024
c9c8d50
tests: allow to override artifact nb filename format archived by Jenkins
tlvu Oct 31, 2024
7fce738
runtest: allow overrideable post-processing steps
tlvu Oct 31, 2024
d0265f3
test artifacts: shorter filename format, name clash already handled
tlvu Oct 31, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 1 addition & 10 deletions Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -144,16 +144,7 @@ Note this is another run, will double the time and no guaranty to have same erro

post {
always {
archiveArtifacts(artifacts: 'notebooks/*.ipynb', fingerprint: true)
archiveArtifacts(artifacts: 'pavics-sdi-*/docs/source/notebooks/*.ipynb', fingerprint: true)
archiveArtifacts(artifacts: 'pavics-sdi-*/docs/source/notebook-components/*.ipynb', fingerprint: true)
archiveArtifacts(artifacts: 'finch-*/docs/source/notebooks/*.ipynb', fingerprint: true)
archiveArtifacts(artifacts: 'raven-*/docs/source/notebooks/*.ipynb', fingerprint: true)
archiveArtifacts(artifacts: 'RavenPy-*/docs/notebooks/*.ipynb', fingerprint: true)
archiveArtifacts(artifacts: 'RavenPy-*/docs/notebooks/paper/*.ipynb', fingerprint: true)
archiveArtifacts(artifacts: 'esgf-compute-api-*/examples/*.ipynb', fingerprint: true)
archiveArtifacts(artifacts: 'PAVICS-landing-*/content/notebooks/climate_indicators/*.ipynb', fingerprint: true)
archiveArtifacts(artifacts: 'buildout/*.output.ipynb', fingerprint: true, allowEmptyArchive: true)
archiveArtifacts(artifacts: 'buildout/*.ipynb', fingerprint: true, allowEmptyArchive: true)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not use a buildout/**/*.ipynb and leave the directory structure as originally defined by each repository? That would avoid all the directory renaming manipulations and potential side effects of removing pytest configs.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I previously thought that too. But actually real life usage, a flat directory is much easier to use, meaning to download the notebooks I want. In a nested layout, if I want multiple notebooks under multiple different sub-folders, I have to click a lot more. It's tiring.

Now with the option to only archive the notebooks enabled for the run, the chance to have name clash is even lower.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would avoid all the directory renaming manipulations and potential side effects of removing pytest configs.

Not sure what you mean. I do not perform any directory renaming manipulations. The notebooks are run from the original location of the checkout because some will use relative path to get their test data in the same checkout.

I only copy the notebooks only, not the test data, to the buildout/ folder so they are archived by Jenkins, along side their matching "output" notebook so we can download both of them and diff them manually if the diff presented by Jenkins console output is impossible to understand.

As for "potential side effects of removing pytest configs", I also do not understand how that relate to Jenkins archiving config change.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All the notebooks are copied as a flat list, which could lead to conflicts between repositories. I was under the impression that the repository name was used as prefix to the archived notebook names to avoid this, but it is not the case. With the results archived this way, it is very hard to trace back the original source of the notebook.

image

As for "potential side effects of removing pytest configs", I also do not understand how that relate to Jenkins archiving config change.

It does not affect the archiving, but it affects how the notebooks are executed by pytest in order to generate their outputs. Therefore, if any pytest configurations options are applied to obtain certain effects needed by the tests, this could have bad side effects.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it is very hard to trace back the original source of the notebook.

I never had trouble knowing which nb is from which repo but I agree not everyone know this. Just so it is clear, it is already like this before (flat list of *.output.ipynb), I am just adding the original along-side the corresponding output.ipynb.

However, I'll add the repo name as prefix to the nb filename to help differentiate and avoid name clash.

By the way, I did preserve the already existing name clash prevention:

# prevent name clash
filename="${filename}_`date '+%s'`"

It does not affect the archiving

But you had your comment in this Jenkins archiving section so that's why I was mixed up.

Like I said here #138 (comment), it weird but I had to do it, otherwise the test run do not work at all. The only hypothesis I have is mentioned in that comment.

By the way, deleting the extra files is an existing behavior. I did not just add it.

This PR is maximally backward-compatible. I did not add any new processing steps other than making these processing steps available to external repos.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"look at the output in the artifact"? You mean look at the output in the "Console Output"?

No, I open the notebook directly from the artifacts page and look at its contents from the browser.

Because if you only look at the list of saved notebooks in the artifacts, you won't know which notebooks actually failed.

If I went through the artifact to look for a notebook, it's because I already know which test case I'm investigating to debug.

The listing of notebooks in the "Console Output" where any errors are also found, are using the original structure, not flat. This will give you exactly the repo and path of the notebook you want to run manually.

This is exactly why I would rather have the notebooks with the generated outputs located under the same locations as the current artifacts using the same structure as displayed in the console outputs.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is exactly why I would rather have the notebooks with the generated outputs located under the same locations as the current artifacts using the same structure as displayed in the console outputs.

@fmigneault
Okay, how about I make all the files (original and output) keep the same folder structure as prefix. Then you have your structure to easy search for your file, I have my flat list to easily switch between multiples repos.

Something like buildout/pavics-sdi-master--docs--source--notebooks--WCS_example{,-output]}.ipynb. Good enough compromise so we can merge this?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it is a matter of compromise. The artifact structure should simply reflect the source repository. There is no need for an alternate naming convention and there is no need to define more code to handle it. The outputs should simply be saved where the notebooks are naturally collected in artifacts.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it is a matter of compromise.

Agreed, no compromise for usability.

Let me repeat comment #138 (comment) again if not clear enough: if "notebooks are under different nested folders ! You'll have to click to get to the first older, back out, then again to the 2nd folder ! The more folders, the more back and forth it is !!!"

Just a reminder, this artifact archiving just for immediate troubleshooting, it has no long term persistance value. I'd rather focus on usability and productivity than following some standard for something that is ephemeral and not important. Once you manage a lot of notebook repos, a flat list if much faster to deal with.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Even then, I do not agree. If you know the path of the notebook you are interested in after getting an error, you have no reason to go back-and-forth between folders.

I see the addition of code, special character handling and deletion of repo-specific configs as a sign of bigger maintenance burden and potential side effects.

archiveArtifacts(artifacts: 'buildout/env-dump/', fingerprint: true)
}
unsuccessful { // Run if the current builds status is "Aborted", "Failure" or "Unstable"
Expand Down
101 changes: 88 additions & 13 deletions downloadrepos
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
#!/bin/sh
# This file can be used both as executable script or library to be sourced.
# To use as library to be sourced, set DOWNLOADREPOS_AS_LIB=1 env var.

downloadrepos() {
github_repo="$1"; shift
Expand All @@ -13,25 +15,98 @@ downloadgithubrepos() {
repo_owner="`echo "$owner_and_repo_name" | sed "s@/.*\\$@@g"`"
repo_name="`echo "$owner_and_repo_name" | sed "s@^.*/@@g"`"
repo_branch="$1"; shift
set -x
# clean up other previously downloaded branches of the same repo as well
rm -rf ${repo_name}-*
ls | grep $repo_name
downloadrepos https://github.com/$repo_owner/$repo_name "$repo_branch"
ls | grep $repo_name
set +x
}

. ./default_build_params
# USAGE: VAR_TO_LOWER="$(lowercase "$VAR_TO_LOWER")"
lowercase() {
echo "$1" | tr '[:upper:]' '[:lower:]'
}

lowercase_boolean_build_params() {
TEST_MAGPIE_AUTH="$(lowercase "$TEST_MAGPIE_AUTH")"
TEST_PAVICS_SDI_REPO="$(lowercase "$TEST_PAVICS_SDI_REPO")"
TEST_PAVICS_SDI_WEAVER="$(lowercase "$TEST_PAVICS_SDI_WEAVER")"
TEST_FINCH_REPO="$(lowercase "$TEST_FINCH_REPO")"
TEST_PAVICS_LANDING_REPO="$(lowercase "$TEST_PAVICS_LANDING_REPO")"
TEST_RAVEN_REPO="$(lowercase "$TEST_RAVEN_REPO")"
TEST_RAVENPY_REPO="$(lowercase "$TEST_RAVENPY_REPO")"
TEST_ESGF_COMPUTE_API_REPO="$(lowercase "$TEST_ESGF_COMPUTE_API_REPO")"
TEST_LOCAL_NOTEBOOKS="$(lowercase "$TEST_LOCAL_NOTEBOOKS")"
}

# Replace all slash (/) by dash (-) because (/) is illegal in folder name
# for branch name of the format "feature/my_wizbang-feature".
# Github does the same when downloading repo archive by downloadrepos above.
# USAGE: export BRANCH_NAME="$(sanitize_branch_name "$BRANCH_NAME")"
sanitize_branch_name() {
echo "$1" | sed "s@/@-@g"
}

# Ex: extract 'pavics-sdi' from 'Ouranosinc/pavics-sdi'.
# USAGE: REPO_NAME_ONLY="$(extract_repo_name "$REPO_NAME")"
extract_repo_name() {
echo "$1" | sed "s@^.*/@@g"
}

# Branches that have allowed characters such as '+' other than alphanum, '-', '_' and '.' are converted to '-' in archives.
# USAGE: FOLDER_NAME="$(sanitize_extracted_folder_name "$FOLDER_NAME")"
sanitize_extracted_folder_name() {
echo "$1" | sed "s@[^a-zA-Z0-9_\-\.]@-@g"
}

# Presence of setup.cfg, tox.ini, pyproject.toml files confuse py.test execution rootdir discovery.
# USAGE: delete_files_confusing_pytest "$CHECKOUT_DIR"
delete_files_confusing_pytest() {
for afile in setup.cfg tox.ini pyproject.toml; do
if [ -f "$1/$afile" ]; then
rm -v "$1/$afile"
fi
done
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the repository defined pytest options, such as necessary plugins or marker definitions, this will break their execution.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree this is weird. I have no other hypothesis other can because I run multiple notebooks from multiple repos at the same time, if each repo has their config, it breaks the full test run.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In that case we should be running a separate py.test process for each repo instead of all at once.

I agree with @fmigneault that there could be crucial definitions in the setup.cfg tox.ini pyproject.toml files that we do not want to lose or the tests may fail unexpectedly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible those pytest definitions are for the unit test but here we are running notebook tests so they do not apply? Because for the current list of repos hardcoded in this Jenkins config, missing those pytest definitions are absolutely fine since forever. Keeping them are actually causing trouble.

The step to remove those files setup.cfg tox.ini pyproject.toml are not hardcoded so for external repos, they can choose to keep them if needed. See example in jenkins-params-external-repos.include.sh

, just do not include that call to delete_files_confusing_pytest.


downloadrepos_main() {
. ./default_build_params

lowercase_boolean_build_params

if [ -z "$DOWNLOAD_ALL_DEFAULT_REPOS" ]; then
# Back-compat with old default behavior, used in binder/reorg-notebook
# and other external scripts that autodeploy tutorial notebooks (see
# https://github.com/bird-house/birdhouse-deploy/blob/444a7c35a31aa8ad351e47f659383ba5c2919705/birdhouse/deployment/trigger-deploy-notebook#L64-L75)
DOWNLOAD_ALL_DEFAULT_REPOS=true
fi
Comment on lines +67 to +72
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I would rather have the other script set DOWNLOAD_ALL_DEFAULT_REPOS=true if it needs all of them and have the default behavior of skipping unnecessary downloads.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, that's the point. If the other scripts have to set this new DOWNLOAD_ALL_DEFAULT_REPOS=true it means I break backward-compatibility with the other scripts.

I want no changes to other scripts outside this repo.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead, every CI needs to inject the new variable to take advantage of more intelligent download rather than getting it for free. Since birdhouse-deploy does wget to retrieve those scripts, it makes more sense IMO that it updates with new features, or use an explicit commit hash or tag to ensure the behavior remains consistent.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

every CI needs to inject the new variable to take advantage of more intelligent download

Huh, the only one is this Jenkins or more precisely the testall script. If someone deploy this job on another server, the entrypoint is still the testall script, not this downloadrepos directly.

use an explicit commit hash or tag to ensure the behavior remains consistent.

Using exact commit hash is like pinning everything in your requirements.txt. We do not do this because update will be too tedious, same here.


if [ -z "$1" ]; then
if [ x"$DOWNLOAD_ALL_DEFAULT_REPOS" = xtrue ] || [ x"$TEST_PAVICS_SDI_REPO" = xtrue ]; then
downloadgithubrepos $PAVICS_SDI_REPO $PAVICS_SDI_BRANCH
fi
if [ x"$DOWNLOAD_ALL_DEFAULT_REPOS" = xtrue ] || [ x"$TEST_FINCH_REPO" = xtrue ]; then
downloadgithubrepos $FINCH_REPO $FINCH_BRANCH
fi
if [ x"$DOWNLOAD_ALL_DEFAULT_REPOS" = xtrue ] || [ x"$TEST_PAVICS_LANDING_REPO" = xtrue ]; then
downloadgithubrepos $PAVICS_LANDING_REPO $PAVICS_LANDING_BRANCH
fi
if [ x"$DOWNLOAD_ALL_DEFAULT_REPOS" = xtrue ] || [ x"$TEST_RAVEN_REPO" = xtrue ]; then
downloadgithubrepos $RAVEN_REPO $RAVEN_BRANCH
fi
if [ x"$DOWNLOAD_ALL_DEFAULT_REPOS" = xtrue ] || [ x"$TEST_RAVENPY_REPO" = xtrue ]; then
downloadgithubrepos $RAVENPY_REPO $RAVENPY_BRANCH
fi
if [ x"$DOWNLOAD_ALL_DEFAULT_REPOS" = xtrue ] || [ x"$TEST_ESGF_COMPUTE_API_REPO" = xtrue ]; then
downloadgithubrepos $ESGF_COMPUTE_API_REPO $ESGF_COMPUTE_API_BRANCH
fi
else
set -x
downloadrepos "$@"
fi
}

if [ -z "$1" ]; then
downloadgithubrepos $PAVICS_SDI_REPO $PAVICS_SDI_BRANCH
downloadgithubrepos $FINCH_REPO $FINCH_BRANCH
downloadgithubrepos $PAVICS_LANDING_REPO $PAVICS_LANDING_BRANCH
downloadgithubrepos $RAVEN_REPO $RAVEN_BRANCH
downloadgithubrepos $RAVENPY_REPO $RAVENPY_BRANCH
downloadgithubrepos $ESGF_COMPUTE_API_REPO $ESGF_COMPUTE_API_BRANCH
else
set -x
downloadrepos "$@"
if [ -z "$DOWNLOADREPOS_AS_LIB" ]; then
# Script mode, not library mode.
downloadrepos_main "$@"
fi
38 changes: 24 additions & 14 deletions runtest
Original file line number Diff line number Diff line change
@@ -1,6 +1,11 @@
#!/bin/sh

DEFAULT_PRODUCTION_HOST="pavics.ouranos.ca"
# Load shared functions, make available to CONFIG_OVERRIDE_SCRIPT_URL.
DOWNLOADREPOS_AS_LIB=1
. ./downloadrepos

# CONFIG_PARAMETERS_SCRIPT_URL can override DEFAULT_PRODUCTION_HOST.
DEFAULT_PRODUCTION_HOST="${DEFAULT_PRODUCTION_HOST:=pavics.ouranos.ca}"

NOTEBOOKS="$1"
if [ -z "$NOTEBOOKS" ]; then
Expand Down Expand Up @@ -63,11 +68,12 @@ if [ -n "$CONFIG_OVERRIDE_SCRIPT_URL" ]; then
fi
fi

py.test --nbval $NOTEBOOKS --nbval-sanitize-with notebooks/output-sanitize.cfg $PYTEST_EXTRA_OPTS
# CONFIG_OVERRIDE_SCRIPT_URL can override NBVAL_SANITIZE_CFG_FILE.
py.test --nbval $NOTEBOOKS --nbval-sanitize-with "${NBVAL_SANITIZE_CFG_FILE:=notebooks/output-sanitize.cfg}" $PYTEST_EXTRA_OPTS
EXIT_CODE="$?"

# lowercase SAVE_RESULTING_NOTEBOOK string
SAVE_RESULTING_NOTEBOOK="`echo "$SAVE_RESULTING_NOTEBOOK" | tr '[:upper:]' '[:lower:]'`"
SAVE_RESULTING_NOTEBOOK="$(lowercase "$SAVE_RESULTING_NOTEBOOK")"


# save notebooks resulting from the run
Expand All @@ -79,24 +85,28 @@ SAVE_RESULTING_NOTEBOOK="`echo "$SAVE_RESULTING_NOTEBOOK" | tr '[:upper:]' '[:lo
# work-around as nbval can not save the result of the run
# see https://github.com/computationalmodelling/nbval/issues/112

if [ x"$SAVE_RESULTING_NOTEBOOK" = xtrue ]; then
mkdir -p buildout
for nb in $NOTEBOOKS; do
filename="`basename "$nb"`"
filename="`echo "$filename" | sed "s/.ipynb$//"`" # remove .ipynb ext
if [ -e "buildout/${filename}.output.ipynb" ]; then
# prevent name clash
filename="${filename}_`date '+%s'`"
fi
mkdir -p buildout/
for nb in $NOTEBOOKS; do
filename="$(basename "$nb")"
filename="$(echo "$filename" | sed "s/.ipynb$//")" # remove .ipynb ext
if [ -e "buildout/${filename}.ipynb" ]; then
# prevent name clash
filename="${filename}_$(date '+%s')"
fi

# Save original notebooks that we sed replace the PAVICS_HOST.
cp "$nb" "buildout/${filename}.ipynb"

if [ x"$SAVE_RESULTING_NOTEBOOK" = xtrue ]; then
# Timeout must not be more than 240s (4 mins). Default in Jenkinsfile.
# Tutorial notebooks should be fast so user do not lose patience waiting
# for them to run. If more than 4 mins, in addition to simplifying the
# notebook, should also check machine performance.
jupyter nbconvert --to notebook --execute \
--ExecutePreprocessor.timeout=${SAVE_RESULTING_NOTEBOOK_TIMEOUT:=240} --allow-errors \
--output-dir buildout --output "${filename}.output.ipynb" "$nb"
done
fi
fi
done

# exit with return code from py.test
exit $EXIT_CODE
47 changes: 47 additions & 0 deletions test-override/jenkins-params-external-repos.include.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
#!/bin/sh
#
# Sample Jenkins params override script to demonstrate running new notebooks
# from an external repo and on-the-fly CONFIG_OVERRIDE_SCRIPT_URL file creation.
#
# This script is intended for param CONFIG_PARAMETERS_SCRIPT_URL.

# Scenario: we want to run notebooks from an external repo, unknown to current Jenkins config.
# https://github.com/roocs/rook/tree/master/notebooks/*.ipynb

# Disable all existing default repos to avoid downloading them and running them.
TEST_PAVICS_SDI_REPO="false"
TEST_FINCH_REPO="false"
TEST_PAVICS_LANDING_REPO="false"
TEST_LOCAL_NOTEBOOKS="false"

# Set new external repo vars. Need 'export' so CONFIG_OVERRIDE_SCRIPT_URL can see them.
export ROOK_REPO="roocs/rook"
export ROOK_BRANCH="master"

# Not checking for expected output, just checking whether the code can run without errors.
PYTEST_EXTRA_OPTS="$PYTEST_EXTRA_OPTS --nbval-lax"

# Create CONFIG_OVERRIDE_SCRIPT_URL file on-the-fly to run the notebooks from
# our external repo.

CONFIG_OVERRIDE_SCRIPT_URL="/tmp/custom-repos.include.sh"

# Populate the content of our CONFIG_OVERRIDE_SCRIPT_URL.
echo '
#!/bin/sh
# Sample config override script to run new notebooks from new external repo.

# Replicate processing steps in 'testall' script.

# Download the external repo.
downloadgithubrepos $ROOK_REPO $ROOK_BRANCH

# Prep vars for including new nbs in nb list to test.
ROOK_REPO_NAME="$(extract_repo_name "$ROOK_REPO")"
ROOK_DIR="$(sanitize_extracted_folder_name "${ROOK_REPO_NAME}-${ROOK_BRANCH}")"

delete_files_confusing_pytest "$ROOK_DIR"

# Set new nbs as nb list to test.
NOTEBOOKS="$ROOK_DIR/notebooks/*.ipynb"
' > "$CONFIG_OVERRIDE_SCRIPT_URL"
82 changes: 33 additions & 49 deletions testall
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,15 @@

. ./default_build_params

# Load shared functions, make available to CONFIG_PARAMETERS_SCRIPT_URL.
DOWNLOADREPOS_AS_LIB=1
. ./downloadrepos

set -x

# emulate "clean after checkout" of single branch pipeline
git clean -fdx

# Allow full override of ALL Jenkins params before running test suite.
# Intended to overrride all params in Jenkinsfile.
#
Expand Down Expand Up @@ -33,40 +40,33 @@ if [ -n "$CONFIG_PARAMETERS_SCRIPT_URL" ]; then
. "$TMP_PARAMS_OVERRIDE"
fi

# emulate "clean after checkout" of single branch pipeline
git clean -fdx

# download all additional repos containing extra notebooks to test
./downloadrepos
DOWNLOAD_ALL_DEFAULT_REPOS=false
downloadrepos_main

# 'export' useful vars so they can be used by the CONFIG_OVERRIDE_SCRIPT_URL in runtest.

# replace all slash (/) by dash (-) because (/) is illegal in folder name
# for branch name of the format "feature/my_wizbang-feature"
# github does the same when downloading repo archive by downloadrepos above
export PAVICS_SDI_BRANCH="`echo "$PAVICS_SDI_BRANCH" | sed "s@/@-@g"`"
export PAVICS_SDI_REPO_NAME="`echo "$PAVICS_SDI_REPO" | sed "s@^.*/@@g"`"
export FINCH_BRANCH="`echo "$FINCH_BRANCH" | sed "s@/@-@g"`"
export FINCH_REPO_NAME="`echo "$FINCH_REPO" | sed "s@^.*/@@g"`"
export PAVICS_LANDING_BRANCH="`echo "$PAVICS_LANDING_BRANCH" | sed "s@/@-@g"`"
export PAVICS_LANDING_REPO_NAME="`echo "$PAVICS_LANDING_REPO" | sed "s@^.*/@@g"`"
export RAVEN_BRANCH="`echo "$RAVEN_BRANCH" | sed "s@/@-@g"`"
export RAVEN_REPO_NAME="`echo "$RAVEN_REPO" | sed "s@^.*/@@g"`"
export RAVENPY_BRANCH="`echo "$RAVENPY_BRANCH" | sed "s@/@-@g"`"
export RAVENPY_REPO_NAME="`echo "$RAVENPY_REPO" | sed "s@^.*/@@g"`"
export ESGF_COMPUTE_API_BRANCH="`echo "$ESGF_COMPUTE_API_BRANCH" | sed "s@/@-@g"`"
export ESGF_COMPUTE_API_REPO_NAME="`echo "$ESGF_COMPUTE_API_REPO" | sed "s@^.*/@@g"`"

# branches that have allowed characters such as '+' other than alphanum, '-' and '_' are converted to '-' in archives
export PAVICS_SDI_DIR=`echo "${PAVICS_SDI_REPO_NAME}-${PAVICS_SDI_BRANCH}" | sed "s@[^a-zA-Z0-9_\-\.]@-@g"`
export FINCH_DIR=`echo "${FINCH_REPO_NAME}-${FINCH_BRANCH}" | sed "s@[^a-zA-Z0-9_\-\.]@-@g"`
export PAVICS_LANDING_DIR=`echo "${PAVICS_LANDING_REPO_NAME}-${PAVICS_LANDING_BRANCH}" | sed "s@[^a-zA-Z0-9_\-\.]@-@g"`
export RAVEN_DIR=`echo "${RAVEN_REPO_NAME}-${RAVEN_BRANCH}" | sed "s@[^a-zA-Z0-9_\-\.]@-@g"`
export RAVENPY_DIR=`echo "${RAVENPY_REPO_NAME}-${RAVENPY_BRANCH}" | sed "s@[^a-zA-Z0-9_\-\.]@-@g"`
export ESGF_COMPUTE_API_DIR=`echo "${ESGF_COMPUTE_API_REPO_NAME}-${ESGF_COMPUTE_API_BRANCH}" | sed "s@[^a-zA-Z0-9_\-\.]@-@g"`
export PAVICS_SDI_BRANCH="$(sanitize_branch_name "$PAVICS_SDI_BRANCH")"
export PAVICS_SDI_REPO_NAME="$(extract_repo_name "$PAVICS_SDI_REPO")"
export FINCH_BRANCH="$(sanitize_branch_name "$FINCH_BRANCH")"
export FINCH_REPO_NAME="$(extract_repo_name "$FINCH_REPO")"
export PAVICS_LANDING_BRANCH="$(sanitize_branch_name "$PAVICS_LANDING_BRANCH")"
export PAVICS_LANDING_REPO_NAME="$(extract_repo_name "$PAVICS_LANDING_REPO")"
export RAVEN_BRANCH="$(sanitize_branch_name "$RAVEN_BRANCH")"
export RAVEN_REPO_NAME="$(extract_repo_name "$RAVEN_REPO")"
export RAVENPY_BRANCH="$(sanitize_branch_name "$RAVENPY_BRANCH")"
export RAVENPY_REPO_NAME="$(extract_repo_name "$RAVENPY_REPO")"
export ESGF_COMPUTE_API_BRANCH="$(sanitize_branch_name "$ESGF_COMPUTE_API_BRANCH")"
export ESGF_COMPUTE_API_REPO_NAME="$(extract_repo_name "$ESGF_COMPUTE_API_REPO")"

export PAVICS_SDI_DIR="$(sanitize_extracted_folder_name "${PAVICS_SDI_REPO_NAME}-${PAVICS_SDI_BRANCH}")"
export FINCH_DIR="$(sanitize_extracted_folder_name "${FINCH_REPO_NAME}-${FINCH_BRANCH}")"
export PAVICS_LANDING_DIR="$(sanitize_extracted_folder_name "${PAVICS_LANDING_REPO_NAME}-${PAVICS_LANDING_BRANCH}")"
export RAVEN_DIR="$(sanitize_extracted_folder_name "${RAVEN_REPO_NAME}-${RAVEN_BRANCH}")"
export RAVENPY_DIR="$(sanitize_extracted_folder_name "${RAVENPY_REPO_NAME}-${RAVENPY_BRANCH}")"
export ESGF_COMPUTE_API_DIR="$(sanitize_extracted_folder_name "${ESGF_COMPUTE_API_REPO_NAME}-${ESGF_COMPUTE_API_BRANCH}")"

# lowercase VERIFY_SSL string
VERIFY_SSL="`echo "$VERIFY_SSL" | tr '[:upper:]' '[:lower:]'`"
VERIFY_SSL="$(lowercase "$VERIFY_SSL")"
if [ x"$VERIFY_SSL" = xfalse ]; then
# if Env var DISABLE_VERIFY_SSL is present, notebook should disable ssl
# cert verification
Expand All @@ -76,26 +76,6 @@ if [ x"$VERIFY_SSL" = xfalse ]; then
echo "setting env var DISABLE_VERIFY_SSL for notebooks"
fi

# presence of setup.cfg, tox.ini, pyproject.toml files confuse py.test execution rootdir discovery
rm -v $FINCH_REPO_NAME-$FINCH_BRANCH/setup.cfg
rm -v $RAVEN_REPO_NAME-$RAVEN_BRANCH/setup.cfg
rm -v $RAVEN_REPO_NAME-$RAVEN_BRANCH/pyproject.toml
rm -v $RAVENPY_REPO_NAME-$RAVENPY_BRANCH/setup.cfg
rm -v $RAVENPY_REPO_NAME-$RAVENPY_BRANCH/tox.ini
rm -v $RAVENPY_REPO_NAME-$RAVENPY_BRANCH/pyproject.toml
rm -v $ESGF_COMPUTE_API_REPO_NAME-$ESGF_COMPUTE_API_BRANCH/setup.cfg
rm -v $ESGF_COMPUTE_API_REPO_NAME-$ESGF_COMPUTE_API_BRANCH/tox.ini

# lowercase
TEST_MAGPIE_AUTH="`echo "$TEST_MAGPIE_AUTH" | tr '[:upper:]' '[:lower:]'`"
TEST_PAVICS_SDI_REPO="`echo "$TEST_PAVICS_SDI_REPO" | tr '[:upper:]' '[:lower:]'`"
TEST_PAVICS_SDI_WEAVER="`echo "$TEST_PAVICS_SDI_WEAVER" | tr '[:upper:]' '[:lower:]'`"
TEST_FINCH_REPO="`echo "$TEST_FINCH_REPO" | tr '[:upper:]' '[:lower:]'`"
TEST_PAVICS_LANDING_REPO="`echo "$TEST_PAVICS_LANDING_REPO" | tr '[:upper:]' '[:lower:]'`"
TEST_RAVEN_REPO="`echo "$TEST_RAVEN_REPO" | tr '[:upper:]' '[:lower:]'`"
TEST_RAVENPY_REPO="`echo "$TEST_RAVENPY_REPO" | tr '[:upper:]' '[:lower:]'`"
TEST_ESGF_COMPUTE_API_REPO="`echo "$TEST_ESGF_COMPUTE_API_REPO" | tr '[:upper:]' '[:lower:]'`"
TEST_LOCAL_NOTEBOOKS="`echo "$TEST_LOCAL_NOTEBOOKS" | tr '[:upper:]' '[:lower:]'`"

NOTEBOOKS_TO_TEST=""
if [ x"$TEST_MAGPIE_AUTH" = xtrue ]; then
Expand All @@ -108,6 +88,7 @@ if [ x"$TEST_PAVICS_SDI_REPO" = xtrue ]; then
fi
fi
if [ x"$TEST_FINCH_REPO" = xtrue ]; then
delete_files_confusing_pytest "$FINCH_DIR"
NOTEBOOKS_TO_TEST="$NOTEBOOKS_TO_TEST ${FINCH_DIR}/docs/source/notebooks/*.ipynb"
fi

Expand All @@ -126,13 +107,16 @@ if [ x"$TEST_PAVICS_LANDING_REPO" = xtrue ]; then
fi

if [ x"$TEST_RAVEN_REPO" = xtrue ]; then
delete_files_confusing_pytest "$RAVEN_DIR"
NOTEBOOKS_TO_TEST="$NOTEBOOKS_TO_TEST ${RAVEN_DIR}/docs/source/notebooks/*.ipynb"
fi
if [ x"$TEST_RAVENPY_REPO" = xtrue ]; then
delete_files_confusing_pytest "$RAVENPY_DIR"
NOTEBOOKS_TO_TEST="$NOTEBOOKS_TO_TEST ${RAVENPY_DIR}/docs/notebooks/*.ipynb"
NOTEBOOKS_TO_TEST="$NOTEBOOKS_TO_TEST ${RAVENPY_DIR}/docs/notebooks/paper/*.ipynb"
fi
if [ x"$TEST_ESGF_COMPUTE_API_REPO" = xtrue ]; then
delete_files_confusing_pytest "$ESGF_COMPUTE_API_DIR"
NOTEBOOKS_TO_TEST="$NOTEBOOKS_TO_TEST ${ESGF_COMPUTE_API_DIR}/examples/*.ipynb"
fi

Expand Down