This python server is made to launch Jupyter notebooks (*.ipynb) and get results from them.
notebooks
(Default value: /home/jupyter-notebook/) - path to the directory with notebooks.j-sp
search files withipynb
extension recursively in the specified folder.results
(Default value: /home/jupyter-notebook/results) - path to the directory for run results.j-sp
resolves result file withjsonl
extension against specified folder.logs
(Default value: /home/jupyter-notebook/logs) - path to the directory for run logs.j-sp
puts run logs to specified folder.out-of-use-engine-time
(Default value: 3600) - out-of-use time interval in seconds.j-sp
unregisters engine related to a notebook when user doesn't run the notebook more than this time
/home/json-stream/
- is home folder for j-sp. It can contain: installed python library, pip.conf and other useful files for run notebooks./home/jupyter-notebook/
- is shared folder between this tool and any source of notebooks. In general j-sp should be run with jupyter notebook/lab/hub. User can develop / debug a notebook in the jupyter and run via j-sp
th2-rpt-viewer since the 5.2.7-TH2-5142-9348403860
version can interact with j-sp by the http://<cluster>:<port>/th2-<schema>/json-stream-provider/
URL
j-sp use pod resources to run notebooks. Please calculate required resource according to solving issues.
apiVersion: th2.exactpro.com/v2
kind: Th2Box
metadata:
name: json-stream-provider
spec:
imageName: ghcr.io/th2-net/th2-json-stream-provider-py
imageVersion: 0.0.2
type: th2-rpt-data-provider
customConfig:
notebooks: /home/jupyter-notebook/
results: /home/jupyter-notebook/j-sp/results/
logs: /home/jupyter-notebook/j-sp/logs/
out-of-use-engine-time: 3600
mounting:
- path: /home/jupyter-notebook/
pvcName: jupyter-notebook
- path: /home/json-stream/
pvcName: json-stream-provider
resources:
limits:
memory: 1000Mi
cpu: 1000m
requests:
memory: 100Mi
cpu: 100m
service:
enabled: true
ingress:
urlPaths:
- '/json-stream-provider/'
clusterIP:
- name: backend
containerPort: 8080
port: 8080
- Cell tagged
parameters
(required)- this cell should be listed parameters only.
- parameters could have typing and value, but value must be constant and have primitive type like boolean, number, string.
- required parameters:
output_path
- path to JSONL file. Server considers a content of this file as run results.
- Cell with dependencies (optional) - server doesn't include third-party packages by default.
You can install / uninstall packages required for your code in one of cells. All installed packages are shared between runs any notebook.
Installation example:
import sys !{sys.executable} -m pip install <package_name>==<package_version>
You can put required files for you jupyter notebooks into local-run/with-jupyter-notebook/user_data
folder. Please note that this folder is read-only for containers.
Or you can mount own folder by changing value of USER_DATA_DIR
environment variable in the local-run/with-jupyter-notebook/.evn
file.
Or change the local-run/with-jupyter-notebook/compose.yml
file. Please note you should mount the same dictionary by the same path to jupyter_notebook
and json_stream_provider
services.
jupyter-notebook
and json-stream-provider
use user from default linux users
group.
It means that:
user_data
folder internal folder should haverwx
permission forusers
group.- files in
user_data
folder should haverw
permission forusers
group.
Perhaps you will need sudo permission for the next commands
cd local-run/with-jupyter-notebook
chgrp -R users user_data/
chmod -R g=u user_data/
cd local-run/with-jupyter-notebook
docker compose up --build
cd local-run/with-jupyter-notebook
docker compose rm --force --volumes --stop
docker compose down --volumes
docker compose build
- http://localhost - th2-rpt-viewer
- http://localhost/jupyter - jupyter-notebook. You can authorise via token printed into
jupyter_notebook
logs:cd local-run/with-jupyter-notebook docker compose logs jupyter_notebook | grep 'jupyter/lab?token=' | tail -1 | cut -d '=' -f 2
- Custom engine holds separate papermill notebook client for each file.
- Added papermill custom engine to reuse it for notebook execution. A separate engine is registered for each notebook and unregistered after 1 hour out-of-use time by default.
- update local run with jupyter-notebook:
- updated th2-rpt-viewer:
JSON Reader
page pulls execution status each 50 ms instead of 1 secJSON Reader
page now uses virtuoso for rendering listsJSON Reader
page now has search, it's values could be loaded fromjson
file containing array of objects containingpattern
andcolor
fields for searching content. Execution of notebook could create such file and it will be loaded into UI if it would be created in path ofcustomization_path
parameter.- Added ability to create multiple
JSON Reader
pages. JSON Reader
page now has compare mode.
- updated th2-rpt-viewer:
- added
umask 0007
to~/.bashrc
file to provide rw file access forusers
group - added
/file
request for loading content of single jsonl file - removed ability to get any file from machine via
/file
REST APIs - added sorting on requests
/files/notebooks
and/files/results
- added
/files/all
request to list all files in/notebooks
and/results/
directories - added
convert_parameter
function for parsing parameter depending on it's type - update local run with jupyter-notebook:
- updated th2-rpt-viewer:
- added option to change default view type of result group
- added display of #display-table field in Table view type
- added option to view last N results of Notebook
- added validation of Notebook's parameters
- added timestamp and file path parameter types
- fixed clearing of Notebook's parameters on run
- increased width of parameters' inputs
- updated compose:
- changed use data access from
ro
torw
- changed use data access from
- updated th2-rpt-viewer:
- added
${HOME}/python/lib
intoPYTHONPATH
environment variable - update local run with jupyter-notebook:
- updated jupyter-notebook Dockerfile:
- used
jupyter/datascience-notebook:python-3.9
- defined
PYTHONPATH
,PIP_TARGET
environment variables
- used
- updated compose:
- added
python_lib
volume
- added
- updated jupyter-notebook Dockerfile:
- added saving of current tasks
- task contains status(success, failed, in progress) and id using which task can be stopped
- added end-point
/stop
for stopping requested task - updated end-point
/result
it now requests task by id and returns file, reason for failed run or informs that task is 'in progress' depending on task status
- Added
json-stream
user to users group - Added docker compose for local run