Skip to content

Commit

Permalink
Version 0.1.0 (#6)
Browse files Browse the repository at this point in the history
* Added definitions for data and analysis steps

* Fix misplaced quotation mark in string

* Removed Elephant individual

* Fixed base.owl and added remaining descriptions

* Main ontology source, Base and Parameters modules

* Added missing class to the description in the Base module

* Added label to Parameters module

* Added catalog files for Steps and Data modules

* Source code moved to src and all OWL definitions added

* Added documentation build and test visualization workflow

* Updated README.md

* Fixed documentation build script to enforce output folder

* Proofreading

* Revised normalization and class disjointness

* Revised normalization and class disjointness

* Proofreading

* Cleanup

* Added normalization

* Updated documentation build workflow

* Added authors and updated README

* Top NEAO OWL file importing all modules

* Revised namespaces

* Added WIDOCO configuration files to tracking

* Added WIDOCO configuration files to tracking

* Updated documentation build workflow

* Changed URI in configuration file

* Updated documentation build workflow

* Added missing domain and ranges for data properties

* Auto pruning of dangling docker images

* Fixed ORCID links

* Auto prune of stopped docker containers

* Fixed docker command errors

* Updated missing domains

* Updated ROBOT merge command

* Various items from code review for 0.1.0 release (#5)

Review of version 0.1.0, by mdenker

* Release build
* Parameter editing
* Minor fixes README
* Some changes to the top level classes.
* Refine comment descriptions in parameters.owl
* Updated the descriptions for 'LowPassFrequencyCutoff,' 'NumberFFTSamples,' and 'WindowLengthSamples' for clarity and precision. Additionally, updated the OWL API version metadata to reflect the latest generator version.
* Update rdfs:comments for clarity and modernize OWL API version
* Improved descriptions of `StatisticalAnalysis`, `DispersionStatisticalAnalysis`, and `TriggeredAverageAnalysis` for better clarity. Also updated the OWL API version to the latest.
* Revised various RDF comments to enhance clarity and precision in the descriptions of spike train synchrony analyses and spike waveform analyses. Minor wording changes aim to improve the readability and accuracy of definitions.
* Simplify rdfs:comment in multiple owl classes
* Corrected grammar and wording in rdfs:comment for several classes to improve readability and clarity. This includes minor changes like "described by" instead of "described in" and transforming verb forms for clarity. These changes ensure the descriptions are more precise and easier to understand.
* Fixed base.owl with proposed changes

* Added missing descriptions to parameters

* Added LICENSE.txt file

---------

Co-authored-by: Cristiano Köhler <[email protected]>
Co-authored-by: Michael Denker <[email protected]>
  • Loading branch information
3 people authored Dec 6, 2024
1 parent a229e59 commit 0f637c2
Show file tree
Hide file tree
Showing 42 changed files with 7,902 additions and 3 deletions.
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1 +1,3 @@
*.properties
src/**/*.properties
doc/docker/doc/

14 changes: 14 additions & 0 deletions AUTHORS.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# Authors and Contributors

The following people have contributed code and/or ideas to the current version of the Neuroelectrophysiology Analysis Ontology.

* Cristiano Köhler (orcid: 0000-0003-0503-5264) [1, 2]
* Michael Denker (orcid: 0000-0003-1255-7300) [1]

1. Institute for Advanced Simulation (IAS-6), Jülich Research Centre, Jülich, Germany
2. RWTH Aachen University, Aachen, Germany

# Contact data

If you have questions regarding the content of this repository, please send an
e-mail to [[email protected]](mailto:[email protected]).
396 changes: 396 additions & 0 deletions LICENSE.txt

Large diffs are not rendered by default.

67 changes: 65 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,69 @@
# Neuroelectrophysiology Analysis Ontology (NEAO)


The NEAO is a collection of controlled vocabularies and concepts to describe the typical processes involved in the analysis of neural activity data acquired using electrophysiology techniques.

The goal of the NEAO is to provide a common framework for annotating neuroelectrophysiology data analysis workflows. This will result in the unambiguous identification of the processes and data elements involved, and also extended semantic information that can be used to reason about the analyses.
The goal of NEAO is to provide a common framework for annotating neuroelectrophysiology data analysis workflows. This will result in the unambiguous identification of the processes and data elements involved, and also extended semantic information that can be used to reason about the analyses.

The published ontology is accessible through the main namespace http://purl.org/neao. This repository contains the sources and centralizes the development efforts within the community.

For requests or additions to NEAO, please first open an issue in the [issue tracker](https://github.com/INM-6/neuroephys_analysis_ontology/issues).

## Table of contents

- [Code repository](#code-repository)
- [Acknowledgments](#acknowledgments)
- [License](#license)

## Code repository

### OWL sources

NEAO was developed using [Protégé 5.5](https://protege.stanford.edu/software.php).

The ontology source code (OWL serialized as Turtle) is organized into subfolders inside the `\src` folder:

- `base`: the top-level classes of the NEAO model that are imported by other modules. It defines the three main classes, the software implementation description, and all the related properties.
- `steps`: module to extend the **AnalysisStep** base class in order to define the specific analysis steps and their semantic groupings.
- `data`: module to extend the **Data** base class in order to define the specific data entities and their semantic groupings.
- `parameters`: module to extend the **AnalysisParameter** base class in order to define the specific parameters and their semantic groupings.
- `bibliography`: module that defines individuals with the bibliographic references used to annotate **AnalysisStep** classes.

Each folder contains the OWL file with the source, and an XML catalog file that is used to perform local imports correctly when loading into Protégé. A top-level OWL file `neao.owl` is contained in `\src` and imports all the submodules to produce the final NEAO OWL source. This file should be edited only for the NEAO ontology metadata.

### Documentation

The documentation source is found in the `/doc` folder:

- The HTML pages are built using [WIDOCO](https://github.com/dgarijo/Widoco). NEAO is based on the JAR release that must be downloaded to the local system ([available here](https://github.com/dgarijo/WIDOCO/releases/latest)). The recommended path is `~/opt/widoco/widoco.jar`.

For the documentation build, `xml_grep` from [xmltwig](https://github.com/mirod/xmltwig/tree/master) is needed to post-process the generated files. In Ubuntu or other Debian based Linux distributions, it can be installed using a package manager:

`apt install xml-twig-tools`

Also, as part of the documentation and release process, the code from all modules is merged into a single OWL source using [ROBOT](https://robot.obolibrary.org/). ROBOT must be installed and accessible in the shell, i.e., by runing the `robot` command ('Getting Started' instructions [here]([https://robot.obolibrary.org/](https://robot.obolibrary.org/))). The recommended path for the JAR file and shell script is `/usr/local/bin` (but you can also add the folder with ROBOT files to the `PATH`).

The complete build is done by running the `build_doc.sh` bash script. The version must be passed as a script argument:

`./build_doc.sh x.x.x`

If the WIDOCO JAR is stored in a path other than `~/opt/widoco/widoco.jar` , it must be passed as a second argument:

`./build_doc.sh x.x.x local/path/to/widoco.jar`

- The subfolder `/doc/source` contains the configuration files, images, and plain HTML files used with WIDOCO. The documentation sources are separated for each ontology submodule, as WIDOCO generates a separate documentation page for each. After the typical documentation page is built, the generated HTML files are edited to remove sections that are not needed, and the custom HTML text (for the Introduction, Description, and Acknowledgement sections) is inserted.

- The subfolder `/doc/releases` contains the built HTML documentation and final OWL code of each version of NEAO released. Each release is stored in a subfolder named after the version (as defined when calling the `build_doc.sh` script).

- To visualize the documentation after the build, the generated files can be inserted into an Apache Docker image (this is required for the [WebVOWL](http://vowl.visualdataweb.org/webvowl.html) visualizations). The Docker files are located in the `/doc/docker` subfolder. Docker must be available and running in your system ([instructions here](https://docs.docker.com/engine/install/)).

First, the image must be built with the `build.sh` bash script (that will take the latest documentation files in `/doc/releases`).

After a successful build, the local server can be initialized by running the image with `run.sh`. Documentation will be accessible at [http://localhost:8080](http://localhost:8080).

## Acknowledgments

This work was performed as part of the [Helmholtz School for Data Science in Life, Earth and Energy (HDS-LEE)](https://hds-lee.de) and received funding from the Helmholtz Association of German Research Centres. This project has received funding from the European Union’s Horizon 2020 Framework Programme for Research and Innovation under Specific Grant Agreement No. 945539 (Human Brain Project SGA3), the European Union’s Horizon Europe Programme under the Specific Grant Agreement No. 101147319 ([EBRAINS](https://ebrains.eu) 2.0 Project), the Ministry of Culture and Science of the State of North Rhine-Westphalia, Germany (NRW-network "[iBehave](https://ibehave.nrw)", grant number: NW21-049), and the Joint Lab "Supercomputing and Modeling for the Human Brain."

## License

CC BY 4.0 (see [LICENSE.txt](LICENSE.txt))
215 changes: 215 additions & 0 deletions doc/build_doc.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,215 @@
#!/bin/bash
set -e

log_message () {
printf "\n\n[$(date)] $1\n\n"
}


insert_html () {
DEST_FILE=$1/index-en.html
DOC_FILES=$2
log_message "Inserting custom HTML text in '$DOC_FILES' into '$DEST_FILE'"
gawk -i inplace -v r="$(cat $DOC_FILES/introduction.html)" '{gsub(/introduction.html/,r)}1' $DEST_FILE
gawk -i inplace -v r="$(cat $DOC_FILES/description.html)" '{gsub(/description.html/,r)}1' $DEST_FILE
gawk -i inplace -v RS='{ "is multiline" }' -v r="<span class=\"markdown\">$(cat $DOC_SRC_FOLDER/acknowledgements.html)</span>" '{gsub(/(<p>\nThe authors would like to thank.*documentation\.<\/p>)/,r)}1' $DEST_FILE
}


clean_section () {
DEST_FILE=$1/index-en.html
log_message "Cleaning '$2' from '$DEST_FILE'"
CONDITION="div[@id=\"$2\"]"
xml_grep --html --exclude "$CONDITION" $DEST_FILE > $DEST_FILE.tmp
mv $DEST_FILE.tmp $DEST_FILE
}


clean_annotation () {
DEST_FILE=$1
log_message "Cleaning information for annotation property '$2' from '$DEST_FILE'"

# List headers
CONDITION="a[@title=\"$2\"]"
xml_grep --html --exclude "$CONDITION" $DEST_FILE > $DEST_FILE.tmp
mv $DEST_FILE.tmp $DEST_FILE

# Property panel
CONDITION="div[@id=\"$2\"]"
xml_grep --html --root 'div[@id="annotationproperties"]' --exclude "$CONDITION" $DEST_FILE > $DEST_FILE.tmp
mv $DEST_FILE.tmp $DEST_FILE
}

clean_annotation_properties () {
DEST_FILE=$1/index-en.html
echo "Cleaning annotation_properties from '$DEST_FILE'"
clean_annotation "$DEST_FILE" "http://purl.org/dc/terms/creator"
clean_annotation "$DEST_FILE" "http://purl.org/dc/terms/created"
clean_annotation "$DEST_FILE" "http://purl.org/dc/terms/license"
clean_annotation "$DEST_FILE" "http://www.w3.org/2004/02/skos/core#prefLabel"
clean_annotation "$DEST_FILE" "http://purl.org/vocab/vann/preferredNamespacePrefix"
clean_annotation "$DEST_FILE" "http://purl.org/vocab/vann/preferredNamespaceUri"
gawk --i inplace -v RS='{ "is multiline" }' -v r="" '{gsub(/(<li>\n\s+\n\s+<\/li>)/,r)}1' $DEST_FILE
}


rename_files () {
BASE_FOLDER=$1
mv $BASE_FOLDER/index-en.html $BASE_FOLDER/index.html
mv $BASE_FOLDER/ontology.ttl $BASE_FOLDER/$2.ttl
mv $BASE_FOLDER/ontology.nt $BASE_FOLDER/$2.nt
mv $BASE_FOLDER/ontology.owl $BASE_FOLDER/$2.xml
mv $BASE_FOLDER/ontology.jsonld $BASE_FOLDER/$2.jsonld
}

remove_files () {
BASE_FOLDER=$1
rm $BASE_FOLDER/$2.ttl
rm $BASE_FOLDER/$2.nt
rm $BASE_FOLDER/$2.xml
rm $BASE_FOLDER/$2.jsonld
}

# This script requires xml_grep
if ! command -v xml_grep &> /dev/null
then
printf "\nxml_grep is needed. Please install it using your package manager (e.g., 'apt install xml-twig-tools').\n\n"
exit
fi

# This script requires ROBOT
if ! command -v robot &> /dev/null
then
printf "\nROBOT is needed. Please install it from 'https://robot.obolibrary.org'.\n\n"
exit
fi

# Get release version from command line
CURRENT_RELEASE=$1
if [ -z "$CURRENT_RELEASE" ]
then
printf "\nProvide the current release version to identify which documentation will be built.\n"
printf "The files of the requested release will be built and stored in './releases/<version>'.\n\n"
exit
fi

# Get WIDOCO JAR path. It can be passed by command line as the second argument
# Otherwise it is assumed to be stored in ~/opt/widoco/widoco.jar
WIDOCO_JAR=$2
if [ -z "$WIDOCO_JAR" ]
then
WIDOCO_JAR="$HOME/opt/widoco/widoco.jar"
fi

# Check if WIDOCO is working
if ! java -jar $WIDOCO_JAR --version
then
printf "\nWIDOCO JAR file '$WIDOCO_JAR' not found or does not run.\n\n"
printf "1. Please, provide the path to the JAR file via the CLI:\n\n"
printf " build_doc.sh <build version> <path to widoco jar>\n\n"
printf "2. Check that Java is configured correctly.\n\n"
exit
fi

# Create release folder if needed
mkdir -p ./releases

# Define folders
DOC_FOLDER=$(realpath ./releases/$CURRENT_RELEASE)
SRC_FOLDER=$(realpath ../src)
DOC_SRC_FOLDER=$(realpath ./source)
log_message "Using NEAO sources in '$SRC_FOLDER'"
log_message "Using documentation configuration and source files in '$DOC_SRC_FOLDER'"

# Clean current build
log_message "Cleaning any previous build in '$DOC_FOLDER'"
rm -rf $DOC_FOLDER
mkdir -p $DOC_FOLDER

NEAO_MERGED_SRC=$SRC_FOLDER/neao_merge.owl
rm -f $NEAO_MERGED_SRC

# Build merged source with ROBOT
log_message "Generating merged OWL source"
robot merge --input $SRC_FOLDER/neao.owl --collapse-import-closure true --output $NEAO_MERGED_SRC

# Run WIDOCO to build the documentation
# A WIDOCO run is done for each module
# The HTML output can be post-processed to tweak the visualization

# Main page
log_message "Building main documentation page"
java -jar $WIDOCO_JAR -ontFile $NEAO_MERGED_SRC -outFolder $DOC_FOLDER -uniteSections -rewriteAll -confFile $DOC_SRC_FOLDER/config.properties
mv $DOC_FOLDER/doc/* $DOC_FOLDER # Docs are actually put into a subfolder
rmdir $DOC_FOLDER/doc
clean_section "$DOC_FOLDER" "abstract"
clean_section "$DOC_FOLDER" "namespacedeclarations"
clean_section "$DOC_FOLDER" "overview"
clean_section "$DOC_FOLDER" "crossref"
clean_section "$DOC_FOLDER" "references"
insert_html "$DOC_FOLDER" "$DOC_SRC_FOLDER"
rename_files "$DOC_FOLDER" "neao"


# Base module
log_message "Building base module documentation"
java -jar $WIDOCO_JAR -ontFile $SRC_FOLDER/base/base.owl -outFolder $DOC_FOLDER/base -webVowl -uniteSections -rewriteAll -includeAnnotationProperties -doNotDisplaySerializations -confFile $DOC_SRC_FOLDER/base/config-base.properties
clean_section "$DOC_FOLDER/base" "abstract"
clean_section "$DOC_FOLDER/base" "references"
clean_annotation_properties "$DOC_FOLDER/base"
insert_html "$DOC_FOLDER/base" "$DOC_SRC_FOLDER/base"
rename_files "$DOC_FOLDER/base" "base"
remove_files "$DOC_FOLDER/base" "base"


# Steps module
log_message "Building steps module documentation"
java -jar $WIDOCO_JAR -ontFile $SRC_FOLDER/steps/steps.owl -outFolder $DOC_FOLDER/steps -webVowl -uniteSections -rewriteAll -ignoreIndividuals -doNotDisplaySerializations -confFile $DOC_SRC_FOLDER/steps/config-steps.properties
clean_section "$DOC_FOLDER/steps" "abstract"
clean_section "$DOC_FOLDER/steps" "references"
insert_html "$DOC_FOLDER/steps" "$DOC_SRC_FOLDER/steps"
rename_files "$DOC_FOLDER/steps" "steps"
remove_files "$DOC_FOLDER/steps" "steps"


# Data module
log_message "Building data module documentation"
java -jar $WIDOCO_JAR -ontFile $SRC_FOLDER/data/data.owl -outFolder $DOC_FOLDER/data -webVowl -uniteSections -rewriteAll -ignoreIndividuals -doNotDisplaySerializations -confFile $DOC_SRC_FOLDER/data/config-data.properties
clean_section "$DOC_FOLDER/data" "abstract"
clean_section "$DOC_FOLDER/data" "references"
insert_html "$DOC_FOLDER/data" "$DOC_SRC_FOLDER/data"
rename_files "$DOC_FOLDER/data" "data"
remove_files "$DOC_FOLDER/data" "data"


# Parameters module
log_message "Building parameters module documentation"
java -jar $WIDOCO_JAR -ontFile $SRC_FOLDER/parameters/parameters.owl -outFolder $DOC_FOLDER/parameters -webVowl -uniteSections -ignoreIndividuals -rewriteAll -doNotDisplaySerializations -confFile $DOC_SRC_FOLDER/parameters/config-parameters.properties
clean_section "$DOC_FOLDER/parameters" "abstract"
clean_section "$DOC_FOLDER/parameters" "references"
insert_html "$DOC_FOLDER/parameters" "$DOC_SRC_FOLDER/parameters"
rename_files "$DOC_FOLDER/parameters" "parameters"
remove_files "$DOC_FOLDER/parameters" "parameters"


# Bibliography module
log_message "Building bibliography module documentation"
java -jar $WIDOCO_JAR -ontFile $SRC_FOLDER/bibliography/bibliography.owl -outFolder $DOC_FOLDER/bibliography -uniteSections -rewriteAll -doNotDisplaySerializations -confFile $DOC_SRC_FOLDER/bibliography/config-bibliography.properties
clean_section "$DOC_FOLDER/bibliography" "abstract"
clean_section "$DOC_FOLDER/bibliography" "references"
insert_html "$DOC_FOLDER/bibliography" "$DOC_SRC_FOLDER/bibliography"
rename_files "$DOC_FOLDER/bibliography" "bibliography"
remove_files "$DOC_FOLDER/bibliography" "bibliography"


# Copy images
log_message "Copying images"
cp -r $DOC_SRC_FOLDER/images $DOC_FOLDER/images


# Cleanup
log_message "Removing temporary and readme files"
rm -rf ./tmp*
find $DOC_FOLDER -name 'readme.md' -delete

log_message "Documentation for $CURRENT_RELEASE built!"
2 changes: 2 additions & 0 deletions doc/docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
FROM httpd:2.4
COPY ./doc/ /usr/local/apache2/htdocs/
26 changes: 26 additions & 0 deletions doc/docker/build.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#!/bin/bash

RELEASE_PATH=$(realpath ../releases/*)

# Get latest version
if ! ls -ad $RELEASE_PATH
then
printf "\n\nNo documentation found in '$RELEASE_PATH'\n"
printf "Please, build the documentation first with $(realpath ../build_doc.sh)\n\n"
exit
fi
LATEST_VERSION=$(ls -ad $RELEASE_PATH | tail -n 1)

# Copy updated documentation files
rm -rf ./doc
cp -r $LATEST_VERSION ./doc

# Remove any running container
docker stop $(docker ps -q --filter "name=neao_doc") >/dev/null 2>&1
docker container rm $(docker ps -a -q --filter "name=neao_doc") >/dev/null 2>&1

# Build docker image with Apache and documentation
docker build --no-cache -t neao_doc .

# Prune old images
docker image prune -f
10 changes: 10 additions & 0 deletions doc/docker/run.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
#!/bin/bash

# Remove any running container
docker stop $(docker ps -q --filter "name=neao_doc") >/dev/null 2>&1
docker container rm $(docker ps -a -q --filter "name=neao_doc") >/dev/null 2>&1

# Run image
docker run -dit --name neao_doc -p 8080:80 neao_doc

echo "open http://localhost:8080/ for visualizing the documentation"
3 changes: 3 additions & 0 deletions doc/source/acknowledgements.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
<p>This work was performed as part of the <a href="https://hds-lee.de">Helmholtz School for Data Science in Life, Earth and Energy (HDS-LEE)</a> and received funding from the Helmholtz Association of German Research Centres. This project has received funding from the European Union’s Horizon 2020 Framework Programme for Research and Innovation under Specific Grant Agreement No. 945539 (Human Brain Project SGA3), the European Union’s Horizon Europe Programme under the Specific Grant Agreement No. 101147319 (<a href="https://ebrains.eu">EBRAINS</a> 2.0 Project), the Ministry of Culture and Science of the State of North Rhine-Westphalia, Germany (NRW-network "<a href="https://ibehave.nrw">iBehave</a>", grant number: NW21-049), and the Joint Lab "Supercomputing and Modeling for the Human Brain."</p>

<p>The authors would like to thank <a href="http://www.essepuntato.it/">Silvio Peroni</a> for developing <a href="http://www.essepuntato.it/lode">LODE</a>, a Live OWL Documentation Environment, which is used for representing the Cross Referencing Section of this document and <a href="https://w3id.org/people/dgarijo">Daniel Garijo</a> for developing <a href="https://github.com/dgarijo/Widoco">Widoco</a>, the program used to create the template used in this documentation.</p>
Loading

0 comments on commit 0f637c2

Please sign in to comment.