Skip to content

Commit

Permalink
add akm_tools , tests, and a simple github workflow config
Browse files Browse the repository at this point in the history
  • Loading branch information
MohitYadav-codes committed Apr 12, 2024
1 parent 93889ec commit 1028acb
Show file tree
Hide file tree
Showing 26 changed files with 1,797 additions and 0 deletions.
29 changes: 29 additions & 0 deletions .devcontainer/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Use the official lightweight Python image.
# https://hub.docker.com/_/python
FROM python:3.10.13 AS base

# Install system dependencies required for Poetry
RUN apt-get update \
&& apt-get install -y curl git \
# && apt-get install -y curl build-essential git \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*

# Install Poetry using recommended installer script
ENV POETRY_VERSION=1.7.0 \
# Install Poetry globally
POETRY_HOME="/usr/local" \
POETRY_NO_INTERACTION=1 \
# Ensure that the poetry path is in the PATH
PATH="/usr/local/bin:$PATH"

# Install Poetry - respects $POETRY_VERSION
RUN curl -sSL https://install.python-poetry.org | python3 -

# Create a non-root user and switch to it
RUN useradd --create-home akm_user

# Switch to the non-root user
USER akm_user

CMD ["bash"]
16 changes: 16 additions & 0 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
{
"name": "akm-develop",
// "context" is the path that the Codespaces docker build command should be run from, relative to devcontainer.json
"context": "..",
"dockerFile": "Dockerfile",

"customizations": {
"vscode": {
"extensions": ["ms-python.python","redhat.vscode-yaml","ms-vscode.makefile-tools"],
"settings": {
"terminal.integrated.defaultProfile.linux": "bash"
}
}
},
"postCreateCommand": "poetry install && poetry shell"
}
35 changes: 35 additions & 0 deletions .github/workflows/run_pytest.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
name: Pytest Workflow

# Triggers the workflow on push or pull request events for the main branch
on:
pull_request:
push:
branches:
- "**"

jobs:
test:
name: Run Pytest
runs-on: ubuntu-latest

steps:
- name: Checkout repository
uses: actions/checkout@v4

- name: setup python
uses: actions/setup-python@v5
with:
python-version: '3.10.13'

- name: install poetry
run: |
curl -sSL https://install.python-poetry.org | python3 -
- name: install pacakges
run: |
poetry config virtualenvs.in-project true
poetry install
- name: run pytest
run: |
poetry run pytest tests
159 changes: 159 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,159 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
*,cover

# Translations
*.mo
*.pot

# Django stuff:
*.log
*.log.*
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
target/

# Jupyter Notebook
.ipynb_checkpoints

# IPython
profile_default/
ipython_config.py

# pyenv
.python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock

# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# Shell script unit test framework
scripts/shunit2

# Sonarqube
.scannerwork/

# Other
_deps
.pytype
.vagrant
site/
.idea
.env-vlab*
.benchmarks
.conan-config
.vscode
mosquitto.conf

# local dbs
/*.db

## local temp files and logs on root level
/*.xlsx
/*.sql
/*.json
/*.log
/*.yaml
17 changes: 17 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1 +1,18 @@
# Automotive Knowledge Model (AKM)
This repo contains the Automotive Knowledge Model (AKM), an open-source data model and metadata catalog for transmitting vehicle signals in a consistent manner. Please see the overview.md file in the Documentation folder for more information. That folder also contains markdown files for principles and usage guidelines.

The repo has four subfolders:

- **\schema** contains the JSON Schema file(s) that provide the structure and meaning of the automotive metadata files
- **\data** contains the JSON documents that contain the actual Automotive metadata
- **\documentation** contains markdown files that explain aspects of the AKM
- **\rdf** contains a turtle file that expresses the structure and metadata in an ontology

The repo is currently in an alpha release state and should be considered a **DRAFT**. It requires the following work to make it generally available:

- The format and structure of the JSON Schema document requires testing, restructuring, and other quality reviews.
- The tooling that converts VSS to other formats must be available to the AKM. (The AKM tools should be simpler because of the many existing JSON Schema libraries, tooling, etc.)
- The processing of VSS overlays should be supported.
- An assessment of how exposing a DAG that is not necessarily the VSS tree would affect the [VISS](https://www.w3.org/TR/viss2-core/).
- The structure of the data subfolders should be appraised.
- The generation of JSON data documents from RDF and vice versa should be developed.
37 changes: 37 additions & 0 deletions akm_tools/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# AKM Tools

This project provides a set of tools for parsing, validating, and exporting Automotive Knowledge Model (AKM) data. It supports handling JSON files from specified directories, validating them against given schemas, and exporting the validated data into different formats including JSON and YAML. The functionality is encapsulated into a Python script that can be executed from the command line, offering flexibility for automation and integration into larger systems or workflows.

## Features

- **Data Validation**: Validate the combined data against a provided schema and optional extended schemas.
- **Data Exporting**: Export the validated data into various formats such as JSON and YAML. Support for GraphQL export is planned but not yet implemented.


## Usage

The main functionality is accessed through the command line interface (CLI) provided by `akm_parser.py`. Below are the available options and their descriptions:

### Command Line Arguments

- `-d`, `--model_data_folder`: Specifies the directory containing AKM model data in JSON format. Default is `akm/data`.
- `-s`, `--schema`: Specifies the schema file against which the data will be validated. Default is `akm/schema/automotive_knowledge_model.json`.
- `-xs`, `--extended_schema_dir`: Specifies the directory containing extended schema files for validation. Default is `extensions/schema`.
- `-xd`, `--extended_data_dir`: Specifies the directory containing extended data. Default is `extensions/data`.

- `-e`, `--export_format`: Specifies the format for exporting validated data. Options are `json`, `yaml`, and `graphql`.
- `-f`, `--export_file_path`: Specifies the path for the export file. Required if `--export_format` is specified.

### Example Commands

Validate data without exporting:
```
python akm_tools/akm_parser.py -xd your_extended_data_folder
```

Export validated data to JSON:
```
python akm_tools/akm_parser.py -d your_model_data_folder -e json -f path/to/export.json
```
### Logging
Validation errors are logged to validation_errors.log, aiding in troubleshooting and ensuring data quality.
Empty file added akm_tools/__init__.py
Empty file.
Loading

0 comments on commit 1028acb

Please sign in to comment.