Skip to content

Commit

Permalink
Merge remote-tracking branch 'interuss/main' into fix-interuss#244
Browse files Browse the repository at this point in the history
  • Loading branch information
mickmis committed Oct 20, 2023
2 parents 2d94d2e + ba6597c commit e39aa47
Show file tree
Hide file tree
Showing 235 changed files with 6,012 additions and 2,341 deletions.
50 changes: 37 additions & 13 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ jobs:
echo "Branch: ${{ github.ref }}"
docker images
- name: Checkout
uses: actions/checkout@v2
uses: actions/checkout@v4
with:
submodules: true
- name: Python lint
Expand Down Expand Up @@ -60,18 +60,6 @@ jobs:
cd monitoring/uss_qualifier
make test
uss_qualifier_F3411-19-test:
name: uss_qualifier F3411-19 tests
uses: ./.github/workflows/monitoring-test.yml
with:
name: uss_qualifier_F3411-19
script: |
export CONFIG_NAME=configurations.dev.netrid_v19 \
USS_QUALIFIER_STOP_FAST=true
cd monitoring/uss_qualifier
make test
prober-test:
name: prober tests
uses: ./.github/workflows/monitoring-test.yml
Expand All @@ -80,3 +68,39 @@ jobs:
script: |
cd monitoring/prober
make test
publish-gh-pages:
name: Publish GitHub Pages
needs: [hygiene-tests, monitorlib-test, mock_uss-test, uss_qualifier-test, prober-test]
if: ${{ always() && contains(join(needs.*.result, ','), 'success') }}
runs-on: ubuntu-latest
permissions:
contents: write
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
submodules: true
path: monitoring

- name: Get uss_qualifier reports
uses: actions/download-artifact@v3
with:
name: monitoring-test-uss_qualifier-reports
path: ./artifacts

- name: Make site content
run: ./monitoring/github_pages/make_site_content.sh

- name: Deploy
uses: peaceiris/actions-gh-pages@v3
if: github.ref == 'refs/heads/main'
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./public
enable_jekyll: true
user_name: 'github-actions[bot]'
user_email: 'github-actions[bot]@users.noreply.github.com'
commit_message: ${{ github.event.head_commit.message }}
8 changes: 4 additions & 4 deletions .github/workflows/dev-checks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,20 +9,20 @@ jobs:
name: Clone on Windows
runs-on: windows-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Checkout on Windows
run: echo "Project successfully cloned on ${{ runner.os }}. See `Set up Job` stage for more details about the Runner."
macos:
name: Clone on Mac
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Checkout on Mac
run: echo "Project successfully cloned on ${{ runner.os }}. See `Set up Job` stage for more details about the Runner."
ubuntu:
name: Clone on Ubuntu
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Checkout on Ubuntu
run: echo "Project successfully cloned on ${{ runner.os }}. See `Set up Job` stage for more details about the Runner."
run: echo "Project successfully cloned on ${{ runner.os }}. See `Set up Job` stage for more details about the Runner."
2 changes: 1 addition & 1 deletion .github/workflows/image-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ jobs:
docker images
- name: Checkout
uses: actions/checkout@v2
uses: actions/checkout@v4
with:
fetch-depth: 0

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/monitoring-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ jobs:
echo "Branch: ${{ github.ref }}"
docker images
- name: Checkout
uses: actions/checkout@v2
uses: actions/checkout@v4
with:
submodules: true
- name: Run ${{ inputs.name }} test
Expand Down
4 changes: 4 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,10 @@ Welcome to this repository and thank you for your interest in contributing to it

Contributions should follow [the general InterUSS contributions process](https://github.com/interuss/tsc/blob/main/repo_contributions.md). Additional information specific to this repository is provided below.

## Formatting and verification

This repository has a very strict Python linter, as well as very strict expected formats for a number of other artifacts such as Markdown files. Correct formatting can be verified with `make lint` from the repository root. But, in most cases manual formatting is not necessary to resolve issues -- instead, `make format` from the repository root should automatically reformat Python and most other mere-formatting issues without changing functionality. Because `make lint` is part of the integration tests, `make format` should generally be run before integration tests.

## Integration tests

When [a PR is created](https://github.com/interuss/tsc/blob/main/repo_contributions.md#create-draft-pr-in-interuss-repository), the [continuous integration (CI) tests for this repository](./.github/workflows/CI.md) will run, and the PR will generally not be reviewed until they pass (unless [committer help is requested](https://github.com/interuss/tsc/blob/main/repo_contributions.md#request-committer-help-via-comment-in-pr) to address the failure). See [the continuous integration test documentation](./.github/workflows/CI.md) for how to run these tests on your local system more quickly and efficiently to be confident your PR will pass the CI tests when created (or when updates are made).
Expand Down
5 changes: 2 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ format: json-schema
cd monitoring && make format

.PHONY: lint
lint:
lint: shell-lint
cd monitoring && make lint
cd schemas && make lint

Expand All @@ -35,8 +35,7 @@ validate-uss-qualifier-docs:

.PHONY: shell-lint
shell-lint:
echo "===== Checking DSS shell lint except monitoring =====" && find . -name '*.sh' | grep -v '^./interfaces/astm-utm' | grep -v '^./monitoring' | xargs docker run --rm -v "$(CURDIR):/monitoring" -w /monitoring koalaman/shellcheck
cd monitoring && make shell-lint
find . -name '*.sh' | grep -v '^./interfaces' | xargs docker run --rm -v "$(CURDIR):/monitoring" -w /monitoring koalaman/shellcheck

.PHONY: json-schema
json-schema:
Expand Down
9 changes: 7 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Monitoring Tools [![GoDoc](https://godoc.org/github.com/interuss/monitoring?status.svg)](https://godoc.org/github.com/interuss/monitoring)
# Monitoring Tools

<img src="assets/color_logo_transparent.png" width="200">

Expand All @@ -15,6 +15,11 @@ The monitoring tools target compliance with the following standards and regulati
- [ASTM F3548-21](https://www.astm.org/f3548-21.html): UAS Traffic Management (UTM) UAS
Service Supplier (USS) Interoperability Specification.
- [F3548-22 OpenAPI interface](./interfaces/astm-utm)
- Useful resources for understanding this standard include these Drone Talk videos:
- [Interoperability standard](https://www.youtube.com/watch?v=ukbjIU_Ojh0)
- [Interoperability standard, part 2](https://www.youtube.com/watch?v=qKW2PkzZ_mE)
- [DSS and ASTM UTM interoperability paradigm](https://youtu.be/Nh53ibxcnBM)
- [Operational intents](https://www.youtube.com/watch?v=lS6tTQTmVO4)

U-Space specific:
- [COMMISSION IMPLEMENTING REGULATION (EU) 2021/664](https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32021R0664&from=EN#d1e32-178-1)
Expand All @@ -26,7 +31,7 @@ services such as Remote ID (ASTM F3411-19/22) and Strategic Conflict Detection d
Management (UTM) UAS Service Supplier (USS) Interoperability Specification.

- [Introduction to monitoring, conformance and interoperability testing](./monitoring/README.md)<br>Modules:
- [USS qualifier](./monitoring/uss_qualifier)
- [USS qualifier](./monitoring/uss_qualifier) (automated testing framework)
- [DSS integration test: prober](./monitoring/prober)
- [DSS load test](./monitoring/loadtest)
- [Mock USS](./monitoring/mock_uss), with multiple capabilities
Expand Down
4 changes: 3 additions & 1 deletion build/dev/extract_json_field.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,5 +18,7 @@
try:
obj = obj[field]
except KeyError:
raise ValueError(f"Could not find field '{field}' in '{sys.argv[1]}' for {sys.argv[2]}; available keys: {list(obj.keys())}")
raise ValueError(
f"Could not find field '{field}' in '{sys.argv[1]}' for {sys.argv[2]}; available keys: {list(obj.keys())}"
)
print(obj)
3 changes: 3 additions & 0 deletions github_pages/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# GitHub Pages tools

This folder contains tools to publish content to [this repository's GitHub Pages site](https://interuss.github.io/monitoring/). Publishing is performed by [the CI](../.github/workflows/ci.yml) at appropriate times, generally when a PR is merged to the main branch. Current site content is pushed to the gh-pages branch of this repository upon publishing, and then that content is deployed to the site by GitHub.
16 changes: 16 additions & 0 deletions github_pages/make_site_content.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
#!/usr/bin/env bash

# This script generates the content for this repository's GitHub Pages site. It is invoked by the CI and expects the
# working folder to contain:
# ./monitoring: this repository
# ./artifacts/uss_qualifier/reports: Reports generated by from running uss_qualifier
#
# The content placed into ./public by this script will be published to the GitHub Pages site.

mkdir ./public
cp -r ./monitoring/github_pages/static/* ./public

mkdir -p ./public/artifacts/uss_qualifier/reports
cp -r ./artifacts/uss_qualifier/output/sequence_uspace ./public/artifacts/uss_qualifier/reports/sequence_uspace
cp -r ./artifacts/uss_qualifier/output/tested_requirements_uspace ./public/artifacts/uss_qualifier/reports/tested_requirements_uspace
cp -r ./artifacts/uss_qualifier/output/capabilities_uspace.html ./public/artifacts/uss_qualifier/reports/capabilities_uspace.html
13 changes: 13 additions & 0 deletions github_pages/static/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# InterUSS [`monitoring`](https://github.com/interuss/monitoring) generated content

This site contains content automatically generated by actions in the [InterUSS](https://interussplatform.org) [monitoring repository](https://github.com/interuss/monitoring).

## uss_qualifier [reports](https://github.com/interuss/monitoring/tree/main/monitoring/uss_qualifier/reports)

These reports were generated during continuous integration for the most recent PR merged to the main branch.

### [U-space developer](https://github.com/interuss/monitoring/blob/main/monitoring/uss_qualifier/configurations/dev/uspace.yaml) [test configuration](https://github.com/interuss/monitoring/tree/main/monitoring/uss_qualifier/configurations)

* [Sequence view](./artifacts/uss_qualifier/reports/sequence_uspace)
* [Tested requirements](./artifacts/uss_qualifier/reports/tested_requirements_uspace)
* [Demonstrated capabilities](./artifacts/uss_qualifier/reports/capabilities_uspace.html)
6 changes: 0 additions & 6 deletions monitoring/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,6 @@ python-lint:
cd monitorlib && make python-lint
cd prober && make python-lint

.PHONY: shell-lint
shell-lint:
cd uss_qualifier && make shell-lint
cd mock_uss && make shell-lint
cd prober && make shell-lint

.PHONY: format
format:
cd uss_qualifier && make format
Expand Down
31 changes: 16 additions & 15 deletions monitoring/atproxy/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,34 +5,35 @@

from monitoring.monitorlib import auth_validation

ENV_KEY_PREFIX = 'ATPROXY'
ENV_KEY_PUBLIC_KEY = '{}_PUBLIC_KEY'.format(ENV_KEY_PREFIX)
ENV_KEY_TOKEN_AUDIENCE = '{}_TOKEN_AUDIENCE'.format(ENV_KEY_PREFIX)
ENV_KEY_CLIENT_BASIC_AUTH = '{}_CLIENT_BASIC_AUTH'.format(ENV_KEY_PREFIX)
ENV_KEY_QUERY_TIMEOUT = '{}_QUERY_TIMEOUT'.format(ENV_KEY_PREFIX)
ENV_KEY_PREFIX = "ATPROXY"
ENV_KEY_PUBLIC_KEY = "{}_PUBLIC_KEY".format(ENV_KEY_PREFIX)
ENV_KEY_TOKEN_AUDIENCE = "{}_TOKEN_AUDIENCE".format(ENV_KEY_PREFIX)
ENV_KEY_CLIENT_BASIC_AUTH = "{}_CLIENT_BASIC_AUTH".format(ENV_KEY_PREFIX)
ENV_KEY_QUERY_TIMEOUT = "{}_QUERY_TIMEOUT".format(ENV_KEY_PREFIX)

# These keys map to entries in the Config class
KEY_TOKEN_PUBLIC_KEY = 'TOKEN_PUBLIC_KEY'
KEY_TOKEN_AUDIENCE = 'TOKEN_AUDIENCE'
KEY_CLIENT_BASIC_AUTH = 'CLIENT_BASIC_AUTH'
KEY_QUERY_TIMEOUT = 'QUERY_TIMEOUT'
KEY_TOKEN_PUBLIC_KEY = "TOKEN_PUBLIC_KEY"
KEY_TOKEN_AUDIENCE = "TOKEN_AUDIENCE"
KEY_CLIENT_BASIC_AUTH = "CLIENT_BASIC_AUTH"
KEY_QUERY_TIMEOUT = "QUERY_TIMEOUT"

KEY_CODE_VERSION = 'MONITORING_VERSION'
KEY_CODE_VERSION = "MONITORING_VERSION"

workspace_path = os.path.join(os.path.abspath(os.path.dirname(__file__)), 'workspace')
workspace_path = os.path.join(os.path.abspath(os.path.dirname(__file__)), "workspace")


class Config(object):
TOKEN_PUBLIC_KEY = auth_validation.fix_key(
os.environ.get(ENV_KEY_PUBLIC_KEY, '')).encode('utf-8')
TOKEN_AUDIENCE = os.environ.get(ENV_KEY_TOKEN_AUDIENCE, '')
os.environ.get(ENV_KEY_PUBLIC_KEY, "")
).encode("utf-8")
TOKEN_AUDIENCE = os.environ.get(ENV_KEY_TOKEN_AUDIENCE, "")
CLIENT_BASIC_AUTH = os.environ[ENV_KEY_CLIENT_BASIC_AUTH]
QUERY_TIMEOUT = float(os.environ.get(ENV_KEY_QUERY_TIMEOUT, "59"))
CODE_VERSION = os.environ.get(KEY_CODE_VERSION, 'Unknown')
CODE_VERSION = os.environ.get(KEY_CODE_VERSION, "Unknown")


def get_users(basic_auth: str) -> Dict[str, str]:
user_pass = [v.strip() for v in basic_auth.split(':')]
user_pass = [v.strip() for v in basic_auth.split(":")]
if len(user_pass) != 2:
raise ValueError('Expected "username:password", got "{}"'.format(basic_auth))
return {user_pass[0]: generate_password_hash(user_pass[1])}
10 changes: 6 additions & 4 deletions monitoring/atproxy/database.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,10 @@
# --- All queries ---
class QueryState(str, enum.Enum):
"""Whether a query is being handled, or has already been handled."""
Queued = 'Queued'
BeingHandled = 'BeingHandled'
Complete = 'Complete'

Queued = "Queued"
BeingHandled = "BeingHandled"
Complete = "Complete"


class Query(ImplicitDict):
Expand Down Expand Up @@ -51,4 +52,5 @@ class Database(ImplicitDict):

db = SynchronizedValue(
Database(),
decoder=lambda b: ImplicitDict.parse(json.loads(b.decode('utf-8')), Database))
decoder=lambda b: ImplicitDict.parse(json.loads(b.decode("utf-8")), Database),
)
21 changes: 18 additions & 3 deletions monitoring/atproxy/gunicorn.conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,14 +8,29 @@

def pre_request(worker: Worker, req: Request):
"""gunicorn server hook called just before a worker processes the request."""
logger.debug("gunicorn pre_request from worker {} (OS PID {}): {} {}", worker.pid, os.getpid(), req.method, req.path)
logger.debug(
"gunicorn pre_request from worker {} (OS PID {}): {} {}",
worker.pid,
os.getpid(),
req.method,
req.path,
)


def post_request(worker: Worker, req: Request, environ: dict, resp: Response):
"""gunicorn server hook called after a worker processes the request."""
logger.debug("gunicorn post_request from worker {} (OS PID {}): {} {} -> {}", worker.pid, os.getpid(), req.method, req.path, resp.status_code)
logger.debug(
"gunicorn post_request from worker {} (OS PID {}): {} {} -> {}",
worker.pid,
os.getpid(),
req.method,
req.path,
resp.status_code,
)


def worker_abort(worker: Worker):
"""gunicorn server hook called when a worker received the SIGABRT signal."""
logger.debug("gunicorn worker_abort from worker {} (OS PID {})", worker.pid, os.getpid())
logger.debug(
"gunicorn worker_abort from worker {} (OS PID {})", worker.pid, os.getpid()
)
32 changes: 25 additions & 7 deletions monitoring/atproxy/handling.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,29 +51,47 @@ def fulfill_query(req: ImplicitDict, timeout: timedelta) -> Tuple[str, int]:
t_start = datetime.utcnow()
query = Query(type=req.request_type_name(), request=req)
id = str(uuid.uuid4())
logger.debug('Attempting to fulfill {} query {} from worker {}', query.type, id, os.getpid())
logger.debug(
"Attempting to fulfill {} query {} from worker {}", query.type, id, os.getpid()
)

# Add query to be handled to the set of handleable queries
with db as tx:
tx.queries[id] = query
logger.debug('Added {} query {} to handler queue'.format(query.type, id))
logger.debug("Added {} query {} to handler queue".format(query.type, id))

# Frequently check if the query has been fulfilled
while datetime.utcnow() < t_start + timeout:
time.sleep(0.1)
with db as tx:
if tx.queries[id].state == QueryState.Complete:
# Query was successfully fulfilled; return the result
logger.debug('Fulfilling {} query {}'.format(query.type, id))
logger.debug("Fulfilling {} query {}".format(query.type, id))
query = tx.queries.pop(id)
logger.debug('Fulfilled {} query {} with {} from worker {}', query.type, id, query.return_code, os.getpid())
logger.debug(
"Fulfilled {} query {} with {} from worker {}",
query.type,
id,
query.return_code,
os.getpid(),
)
if query.response is not None:
return flask.jsonify(query.response), query.return_code
else:
return '', query.return_code
return "", query.return_code

# Time expired; remove request from queue and indicate error
with db as tx:
tx.queries.pop(id)
logger.debug('Failed to fulfill {} query {} in time (backend handler did not provide a response) from worker {}', query.type, id, os.getpid())
return flask.jsonify({'message': 'Backend handler did not respond within the alotted time'}), 500
logger.debug(
"Failed to fulfill {} query {} in time (backend handler did not provide a response) from worker {}",
query.type,
id,
os.getpid(),
)
return (
flask.jsonify(
{"message": "Backend handler did not respond within the alotted time"}
),
500,
)
3 changes: 2 additions & 1 deletion monitoring/atproxy/oauth.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,5 @@

requires_scope = auth_validation.requires_scope_decorator(
webapp.config.get(config.KEY_TOKEN_PUBLIC_KEY),
webapp.config.get(config.KEY_TOKEN_AUDIENCE))
webapp.config.get(config.KEY_TOKEN_AUDIENCE),
)
Loading

0 comments on commit e39aa47

Please sign in to comment.