Skip to content

Commit

Permalink
Merge remote-tracking branch 'interuss/main' into opintentvalidator
Browse files Browse the repository at this point in the history
  • Loading branch information
BenjaminPelletier committed Dec 27, 2023
2 parents 7d785da + 2230d7b commit 15e3416
Show file tree
Hide file tree
Showing 46 changed files with 566 additions and 360 deletions.
56 changes: 40 additions & 16 deletions .github/workflows/CI.md
Original file line number Diff line number Diff line change
@@ -1,37 +1,31 @@
# [Continuous integration](ci.yml)

## Overview

Before a pull request can be merged into the main branch, it must pass all automated tests for the repository. This document describes the tests and how to run them locally.

To run the equivalent of the full CI suite locally, simply execute `make presubmit`. This should perform all of the steps below, including bringing up a local interoperability ecosystem and all local USS mocks necessary to conduct tests. Therefore, `make presubmit` can (and should) be run starting without any pre-existing local infrastructure (interoperability ecosystem and/or USS mocks).

The downside of `make presubmit` is that it takes a long time, and a lot of that time is spent bringing up and tearing down local infrastructure (interoperability ecosystem and USS mocks). If your changes do not affect one or both of these things, you may save substantial time by bringing up an interoperability ecosystem once and leaving it up while developing, and possibly bringing up a set of USS mocks ([mock_uss](../../monitoring/mock_uss) instances) and leaving them up while developing (though obviously this will not work if developing changes that affect mock_uss). See uss_qualifier tests section below.

## Repository hygiene (`make check-hygiene`)

This set of tests ensures consistent formatting and performs other hygiene-related tasks that are not usually related to the actual functionality of the repository.
## Checks

### Python lint (`make python-lint`)
### Repository hygiene (`make check-hygiene`)

To maintain consistency across this large codebase, InterUSS employs a very strict Python linter. But, there is no need to satisfy the linter manually; simply run `make format` from the repo root to resolve most mere-formatting issues that this check will detect.
This set of tests including [miscellaneous hygiene checks](../../test/repo_hygiene/README.md) ensures consistent formatting and performs other hygiene-related tasks that are not usually related to the actual functionality of the repository.

### Automated hygiene verification (`make hygiene`)
#### Python lint (`make python-lint`)

This check performs [miscellaneous hygiene checks](../../test/repo_hygiene/README.md). Currently, it ensures that local links in Markdown (*.md) files are not broken.
To maintain consistency across this large codebase, InterUSS employs a very strict Python linter. But, there is no need to satisfy the linter manually; simply run `make format` from the repo root to resolve most mere-formatting issues that this check will detect.

### uss_qualifier documentation validation (`make validate-uss-qualifier-docs`)
#### uss_qualifier documentation validation (`make validate-uss-qualifier-docs`)

uss_qualifier [scenario documentation](../../monitoring/uss_qualifier/scenarios/README.md#documentation) is required and has strict requirements to ensure a strong, consistent, and correct interface between regulators and others concerned only with the concept of what scenarios are doing, and InterUSS contributors concerned with writing the actual code to perform the test. This check validates many of these documentation requirements.

### Shell lint (`make shell-lint`)
#### Shell lint (`make shell-lint`)

This repository contains a large number of shell scripts. This check attempts to have these scripts follow best practices and avoid common pitfalls.

## `monitoring` tests (`make check-monitoring`)

These tests verify functional behavior of the repository content.

### monitorlib tests (`make test` in monitoring/monitorlib)

### mock_uss tests (`make test` in monitoring/mock_uss)

This check runs unit tests for mock_uss.
Expand All @@ -56,4 +50,34 @@ uss_qualifier's run_locally.sh ([monitoring/uss_qualifier/run_locally.sh](../../

### prober tests (`make test` in monitoring/prober)

[prober](../../monitoring/prober/README.md) is a legacy test suite dedicated to integration testing of DSS instances. It is being migrated to test scenarios and suites in uss_qualifier, but in the mean time, it is still the standard way to validate DSS functionality. It runs on the DSS provided as part of the local interoperability ecosystem.
[prober](../../monitoring/prober/README.md) is a legacy test suite dedicated to integration testing of DSS instances. It is being migrated to test scenarios and suites in uss_qualifier, but in the meantime, it is still the standard way to validate DSS functionality. It runs on the DSS provided as part of the local interoperability ecosystem.

## Troubleshooting

Whenever CI is run on GitHub, a large amount of documentation on that test run is generated. Each of the checks is documented separately. When CI is run upon merging a PR to the main branch, the list of checks is accessible by clicking on the green checkmark (or red x) near the top of [the repo homepage](https://github.com/interuss/monitoring):

![Repo homepage checks location](../../assets/ci/checks_repo_homepage.png)

When CI is run for a PR, the list of checks is visible near the bottom of the PR page. If all checks passed, the list of checks must be opened by clicking "Show all checks".

In either case, the details of a particular check can be viewed by clicking on the "Details" link to the right of the check name:

![CI check details](../../assets/ci/check_details.png)

This GitHub Actions job run page shows the console output from the job. If there was an exception during the run, the error message of interest will usually be near the end of the last console output for the job.

### uss_qualifier artifacts

Some of the most useful outputs of uss_qualifier are the artifacts generated from the uss_qualifier test runs that are part of the CI. When a PR is merged to the main branch of the repository, some of these artifacts are published to [https://interuss.github.io/monitoring](https://interuss.github.io/monitoring/). These artifacts (and some additional ones) are available for every CI run. To access them from a GitHub Actions job run page (see just above), first click "Summary" for the run:

![CI Summary location](../../assets/ci/summary_location.png)

Then, scroll down to the bottom of the page to see the list of GitHub Actions artifacts. The GitHub Actions artifact that contains the uss_qualifier artifacts from all uss_qualifier test runs during the CI is the one labeled "monitoring-test-uss_qualifier-reports":

![GitHub Actions artifacts](../../assets/ci/artifacts.png)

Click on this artifact to download a zip file which can then be unzipped to reveal a uss_qualifier/output folder structure. Each subfolder in uss_qualifier/output contains the output of a particular test run of uss_qualifier -- for instance, the `f3548` folder contains the output from running the [`configurations.dev.f3548_self_contained` configuration](../../monitoring/uss_qualifier/configurations/dev/f3548_self_contained.yaml). The sequence view artifact is often particularly useful for debugging.

### Automated testing logs

The logs from all of the containers running during uss_qualifier automated tests (DSS instance, mock_uss instances, etc) can be found in the GitHub Actions artifact labeled "monitoring-test-uss_qualifier-logs"; see above for download instructions.
8 changes: 5 additions & 3 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,13 @@ This repository has a very strict Python linter, as well as very strict expected

When [a PR is created](https://github.com/interuss/tsc/blob/main/repo_contributions.md#create-draft-pr-in-interuss-repository), the [continuous integration (CI) tests for this repository](./.github/workflows/CI.md) will run, and the PR will generally not be reviewed until they pass (unless [committer help is requested](https://github.com/interuss/tsc/blob/main/repo_contributions.md#request-committer-help-via-comment-in-pr) to address the failure). See [the continuous integration test documentation](./.github/workflows/CI.md) for how to run these tests on your local system more quickly and efficiently to be confident your PR will pass the CI tests when created (or when updates are made).

### Failing "uss_qualifier tests" CI check
### Troubleshooting

See [the continuous integration test documentation](./.github/workflows/CI.md) for how to troubleshoot failing CI for a PR.

If `make presubmit` succeeds on a developer's local machine, the GitHub CI actions should succeed as well. [A known issue](https://github.com/interuss/monitoring/issues/28) frequently causes the "uss_qualifier tests" check to fail. If the failed check indicates a query response code of 999 (this is the code InterUSS indicates when no response is received), this is very likely the problem. A committer can rerun the CI check and it is likely to succeed on the second try with no changes.
### Failing "uss_qualifier tests" CI check

If anyone can resolve [issue #28](https://github.com/interuss/monitoring/issues/28) which causes this problem, that help would be enormously appreciated by InterUSS.
If `make presubmit` succeeds on a developer's local machine, the GitHub CI actions should succeed as well. If `make presubmit` succeeds locally but the GitHub CI actions fail, that may indicate help from an InterUSS committer would be helpful.

## uss_qualifier test scenarios

Expand Down
8 changes: 4 additions & 4 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -83,13 +83,13 @@ collect-local-logs:
-sh -c "build/dev/run_locally.sh logs --timestamps" > logs/local_infra.log 2>&1
-docker logs mock_uss_scdsc_a > logs/mock_uss_scdsc_a.log 2>&1
-docker logs mock_uss_scdsc_b > logs/mock_uss_scdsc_b.log 2>&1
-docker logs mock_uss_geoawareness > logs/mock_uss_geoawareness.log 2>&1
-docker logs mock_uss_ridsp > logs/mock_uss_ridsp.log 2>&1
-docker logs mock_uss_ridsp_v22a > logs/mock_uss_ridsp_v22a.log 2>&1
-docker logs mock_uss_riddp > logs/mock_uss_riddp.log 2>&1
-docker logs mock_uss_riddp_v22a > logs/mock_uss_riddp_v22a.log 2>&1
-docker logs mock_uss_geoawareness > logs/mock_uss_geoawareness.log 2>&1
-docker logs mock_uss_ridsp_v19 > logs/mock_uss_ridsp_v19.log 2>&1
-docker logs mock_uss_riddp_v19 > logs/mock_uss_riddp_v19.log 2>&1
-docker logs mock_uss_tracer > logs/mock_uss_tracer.log 2>&1
-docker logs mock_uss_tracer_v22a > logs/mock_uss_tracer_v22a.log 2>&1
-docker logs mock_uss_scdsc_interaction_log > logs/mock_uss_scdsc_interaction_log.log 2>&1

.PHONY: stop-locally
stop-locally:
Expand Down
Binary file added assets/ci/artifacts.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/ci/check_details.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/ci/checks_repo_homepage.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/ci/summary_location.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
5 changes: 2 additions & 3 deletions monitoring/deployment_manager/actions/test/hello_world.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,9 @@
import time

from monitoring.deployment_manager import deploylib
import monitoring.deployment_manager.deploylib.namespaces
import monitoring.deployment_manager.deploylib.systems
from monitoring.deployment_manager.infrastructure import deployment_action, Context
from monitoring.deployment_manager.systems.test import hello_world
from monitoring.monitorlib.delay import sleep


@deployment_action("test/hello_world/deploy")
Expand Down Expand Up @@ -52,7 +51,7 @@ def destroy(context: Context) -> None:
namespace.metadata.name, context.spec.cluster.name
)
)
time.sleep(15)
sleep(15, "destruction of hello_world system may take a few seconds")

deploylib.systems.delete_resources(
existing_resources, namespace, context.clients, context.log
Expand Down
58 changes: 57 additions & 1 deletion monitoring/mock_uss/f3548v21/routes_scd.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,17 @@
from typing import Optional

import flask

from monitoring.mock_uss.f3548v21.flight_planning import op_intent_from_flightrecord
from monitoring.monitorlib import scd
from monitoring.mock_uss import webapp
from monitoring.mock_uss.auth import requires_scope
from monitoring.mock_uss.flights.database import db
from monitoring.mock_uss.flights.database import db, FlightRecord
from uas_standards.astm.f3548.v21.api import (
ErrorResponse,
GetOperationalIntentDetailsResponse,
GetOperationalIntentTelemetryResponse,
OperationalIntentState,
)


Expand Down Expand Up @@ -44,6 +48,58 @@ def scdsc_get_operational_intent_details(entityid: str):
return flask.jsonify(response), 200


@webapp.route(
"/mock/scd/uss/v1/operational_intents/<entityid>/telemetry", methods=["GET"]
)
@requires_scope(scd.SCOPE_CM_SA)
def scdsc_get_operational_intent_telemetry(entityid: str):
"""Implements getOperationalIntentTelemetry in ASTM SCD API."""

# Look up entityid in database
tx = db.value
flight: Optional[FlightRecord] = None
for f in tx.flights.values():
if f and f.op_intent.reference.id == entityid:
flight = f
break

# If requested operational intent doesn't exist, return 404
if flight is None:
return (
flask.jsonify(
ErrorResponse(
message="Operational intent {} not known by this USS".format(
entityid
)
)
),
404,
)

elif flight.op_intent.reference.state not in {
OperationalIntentState.Contingent,
OperationalIntentState.Nonconforming,
}:
return (
flask.jsonify(
ErrorResponse(
message=f"Operational intent {entityid} is not in a state that provides telemetry ({flight.op_intent.reference.state})"
)
),
409,
)

# TODO: implement support for telemetry
return (
flask.jsonify(
ErrorResponse(
message=f"Operational intent {entityid} has no telemetry data available."
)
),
412,
)


@webapp.route("/mock/scd/uss/v1/operational_intents", methods=["POST"])
@requires_scope(scd.SCOPE_SC)
def scdsc_notify_operational_intent_details_changed():
Expand Down
8 changes: 6 additions & 2 deletions monitoring/mock_uss/flights/planning.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
from typing import Callable, Optional

from monitoring.mock_uss.flights.database import FlightRecord, db, DEADLOCK_TIMEOUT
from monitoring.monitorlib.delay import sleep


def lock_flight(flight_id: str, log: Callable[[str], None]) -> FlightRecord:
Expand All @@ -25,7 +26,7 @@ def lock_flight(flight_id: str, log: Callable[[str], None]) -> FlightRecord:
break
# We found an existing flight but it was locked; wait for it to become
# available
time.sleep(0.5)
sleep(0.5, f"flight {flight_id} is currently already locked")

if datetime.utcnow() > deadline:
raise RuntimeError(
Expand Down Expand Up @@ -61,7 +62,10 @@ def delete_flight_record(flight_id: str) -> Optional[FlightRecord]:
# No FlightRecord found
return None
# There is a race condition with another handler to create or modify the requested flight; wait for that to resolve
time.sleep(0.5)
sleep(
0.5,
f"flight {flight_id} is currently already locked while we are trying to delete it",
)
if datetime.utcnow() > deadline:
raise RuntimeError(
f"Deadlock in delete_flight while attempting to gain access to flight {flight_id} (now: {datetime.utcnow()}, deadline: {deadline})"
Expand Down
7 changes: 5 additions & 2 deletions monitoring/mock_uss/ridsp/behavior.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
from time import sleep
from typing import Optional

from monitoring.monitorlib.delay import sleep
from monitoring.monitorlib.rid_automated_testing.injection_api import TestFlight
from implicitdict import ImplicitDict
from uas_standards.astm.f3411.v19.api import RIDFlight
Expand Down Expand Up @@ -56,6 +56,9 @@ def adjust_reported_flight(
p.position.alt *= FEET_PER_METER

if behavior.delay_flight_report_s > 0:
sleep(behavior.delay_flight_report_s)
sleep(
behavior.delay_flight_report_s,
"specified Service Provider behavior is to delay before reporting flight",
)

return adjusted
5 changes: 4 additions & 1 deletion monitoring/monitorlib/clients/flight_planning/client_v1.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
)
from monitoring.monitorlib.clients.flight_planning.planning import (
PlanningActivityResponse,
AdvisoryInclusion,
)
from monitoring.monitorlib.clients.flight_planning.planning import (
PlanningActivityResult,
Expand Down Expand Up @@ -93,7 +94,9 @@ def _inject(
queries=[query],
activity_result=resp.planning_result,
flight_plan_status=resp.flight_plan_status,
includes_advisories=resp.includes_advisories,
includes_advisories=resp.includes_advisories
if "includes_advisories" in resp
else AdvisoryInclusion.Unknown,
)

return response
Expand Down
27 changes: 27 additions & 0 deletions monitoring/monitorlib/delay.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
from datetime import timedelta
import time
from typing import Union

from loguru import logger


MAX_SILENT_DELAY_S = 0.4
"""Number of seconds to delay above which a reasoning message should be displayed."""


def sleep(duration: Union[float, timedelta], reason: str) -> None:
"""Sleep for the specified amount of time, logging the fact that the delay is occurring (when appropriate).
Args:
duration: Amount of time to sleep for; interpreted as seconds if float.
reason: Reason the delay is happening (to be printed to console/log if appropriate).
"""
if isinstance(duration, timedelta):
duration = duration.total_seconds()
if duration <= 0:
# No need to delay
return

if duration > MAX_SILENT_DELAY_S:
logger.debug(f"Delaying {duration:.1f} seconds because {reason}")
time.sleep(duration)
8 changes: 8 additions & 0 deletions monitoring/monitorlib/errors.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,11 @@
def stacktrace_string(e: Exception) -> str:
"""Return a multi-line string containing a stacktrace for the specified exception."""
return "".join(traceback.format_exception(e))


def current_stack_string(exclude_levels: int = 1) -> str:
"""Return a multi-line string containing a trace of the current execution state."""
stack = traceback.extract_stack()
if exclude_levels > 0:
stack = stack[0:-exclude_levels]
return "".join(traceback.format_list(stack))
3 changes: 3 additions & 0 deletions monitoring/monitorlib/fetch/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -254,6 +254,9 @@ class QueryType(str, Enum):
"interuss.automated_testing.flight_planning.v1.DeleteFlightPlan"
)

def __str__(self):
return self.value

@staticmethod
def flight_details(rid_version: RIDVersion):
if rid_version == RIDVersion.f3411_19:
Expand Down
4 changes: 2 additions & 2 deletions monitoring/prober/rid/v1/test_isa_expiry.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
"""Test ISAs aren't returned after they expire."""

import datetime
import time

from monitoring.monitorlib.delay import sleep
from monitoring.monitorlib.infrastructure import default_scope
from monitoring.monitorlib import rid_v1
from monitoring.prober.infrastructure import register_resource_type
Expand Down Expand Up @@ -64,7 +64,7 @@ def test_valid_immediately(ids, session_ridv1):

def test_sleep_5_seconds():
# But if we wait 5 seconds it will expire...
time.sleep(5)
sleep(5, "if we wait 5 seconds, the ISA of interest will expire")


@default_scope(Scope.Read)
Expand Down
4 changes: 2 additions & 2 deletions monitoring/prober/rid/v2/test_isa_expiry.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
"""Test ISAs aren't returned after they expire."""

import datetime
import time

from monitoring.monitorlib.delay import sleep
from uas_standards.astm.f3411.v22a.api import OPERATIONS, OperationID
from uas_standards.astm.f3411.v22a.constants import Scope

Expand Down Expand Up @@ -68,7 +68,7 @@ def test_valid_immediately(ids, session_ridv2):

def test_sleep_5_seconds():
# But if we wait 5 seconds it will expire...
time.sleep(5)
sleep(5, "if we wait 5 seconds, the ISA of interest will expire")


@default_scope(Scope.DisplayProvider)
Expand Down
Loading

0 comments on commit 15e3416

Please sign in to comment.