Skip to content

Commit

Permalink
Merge branch 'main' into mtoff/go-otel-systests
Browse files Browse the repository at this point in the history
  • Loading branch information
robertomonteromiguel authored Jul 19, 2024
2 parents 0477f08 + e4ed9ed commit d4683db
Show file tree
Hide file tree
Showing 126 changed files with 2,081 additions and 1,762 deletions.
4 changes: 2 additions & 2 deletions .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
/utils/build/docker/php*/ @DataDog/apm-php @DataDog/system-tests-core
/utils/build/docker/python*/ @DataDog/apm-python @DataDog/asm-python @DataDog/system-tests-core
/utils/build/docker/ruby*/ @DataDog/apm-ruby @DataDog/asm-ruby @DataDog/system-tests-core
/parametric/ @Kyle-Verhoog @DataDog/system-tests-core
/tests/parametric/ @Kyle-Verhoog @DataDog/system-tests-core
/parametric/ @Kyle-Verhoog @DataDog/system-tests-core @DataDog/apm-sdk-api
/tests/parametric/ @Kyle-Verhoog @DataDog/system-tests-core @DataDog/apm-sdk-api
/tests/otel_tracing_e2e/ @DataDog/opentelemetry @DataDog/system-tests-core
/tests/remote_config/ @DataDog/system-tests-core @DataDog/remote-config @DataDog/system-tests-core
/tests/appsec/ @DataDog/asm-libraries @DataDog/system-tests-core
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ jobs:
- name: Test idiomatics scenarios
run: |
./run.sh DEFAULT
TEST_LIBRARY=golang ./run.sh PARAMETRIC
TEST_LIBRARY=golang ./run.sh PARAMETRIC --splits=8 --group=1
env:
DD_API_KEY: ${{ secrets.DD_API_KEY }}

Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/run-parametric.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ jobs:
runs-on:
group: "APM Larger Runners"
strategy:
fail-fast: true
fail-fast: false
matrix:
job_instance: ${{ fromJson( inputs._experimental_job_matrix ) }}
env:
Expand All @@ -64,7 +64,7 @@ jobs:
RUN_ATTEMPTS=1
while [ $RUN_ATTEMPTS -le 3 ]; do
echo "Running parametric test attempt $RUN_ATTEMPTS"
timeout 720s ./run.sh PARAMETRIC
timeout 720s ./run.sh PARAMETRIC --splits=${{ inputs._experimental_job_count }} --group=${{ matrix.job_instance }}
status=$?
#timneout returns 124 if it times out
#if the return code is not 124, then we exit with the status
Expand Down
95 changes: 70 additions & 25 deletions .gitlab-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ stages:
- java_tracer
- python_tracer
- dotnet_tracer
- php_tracer
- parse_results
- before_tests

Expand Down Expand Up @@ -42,22 +43,25 @@ onboarding_nodejs:
matrix:
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-nodejs]
SCENARIO: [HOST_AUTO_INJECTION, HOST_AUTO_INJECTION_INSTALL_SCRIPT, SIMPLE_HOST_AUTO_INJECTION_PROFILING, HOST_AUTO_INJECTION_INSTALL_SCRIPT_PROFILING]
SCENARIO: [HOST_AUTO_INJECTION_INSTALL_SCRIPT, HOST_AUTO_INJECTION_INSTALL_SCRIPT_PROFILING]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-shell-script]
SCENARIO: [HOST_AUTO_INJECTION_BLOCK_LIST]
SCENARIO: [INSTALLER_AUTO_INJECTION_BLOCK_LIST]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-nodejs-container,test-app-nodejs-alpine-libgcc]
SCENARIO: [CONTAINER_AUTO_INJECTION, CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT, SIMPLE_CONTAINER_AUTO_INJECTION_PROFILING, CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT_PROFILING]
SCENARIO: [ CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT, CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT_PROFILING]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-nodejs,test-app-nodejs-container,test-app-nodejs-alpine-libgcc]
SCENARIO: [INSTALLER_AUTO_INJECTION]
SCENARIO: [INSTALLER_AUTO_INJECTION,SIMPLE_AUTO_INJECTION_PROFILING]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-nodejs-16, test-app-nodejs-alpine]
SCENARIO: [INSTALLER_NOT_SUPPORTED_AUTO_INJECTION]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-nodejs-alpine]
SCENARIO: [CONTAINER_NOT_SUPPORTED_AUTO_INJECTION]
ONBOARDING_FILTER_WEBLOG: [test-app-nodejs]
SCENARIO: [INSTALLER_AUTO_INJECTION_LD_PRELOAD]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-nodejs-16]
SCENARIO: [HOST_NOT_SUPPORTED_AUTO_INJECTION]
ONBOARDING_FILTER_WEBLOG: [test-app-nodejs]
SCENARIO: [INSTALLER_HOST_AUTO_INJECTION_CHAOS]
script:
- ./build.sh -i runner
- timeout 2700s ./run.sh $SCENARIO --vm-weblog ${ONBOARDING_FILTER_WEBLOG} --vm-env ${ONBOARDING_FILTER_ENV} --vm-library ${TEST_LIBRARY} --vm-provider aws
Expand All @@ -75,20 +79,25 @@ onboarding_java:
matrix:
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-java]
SCENARIO: [HOST_AUTO_INJECTION, HOST_AUTO_INJECTION_INSTALL_SCRIPT]
SCENARIO: [HOST_AUTO_INJECTION_INSTALL_SCRIPT]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-shell-script]
SCENARIO: [HOST_AUTO_INJECTION_BLOCK_LIST]
SCENARIO: [INSTALLER_AUTO_INJECTION_BLOCK_LIST]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-java-container,test-app-java-container-jdk15,test-app-java-alpine-libgcc,test-app-java-buildpack]
SCENARIO: [CONTAINER_AUTO_INJECTION, CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT]
SCENARIO: [CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-java,test-app-java-container,test-app-java-alpine-libgcc,test-app-java-buildpack]
ONBOARDING_FILTER_WEBLOG: [test-app-java,test-app-java-container,test-app-java-container-jdk15,test-app-java-alpine-libgcc,test-app-java-buildpack]
SCENARIO: [INSTALLER_AUTO_INJECTION]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-java-alpine,test-app-java-alpine-jdk15,test-app-java-alpine-jdk21]
SCENARIO: [CONTAINER_NOT_SUPPORTED_AUTO_INJECTION]

SCENARIO: [INSTALLER_NOT_SUPPORTED_AUTO_INJECTION]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-java]
SCENARIO: [INSTALLER_AUTO_INJECTION_LD_PRELOAD]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-java]
SCENARIO: [INSTALLER_HOST_AUTO_INJECTION_CHAOS]
script:
- ./build.sh -i runner
- timeout 2700s ./run.sh $SCENARIO --vm-weblog ${ONBOARDING_FILTER_WEBLOG} --vm-env ${ONBOARDING_FILTER_ENV} --vm-library ${TEST_LIBRARY} --vm-provider aws
Expand All @@ -106,19 +115,25 @@ onboarding_python:
matrix:
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-python]
SCENARIO: [HOST_AUTO_INJECTION, HOST_AUTO_INJECTION_INSTALL_SCRIPT]
SCENARIO: [HOST_AUTO_INJECTION_INSTALL_SCRIPT]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-shell-script]
SCENARIO: [HOST_AUTO_INJECTION_BLOCK_LIST]
SCENARIO: [INSTALLER_AUTO_INJECTION_BLOCK_LIST]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-python-container,test-app-python-alpine-libgcc]
SCENARIO: [CONTAINER_AUTO_INJECTION, CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT]
SCENARIO: [ CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-python,test-app-python-container,test-app-python-alpine-libgcc]
SCENARIO: [INSTALLER_AUTO_INJECTION]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-python-alpine]
SCENARIO: [CONTAINER_NOT_SUPPORTED_AUTO_INJECTION]
SCENARIO: [INSTALLER_NOT_SUPPORTED_AUTO_INJECTION]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-python]
SCENARIO: [INSTALLER_AUTO_INJECTION_LD_PRELOAD]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-python]
SCENARIO: [INSTALLER_HOST_AUTO_INJECTION_CHAOS]
script:
- ./build.sh -i runner
- timeout 2700s ./run.sh $SCENARIO --vm-weblog ${ONBOARDING_FILTER_WEBLOG} --vm-env ${ONBOARDING_FILTER_ENV} --vm-library ${TEST_LIBRARY} --vm-provider aws
Expand All @@ -136,16 +151,22 @@ onboarding_dotnet:
matrix:
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-dotnet]
SCENARIO: [HOST_AUTO_INJECTION, HOST_AUTO_INJECTION_INSTALL_SCRIPT]
SCENARIO: [HOST_AUTO_INJECTION_INSTALL_SCRIPT]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-shell-script]
SCENARIO: [HOST_AUTO_INJECTION_BLOCK_LIST]
SCENARIO: [INSTALLER_AUTO_INJECTION_BLOCK_LIST]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-dotnet-container]
SCENARIO: [ CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-dotnet,test-app-dotnet-container]
SCENARIO: [INSTALLER_AUTO_INJECTION]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-dotnet-container]
SCENARIO: [CONTAINER_AUTO_INJECTION, CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT]
ONBOARDING_FILTER_WEBLOG: [test-app-dotnet]
SCENARIO: [INSTALLER_AUTO_INJECTION_LD_PRELOAD]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-dotnet]
SCENARIO: [INSTALLER_HOST_AUTO_INJECTION_CHAOS]
script:
- ./build.sh -i runner
- timeout 2700s ./run.sh $SCENARIO --vm-weblog ${ONBOARDING_FILTER_WEBLOG} --vm-env ${ONBOARDING_FILTER_ENV} --vm-library ${TEST_LIBRARY} --vm-provider aws
Expand All @@ -163,16 +184,40 @@ onboarding_ruby:
matrix:
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-ruby]
SCENARIO: [HOST_AUTO_INJECTION, HOST_AUTO_INJECTION_INSTALL_SCRIPT]
SCENARIO: [HOST_AUTO_INJECTION_INSTALL_SCRIPT]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-shell-script]
SCENARIO: [HOST_AUTO_INJECTION_BLOCK_LIST]
SCENARIO: [INSTALLER_AUTO_INJECTION_BLOCK_LIST]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-ruby-container]
SCENARIO: [CONTAINER_AUTO_INJECTION, CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT]
SCENARIO: [ CONTAINER_AUTO_INJECTION_INSTALL_SCRIPT]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-ruby,test-app-ruby-container]
SCENARIO: [INSTALLER_AUTO_INJECTION]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-ruby]
SCENARIO: [INSTALLER_AUTO_INJECTION_LD_PRELOAD]
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-ruby]
SCENARIO: [INSTALLER_HOST_AUTO_INJECTION_CHAOS]
script:
- ./build.sh -i runner
- timeout 2700s ./run.sh $SCENARIO --vm-weblog ${ONBOARDING_FILTER_WEBLOG} --vm-env ${ONBOARDING_FILTER_ENV} --vm-library ${TEST_LIBRARY} --vm-provider aws

onboarding_php:
extends: .base_job_onboarding_system_tests
stage: php_tracer
allow_failure: true
dependencies: []
only:
- schedules
variables:
TEST_LIBRARY: "php"
parallel:
matrix:
- ONBOARDING_FILTER_ENV: [dev, prod]
ONBOARDING_FILTER_WEBLOG: [test-app-php,test-app-php-container-83]
SCENARIO: [INSTALLER_AUTO_INJECTION]
script:
- ./build.sh -i runner
- timeout 2700s ./run.sh $SCENARIO --vm-weblog ${ONBOARDING_FILTER_WEBLOG} --vm-env ${ONBOARDING_FILTER_ENV} --vm-library ${TEST_LIBRARY} --vm-provider aws
Expand Down
11 changes: 7 additions & 4 deletions conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -359,11 +359,14 @@ def pytest_sessionfinish(session, exitstatus):

data = session.config._json_report.report # pylint: disable=protected-access

junit_modifyreport(
data, session.config.option.xmlpath, junit_properties=context.scenario.get_junit_properties(),
)
try:
junit_modifyreport(
data, session.config.option.xmlpath, junit_properties=context.scenario.get_junit_properties(),
)

export_feature_parity_dashboard(session, data)
export_feature_parity_dashboard(session, data)
except Exception:
logger.exception("Fail to export export reports", exc_info=True)


def export_feature_parity_dashboard(session, data):
Expand Down
54 changes: 35 additions & 19 deletions docs/edit/remote-config.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
The RC API is the official way to interact with remote config. It allows to build and send RC payloads to the library durint setup phase, and send request before/after each state change.

## Building RC payload
## Setting RC configuration files

### Example

``` python
from utils import remote_config


command = remote_config.RemoteConfigCommand(version=1)
# will return the global rc state
rc_state = remote_config.rc_state

config = {
"rules_data": [
Expand All @@ -20,35 +21,47 @@ config = {
]
}

command.add_client_config(f"datadog/2/ASM_DATA-base/ASM_DATA-base/config", config)
rc_state.set_config(f"datadog/2/ASM_DATA-base/ASM_DATA-base/config", config)
# send the state to the tracer and wait for the result to be validated
rc_state.apply()
```

### API

#### class `remote_config.RemoteConfigCommand`
#### object `remote_config.rc_state`

This class will be serialized as a valid `ClientGetConfigsResponse`.

* constructor `__init__(self, version: int, client_configs=(), expires=None)`
* `version: int`: `version` property of `signed` object
* `client_configs`[optional]: list of configuration path / config object.
* `expires` [optional]: expiration date of the config (default `3000-01-01T00:00:00Z`)
* `add_client_config(self, path, config) -> ClientConfig:`
* `set_config(self, path, config) -> rc_state`
* `path`: configuration path
* `config`: config object
* `send()`: send the command using the `send_command` function (see below)

*add one configuration in the state*
* `del_config(self, path) -> rc_state`
* `path`: configuration path

*delete one configuration in the state*
* `reset(self) -> rc_state`

*delete all configurations in the state*
* `apply() -> tracer_state`

*send the state using the `send_state` function (see below).*

*return value can be used to check that the state was correctly applied to the tracer.*

Remember that the state is shared among all tests of a scenario.
You need to reset it and apply at the start of each setup.

## Sending command
## Sending states

### Example

Here is an example a scenario activating/deactivating ASM:

1. the library starts in an initial state where ASM is disabled. This state is validated with an assertion on a request containing an attack : the request should not been caught by ASM
2. Then a RC command is sent to activate ASM
2. Then the RC state is sent to activate ASM
3. another request containing an attack is sent, this one must be reported by ASM
4. A second command is sent to deactivate ASM
4. The state is modified and sent to deactivate ASM
5. a thirst request containing an attack is sent, this last one should not be seen


Expand All @@ -57,6 +70,7 @@ Here is the test code performing that test. Please note variables `activate_ASM_
```python
from utils import weblog, interfaces, scenarios, remote_config

rc_state = remote_config.rc_state

@scenarios.asm_deactivated # in this scenario, ASM is deactivated
class Test_RemoteConfigSequence:
Expand All @@ -67,17 +81,19 @@ class Test_RemoteConfigSequence:
self.first_request = weblog.get("/waf/", headers={"User-Agent": "Arachni/v1"})

# this function will send a RC payload to the library, and wait for a confirmation from the library
self.config_states_activation = activate_ASM_command.send()
self.config_states_activation = rc_state.set_config(path, asm_enabled).apply()
self.second_request = weblog.get("/waf/", headers={"User-Agent": "Arachni/v1"})

# now deactivate the WAF, and check that it does not catch anything
self.config_states_deactivation = deactivate_ASM_command.send()
# now deactivate the WAF by deleting the RC file, and check that it does not catch anything
self.config_states_deactivation = rc_state.del_config(path).apply()
self.third_request = weblog.get("/waf/", headers={"User-Agent": "Arachni/v1"})

def test_asm_switch_on_switch_off():
# first check that both config state are ok, otherwise, next assertions will fail with cryptic messages
assert self.config_states_activation[remote_config.RC_STATE] == remote_config.ApplyState.ACKNOWLEDGED
assert self.config_states_deactivation[remote_config.RC_STATE] == remote_config.ApplyState.ACKNOWLEDGED
# for non empty config, you can also check for details of files
assert self.config_states_activation["asm_features_activation"]["apply_state"] == remote_config.ApplyState.ACKNOWLEDGED, self.config_states_activation
assert self.config_states_deactivation["asm_features_activation"]["apply_state"] == remote_config.ApplyState.ACKNOWLEDGED, self.config_states_deactivation

interfaces.library.assert_no_appsec_event(self.first_request)
interfaces.library.assert_waf_attack(self.second_request)
Expand All @@ -88,7 +104,7 @@ To use this feature, you must use an `EndToEndScenario` with `rc_api_enabled=Tru

### API

#### `send_command(raw_payload, *, wait_for_acknowledged_status: bool = True) -> dict[str, dict[str, Any]]`
#### `send_state(raw_payload, *, wait_for_acknowledged_status: bool = True) -> dict[str, dict[str, Any]]`

Sends a remote config payload to the library and waits for the config to be applied.
Then returns a dictionary with the state of each requested file as returned by the library.
Expand Down
2 changes: 1 addition & 1 deletion docs/execute/requirements.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ You''ll need a bash based terminal, docker and python3.12
```
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update
sudo apt install python3.12 python3.12-distutils python3.12-venv
sudo apt install python3.12 python3.12-distutils python3.12-venv python3.12-dev
curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
python3.12 get-pip.py
./build.sh -i runner
Expand Down
10 changes: 9 additions & 1 deletion docs/execute/troubleshooting.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,4 +39,12 @@ sudo chown -R $(whoami) ~/.docker

## NodeJs weblog experimenting segfaults on Mac/Intel

In docker dashbaord, setting, general, untick `Use Virtualization Framework`. See this [Stack overflow thread](https://stackoverflow.com/questions/76735062/segmentation-fault-in-node-js-application-running-in-docker).
In docker dashbaord, setting, general, untick `Use Virtualization Framework`. See this [Stack overflow thread](https://stackoverflow.com/questions/76735062/segmentation-fault-in-node-js-application-running-in-docker).

## Parametric scenario : `GRPC recvmsg:Connection reset by peer`

The GRPC interface seems to be less stable. No other solution than retry so far.

## Parametric scenario : `Fail to bind port`

Docker seems to sometimes keep a host port open, even after the container being removed. There is wait and rety mechanism, but it may be not enough. No other solution than retry so far.
7 changes: 7 additions & 0 deletions docs/internals/parametric-life-cycle.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Parametric scenario is a scenario that only targets libraries. It spawns, for each test, docker network, a container with the tested library behind a custom HTTP interface\*, and a container with a test agent. Those three items are removed at the end of the test.

To keep this scenario reasonably fast, it also use `xdist` plugin, that split the test session in as many core as it exists. Here is an example with two cores :

![Output on success](../../utils/assets/parametric_infra.png?raw=true)

Note : [previously a gRPC interface](https://github.com/DataDog/system-tests/issues/1930)
1 change: 1 addition & 0 deletions manifests/dotnet.yml
Original file line number Diff line number Diff line change
Expand Up @@ -215,6 +215,7 @@ tests/:
Test_AppSecRequestBlocking: v2.25.0
test_runtime_activation.py:
Test_RuntimeActivation: v2.16.0
Test_RuntimeDeactivation: v2.16.0
test_shell_execution.py:
Test_ShellExecution: missing_feature
test_traces.py:
Expand Down
1 change: 1 addition & 0 deletions manifests/golang.yml
Original file line number Diff line number Diff line change
Expand Up @@ -317,6 +317,7 @@ tests/:
Test_AppSecRequestBlocking: v1.50.0-rc.1
test_runtime_activation.py:
Test_RuntimeActivation: missing_feature
Test_RuntimeDeactivation: missing_feature
test_shell_execution.py:
Test_ShellExecution: missing_feature
test_traces.py:
Expand Down
Loading

0 comments on commit d4683db

Please sign in to comment.