Skip to content

Commit

Permalink
Revert "Added content for Creating a custom integration test for upst…
Browse files Browse the repository at this point in the history
…ream"

This reverts commit f49b125.
  • Loading branch information
gtrivedi88 committed May 15, 2024
1 parent 2452079 commit 9c5589d
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 185 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,7 @@ NOTE: By default, all integration test scenarios are mandatory and must pass. A

. Select *Add integration test*.

. Start a new build for any component you want to test.

.. For components using the default build pipeline, go to the *Components* tab, select the three dots next to the name of the component, and select *Start new build*.

.. For components with an upgraded build pipeline, make a commit to their GitHub repository.
. To start building a new component, either open a new pull request (PR) that targets the tracked branch of the component in the GitHub repository, or comment '/retest' on an existing PR.

.Verification

Expand Down
181 changes: 1 addition & 180 deletions docs/modules/ROOT/pages/how-tos/testing/integration/creating.adoc
Original file line number Diff line number Diff line change
@@ -1,180 +1 @@
= Creating a custom integration test

In {ProductName}, you can create your own integration tests to run on all components of a given application before they are deployed.

.Procedure

To create any custom test, complete the following steps:

. In your preferred IDE, create a Tekton pipeline in a `.yaml` file.

. Within that pipeline, create tasks, which define the actual steps of the test that {ProductName} executes against images before deploying them.

. Commit the `.yaml` file to a GitHub repo and add it as an integration test in {ProductName}.

.Procedure with example

To create a custom test that checks that your app serves the text “Hello world!”, complete the following steps:

. In your preferred IDE, create a new `.yaml` file, with a name of your choosing.

. Define a new Tekton pipeline. The following example is the beginning of a pipeline that uses `curl` to check that the app serves the text “Hello world!”.

+
Example pipeline file:

+
[source]
----
kind: Pipeline
apiVersion: tekton.dev/v1beta1
metadata:
name: example-pipeline
spec:
params:
- description: 'Snapshot of the application'
name: SNAPSHOT
default: '{"components": [{"name":"test-app", "containerImage": "quay.io/example/repo:latest"}]}'
type: string
tasks:
----

. In the `.pipeline.spec` path, declare a new task.

+
Example task declaration:

+
[source]
----
tasks:
- name: task-1
description: Placeholder task that prints the Snapshot and outputs standard TEST_OUTPUT
params:
- name: SNAPSHOT
value: $(params.SNAPSHOT)
taskSpec:
params:
- name: SNAPSHOT
results:
- name: TEST_OUTPUT
description: Test output
steps:
- image: registry.redhat.io/openshift4/ose-cli:latest
env:
- name: SNAPSHOT
value: $(params.SNAPSHOT)
script: |
dnf -y install jq
echo -e "Example test task for the Snapshot:\n ${SNAPSHOT}"
// Run custom tests for the given Snapshot here
// After the tests finish, record the overall result in the RESULT variable
RESULT="SUCCESS"
// Output the standardized TEST_OUTPUT result in JSON form
TEST_OUTPUT=$(jq -rc --arg date $(date +%s) --arg RESULT "${RESULT}" --null-input \
'{result: $RESULT, timestamp: $date, failures: 0, successes: 1, warnings: 0}')
echo -n "${TEST_OUTPUT}" | tee $(results.TEST_OUTPUT.path)
----

. Save the `.yaml` file.

.. If you haven’t already, commit this file to a GitHub repository that {ProductName} can access.

+
Complete example file:

+
[source]
----
kind: Pipeline
apiVersion: tekton.dev/v1beta1
metadata:
name: example-pipeline
spec:
params:
- description: 'Snapshot of the application'
name: SNAPSHOT
default: '{"components": [{"name":"test-app", "containerImage": "quay.io/example/repo:latest"}]}'
type: string
- description: 'Namespace where the application is running'
name: NAMESPACE
default: "default"
type: string
- description: 'Expected output'
name: EXPECTED_OUTPUT
default: "Hello World!"
type: string
workspaces:
- name: cluster-credentials
optional: true
tasks:
- name: task-1
description: Placeholder task that prints the Snapshot and outputs standard TEST_OUTPUT
params:
- name: SNAPSHOT
value: $(params.SNAPSHOT)
taskSpec:
params:
- name: SNAPSHOT
results:
- name: TEST_OUTPUT
description: Test output
steps:
- image: registry.redhat.io/openshift4/ose-cli:latest
env:
- name: SNAPSHOT
value: $(params.SNAPSHOT)
script: |
dnf -y install jq
echo -e "Example test task for the Snapshot:\n ${SNAPSHOT}"
// Run custom tests for the given Snapshot here
// After the tests finish, record the overall result in the RESULT variable
RESULT="SUCCESS"
// Output the standardized TEST_OUTPUT result in JSON form
TEST_OUTPUT=$(jq -rc --arg date $(date +%s) --arg RESULT "${RESULT}" --null-input \
'{result: $RESULT, timestamp: $date, failures: 0, successes: 1, warnings: 0}')
echo -n "${TEST_OUTPUT}" | tee $(results.TEST_OUTPUT.path)
----

. Add your new custom test as an integration test in {ProductName}.

.. For additional instructions on adding an integration test, see this document: xref:how-to-guides/testing_applications/proc_adding_an_integration_test.adoc[Adding an integration test].

.Data injected into the PipelineRun of the integration test

When you create a custom integration test, {ProductName} automatically adds certain parameters, workspaces, and labels to the PipelineRun of the integration test. This section explains what those parameters, workspaces, and labels are, and how they can help you.

Parameters:

* *`SNAPSHOT`*: contains the xref:../../glossary/index.adoc#_snapshot[snapshot] of the whole application as a JSON string. This JSON string provides useful information about the test, such as which components {ProductName} is testing, and what git repository and commit {ProductName} is using to build those components. For information about snapshot JSON string, see link:https://github.com/konflux-ci/integration-examples/blob/main/examples/snapshot_json_string_example[an example snapshot JSON string].
Labels:

* *`appstudio.openshift.io/application`*: contains the name of the application.
* *`appstudio.openshift.io/component`*: contains the name of the component.
* *`appstudio.openshift.io/snapshot`*: contains the name of the snapshot.
* *`test.appstudio.openshift.io/optional`*: contains the optional flag, which specifies whether or not components must pass the integration test before deployment.
* *`test.appstudio.openshift.io/scenario`*: contains the name of the integration test (this label ends with "scenario," because each test is technically a custom resource called an `IntegrationTestScenario`).
.Verification

After adding the integration test to an application, you need to trigger a new build of its components to make {ProductName} run the integration test. Make a commit to the GitHub repositories of your components to trigger a new build.

When the new build is finished, complete the following steps in the {ProductName} console:

. Go to the *Integration tests* tab and select the highlighted name of your test.

. Go to the *Pipeline runs* tab of that test and select the most recent run.

. On the *Details* page, see if the test succeeded for that component. Select the other tabs to view more details.

.. If you used our example script, switch to the *Logs* tab and verify that the test printed “Hello world!”.
= Creating a custom integration test

0 comments on commit 9c5589d

Please sign in to comment.