Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modifications for Complete Pipeline with Github Actions #554

Open
wants to merge 38 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
1242d2e
modifcations complete pipeline
cam19kab Nov 30, 2023
e3948da
Merge pull request #1 from cam19kab/pipeline_complete_modification_gi…
cam19kab Nov 30, 2023
f7ce0e7
Add modifications complete pipeline yml and cfg
cam19kab Dec 1, 2023
84b70ca
Merge pull request #2 from cam19kab/pipeline_complete_modification_gi…
cam19kab Dec 1, 2023
69850f0
modifications complete-pipeline common node-build
cam19kab Dec 1, 2023
a27f567
Merge pull request #3 from cam19kab/pipeline_complete_modification_gi…
cam19kab Dec 1, 2023
287aec0
new changes file configurations
cam19kab Dec 1, 2023
780f4a0
Merge pull request #4 from cam19kab/pipeline_complete_modification_gi…
cam19kab Dec 1, 2023
5e4fb3a
changes complete cfg and file sh
cam19kab Dec 4, 2023
c361e27
Merge pull request #5 from cam19kab/pipeline_complete_modification_gi…
cam19kab Dec 4, 2023
7093bcd
changes complete pipeline yml template
cam19kab Dec 5, 2023
a97e028
Merge pull request #6 from cam19kab/pipeline_complete_modification_gi…
cam19kab Dec 5, 2023
d170d1d
add modifications build-test-quality
cam19kab Dec 5, 2023
d4a4f3a
new changes commit quality and test
cam19kab Dec 5, 2023
546c2ea
changes cfg complete-pipeline
cam19kab Dec 5, 2023
2217634
Merge pull request #7 from cam19kab/pipeline_complete_modification_gi…
cam19kab Dec 5, 2023
b5a9c2f
modifications build-test-quality
cam19kab Dec 5, 2023
f1fa8fc
Merge pull request #8 from cam19kab/pipeline_complete_modification_gi…
cam19kab Dec 5, 2023
02231ef
changes configuration template
cam19kab Dec 5, 2023
b280876
changes configurations
cam19kab Dec 5, 2023
77cd50e
changes apply path and files
cam19kab Dec 5, 2023
9d54aa2
Merge pull request #9 from cam19kab/pipeline_complete_modification_gi…
cam19kab Dec 5, 2023
6a1b3df
changes scripFile cfg
cam19kab Dec 5, 2023
711a48c
Merge pull request #10 from cam19kab/pipeline_complete_modification_g…
cam19kab Dec 5, 2023
2ad03aa
changes in cfg and yml quality sonar
cam19kab Dec 5, 2023
5b50711
Merge pull request #11 from cam19kab/pipeline_complete_modification_g…
cam19kab Dec 5, 2023
6df0c66
changes yml template sonar env variables
cam19kab Dec 5, 2023
66e91d4
Merge pull request #12 from cam19kab/pipeline_complete_modification_g…
cam19kab Dec 5, 2023
dd47ab5
fix: modifications documentation and configuration
cam19kab Dec 11, 2023
6b78000
Merge pull request #13 from cam19kab/pipeline_complete_modification_g…
cam19kab Dec 11, 2023
86a74e5
fix: new modifications quarkus-quality.sh
cam19kab Dec 11, 2023
6f41376
Merge pull request #14 from cam19kab/pipeline_complete_modification_g…
cam19kab Dec 11, 2023
d78739c
fix: changes configuration and yml
cam19kab Dec 12, 2023
d023c56
Merge pull request #15 from cam19kab/pipeline_complete_modification_g…
cam19kab Dec 12, 2023
949ab61
fix: changes complete yml
cam19kab Dec 12, 2023
5c16346
Merge pull request #16 from cam19kab/pipeline_complete_modification_g…
cam19kab Dec 12, 2023
3523aae
fix: doc and pipeline yml delete comments
cam19kab Dec 13, 2023
876dcea
Merge pull request #17 from cam19kab/pipeline_complete_modification_g…
cam19kab Dec 13, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
74 changes: 74 additions & 0 deletions documentation/github/setup-complete-pipeline.asciidoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
:provider: GitHub
:pipeline_type: workflow
:trigger_sentence: This workflow will be configured to be executed as a job inside a CI workflow
:pipeline_type2: GitHub action
:path_provider: github
:extra_sentence_ci: Please note that this workflow, although manually triggerable, is designed to be executed as part of a CI workflow, which you can create following this xref:./setup-ci-pipeline.asciidoc[guide].
:openBrowserFlag: -w
= Setting up a Complete {pipeline_type} on {provider}

In this section we will create a complete {pipeline_type} for compiling project code. {trigger_sentence}, regardless of which branch it is made on.

The creation of the {pipeline_type2} will follow the project workflow, so a new branch named `feature/complete-pipeline` will be created and the YAML file for the {pipeline_type} will be pushed to it.

Then, a Pull Request (PR) will be created in order to merge the new branch into the appropriate branch (provided in `-b` flag). The PR will be automatically merged if the repository policies are met. If the merge is not possible, either the PR URL will be shown as output, or it will be opened in your web browser if using `-w` flag.

The script located at `/scripts/pipelines/{path_provider}/pipeline_generator.sh` will automatically create this new branch, create a complete {pipeline_type} based on a YAML template appropriate for the project programming language or framework, create the Pull Request and, if it is possible, merge this new branch into the specified branch.

{extra_sentence_ci}

== Prerequisites

This script will commit and push the corresponding YAML template into your repository, so please be sure your local repository is up-to-date (i.e you have pulled latest changes with `git pull`).

== Creating the {pipeline_type} using provided script

=== Usage
[subs=attributes+]
```
pipeline_generator.sh \
-c <config file path> \
-n <{pipeline_type} name> \
-l <language or framework> \
--sonar-url <sonarqube url> \
--sonar-token <sonarqube token> \
-d <project local path> \
[-b <branch>] \
[-w]
```

NOTE: The config file for the Complete {pipeline_type} is located at `/scripts/pipelines/{path_provider}/templates/complete/complete-pipeline.cfg`.

=== Flags
[subs=attributes+]
```
-c, --config-file [Required] Configuration file containing {pipeline_type} definition.
-n, --pipeline-name [Required] Name that will be set to the {pipeline_type}.
-l, --language [Required] Language or framework of the project.
-d, --local-directory [Required] Local directory of your project.
-t, --target-directory Target directory of complete process. Takes precedence over the language/framework default one.
-b, --target-branch Name of the branch to which the Pull Request will target. PR is not created if the flag is not provided.
-w Open the Pull Request on the web browser if it cannot be automatically merged. Requires -b flag.
```

=== Examples

==== Quarkus project

===== Quarkus project using JVM
[subs=attributes+]
```
./pipeline_generator.sh -c ./templates/complete/complete-pipeline.cfg -n quarkus-project-complete -l quarkus-jvm --sonar-url http://1.2.3.4:9000 --sonar-token 6ce6663b63fc02881c6ea4c7cBa6563b8247a04e -d C:/Users/$USERNAME/Desktop/quarkus-project -b develop {openBrowserFlag}
```

==== Node.js project
[subs=attributes+]
```
./pipeline_generator.sh -c ./templates/complete/complete-pipeline.cfg -n node-project-complete -l node --sonar-url http://1.2.3.4:9000 --sonar-token 6ce6663b63fc02881c6ea4c7cBa6563b8247a04e -d C:/Users/$USERNAME/Desktop/node-project -b develop {openBrowserFlag}
```

==== Angular project
[subs=attributes+]
```
./pipeline_generator.sh -c ./templates/complete/complete-pipeline.cfg -n angular-project-complete -l angular --sonar-url http://1.2.3.4:9000 --sonar-token 6ce6663b63fc02881c6ea4c7cBa6563b8247a04e -d C:/Users/$USERNAME/Desktop/angular-project -b develop {openBrowserFlag}
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
= Setting up a Complete Pipeline {pipeline_type} on {provider}

In this section we will create a complete {pipeline_type} for analyzing project code with SonarQube. {trigger_sentence}, and consumes the artifact produced by the build {pipeline_type}.

The creation of this {pipeline_type2} will follow the project workflow, so a new branch named `feature/ci-pipeline` will be created and the YAML file for the {pipeline_type} will be pushed to it.

ifndef::no-PR-or-MR[]
Then, a Pull Request (PR) will be created in order to merge the new branch into the appropriate branch (provided in `-b` flag). The PR will be automatically merged if the repository policies are met. If the merge is not possible, either the PR URL will be shown as output, or it will be opened in your web browser if using `-w` flag.

endif::[]
ifdef::no-PR-or-MR[]
Then, the new branch will be merged into the appropriate branch (provided in `-b` flag).

endif::[]
The script located at `/scripts/pipelines/{path_provider}/pipeline_generator.sh` will automatically create this new branch, create a quality {pipeline_type} based on a YAML template appropriate for the project programming language or framework, create the Pull Request, and if it is possible, merge this new branch into the specified branch.

ifdef::extra_sentence_ci[]
{extra_sentence_ci}

endif::[]
== Prerequisites

* This script will commit and push the corresponding YAML template into your repository, so please be sure your local repository is up-to-date (i.e you have pulled the latest changes with `git pull`).
* Generate a SonarQube https://docs.sonarqube.org/latest/user-guide/user-token/[token] (just follow the section "Generating a token").

== Creating the {pipeline_type} using provided script

=== Usage
[subs=attributes+]
```
pipeline_generator.sh \
-c <config file path> \
-n <{pipeline_type} name> \
-l <language or framework> \
--sonar-url <sonarqube url> \
--sonar-token <sonarqube token> \
-d <project local path> \
ifdef::build-pipeline[ --build-pipeline-name <build {pipeline_type} name> \]
ifdef::test-pipeline[ --test-pipeline-name <test {pipeline_type} name> \]
ifeval::["{provider}" == "Google Cloud"]
[--test-pipeline-name <test {pipeline_type} name>] \
endif::[]
[-b <branch>] \
ifndef::no-PR-or-MR[ [-w]]
ifeval::["{provider}" == "Google Cloud"]
[-m <machine type for {pipeline_type} runner>]
[--env-vars <env vars list>]
[--secret-vars <secret vars list>]
endif::[]
```


NOTE: The config file for the complete {pipeline_type} is located at `/scripts/pipelines/{path_provider}/templates/complete/complete-pipeline.cfg`.

=== Flags
[subs=attributes+]
```
-c, --config-file [Required] Configuration file containing {pipeline_type} definition.
-n, --pipeline-name [Required] Name that will be set to the {pipeline_type}.
-l, --language [Required] Language or framework of the project.
--sonar-url [Required] SonarQube URL.
--sonar-token [Required] SonarQube token.
-d, --local-directory [Required] Local directory of your project.
ifdef::build-pipeline[ --build-pipeline-name [Required] Build {pipeline_type} name.]
ifdef::test-pipeline[ --test-pipeline-name [Required] Test {pipeline_type} name.]
ifeval::["{provider}" == "Google Cloud"]
--test-pipeline-name Test {pipeline_type} name.
endif::[]
-b, --target-branch Name of the branch to which the Pull Request will target. PR is not created if the flag is not provided.
ifndef::no-PR-or-MR[-w Open the Pull Request on the web browser if it cannot be automatically merged. Requires -b flag.]
ifeval::["{provider}" == "Google Cloud"]
-m, --machine-type Machine type for {pipeline_type} runner. Accepted values: E2_HIGHCPU_8, E2_HIGHCPU_32, N1_HIGHCPU_8, N1_HIGHCPU_32.
--env-vars List of environment variables to be made available in pipeline. Syntax: "var1=val1 var2=val2 ...".
--secret-vars List of environment variables (saved as secrets in Secret Manager) to be made available in pipeline. Syntax: "var1=val1 var2=val2 ...".
endif::[]
```

=== Examples

==== Quarkus project

[subs=attributes+]
```
./pipeline_generator.sh -c ./templates/quality/quality-pipeline.cfg -n quarkus-project-quality -l quarkus --sonar-url http://1.2.3.4:9000 --sonar-token 6ce666 -d C:/Users/$USERNAME/Desktop/quarkus-project {extra_args_quarkus} -b develop {openBrowserFlag}
```

==== Node.js project

[subs=attributes+]
```
./pipeline_generator.sh -c ./templates/complete/complete-pipeline.cfg -n node-project -l node --sonar-url http://1.2.3.4:9000 --sonar-token 6ce66 -d C:/Users/$USERNAME/Desktop/node-project {extra_args_quarkus} -b develop {openBrowserFlag}
```


8 changes: 8 additions & 0 deletions documentation/src/github/setup-complete-pipeline.asciidoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
:provider: GitHub
:pipeline_type: workflow
:trigger_sentence: This workflow will be configured to be executed as a job inside a CI workflow
:pipeline_type2: GitHub action
:path_provider: github
:extra_sentence_ci: Please note that this workflow, although manually triggerable, is designed to be executed as part of a CI workflow, which you can create following this xref:./setup-ci-pipeline.asciidoc[guide].
:openBrowserFlag: -w
include::../common_templates/setup-complete-pipeline.asciidoc[]
7 changes: 7 additions & 0 deletions scripts/pipelines/common/templates/complete/node-complete.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
#!/bin/bash
npm install
npm install jest-junit
JEST_SUITE_NAME="jest tests" JEST_JUNIT_OUTPUT_NAME="TEST-junit.xml" JEST_JUNIT_OUTPUT_DIR="." npm run test -- --ci --coverage --reporters=default --reporters=jest-junit
mv ./coverage/lcov.info ./lcov.info
#projectKey=$(python -c "from json import load; print(load(open('./package.json', 'r'))['name']);")
#npx sonar-scanner -Dsonar.host.url="$SONAR_URL" -Dsonar.login="$SONAR_TOKEN" -Dsonar.projectKey=$projectKey -Dsonar.javascript.lcov.reportPaths=lcov.info -Dsonar.sources="."
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
#!/bin/bash
ls -l $PROJECT_PATH/target/classes
mvn sonar:sonar -B -Dsonar.host.url="$SONAR_URL" -Dsonar.login="$SONAR_TOKEN" -Dsonar.java.binaries=$PROJECT_PATH/target/classes
54 changes: 54 additions & 0 deletions scripts/pipelines/github/templates/complete/complete-pipeline.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
# Mandatory flags.
mandatoryFlags="$pipelineName,$localDirectory,$language,$sonarUrl,$sonarToken,"

# Path to the templates.
templatesPath="scripts/pipelines/github/templates/complete"

# Path to the common templates folder
commonTemplatesPipelinePath=("scripts/pipelines/common/templates/build" "scripts/pipelines/common/templates/test" "scripts/pipelines/common/templates/quality")

# YAML file name.
yamlFile="complete-pipeline.yml"

# Script name.
scriptFile=("build.sh" "test.sh" "quality.sh.template")


# Source branch.
sourceBranch="feature/ci-complete"

# Function that copies the script to build the application.
function copyScript {
if [[ $language == "quarkus"* ]]
then
language="quarkus"
fi
# Copy the script.
cp "${hangarPath}/${commonTemplatesPipelinePath[0]}/${language}-${scriptFile[0]}" "${localDirectory}/${scriptFilePath}/${scriptFile[0]}"
cp "${hangarPath}/${commonTemplatesPipelinePath[1]}/${language}-${scriptFile[1]}" "${localDirectory}/${scriptFilePath}/${scriptFile[1]}"
cp "${hangarPath}/${commonTemplatesPipelinePath[2]}/${language}-${scriptFile[2]}" "${localDirectory}/${scriptFilePath}/${scriptFile[2]}"
}

# Function that adds the variables to be used in the pipeline.
function addPipelineVariables {
# if the user did not specify a custom target-directory
# we default to the language specific defaults

if test -z $targetDirectory
then
setTargetDirectory
fi

export targetDirectory
specificEnvSubstList='${targetDirectory}'


repoURL=$(git config --get remote.origin.url)
repoNameWithGit="${repoURL/https:\/\/github.com\/}"
repoName="${repoNameWithGit/.git}"
gh secret set SONARQUBE_TOKEN -a actions -b "$sonarToken" -R "$repoName"

export sonarUrl
export targetDirectory
specificEnvSubstList='${sonarUrl} ${targetDirectory}'
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
name: $pipelineName

on:
workflow_call:
workflow_dispatch:
inputs:
sonarUrl:
required: false
default: $sonarUrl
sonarToken:
required: false
default:
targetPath:
description: 'Target directory of build process.'
required: false
type: string
default: '$targetDirectory'

# mark to insert additional artifact input #

env:
targetPath: ${{ github.event.inputs.targetPath || '$targetDirectory' }}
sonarUrl: ${{ github.event.inputs.sonarUrl || '$sonarUrl' }}
sonarToken: ${{ github.event.inputs.sonarToken || secrets.SONARQUBE_TOKEN }}

# mark to insert additional artifact env var #

jobs:
Build:
name: Build
runs-on: ubuntu-latest

steps:
- name: Checkout the repository
uses: actions/checkout@v2

- name: Build the application
run: .github/workflows/scripts/build.sh

- name: Archiving artifact
run: pwd;ls -l;tar -cvf ./BuildOutput.tar -C ${{ env.targetPath }}/ .

- name: Publish Artifact
uses: actions/upload-artifact@v3
with:
name: BuildOutput
path: ./BuildOutput.tar

- name: Checkout repository code
uses: actions/checkout@v2

- name: Test
run: .github/workflows/scripts/test.sh

- name: Publish Test Results
uses: EnricoMi/publish-unit-test-result-action@v1
if: always()
with:
files: '**/TEST-*.xml'

- name: SonarQube analysis
run: .github/workflows/scripts/quality.sh.template
env:
SONAR_TOKEN: ${{ secrets.SONARQUBE_TOKEN }}
SONAR_URL: ${{ env.sonarUrl }}
PROJECT_PATH: .
9 changes: 6 additions & 3 deletions setup/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,6 @@ FROM base as cli_install_gcloud
WORKDIR /downloaded_assets

# Install gcloud
RUN echo "deb [signed-by=/usr/share/keyrings/cloud.google.gpg] http://packages.cloud.google.com/apt cloud-sdk main" | tee -a /etc/apt/sources.list.d/google-cloud-sdk.list && \
curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | tee /usr/share/keyrings/cloud.google.gpg && \
apt-get update -y && apt-get install google-cloud-sdk -y

FROM cli_install_gcloud as cli_install_aws

Expand Down Expand Up @@ -54,6 +51,12 @@ RUN curl -Lo ./firebase_tools https://firebase.tools/bin/linux/v11.16.0 && \

FROM cli_install_firebase as run_env

ARG KEY_ONE
ARG KEY_TWO

ENV ssh_prv_key=$KEY_ONE
ENV ssh_pub_key=$KEY_TWO

WORKDIR /scripts

ADD scripts .
Expand Down