Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a CI job to build our container images #282

Merged
merged 10 commits into from
Dec 21, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 0 additions & 16 deletions .github/workflows/add-issue-to-project.yml

This file was deleted.

146 changes: 146 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,146 @@
name: "CI"
on:
push:
tags:
- "[0-9]+.[0-9]+.[0-9]+"
- "[0-9]+.[0-9]+.[0-9]+-rc[0-9]+"
branches: [main]
pull_request:
workflow_dispatch: # Manually
env:
REGISTRY: ghcr.io/noaa-gsl/vxingest

jobs:
build-ingest:
name: Build Ingest image
runs-on: ubuntu-latest
permissions:
packages: write
steps:
- uses: actions/checkout@v4
- name: Generate image metadata
uses: docker/metadata-action@v4
id: meta
with:
images: |
ghcr.io/noaa-gsl/vxingest/ingest
tags: |
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=sha
# set latest tag for default branch
type=raw,value=latest,enable={{is_default_branch}}
labels: |
org.opencontainers.image.vendor=NOAA's Global Systems Laboratory
- uses: docker/setup-qemu-action@v3
- uses: docker/setup-buildx-action@v3
- name: Login to GHCR
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build image and push
uses: docker/build-push-action@v5
with:
file: docker/ingest/Dockerfile
context: .
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
platforms: linux/amd64,linux/arm64
push: true
# Use a cache to speed up builds
# Note: this may cause security issues if the apt updates are cached
# It may make more sense to build eccodes as its own image instead.
cache-from: type=registry,ref=ghcr.io/noaa-gsl/vxingest/cache/ingest:buildcache
cache-to: type=registry,ref=ghcr.io/noaa-gsl/vxingest/cache/ingest:buildcache,mode=max
build-import:
name: Build Import image
runs-on: ubuntu-latest
permissions:
packages: write
steps:
- uses: actions/checkout@v4
- name: Generate image metadata
uses: docker/metadata-action@v4
id: meta
with:
images: |
ghcr.io/noaa-gsl/vxingest/import
tags: |
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=sha
# set latest tag for default branch
type=raw,value=latest,enable={{is_default_branch}}
labels: |
org.opencontainers.image.vendor=NOAA's Global Systems Laboratory
- uses: docker/setup-qemu-action@v3
- uses: docker/setup-buildx-action@v3
- name: Login to GHCR
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build image and push
uses: docker/build-push-action@v5
with:
file: docker/import/Dockerfile
context: .
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
platforms: linux/amd64,linux/arm64
push: true
scan-ingest:
runs-on: ubuntu-latest
needs: build-ingest
steps:
- name: Extract SHORT_SHA
run: echo "SHORT_SHA=${GITHUB_SHA::7}" >> "$GITHUB_ENV"
- name: Scan image with Trivy
uses: aquasecurity/[email protected]
with:
image-ref: "ghcr.io/noaa-gsl/vxingest/ingest:sha-${{ env.SHORT_SHA }}"
format: "sarif"
output: "trivy-ingest-results.sarif"
ignore-unfixed: true
severity: "CRITICAL,HIGH"
limit-severities-for-sarif: true
exit-code: "1"
env:
TRIVY_USERNAME: ${{ github.actor }}
TRIVY_PASSWORD: ${{ secrets.GITHUB_TOKEN }}
- name: Upload Trivy scan results to GitHub Security tab
if: always()
uses: github/codeql-action/upload-sarif@v2
with:
sarif_file: "trivy-ingest-results.sarif"
scan-import:
runs-on: ubuntu-latest
needs: build-import
steps:
- name: Extract SHORT_SHA
run: echo "SHORT_SHA=${GITHUB_SHA::7}" >> "$GITHUB_ENV"
- name: Scan image with Trivy
uses: aquasecurity/[email protected]
with:
image-ref: "ghcr.io/noaa-gsl/vxingest/import:sha-${{ env.SHORT_SHA }}"
format: "sarif"
output: "trivy-import-results.sarif"
ignore-unfixed: true
severity: "CRITICAL,HIGH"
limit-severities-for-sarif: true
# exit-code: "1" # FIXME: allow failures for now. Couchbase needs to update cbtools
env:
TRIVY_USERNAME: ${{ github.actor }}
TRIVY_PASSWORD: ${{ secrets.GITHUB_TOKEN }}
- name: Upload Trivy scan results to GitHub Security tab
if: always()
uses: github/codeql-action/upload-sarif@v2
with:
sarif_file: "trivy-import-results.sarif"
8 changes: 6 additions & 2 deletions docker/import/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,13 @@ RUN groupadd --gid ${ID} amb-verif && \

WORKDIR /app

# Install the couchbase tools
ENV CB_VERSION=7.2.2
RUN curl -L https://packages.couchbase.com/releases/${CB_VERSION}/couchbase-server-tools_${CB_VERSION}-linux_$(uname -m).tar.gz | tar xz --directory /usr/local

# Copy the scripts and metadata dirs so the import script can run
COPY ./scripts/ /app/
COPY ./mats_metadata_and_indexes /app/
COPY ./scripts /app/scripts
COPY ./mats_metadata_and_indexes /app/mats_metadata_and_indexes

# TODO - install the cbtools directly and remove from the git repo
# See: https://docs.couchbase.com/cloud/reference/command-line-tools.html#download-and-install-the-couchbase-command-line-tools
Expand Down
2 changes: 1 addition & 1 deletion scripts/VXingest_utilities/import_docs.sh
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ do_import() {
sleep 10
cat ${file_list} | while read f; do
echo "cbimport json --cluster couchbase://${host} --bucket ${bucket} --scope-collection-exp ${scope}.${collection} --username ${user} --password ${pwd} --format list --generate-key %id% --dataset file:///${f}"
${curdir}/scripts/cbtools/bin/cbimport json --cluster couchbase://${host} --bucket ${bucket} --scope-collection-exp ${scope}.${collection} --username ${user} --password ${pwd} --format list --generate-key %id% --dataset file:///${f}
cbimport json --cluster couchbase://${host} --bucket ${bucket} --scope-collection-exp ${scope}.${collection} --username ${user} --password ${pwd} --format list --generate-key %id% --dataset file:///${f}
done
}

Expand Down
Binary file removed scripts/VXingest_utilities/promql
Binary file not shown.
Loading