Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with WORKSPACE variable processing #25

Open
igor-ivanov opened this issue Mar 24, 2021 · 9 comments
Open

Issue with WORKSPACE variable processing #25

igor-ivanov opened this issue Mar 24, 2021 · 9 comments

Comments

@igor-ivanov
Copy link

igor-ivanov commented Mar 24, 2021

This issues originaly was here https://github.com/Mellanox-lab/libvma-pro/pull/33#discussion_r598849608

I am using following:

  - name: Build
    run: |
      env ${WORKSPACE}/contrib/test_jenkins.sh

and periodically observe
for bare-metal:

[2021-03-24T07:00:33.984Z] + env /home/jenkins/agent/workspace/LIBVMA-PRO/contrib/test_jenkins.sh
[2021-03-24T07:00:33.984Z] env: /home/jenkins/agent/workspace/LIBVMA-PRO/contrib/test_jenkins.sh: No such file or directory

and for container:

[2021-03-24T07:01:18.427Z] + env /scrap/jenkins/workspace/LIBVMA-PRO/contrib/test_jenkins.sh
[2021-03-24T07:01:18.427Z] env: ‘/scrap/jenkins/workspace/LIBVMA-PRO/contrib/test_jenkins.sh’: No such file or directory

For example:
http://hpc-master.lab.mtl.com:8080/blue/organizations/jenkins/LIBVMA-PRO/detail/LIBVMA-PRO/76/pipeline/885
default/x86_64/fc31/1
and
default/x86_64/r-aa-zorro014/1

@igor-ivanov
Copy link
Author

This isssue becomes critical as far as it appears regullary
http://hpc-master.lab.mtl.com:8080/blue/organizations/jenkins/LIBVMA-PRO/detail/LIBVMA-PRO/81/pipeline/1369
See for extra/x86_64/r-aa-zorro014/2

@mike-dubman
Copy link
Collaborator

what is an issue?
i see WS are fine and diff
http://hpc-master.lab.mtl.com:8080/job/LIBVMA-PRO/81/ws/

@igor-ivanov
Copy link
Author

igor-ivanov commented Mar 26, 2021

Is it possible that ci uses WORKSPACE as special variable and corrupts mine?

Example of issue see abs_path and WORKSPACE:

output:

[2021-03-26T11:26:52.581Z] # rel_path ----------------->  ./contrib    
[2021-03-26T11:26:52.581Z] # abs_path ----------------->  /scrap/jenkins/workspace/LIBVMA-PRO/contrib       
[2021-03-26T11:26:52.581Z] 
[2021-03-26T11:26:52.581Z] # WORKSPACE ---------------->  /home/jenkins/agent/workspace/LIBVMA-PRO    
[2021-03-26T11:26:52.581Z] # BUILD_NUMBER ------------->  98 
[2021-03-26T11:26:52.581Z] # TARGET ------------------->  default      

link: http://hpc-master.lab.mtl.com:8080/blue/organizations/jenkins/LIBVMA-PRO/detail/LIBVMA-PRO/98/pipeline/918

real script:

contrib/test_jenkins.sh

rel_path=$(dirname $0)
abs_path=$(readlink -f $rel_path)

echo
echo "# rel_path ----------------->  ${rel_path}    "
echo "# abs_path ----------------->  ${abs_path}       "
echo

source ${abs_path}/jenkins_tests/globals.sh

echo
echo "# WORKSPACE ---------------->  ${WORKSPACE}    "
echo "# BUILD_NUMBER ------------->  ${BUILD_NUMBER} "
echo "# TARGET ------------------->  ${TARGET}       "
echo

where: jenkins_tests/globals.sh

#!/bin/bash

WORKSPACE=${WORKSPACE:=$abs_path}
WORKSPACE=${WORKSPACE:=$(pwd)}
BUILD_NUMBER=${BUILD_NUMBER:=0}

@igor-ivanov
Copy link
Author

This issue appears 4 of 5 times.

@igor-ivanov
Copy link
Author

igor-ivanov commented Mar 29, 2021

http://hpc-master.lab.mtl.com:8080/blue/organizations/jenkins/LIBVMA-PRO/detail/LIBVMA-PRO/105/pipeline/941/

For example on bare metal (r-aa-zorro014) pipeline.log output shows
PWD=/scrap/jenkins/workspace/LIBVMA-PRO
WORKSPACE=/home/jenkins/agent/workspace/LIBVMA-PRO
before running contrib/test_jenkins.sh

[Pipeline] unstash
[Pipeline] pwd
[Pipeline] sh
[2021-03-31T12:24:58.649Z] pwd=/scrap/jenkins/workspace/LIBVMA-PRO -- ws=/home/jenkins/agent/workspace/LIBVMA-PRO
[2021-03-31T12:24:58.649Z] XXX found
[2021-03-31T12:24:58.876Z] + set -eE
[2021-03-31T12:24:58.876Z] + bash -c 'shopt -s dotglob; rm -rf /home/jenkins/agent/workspace/LIBVMA-PRO/*'
[2021-03-31T12:24:58.919Z] ======================================================

@igor-ivanov
Copy link
Author

There is a workaround to hide this jenkins issue as setting WORKSPACE=$PWD explicitly in Run but this workarond can not be applied for BlackDuck step.
As a result:

[2021-04-06T09:18:51.917Z] + set +x

[2021-04-06T09:18:51.917Z] Cloning into '/home/jenkins/agent/workspace/LIBVMA-PRO/blackduck'...
[2021-04-06T09:18:51.917Z] /home/jenkins/agent/workspace/LIBVMA-PRO/blackduck /home/jenkins/agent/workspace/LIBVMA-PRO
[2021-04-06T09:18:51.917Z] INFO:(run_bd_scan.sh) Using JAVA: /usr/bin/java
[2021-04-06T09:18:52.173Z] INFO:(run_bd_scan.sh) JAVA Version: 1.8.0_282
[2021-04-06T09:18:52.173Z] Required parameters value:
[2021-04-06T09:18:52.173Z] ====================================================
[2021-04-06T09:18:52.173Z] SPRING_APPLICATION_JSON = {"blackduck.url":"https://blackduck.mellanox.com/","blackduck.api.token":"ODMwOWYwMzEtODA2ZC00MzBjLWI1ZDEtNmFiMjBkYzQzMzkwOjNmNjExN2M1LWE2ZmEtNDZlYS1hZjRiLTZlNDgwNjAwOTVjNw=="}
[2021-04-06T09:18:52.173Z] PROJECT_NAME            = libvma
[2021-04-06T09:18:52.173Z] PROJECT_VERSION         = 0.1.0
[2021-04-06T09:18:52.173Z] PROJECT_SRC_PATH        = /scrap/jenkins/workspace/LIBVMA-PRO/src
[2021-04-06T09:18:52.173Z] ----------------------------------------------------
[2021-04-06T09:18:52.173Z] INFO:(run_bd_scan.sh) Running: source scan
[2021-04-06T09:18:52.173Z] INFO:(run_bd_scan.sh) Dry Run: false
[2021-04-06T09:18:52.173Z] ERROR:(run_bd_scan.sh) Source scan failed. PROJECT_SRC_PATH should be directory

Clonning is done into the right path as /home/jenkins/agent/workspace/LIBVMA-PRO/blackduck
But PROJECT_SRC_PATH is set to /scrap/jenkins/workspace/LIBVMA-PRO/src

@mike-dubman
Copy link
Collaborator

added fix as we discussed offline (move workspace var calc under stage() context
please let know if it helped

@igor-ivanov
Copy link
Author

@lennybe
Copy link
Contributor

lennybe commented Apr 20, 2021

I also faced this issue in the similar scenario when I had docker images and physical server usage in the same matrix file.
in my case usage of dockers was not necessary, so after i removed dockers and left with server only this issue has gone.
@mike, when we are using agentSelector, WORKSPACE comes from jenkins configuration, not from docker image or ci-demo user.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants