-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HPCC4J-673 Fix Poms and directory structure to include spark-hpcc as a module of the hpcc4j project #780
HPCC4J-673 Fix Poms and directory structure to include spark-hpcc as a module of the hpcc4j project #780
Conversation
Signed-off-by: Michael Gardner <[email protected]>
…a module Signed-off-by: Michael Gardner <[email protected]>
6fa9fc3
to
d78ccdd
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Michael-Gardner just one minor question regarding the scala related version variables at the top level pom. Other than that it looks good to go. please merge
pom.xml
Outdated
@@ -65,6 +67,8 @@ | |||
<opentelemetry.bom.version>1.38.0</opentelemetry.bom.version> | |||
<opentelemetry.semconv.version>1.25.0-alpha</opentelemetry.semconv.version> | |||
<opentelemetry.instrumentation.annotations.version>2.6.0</opentelemetry.instrumentation.annotations.version> | |||
<scala.binary.version>2.11</scala.binary.version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think scala is spark-hpcc specific
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moved both of those scala references back over to the spark-hpcc Pom. Merging.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Michael-Gardner looks good to me
@Michael-Gardner It seems like the Javadoc workflow are running the Spark tests. I think this because we have a profile for remote tests on the HPCC4j side but not the Spark side. It seems like we can simply add a -Dmaven.test.skip=True to the maven command in the workflows/javadocTest.yml |
Signed-off-by: Michael Gardner <[email protected]>
372b130
to
3796948
Compare
Jirabot Action Result: |
Type of change:
Checklist:
Testing: