Skip to content

[HUDI-8371] Fixing column stats index with MDT for few scenarios #36174

[HUDI-8371] Fixing column stats index with MDT for few scenarios

[HUDI-8371] Fixing column stats index with MDT for few scenarios #36174

Triggered via pull request October 19, 2024 04:14
Status Failure
Total duration 1h 12m 31s
Artifacts

bot.yml

on: pull_request
validate-source
2m 45s
validate-source
Matrix: build-spark-java17
Matrix: docker-java17-test
Matrix: integration-tests
Matrix: test-hudi-hadoop-mr-and-hudi-java-client
Matrix: test-spark-java-tests
Matrix: test-spark-java11-17-java-tests
Matrix: test-spark-java11-17-scala-tests
Matrix: test-spark-java17-java-tests
Matrix: test-spark-java17-scala-tests
Matrix: test-spark-scala-tests
Matrix: validate-bundles-java11
Matrix: validate-bundles
Fit to window
Zoom out
Zoom in

Annotations

15 errors and 293 warnings
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
Process completed with exit code 125.
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
The job was canceled because "scala-2_13_flink1_20_spar" failed.
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
The operation was canceled.
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
The job was canceled because "scala-2_13_flink1_20_spar" failed.
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
The operation was canceled.
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
test-spark-java17-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes
test-spark-java11-17-java-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes
test-spark-java-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes
test-spark-java17-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes
test-spark-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes
test-spark-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes
test-spark-java11-17-java-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes
validate-source
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Building Hudi with Java 8
test-flink (flink1.18)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-flink (flink1.19)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-flink (flink1.17)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-flink (flink1.14)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-flink (flink1.16)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
build-flink-java17 (scala-2.12, flink1.20)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-flink (flink1.15)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-scala-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
build-spark-java17 (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
build-spark-java17 (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
build-spark-java17 (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
build-spark-java17 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh validating spark & hadoop-mr bundle
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
Use default java runtime under /opt/java/openjdk
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh validating utilities slim bundle
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh running deltastreamer
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh done with deltastreamer
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh validating spark & hadoop-mr bundle
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
Use default java runtime under /opt/java/openjdk
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh validating utilities slim bundle
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh running deltastreamer
validate-bundles-java11 (scala-2.13, flink1.20, spark3.5, spark3.5.0)
validate.sh done with deltastreamer
test-hudi-hadoop-mr-and-hudi-java-client (scala-2.12, spark3.5, flink1.20)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh validating spark & hadoop-mr bundle
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh Query and validate the results using Spark SQL
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh Query and validate the results using HiveQL
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
Use default java runtime under /opt/java/openjdk
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh validating utilities slim bundle
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh validating spark & hadoop-mr bundle
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh Query and validate the results using Spark SQL
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh Query and validate the results using HiveQL
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
Use default java runtime under /opt/java/openjdk
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles-java11 (scala-2.12, flink1.20, spark3.5, spark3.5.0)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
validate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating utilities slim bundle
integration-tests (spark3.5, spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-flink (flink1.20)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java17-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java17-scala-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java11-17-java-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java11-17-scala-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java11-17-scala-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java17-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java17-scala-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-scala-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-scala-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-java11-17-java-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/
test-spark-scala-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The following actions use a deprecated Node.js version and will be forced to run on node20: actions/checkout@v3, actions/setup-java@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/