Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SNOW-831250: Create thin jar #1586

Merged
merged 4 commits into from
Jan 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .github/workflows/build-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,14 +32,15 @@ jobs:

test-linux:
needs: build
name: ${{ matrix.cloud }} JDBC ${{ matrix.category }} on ${{ matrix.image }}
name: ${{ matrix.cloud }} JDBC${{ matrix.additionalMavenProfile }} ${{ matrix.category }} on ${{ matrix.image }}
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
image: [ 'jdbc-centos7-openjdk8', 'jdbc-centos7-openjdk11', 'jdbc-centos7-openjdk17' ]
cloud: [ 'AWS' ]
category: ['TestCategoryResultSet,TestCategoryOthers,TestCategoryLoader', 'TestCategoryConnection,TestCategoryStatement', 'TestCategoryArrow,TestCategoryCore', 'TestCategoryFips']
additionalMavenProfile: ['', '-Dthin-jar']
sfc-gh-pfus marked this conversation as resolved.
Show resolved Hide resolved
steps:
- uses: actions/checkout@v1
- name: Tests
Expand All @@ -49,6 +50,7 @@ jobs:
CLOUD_PROVIDER: ${{ matrix.cloud }}
TARGET_DOCKER_TEST_IMAGE: ${{ matrix.image }}
JDBC_TEST_CATEGORY: ${{ matrix.category }}
ADDITIONAL_MAVEN_PROFILE: ${{ matrix.additionalMavenProfile }}
run: ./ci/test.sh

test-linux-old-driver:
Expand Down
11 changes: 11 additions & 0 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -111,6 +111,17 @@ You may import the coding style from IntelliJ so that the coding style can be ap
- Enable `google-java-format` for the JDBC project.
- In the source code window, select **Code** -> **Reformat** to apply the coding style.

Thin Jar
========

To build a thin jar run command:

.. code-block:: bash

mvn clean verify -Dthin-jar -Dnot-self-contained-jar
sfc-gh-pfus marked this conversation as resolved.
Show resolved Hide resolved

- `thin-jar` enables thin jar profile
- `not-self-contained-jar` turns off fat jar profile (enabled by default)

Tests
=====
Expand Down
2 changes: 1 addition & 1 deletion ci/container/test_component.sh
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ for c in "${CATEGORY[@]}"; do
-Djacoco.skip.instrument=false \
-DtestCategory=net.snowflake.client.category.$c \
-Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn \
-Dnot-self-contained-jar \
-Dnot-self-contained-jar $ADDITIONAL_MAVEN_PROFILE \
verify \
--batch-mode --show-version
fi
Expand Down
4 changes: 3 additions & 1 deletion ci/scripts/check_content.sh
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,13 @@

# scripts used to check if all dependency is shaded into snowflake internal path

package_modifier=$1

set -o pipefail

DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null && pwd )"

if jar tvf $DIR/../../target/snowflake-jdbc.jar | awk '{print $8}' | grep -v -E "^(net|com)/snowflake" | grep -v -E "(com|net)/\$" | grep -v -E "^META-INF" | grep -v -E "^mozilla" | grep -v -E "^com/sun/jna" | grep -v com/sun/ | grep -v mime.types; then
if jar tvf $DIR/../../target/snowflake-jdbc${package_modifier}.jar | awk '{print $8}' | grep -v -E "^(net|com)/snowflake" | grep -v -E "(com|net)/\$" | grep -v -E "^META-INF" | grep -v -E "^mozilla" | grep -v -E "^com/sun/jna" | grep -v com/sun/ | grep -v mime.types; then
echo "[ERROR] JDBC jar includes class not under the snowflake namespace"
exit 1
fi
1 change: 1 addition & 0 deletions ci/test.sh
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,7 @@ for name in "${!TARGET_TEST_IMAGES[@]}"; do
-e JOB_NAME \
-e BUILD_NUMBER \
-e JDBC_TEST_CATEGORY \
-e ADDITIONAL_MAVEN_PROFILE \
-e is_old_driver \
--add-host=snowflake.reg.local:${IP_ADDR} \
--add-host=s3testaccount.reg.local:${IP_ADDR} \
Expand Down
198 changes: 196 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -10,18 +10,23 @@
<relativePath>./parent-pom.xml</relativePath>
</parent>

<artifactId>snowflake-jdbc</artifactId>
<!-- Maven complains about using property here, but it makes install and deploy process easier to override final package names and localization -->
sfc-gh-pfus marked this conversation as resolved.
Show resolved Hide resolved
<artifactId>${artifactId}</artifactId>
<version>3.14.4</version>
<packaging>jar</packaging>

<name>snowflake-jdbc</name>
<name>${artifactId}</name>
<url>https://github.com/snowflakedb/snowflake-jdbc</url>

<scm>
<connection>scm:git:https://github.com/snowflakedb/snowflake-jdbc.git</connection>
<url>https://github.com/snowflakedb/snowflake-jdbc</url>
</scm>

<properties>
<artifactId>snowflake-jdbc</artifactId>
</properties>

<dependencies>
<dependency>
<groupId>org.bouncycastle</groupId>
Expand Down Expand Up @@ -578,6 +583,195 @@
</plugins>
</build>
</profile>
<profile>
<id>thin-jar</id>
<activation>
<property>
<name>thin-jar</name>
</property>
</activation>
<properties>
<artifactId>snowflake-jdbc-thin</artifactId>
</properties>
<build>
<plugins>
<plugin>
<!-- we don't want to run japicmp for thin-jar until we release it for the first time -->
<groupId>com.github.siom79.japicmp</groupId>
<artifactId>japicmp-maven-plugin</artifactId>
<executions>
<execution>
<id>japicmp</id>
<goals>
<goal>cmp</goal>
</goals>
<phase>none</phase>
</execution>
</executions>
</plugin>
<plugin>
<!-- relocate the META-INF/versions files manually due to the maven bug -->
<!-- https://issues.apache.org/jira/browse/MSHADE-406 -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>repack</id>
<goals>
<goal>run</goal>
</goals>
<phase>package</phase>
<configuration>
<target>
<unzip dest="${project.build.directory}/relocate" src="${project.build.directory}/${project.build.finalName}.jar"/>
<mkdir dir="${project.build.directory}/relocate/META-INF/versions/9/${relocationBase}"/>
<mkdir dir="${project.build.directory}/relocate/META-INF/versions/11/${relocationBase}"/>
<mkdir dir="${project.build.directory}/relocate/META-INF/versions/15/${relocationBase}"/>
<zip basedir="${project.build.directory}/relocate" destfile="${project.build.directory}/${project.build.finalName}.jar"/>
<delete dir="${project.build.directory}/relocate/META-INF/versions/9/${relocationBase}"/>
<delete dir="${project.build.directory}/relocate/META-INF/versions/11/${relocationBase}"/>
<delete dir="${project.build.directory}/relocate/META-INF/versions/15/${relocationBase}"/>
</target>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<!-- google linkage checker doesn't work well with shaded jar, disable the check in this case for now -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<executions>
<execution>
<id>enforce-linkage-checker</id>
<goals>
<goal>enforce</goal>
</goals>
<phase>none</phase>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<configuration/>
<executions>
<execution>
<goals>
<goal>shade</goal>
</goals>
<phase>package</phase>
<configuration>
<artifactSet>
<includes>
<include>net.snowflake:snowflake-common</include>
<include>org.apache.arrow:*</include>
<include>org.apache.tika:tika-core</include>
<include>io.netty:*</include>
</includes>
</artifactSet>
<relocations>
<!-- We list only packages that we need to include form dependencies + snowflake-common-->
<relocation>
<pattern>net.snowflake.common</pattern>
<shadedPattern>${shadeBase}.snowflake.common</shadedPattern>
</relocation>
<relocation>
<pattern>org.apache.arrow</pattern>
<shadedPattern>${shadeBase}.apache.arrow</shadedPattern>
</relocation>
<relocation>
<pattern>org.apache.tika</pattern>
<shadedPattern>${shadeBase}.apache.tika</shadedPattern>
</relocation>
<!-- io.netty are dependencies for arrow and arrow packages have some of the io.netty classes internally -->
<relocation>
<pattern>io.netty</pattern>
<shadedPattern>${shadeBase}.io.netty</shadedPattern>
</relocation>
</relocations>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/LICENSE*</exclude>
<exclude>META-INF/NOTICE*</exclude>
<exclude>META-INF/DEPENDENCIES</exclude>
<exclude>META-INF/maven/**</exclude>
<exclude>META-INF/services/com.fasterxml.*</exclude>
<exclude>META-INF/*.xml</exclude>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
<exclude>.netbeans_automatic_build</exclude>
<exclude>git.properties</exclude>
<exclude>arrow-git.properties</exclude>
<exclude>google-http-client.properties</exclude>
<exclude>storage.v1.json</exclude>
<!-- This is just a documentation file, not needed-->
<exclude>pipes-fork-server-default-log4j2.xml</exclude>
<exclude>dependencies.properties</exclude>
<exclude>pipes-fork-server-default-log4j2.xml</exclude>
</excludes>
</filter>
<filter>
<artifact>org.apache.arrow:arrow-vector</artifact>
<excludes>
<!-- codegen directory is used to generate java code for arrow vector package. Excludes them since we only need class file -->
<exclude>codegen/**</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"/>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<configuration>
<timestampFormat>yyyyMMddHHmmss</timestampFormat>
<timestampPropertyName>buildNumber.timestamp</timestampPropertyName>
<doCheck>false</doCheck>
<revisionOnScmFailure/>
<doUpdate>false</doUpdate>
<!--- Note for those who come later. If you specify "buildNumber" in the items field, it becomes an incrementing buildNumber
AFAIK (and I spent a lot of time on this) it is impossible to get the SCM rev number and incrementing build number at the same time -->
</configuration>
<executions>
<execution>
<goals>
<goal>create-timestamp</goal>
</goals>
<phase>package</phase>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<execution>
<id>check-shaded-content</id>
<goals>
<goal>exec</goal>
</goals>
<phase>verify</phase>
<configuration>
<executable>${basedir}/ci/scripts/check_content.sh</executable>
<arguments>
<argument>-thin</argument>
</arguments>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>self-contained-jar</id>
<activation>
Expand Down
12 changes: 12 additions & 0 deletions src/test/java/net/snowflake/client/SkipOnThinJar.java
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
/*
* Copyright (c) 2012-2024 Snowflake Computing Inc. All right reserved.
*/
package net.snowflake.client;

/** Skip tests on CI when thin jar is tested */
public class SkipOnThinJar implements ConditionalIgnoreRule.IgnoreCondition {
@Override
public boolean isSatisfied() {
return "-Dthin-jar".equals(TestUtil.systemGetEnv("ADDITIONAL_MAVEN_PROFILE"));
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@
import java.sql.Statement;
import java.time.Instant;
import java.util.*;
import net.snowflake.client.ConditionalIgnoreRule;
import net.snowflake.client.SkipOnThinJar;
import net.snowflake.client.category.TestCategoryArrow;
import net.snowflake.client.jdbc.*;
import net.snowflake.client.jdbc.telemetry.NoOpTelemetryClient;
Expand All @@ -41,16 +43,24 @@

@Category(TestCategoryArrow.class)
public class SFArrowResultSetIT {

/** Necessary to conditional ignore tests */
@Rule public ConditionalIgnoreRule rule = new ConditionalIgnoreRule();

private Random random = new Random();

/** allocator for arrow */
/**
* allocator for arrow RootAllocator is shaded so it cannot be overridden when testing thin or fat
sfc-gh-pfus marked this conversation as resolved.
Show resolved Hide resolved
* jar
*/
protected BufferAllocator allocator = new RootAllocator(Long.MAX_VALUE);

/** temporary folder to store result files */
@Rule public TemporaryFolder resultFolder = new TemporaryFolder();

/** Test the case that all results are returned in first chunk */
@Test
@ConditionalIgnoreRule.ConditionalIgnore(condition = SkipOnThinJar.class)
public void testNoOfflineData() throws Throwable {
List<Field> fieldList = new ArrayList<>();
Map<String, String> customFieldMeta = new HashMap<>();
Expand Down Expand Up @@ -112,6 +122,7 @@ public void testEmptyResultSet() throws Throwable {

/** Testing the case that all data comes from chunk downloader */
@Test
@ConditionalIgnoreRule.ConditionalIgnore(condition = SkipOnThinJar.class)
public void testOnlyOfflineData() throws Throwable {
final int colCount = 2;
final int chunkCount = 10;
Expand Down Expand Up @@ -161,6 +172,7 @@ public void testOnlyOfflineData() throws Throwable {

/** Testing the case that all data comes from chunk downloader */
@Test
@ConditionalIgnoreRule.ConditionalIgnore(condition = SkipOnThinJar.class)
public void testFirstResponseAndOfflineData() throws Throwable {
final int colCount = 2;
final int chunkCount = 10;
Expand Down Expand Up @@ -553,6 +565,7 @@ private void writeTimestampStructToField(

/** Test that first chunk containing struct vectors (used for timestamps) can be sorted */
@Test
@ConditionalIgnoreRule.ConditionalIgnore(condition = SkipOnThinJar.class)
public void testSortedResultChunkWithStructVectors() throws Throwable {
Connection con = getConnection();
Statement statement = con.createStatement();
Expand Down Expand Up @@ -622,6 +635,7 @@ public void testSortedResultChunkWithStructVectors() throws Throwable {

/** Test that the first chunk can be sorted */
@Test
@ConditionalIgnoreRule.ConditionalIgnore(condition = SkipOnThinJar.class)
public void testSortedResultChunk() throws Throwable {
Connection con = getConnection();
Statement statement = con.createStatement();
Expand Down
Loading