Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/master' into remove_dependecy_…
Browse files Browse the repository at this point in the history
…on_protocol
  • Loading branch information
jtjeferreira committed Feb 14, 2024
2 parents d326351 + 557659e commit f780f6d
Show file tree
Hide file tree
Showing 143 changed files with 1,778 additions and 330 deletions.
15 changes: 7 additions & 8 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,12 @@

SNOW-XXXXX

## External contributors - please answer these questions before submitting a pull request. Thanks!
## Pre-review self checklist
- [ ] The code is correctly formatted (run `mvn -P check-style validate`)
- [ ] I don't expose unnecessary new public API (run `mvn verify` and inspect `target/japicmp/japicmp.html`)
- [ ] Pull request name is prefixed with `SNOW-XXXX: `

Please answer these questions before submitting your pull requests. Thanks!
## External contributors - please answer these questions before submitting a pull request. Thanks!

1. What GitHub issue is this PR addressing? Make sure that there is an accompanying issue to your PR.

Expand All @@ -18,13 +21,9 @@ Please answer these questions before submitting your pull requests. Thanks!
- [ ] I am modifying authorization mechanisms
- [ ] I am adding new credentials
- [ ] I am modifying OCSP code
- [ ] I am adding a new dependency
- [ ] I am adding a new dependency or upgrading an existing one
- [ ] I am adding new public/protected component not marked with `@SnowflakeJdbcInternalApi` (note that public/protected methods/fields in classes marked with this annotation are already internal)

3. Please describe how your code solves the related issue.

Please write a short description of how your code change solves the related issue.

## Pre-review checklist
- [ ] This change has passed precommit
- [ ] I have reviewed code coverage report for my PR in ([Sonarqube](https://sonarqube.int.snowflakecomputing.com/project/branches?id=snowflake-jdbc))

2 changes: 1 addition & 1 deletion .github/workflows/check-style.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,4 @@ jobs:
- uses: actions/checkout@v1
- name: Check Style
shell: bash
run: mvn clean verify --batch-mode --show-version -P check-style
run: mvn clean validate --batch-mode --show-version -P check-style
4 changes: 2 additions & 2 deletions FIPS/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,12 @@
<parent>
<groupId>net.snowflake</groupId>
<artifactId>snowflake-jdbc-parent</artifactId>
<version>3.14.5</version>
<version>3.14.6-SNAPSHOT</version>
<relativePath>../parent-pom.xml</relativePath>
</parent>

<artifactId>snowflake-jdbc-fips</artifactId>
<version>3.14.5</version>
<version>3.14.6-SNAPSHOT</version>
<packaging>jar</packaging>

<name>snowflake-jdbc-fips</name>
Expand Down
18 changes: 17 additions & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ Run the maven command to check the coding style.

.. code-block:: bash
mvn -P check-style verify
mvn -P check-style validate
Follow the instruction if any error occurs or run this command to fix the formats.

Expand All @@ -151,6 +151,10 @@ You may import the coding style from IntelliJ so that the coding style can be ap
- In the **File** -> **Settings/Plugins**, and install `google-java-format` plugin.
- Enable `google-java-format` for the JDBC project.
- In the source code window, select **Code** -> **Reformat** to apply the coding style.
- Additionally configure IDE to not use wildcard imports in **File** -> **Ecitor** -> **Code Style** -> **Java** set:
- **Use single class import**
- **Class count to use import with '*'** to 1000
- **Names count to use static import with '*'** to 1000

Tests
=====
Expand Down Expand Up @@ -179,6 +183,18 @@ Run the maven ``verify`` goal.
where ``category`` is the class name under the package ``net.snowflake.client.category``.

Set new version
---------------

1. Run maven command with passing specific version:

.. code-block:: bash
mvn -f parent-pom.xml versions:set -DnewVersion=... -DgenerateBackupPoms=false
2. Set manually the same version in field ``implementVersion`` in ``src/main/java/net/snowflake/client/jdbc/SnowflakeDriver.java`` when it's version for release or without ``-SNAPSHOT`` suffix between releases
3. Add entry in ``CHANGELOG.rst`` for release versions

Test Class Naming Convention
----------------------------

Expand Down
8 changes: 7 additions & 1 deletion parent-pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,14 @@

<groupId>net.snowflake</groupId>
<artifactId>snowflake-jdbc-parent</artifactId>
<version>3.14.5</version>
<version>3.14.6-SNAPSHOT</version>
<packaging>pom</packaging>

<modules>
<module>.</module>
<module>FIPS</module>
</modules>

<properties>
<apache.commons.compress.version>1.21</apache.commons.compress.version>
<apache.commons.lang3.version>3.12.0</apache.commons.lang3.version>
Expand Down Expand Up @@ -74,6 +79,7 @@
<version.maven>3.6.3</version.maven>
<version.plugin.antrun>3.1.0</version.plugin.antrun>
<version.plugin.buildnumber>3.0.0</version.plugin.buildnumber>
<version.plugin.checkstyle>3.3.1</version.plugin.checkstyle>
<version.plugin.clean>3.2.0</version.plugin.clean>
<version.plugin.compiler>3.11.0</version.plugin.compiler>
<version.plugin.dependency>3.5.0</version.plugin.dependency>
Expand Down
56 changes: 38 additions & 18 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,13 @@
<parent>
<groupId>net.snowflake</groupId>
<artifactId>snowflake-jdbc-parent</artifactId>
<version>3.14.5</version>
<version>3.14.6-SNAPSHOT</version>
<relativePath>./parent-pom.xml</relativePath>
</parent>

<!-- Maven complains about using property here, but it makes install and deploy process easier to override final package names and localization -->
<artifactId>${artifactId}</artifactId>
<version>3.14.5</version>
<version>3.14.6-SNAPSHOT</version>
<packaging>jar</packaging>

<name>${artifactId}</name>
Expand Down Expand Up @@ -68,6 +68,11 @@
<artifactId>maven-antrun-plugin</artifactId>
<version>${version.plugin.antrun}</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<version>${version.plugin.checkstyle}</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-clean-plugin</artifactId>
Expand Down Expand Up @@ -185,14 +190,15 @@
<artifactId>japicmp-maven-plugin</artifactId>
<configuration>
<parameter>
<breakBuildBasedOnSemanticVersioning>true</breakBuildBasedOnSemanticVersioning>
<ignoreMissingClasses>true</ignoreMissingClasses>
<breakBuildOnBinaryIncompatibleModifications>true</breakBuildOnBinaryIncompatibleModifications>
<ignoreMissingClasses>false</ignoreMissingClasses>
<oldVersionPattern>\d+\.\d+\.\d+</oldVersionPattern>
<includes>
<include>com.snowflake</include>
<include>net.snowflake</include>
</includes>
<excludes>
<exclude>@net.snowflake.client.core.SnowflakeJdbcInternalApi</exclude>
<exclude>${shadeBase}</exclude>
</excludes>
</parameter>
Expand Down Expand Up @@ -573,9 +579,37 @@
<artifactId>fmt-maven-plugin</artifactId>
<executions>
<execution>
<id>fmt</id>
<goals>
<goal>check</goal>
</goals>
<phase>validate</phase>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-checkstyle-plugin</artifactId>
<configuration>
<checkstyleRules>
<module name="Checker">
<module name="TreeWalker">
<module name="AvoidStarImport"/>
</module>
</module>
</checkstyleRules>
<consoleOutput>true</consoleOutput>
<failsOnError>true</failsOnError>
<includeTestSourceDirectory>true</includeTestSourceDirectory>
<violationSeverity>warning</violationSeverity>
</configuration>
<executions>
<execution>
<id>checkstyle</id>
<goals>
<goal>check</goal>
</goals>
<phase>validate</phase>
</execution>
</executions>
</plugin>
Expand All @@ -594,20 +628,6 @@
</properties>
<build>
<plugins>
<plugin>
<!-- we don't want to run japicmp for thin-jar until we release it for the first time -->
<groupId>com.github.siom79.japicmp</groupId>
<artifactId>japicmp-maven-plugin</artifactId>
<executions>
<execution>
<id>japicmp</id>
<goals>
<goal>cmp</goal>
</goals>
<phase>none</phase>
</execution>
</executions>
</plugin>
<plugin>
<!-- google linkage checker doesn't work well with shaded jar, disable the check in this case for now -->
<groupId>org.apache.maven.plugins</groupId>
Expand Down
6 changes: 5 additions & 1 deletion src/main/java/net/snowflake/client/core/Event.java
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,11 @@
package net.snowflake.client.core;

import com.google.common.base.Preconditions;
import java.io.*;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.io.PrintWriter;
import java.util.zip.GZIPOutputStream;
import net.snowflake.client.log.SFLogger;
import net.snowflake.client.log.SFLoggerFactory;
Expand Down
14 changes: 12 additions & 2 deletions src/main/java/net/snowflake/client/core/EventHandler.java
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,20 @@

import static net.snowflake.client.jdbc.SnowflakeUtil.systemGetProperty;

import java.io.*;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.io.PrintWriter;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.*;
import java.util.ArrayList;
import java.util.Comparator;
import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import java.util.TimeZone;
import java.util.TreeSet;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.ThreadFactory;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,14 @@

import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import java.io.*;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.OutputStreamWriter;
import java.io.Reader;
import java.io.Writer;
import java.nio.charset.Charset;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
Expand Down
42 changes: 12 additions & 30 deletions src/main/java/net/snowflake/client/core/HttpUtil.java
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,11 @@
import com.google.common.annotations.VisibleForTesting;
import com.google.common.base.Strings;
import com.microsoft.azure.storage.OperationContext;
import com.snowflake.client.jdbc.SnowflakeDriver;
import java.io.*;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
import java.io.PrintWriter;
import java.io.StringWriter;
import java.net.InetSocketAddress;
import java.net.Proxy;
import java.net.Socket;
Expand All @@ -29,6 +32,7 @@
import javax.net.ssl.TrustManager;
import net.snowflake.client.jdbc.ErrorCode;
import net.snowflake.client.jdbc.RestRequest;
import net.snowflake.client.jdbc.SnowflakeDriver;
import net.snowflake.client.jdbc.SnowflakeSQLException;
import net.snowflake.client.jdbc.SnowflakeUtil;
import net.snowflake.client.jdbc.cloud.storage.S3HttpUtil;
Expand Down Expand Up @@ -109,14 +113,14 @@ public class HttpUtil {
@SnowflakeJdbcInternalApi
public static Duration getConnectionTimeout() {
return Duration.ofMillis(
convertSystemPropertyToIntValue(
SystemUtil.convertSystemPropertyToIntValue(
JDBC_CONNECTION_TIMEOUT_IN_MS_PROPERTY, DEFAULT_HTTP_CLIENT_CONNECTION_TIMEOUT_IN_MS));
}

@SnowflakeJdbcInternalApi
public static Duration getSocketTimeout() {
return Duration.ofMillis(
convertSystemPropertyToIntValue(
SystemUtil.convertSystemPropertyToIntValue(
JDBC_SOCKET_TIMEOUT_IN_MS_PROPERTY, DEFAULT_HTTP_CLIENT_SOCKET_TIMEOUT_IN_MS));
}

Expand Down Expand Up @@ -258,7 +262,7 @@ public static CloseableHttpClient buildHttpClient(
@Nullable HttpClientSettingsKey key, File ocspCacheFile, boolean downloadUnCompressed) {
// set timeout so that we don't wait forever.
// Setup the default configuration for all requests on this client
int timeToLive = convertSystemPropertyToIntValue(JDBC_TTL, DEFAULT_TTL);
int timeToLive = SystemUtil.convertSystemPropertyToIntValue(JDBC_TTL, DEFAULT_TTL);
logger.debug("time to live in connection pooling manager: {}", timeToLive);
long connectTimeout = getConnectionTimeout().toMillis();
long socketTimeout = getSocketTimeout().toMillis();
Expand Down Expand Up @@ -326,9 +330,10 @@ public static CloseableHttpClient buildHttpClient(
new PoolingHttpClientConnectionManager(
registry, null, null, null, timeToLive, TimeUnit.SECONDS);
int maxConnections =
convertSystemPropertyToIntValue(JDBC_MAX_CONNECTIONS_PROPERTY, DEFAULT_MAX_CONNECTIONS);
SystemUtil.convertSystemPropertyToIntValue(
JDBC_MAX_CONNECTIONS_PROPERTY, DEFAULT_MAX_CONNECTIONS);
int maxConnectionsPerRoute =
convertSystemPropertyToIntValue(
SystemUtil.convertSystemPropertyToIntValue(
JDBC_MAX_CONNECTIONS_PER_ROUTE_PROPERTY, DEFAULT_MAX_CONNECTIONS_PER_ROUTE);
logger.debug(
"Max connections total in connection pooling manager: {}; max connections per route: {}",
Expand Down Expand Up @@ -854,29 +859,6 @@ public Socket createSocket(HttpContext ctx) throws IOException {
}
}

/**
* Helper function to convert system properties to integers
*
* @param systemProperty name of the system property
* @param defaultValue default value used
* @return the value of the system property, else the default value
*/
static int convertSystemPropertyToIntValue(String systemProperty, int defaultValue) {
String systemPropertyValue = systemGetProperty(systemProperty);
int returnVal = defaultValue;
if (systemPropertyValue != null) {
try {
returnVal = Integer.parseInt(systemPropertyValue);
} catch (NumberFormatException ex) {
logger.info(
"Failed to parse the system parameter {} with value {}",
systemProperty,
systemPropertyValue);
}
}
return returnVal;
}

/**
* Helper function to attach additional headers to a request if present. This takes a (nullable)
* map of headers in <name,value> format and adds them to the incoming request using addHeader.
Expand Down
6 changes: 5 additions & 1 deletion src/main/java/net/snowflake/client/core/IncidentUtil.java
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,11 @@
import com.yammer.metrics.core.Clock;
import com.yammer.metrics.core.VirtualMachineMetrics;
import com.yammer.metrics.reporting.MetricsServlet;
import java.io.*;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.io.PrintWriter;
import java.util.Map;
import java.util.concurrent.TimeUnit;
import java.util.zip.GZIPOutputStream;
Expand Down
Loading

0 comments on commit f780f6d

Please sign in to comment.