Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SNOW-755757: Unwanted INFO messages from arrow memory allocation manager #1095

Closed
hchenatsafe opened this issue Jul 15, 2022 · 11 comments
Closed
Assignees
Labels

Comments

@hchenatsafe
Copy link

After updating Snowflake JDBC driver from 3.13.14 to 3.13.20, I am getting the following INFO messages from arrow:

INFO: Debug mode disabled.
Jul 12, 2022 2:38:03 PM net.snowflake.client.jdbc.internal.apache.arrow.memory.DefaultAllocationManagerOption getDefaultAllocationManagerFactory
INFO: allocation manager type not specified, using netty as the default type
Jul 12, 2022 2:38:03 PM net.snowflake.client.jdbc.internal.apache.arrow.memory.CheckAllocator reportResult
INFO: Using DefaultAllocationManager at memory/DefaultAllocationManagerFactory.class
@sfc-gh-wfateem sfc-gh-wfateem self-assigned this Jul 20, 2022
@sfc-gh-wfateem
Copy link
Collaborator

@hchenatsafe can you give us a bit more details on how you're setting up your logging? I just ran a test using 3.13.20 and configured log4j by setting the package net.snowflake.client to INFO. I don't see any messages at all, including the messages you included.

@sfc-gh-wfateem
Copy link
Collaborator

Closing this since we're unable to reproduce the issue.

@hchenatsafe
Copy link
Author

I don't have an isolated repro to share, but these logs seem to be introduced in v3.13.18. These logs are sent to stderr.

@sfc-gh-wfateem
Copy link
Collaborator

sfc-gh-wfateem commented Jul 29, 2022

@hchenatsafe The log messages were introduced by a change in the Apache Arrow version, but my confusion with the problem described here was related to the logging configurations for your application when this is printed out. Are you saying that you don't have debugging enabled for the JDBC driver and this is printed out directly to standard error?
Here's one of the log messages that you're referring to:
https://github.com/apache/arrow/blob/apache-arrow-8.0.0/java/memory/memory-core/src/main/java/org/apache/arrow/memory/BaseAllocator.java#L58
You'll see that this is printed out using an instance of SLF4J's Logger. I'm not sure why those would go out to standard error.
On the other hand, when I configure my application with Log4J, I don't see this message anywhere in my log.
Let me take a look into this a bit more, in the meantime, can you just help clarify what logging framework are you using for your application and how this was configured, please?

@hchenatsafe
Copy link
Author

I'm not clear on how our logging is currently configured, but this is the first time I've seen logging appear in a spot unexpectedly. As far as I know, debugging is not enabled for the driver, and the logging is going directly to stderr

@zhongzheng-instacart
Copy link

zhongzheng-instacart commented Sep 2, 2022

reporting that similar issue observed on my side when access snowflake via JDBC on M1 machine. The exception stack and trace indicates exception is thrown out when JDBC driver tried to box the response from Snowflake in Arrow format. More details found below:

Std err:

Sep 02, 2022 7:26:51 PM net.snowflake.client.jdbc.internal.apache.arrow.memory.DefaultAllocationManagerOption getDefaultAllocationManagerFactory
INFO: allocation manager type not specified, using netty as the default type
Sep 02, 2022 7:26:51 PM net.snowflake.client.jdbc.internal.apache.arrow.memory.CheckAllocator reportResult
INFO: Using DefaultAllocationManager at memory/DefaultAllocationManagerFactory.class
Sep 02, 2022 7:26:51 PM net.snowflake.client.core.SFStatement executeQueryInternal
SEVERE: Exception creating result
java.lang.ExceptionInInitializerError
	at net.snowflake.client.jdbc.internal.apache.arrow.memory.UnsafeAllocationManager.<clinit>(UnsafeAllocationManager.java:27)
	at net.snowflake.client.jdbc.internal.apache.arrow.memory.DefaultAllocationManagerFactory.<clinit>(DefaultAllocationManagerFactory.java:26)
	at java.base/java.lang.Class.forName0(Native Method)
	at java.base/java.lang.Class.forName(Class.java:383)
	at java.base/java.lang.Class.forName(Class.java:376)
	at net.snowflake.client.jdbc.internal.apache.arrow.memory.DefaultAllocationManagerOption.getFactory(DefaultAllocationManagerOption.java:111)
	at net.snowflake.client.jdbc.internal.apache.arrow.memory.DefaultAllocationManagerOption.getDefaultAllocationManagerFactory(DefaultAllocationManagerOption.java:101)

Trace:

2022-09-02 23:11:33.136 n.s.c.jdbc.internal.apache.arrow.memory.BaseAllocator INFO <clinit>:60 - Debug mode disabled.
2022-09-02 23:11:33.139 n.s.c.jdbc.internal.apache.arrow.memory.DefaultAllocationManagerOption INFO getDefaultAllocationManagerFactory:100 - allocation manager type not specified, using netty as the default ty>
2022-09-02 23:11:33.141 n.s.c.jdbc.internal.apache.arrow.memory.CheckAllocator INFO reportResult:73 - Using DefaultAllocationManager at memory/DefaultAllocationManagerFactory.class
2022-09-02 23:11:33.144 n.s.c.core.SFStatement SEVERE executeQueryInternal:247 - Exception creating result
2022-09-02 23:11:33.148 n.s.c.core.EventHandler FINE triggerIncident:87 - New incident V2 triggered, flushing event buffer
2022-09-02 23:11:33.148 n.s.c.core.EventHandler FINE triggerIncident:90 - Incidents disabled by Snowflake, creation failed
2022-09-02 23:11:33.148 n.s.c.jdbc.SnowflakeSQLException FINE <init>:108 - Snowflake exception: JDBC driver internal error: exception creating result java.lang.ExceptionInInitializerError at net.snowflake.clie>
2022-09-02 23:11:33.148 n.s.c.jdbc.SnowflakeStatementV1 FINE close:821 - close()
...

@sfc-gh-wfateem
Copy link
Collaborator

Hi @zhongzheng-instacart
Those are actually two different and unrelated issues. The exception you're seeing in stderr is due to the JDK version, and it's likely that you're on 16 or higher. If you're on an M1 machine, then you're likely on JDK 17. Please refer to issues #589 and #533.

This issue is mainly talking about the INFO messages you pointed out to in the trace log. Those are printed out by the Apache Arrow memory library code and were introduced starting with Arrow 7.0, which is the library version of Apache Arrow that we upgraded to in the JDBC driver starting with version 3.13.18.

You could suppress those messages using the JDK logger configuration:
-Djava.util.logging.config.file=logging.properties

Then add the following entry in the logging.properties file:
java.util.logging.ConsoleHandler.level = SEVERE

If you don't pass a logging.properties file then the default configuration is taken from the logging.properties file under <JAVA_HOME>/jre/lib

Theoretically, you should be able to adjust the logging level for the apache.arrow.memory package using log4j, but that doesn't seem to be working as expected. From my perspective, I believe that's the real issue that should be addressed.

@sfc-gh-igarish sfc-gh-igarish changed the title Unwanted INFO messages from arrow memory allocation manager SNOW-755757: Unwanted INFO messages from arrow memory allocation manager Mar 9, 2023
@sfc-gh-skumbham
Copy link
Contributor

Closing this issue as it is no longer reproducible for the latest jdbc driver - 3.13.27 and it is only reproducible for the versions from 3.13.18 to 3.13.22 which are using arrow package 9.0.0 .
jdbc versions > 3.13.22 uses the arrow package with version 10.0.1 and there are no log emissions.

Related change: https://github.com/snowflakedb/snowflake-jdbc/pull/1202/files#diff-e67a1908492d15b4743e498e89ff01f892a9dd035b66a2afc5820db93039885f

@cnotin
Copy link

cnotin commented Nov 5, 2024

Hello! I think I have the same issue with the most recent 3.19.0 version of snowflake-jdbc:

nov. 05, 2024 12:17:20 PM net.snowflake.client.jdbc.internal.apache.arrow.memory.BaseAllocator <clinit>
INFO: Debug mode disabled. Enable with the VM option -Darrow.memory.debug.allocator=true.
nov. 05, 2024 12:17:20 PM net.snowflake.client.jdbc.internal.apache.arrow.memory.DefaultAllocationManagerOption getDefaultAllocationManagerFactory
INFO: allocation manager type not specified, using netty as the default type
nov. 05, 2024 12:17:20 PM net.snowflake.client.jdbc.internal.apache.arrow.memory.CheckAllocator reportResult
INFO: Using DefaultAllocationManager at memory/unsafe/DefaultAllocationManagerFactory.class

Even though I'm using SLF4J with this JVM arg:

-Dnet.snowflake.jdbc.loggerImpl=net.snowflake.client.log.SLF4JLogger

which is working properly because I have this logger config in logback.xml which has visible effects, except on these arrow messages

<logger name="net.snowflake" level="WARN"/>

Also, I noticed that the arrow messages are printed to stderr, instead of stdout for other messages.
It looks like they don't go through SLF4J. Is there a regression perhaps?

@sfc-gh-dszmolka
Copy link
Contributor

hi there - could you please open a new issue with the details, if it still happens for you? using the one which is closed for 1.5 years now doesn't get the necessary attention to the issue. Thank you in advance !

@cnotin
Copy link

cnotin commented Nov 5, 2024

You're right! Here it is: #1956

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

7 participants