-
Notifications
You must be signed in to change notification settings - Fork 170
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error handling VARIANT type #205
Comments
hi, This is not a question about JDBC driver but COPY command. Can you please reach out the support team? thanks. |
I tried to use VARIANT with jdbc driver:
Got error:
|
Hi, I am trying to write variant type using jdbc-snowflake driver. Something like below -
it throws Datatype unsupported exception. let me know the correct way to insert Variant type using jdbc driver |
I have the same problem as @IRus , Expression type does not match column data type, expecting VARIANT but got VARCHAR(2) Is there a way for us to insert to a VARIANT column? Also tried using PARSE_JSON() but we can't perform any evaluations in the VALUES section of a batch statement |
Was struggle with this once again yesterday, and finally find solution:
|
Thanks @IRus, I found this test case that shows something similar to your suggestion: https://github.com/snowflakedb/snowflake-jdbc/blob/5339cfb4531457fb9904b61eaaafcfba01b97c1a/src/test/java/net/snowflake/client/jdbc/SnowflakeDriverIT.java I was able to insert data successfully using both The only problem in my case is that I'm trying to perform a batch insert, and switching from |
@ftenaglia Yes, right now I'm observing same behavior from snowflake, before it was fast batch insert, and 100 rows insert was about few seconds, now it takes minutes - cause each individual insert becomes query. And for batches that contains thousands of records it not usable at all. Obvious solution would be using COPY INTO with some stage, but it makes things more complicated, and I want to avoid it. Any suggestions @sfc-gh-stakeda? |
Also approach with temporary table where variant data is STRING, and than copy this data from temp to actual table works much faster, but required additional table and moving data between tables: CREATE TABLE "DATABASE"."SCHEMA"."TABLE_TEMP"(
DATA_FIELD STRING,
ANOTHER_FIELD STRING
);
INSERT INTO "DATABASE"."SCHEMA"."TABLE"(DATA_FIELD, ANOTHER_FIELD) SELECT parse_json(t2.DATA_FIELD), t2.ANOTHER_FIELD FROM "DATABASE"."SCHEMA"."TABLE_TEMP" as t2; |
@IRus that's what I'm trying to do, I guess it still requires less effort than handling a file for COPY. Thanks for the input. |
To clean up and re-prioritize more pressing bugs and feature requests we are closing all issues older than 6 months as of March 1, 2023. If there are any issues or feature requests that you would like us to address, please create them according to the new templates we have created. For urgent issues, opening a support case with this link Snowflake Community is the fastest way to get a response. |
I would love to see better handling of |
Hi,
I created a table as follows:
And, I tried to insert a record using COPY command with internal staging. The ultimate copy command invoked is:
If the file content is just as follows it works without any problem:
{"name":"syed", "role":"dev"}
However, if I en-quote the entire value in the file as below, it fails:
'{"name":"syed", "role":"dev"}'
The error is:
net.snowflake.client.jdbc.SnowflakeSQLException: Error parsing JSON ........
In order to overcome the above error, I had to modify the COPY command by adding few more options,
My question is why do I have to modify the COPY command? If I change my data type in the table from VARIANT to VARCHAR then the first COPY command (without the EMPTY_FIELD_AS_NULL and NULL_IF options) works for the data value enquoted in the file.
Can someone clarify if this could be bug from the snowflake end?
The text was updated successfully, but these errors were encountered: