Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SNOW-966003: Unexpected arrow table behaviour when query returns zero rows #1800

Closed
thomasaarholt opened this issue Nov 9, 2023 · 2 comments · Fixed by feast-dev/feast#4008

Comments

@thomasaarholt
Copy link
Contributor

thomasaarholt commented Nov 9, 2023

Python version

Python 3.11.4 (main, Jun 20 2023, 09:03:38) [Clang 14.0.3 (clang-1403.0.22.14.1)]

Operating system and processor architecture

macOS-14.0-arm64-arm-64bit

Installed packages

agate==1.7.1
appdirs==1.4.4
appnope==0.1.3
asn1crypto==1.5.1
asttokens==2.4.1
attrs==23.1.0
Babel==2.13.0
certifi==2023.7.22
cffi==1.16.0
cfgv==3.4.0
chardet==5.2.0
charset-normalizer==3.3.0
click==8.1.7
colorama==0.4.6
comm==0.2.0
contourpy==1.1.0
cryptography==41.0.4
cycler==0.11.0
dbt-core==1.6.6
dbt-extractor==0.4.1
dbt-semantic-interfaces==0.2.2
dbt-snowflake==1.6.4
debugpy==1.8.0
decorator==5.1.1
diff_cover==8.0.0
distlib==0.3.7
executing==2.0.1
filelock==3.12.4
fonttools==4.42.0
hologram==0.0.16
identify==2.5.30
idna==3.4
importlib-metadata==6.8.0
iniconfig==2.0.0
ipykernel==6.26.0
ipython==8.17.2
isodate==0.6.1
jaraco.classes==3.3.0
jedi==0.19.1
Jinja2==3.1.2
jsonschema==4.19.1
jsonschema-specifications==2023.7.1
jupyter_client==8.6.0
jupyter_core==5.5.0
keyring==24.2.0
kiwisolver==1.4.4
leather==0.3.4
Logbook==1.5.3
MarkupSafe==2.1.3
mashumaro==3.8.1
matplotlib==3.7.2
matplotlib-inline==0.1.6
minimal-snowplow-tracker==0.0.2
more-itertools==8.14.0
msgpack==1.0.7
nest-asyncio==1.5.8
networkx==3.2
nodeenv==1.8.0
numpy==1.25.2
oscrypto==1.3.0
packaging==23.1
pandas==2.0.3
parsedatetime==2.6
parso==0.8.3
pathspec==0.11.2
pexpect==4.8.0
Pillow==10.0.0
platformdirs==3.11.0
pluggy==1.3.0
polars==0.18.12
pre-commit==3.3.3
prompt-toolkit==3.0.39
protobuf==4.24.4
psutil==5.9.6
ptyprocess==0.7.0
pure-eval==0.2.2
pyarrow==10.0.1
pycparser==2.21
pycryptodomex==3.19.0
pydantic==1.10.13
Pygments==2.16.1
PyJWT==2.8.0
pyOpenSSL==23.2.0
pyparsing==3.0.9
pytest==7.4.2
python-dateutil==2.8.2
python-slugify==8.0.1
pytimeparse==1.1.8
pytz==2023.3.post1
PyYAML==6.0.1
pyzmq==25.1.1
referencing==0.30.2
regex==2023.10.3
requests==2.31.0
rpds-py==0.10.6
six==1.16.0
snowflake-connector-python==3.3.1
sortedcontainers==2.4.0
sqlfluff==2.3.1
sqlparse==0.4.4
stack-data==0.6.3
tblib==2.0.0
text-unidecode==1.3
tomlkit==0.12.1
tornado==6.3.3
tqdm==4.66.1
traitlets==5.13.0
typing_extensions==4.8.0
tzdata==2023.3
urllib3==1.26.18
virtualenv==20.24.5
wcwidth==0.2.9
yamllint==1.32.0
zipp==3.17.0

What did you do?

I use cursor.fetch_arrow_all() in order to return snowflake data in the arrow format. Sometimes, a where filter might result in the table not having any rows. Currently, instead of returning an arrow table with the correct schema and no rows, snowflake-connector returns None.

This is quite undesirable, since one might be e.g. left-joining this table to another table in Python. With the current behavior, one completely loses the entire schema (column name and dtype), which means that one has to create a workaround to generate the schema to handle this case when there happens to be zero rows.

The pandas version of that method, fetch_pandas_all does not return None, and does return a dataframe with the types cast to the highest bitdepth of each dtype (e.g. Float64 for the float type).

The following example is

pip install 'snowflake-connector-python[pandas]'

import snowflake.connector
con = snowflake.connector.connect(...)

### Example returning rows
query = "select 1::int as foo, 2::float as bar"
cursor = con.cursor().execute(query)
table = cursor.fetch_arrow_all()
print(table)
### CURRENT OUTPUT
# pyarrow.Table
# FOO: int8 not null
# BAR: double not null
# ----
# FOO: [[1]]
# BAR: [[2]]

### Example returning no rows
cursor = con.cursor().execute(query + " limit 0")
table = cursor.fetch_arrow_all()
print(table)
### CURRENT OUTPUT
# None                              # <------------------------ This is the issue

# DESIRED OUTPUT
# pyarrow.Table
# FOO: int64
# BAR: double
# ----
# FOO: [[]]
# BAR: [[]]

### Example using fetch_pandas_all returning no rows
cursor = con.cursor().execute(query + " limit 0")
df = cursor.fetch_pandas_all()
print(df)
print(df.dtypes)
### CURRENT OUTPUT
# Empty DataFrame
# Columns: [FOO, BAR]
# Index: []
# FOO      int64
# BAR    float64
# dtype: object

What did you expect to see?

As per the example above, I would desire a table with as correct a schema as possible. I am submitting a PR with a suggested solution to this, and in that case we return a table with a schema using the highest dtype bit-depth for each dtype, since that is what Snowflake provides us (see upcoming PR).

If necessary, I am happy to go through our company's Snowflake support to see this prioritized.

Can you set logging to DEBUG and collect the logs?

2023-11-09 19:41:52,326 - MainThread connection.py:749 - cursor() - DEBUG - cursor
2023-11-09 19:41:52,327 - MainThread cursor.py:765 - execute() - DEBUG - executing SQL/command
2023-11-09 19:41:52,327 - MainThread cursor.py:621 - _preprocess_pyformat_query() - DEBUG - binding: [select 1::int as foo, 2::float as bar] with input=[None], processed=[{}]
2023-11-09 19:41:52,327 - MainThread cursor.py:833 - execute() - INFO - query: [select 1::int as foo, 2::float as bar]
2023-11-09 19:41:52,327 - MainThread connection.py:1484 - _next_sequence_counter() - DEBUG - sequence counter: 1
2023-11-09 19:41:52,328 - MainThread cursor.py:496 - _execute_helper() - DEBUG - Request id: a1d21078-f385-4eb4-ac9a-0778f661a0d9
2023-11-09 19:41:52,328 - MainThread cursor.py:498 - _execute_helper() - DEBUG - running query [select 1::int as foo, 2::float as bar]
2023-11-09 19:41:52,328 - MainThread cursor.py:505 - _execute_helper() - DEBUG - is_file_transfer: True
2023-11-09 19:41:52,329 - MainThread connection.py:1149 - cmd_query() - DEBUG - _cmd_query
2023-11-09 19:41:52,329 - MainThread _query_context_cache.py:155 - serialize_to_dict() - DEBUG - serialize_to_dict() called
2023-11-09 19:41:52,329 - MainThread connection.py:1176 - cmd_query() - DEBUG - sql=[select 1::int as foo, 2::float as bar], sequence_id=[1], is_file_transfer=[False]
2023-11-09 19:41:52,329 - MainThread network.py:1220 - _use_requests_session() - DEBUG - Session status for SessionPool 'mv21709.west-europe.azure.snowflakecomputing.com', SessionPool 1/1 active sessions
2023-11-09 19:41:52,330 - MainThread network.py:881 - _request_exec_wrapper() - DEBUG - remaining request timeout: None, retry cnt: 1
2023-11-09 19:41:52,330 - MainThread network.py:863 - add_request_guid() - DEBUG - Request guid: 20b01ae1-e09c-4260-884e-e39ada49b795
2023-11-09 19:41:52,330 - MainThread network.py:1072 - _request_exec() - DEBUG - socket timeout: 60
2023-11-09 19:41:52,884 - MainThread connectionpool.py:456 - _make_request() - DEBUG - https://mv21709.west-europe.azure.snowflakecomputing.com:443 "POST /queries/v1/query-request?requestId=a1d21078-f385-4eb4-ac9a-0778f661a0d9&request_guid=20b01ae1-e09c-4260-884e-e39ada49b795 HTTP/1.1" 200 None
2023-11-09 19:41:52,886 - MainThread network.py:1098 - _request_exec() - DEBUG - SUCCESS
2023-11-09 19:41:52,886 - MainThread network.py:1225 - _use_requests_session() - DEBUG - Session status for SessionPool 'mv21709.west-europe.azure.snowflakecomputing.com', SessionPool 0/1 active sessions
2023-11-09 19:41:52,887 - MainThread network.py:741 - _post_request() - DEBUG - ret[code] = None, after post request
2023-11-09 19:41:52,887 - MainThread network.py:765 - _post_request() - DEBUG - Query id: 01b03821-0203-050a-0000-024d5406606e
2023-11-09 19:41:52,888 - MainThread _query_context_cache.py:189 - deserialize_json_dict() - DEBUG - deserialize_json_dict() called: data from server: {'entries': [{'id': 0, 'timestamp': 160108752093169, 'priority': 0}]}
2023-11-09 19:41:52,888 - MainThread _query_context_cache.py:230 - deserialize_json_dict() - DEBUG - deserialize {'id': 0, 'timestamp': 160108752093169, 'priority': 0}
2023-11-09 19:41:52,888 - MainThread _query_context_cache.py:101 - _sync_priority_map() - DEBUG - sync_priority_map called priority_map size = 0, new_priority_map size = 1
2023-11-09 19:41:52,889 - MainThread _query_context_cache.py:127 - trim_cache() - DEBUG - trim_cache() called. treeSet size is 1 and cache capacity is 5
2023-11-09 19:41:52,889 - MainThread _query_context_cache.py:136 - trim_cache() - DEBUG - trim_cache() returns. treeSet size is 1 and cache capacity is 5
2023-11-09 19:41:52,890 - MainThread _query_context_cache.py:269 - deserialize_json_dict() - DEBUG - deserialize_json_dict() returns
2023-11-09 19:41:52,890 - MainThread _query_context_cache.py:274 - log_cache_entries() - DEBUG - Cache Entry: (0, 160108752093169, 0)
2023-11-09 19:41:52,890 - MainThread cursor.py:840 - execute() - DEBUG - sfqid: 01b03821-0203-050a-0000-024d5406606e
2023-11-09 19:41:52,891 - MainThread cursor.py:846 - execute() - INFO - query execution done
2023-11-09 19:41:52,891 - MainThread cursor.py:860 - execute() - DEBUG - SUCCESS
2023-11-09 19:41:52,891 - MainThread cursor.py:879 - execute() - DEBUG - PUT OR GET: False
2023-11-09 19:41:52,892 - MainThread cursor.py:974 - _init_result_and_meta() - DEBUG - Query result format: arrow
2023-11-09 19:41:52,892 - MainThread cursor.py:988 - _init_result_and_meta() - INFO - Number of results in first chunk: 1
2023-11-09 19:41:52,895 - MainThread result_batch.py:92 - _create_nanoarrow_iterator() - DEBUG - Using nanoarrow as the arrow data converter
2023-11-09 19:41:52,895 - MainThread CArrowIterator.cpp:53 - CArrowIterator() - DEBUG - Arrow BatchSize: 1
2023-11-09 19:41:52,895 - MainThread nanoarrow_arrow_iterator.cpython-311-darwin.so:0 - __cinit__() - DEBUG - Batches read: 1
2023-11-09 19:41:52,896 - MainThread result_set.py:59 - result_set_iterator() - DEBUG - beginning to schedule result batch downloads
2023-11-09 19:41:52,896 - MainThread connection.py:749 - cursor() - DEBUG - cursor
2023-11-09 19:41:52,897 - MainThread cursor.py:765 - execute() - DEBUG - executing SQL/command
2023-11-09 19:41:52,897 - MainThread cursor.py:621 - _preprocess_pyformat_query() - DEBUG - binding: [select 1::int as foo, 2::float as bar limit 0] with input=[None], processed=[{}]
2023-11-09 19:41:52,898 - MainThread cursor.py:833 - execute() - INFO - query: [select 1::int as foo, 2::float as bar limit 0]
2023-11-09 19:41:52,898 - MainThread connection.py:1484 - _next_sequence_counter() - DEBUG - sequence counter: 2
2023-11-09 19:41:52,898 - MainThread cursor.py:496 - _execute_helper() - DEBUG - Request id: 35499a0e-bcdd-41e2-9c91-44cf7011bca2
2023-11-09 19:41:52,899 - MainThread cursor.py:498 - _execute_helper() - DEBUG - running query [select 1::int as foo, 2::float as bar limit 0]
2023-11-09 19:41:52,899 - MainThread cursor.py:505 - _execute_helper() - DEBUG - is_file_transfer: True
2023-11-09 19:41:52,899 - MainThread connection.py:1149 - cmd_query() - DEBUG - _cmd_query
2023-11-09 19:41:52,900 - MainThread _query_context_cache.py:155 - serialize_to_dict() - DEBUG - serialize_to_dict() called
2023-11-09 19:41:52,900 - MainThread _query_context_cache.py:274 - log_cache_entries() - DEBUG - Cache Entry: (0, 160108752093169, 0)
2023-11-09 19:41:52,900 - MainThread _query_context_cache.py:178 - serialize_to_dict() - DEBUG - serialize_to_dict(): data to send to server {'entries': [{'id': 0, 'timestamp': 160108752093169, 'priority': 0, 'context': {}}]}
2023-11-09 19:41:52,900 - MainThread connection.py:1176 - cmd_query() - DEBUG - sql=[select 1::int as foo, 2::float as bar limit 0], sequence_id=[2], is_file_transfer=[False]
2023-11-09 19:41:52,901 - MainThread network.py:1220 - _use_requests_session() - DEBUG - Session status for SessionPool 'mv21709.west-europe.azure.snowflakecomputing.com', SessionPool 1/1 active sessions
2023-11-09 19:41:52,901 - MainThread network.py:881 - _request_exec_wrapper() - DEBUG - remaining request timeout: None, retry cnt: 1
2023-11-09 19:41:52,901 - MainThread network.py:863 - add_request_guid() - DEBUG - Request guid: b5ae98d1-371f-46e8-9506-fb4abd271b38
2023-11-09 19:41:52,902 - MainThread network.py:1072 - _request_exec() - DEBUG - socket timeout: 60
2023-11-09 19:41:53,191 - MainThread connectionpool.py:456 - _make_request() - DEBUG - https://mv21709.west-europe.azure.snowflakecomputing.com:443 "POST /queries/v1/query-request?requestId=35499a0e-bcdd-41e2-9c91-44cf7011bca2&request_guid=b5ae98d1-371f-46e8-9506-fb4abd271b38 HTTP/1.1" 200 None
2023-11-09 19:41:53,192 - MainThread network.py:1098 - _request_exec() - DEBUG - SUCCESS
2023-11-09 19:41:53,192 - MainThread network.py:1225 - _use_requests_session() - DEBUG - Session status for SessionPool 'mv21709.west-europe.azure.snowflakecomputing.com', SessionPool 0/1 active sessions
2023-11-09 19:41:53,192 - MainThread network.py:741 - _post_request() - DEBUG - ret[code] = None, after post request
2023-11-09 19:41:53,193 - MainThread network.py:765 - _post_request() - DEBUG - Query id: 01b03821-0203-0706-0000-024d540659c6
2023-11-09 19:41:53,193 - MainThread _query_context_cache.py:189 - deserialize_json_dict() - DEBUG - deserialize_json_dict() called: data from server: {'entries': [{'id': 0, 'timestamp': 160108752421251, 'priority': 0}]}
2023-11-09 19:41:53,193 - MainThread _query_context_cache.py:274 - log_cache_entries() - DEBUG - Cache Entry: (0, 160108752093169, 0)
2023-11-09 19:41:53,194 - MainThread _query_context_cache.py:230 - deserialize_json_dict() - DEBUG - deserialize {'id': 0, 'timestamp': 160108752421251, 'priority': 0}
2023-11-09 19:41:53,194 - MainThread _query_context_cache.py:101 - _sync_priority_map() - DEBUG - sync_priority_map called priority_map size = 0, new_priority_map size = 1
2023-11-09 19:41:53,194 - MainThread _query_context_cache.py:127 - trim_cache() - DEBUG - trim_cache() called. treeSet size is 1 and cache capacity is 5
2023-11-09 19:41:53,194 - MainThread _query_context_cache.py:136 - trim_cache() - DEBUG - trim_cache() returns. treeSet size is 1 and cache capacity is 5
2023-11-09 19:41:53,195 - MainThread _query_context_cache.py:269 - deserialize_json_dict() - DEBUG - deserialize_json_dict() returns
2023-11-09 19:41:53,195 - MainThread _query_context_cache.py:274 - log_cache_entries() - DEBUG - Cache Entry: (0, 160108752421251, 0)
2023-11-09 19:41:53,195 - MainThread cursor.py:840 - execute() - DEBUG - sfqid: 01b03821-0203-0706-0000-024d540659c6
2023-11-09 19:41:53,196 - MainThread cursor.py:846 - execute() - INFO - query execution done
2023-11-09 19:41:53,196 - MainThread cursor.py:860 - execute() - DEBUG - SUCCESS
2023-11-09 19:41:53,196 - MainThread cursor.py:879 - execute() - DEBUG - PUT OR GET: False
2023-11-09 19:41:53,196 - MainThread cursor.py:974 - _init_result_and_meta() - DEBUG - Query result format: arrow
2023-11-09 19:41:53,197 - MainThread cursor.py:988 - _init_result_and_meta() - INFO - Number of results in first chunk: 0
2023-11-09 19:41:53,197 - MainThread result_set.py:59 - result_set_iterator() - DEBUG - beginning to schedule result batch downloads
2023-11-09 19:41:53,197 - MainThread connection.py:749 - cursor() - DEBUG - cursor
2023-11-09 19:41:53,197 - MainThread cursor.py:765 - execute() - DEBUG - executing SQL/command
2023-11-09 19:41:53,198 - MainThread cursor.py:621 - _preprocess_pyformat_query() - DEBUG - binding: [select 1::int as foo, 2::float as bar limit 0] with input=[None], processed=[{}]
2023-11-09 19:41:53,198 - MainThread cursor.py:833 - execute() - INFO - query: [select 1::int as foo, 2::float as bar limit 0]
2023-11-09 19:41:53,198 - MainThread connection.py:1484 - _next_sequence_counter() - DEBUG - sequence counter: 3
2023-11-09 19:41:53,199 - MainThread cursor.py:496 - _execute_helper() - DEBUG - Request id: 1fa73205-799b-4a60-9683-f81089d96161
2023-11-09 19:41:53,199 - MainThread cursor.py:498 - _execute_helper() - DEBUG - running query [select 1::int as foo, 2::float as bar limit 0]
2023-11-09 19:41:53,199 - MainThread cursor.py:505 - _execute_helper() - DEBUG - is_file_transfer: True
2023-11-09 19:41:53,199 - MainThread connection.py:1149 - cmd_query() - DEBUG - _cmd_query
2023-11-09 19:41:53,200 - MainThread _query_context_cache.py:155 - serialize_to_dict() - DEBUG - serialize_to_dict() called
2023-11-09 19:41:53,200 - MainThread _query_context_cache.py:274 - log_cache_entries() - DEBUG - Cache Entry: (0, 160108752421251, 0)
2023-11-09 19:41:53,200 - MainThread _query_context_cache.py:178 - serialize_to_dict() - DEBUG - serialize_to_dict(): data to send to server {'entries': [{'id': 0, 'timestamp': 160108752421251, 'priority': 0, 'context': {}}]}
2023-11-09 19:41:53,200 - MainThread connection.py:1176 - cmd_query() - DEBUG - sql=[select 1::int as foo, 2::float as bar limit 0], sequence_id=[3], is_file_transfer=[False]
2023-11-09 19:41:53,200 - MainThread network.py:1220 - _use_requests_session() - DEBUG - Session status for SessionPool 'mv21709.west-europe.azure.snowflakecomputing.com', SessionPool 1/1 active sessions
2023-11-09 19:41:53,201 - MainThread network.py:881 - _request_exec_wrapper() - DEBUG - remaining request timeout: None, retry cnt: 1
2023-11-09 19:41:53,201 - MainThread network.py:863 - add_request_guid() - DEBUG - Request guid: 56f0a2be-4245-4c21-814d-034579170867
2023-11-09 19:41:53,202 - MainThread network.py:1072 - _request_exec() - DEBUG - socket timeout: 60
2023-11-09 19:41:53,311 - MainThread connectionpool.py:456 - _make_request() - DEBUG - https://mv21709.west-europe.azure.snowflakecomputing.com:443 "POST /queries/v1/query-request?requestId=1fa73205-799b-4a60-9683-f81089d96161&request_guid=56f0a2be-4245-4c21-814d-034579170867 HTTP/1.1" 200 None
2023-11-09 19:41:53,312 - MainThread network.py:1098 - _request_exec() - DEBUG - SUCCESS
2023-11-09 19:41:53,312 - MainThread network.py:1225 - _use_requests_session() - DEBUG - Session status for SessionPool 'mv21709.west-europe.azure.snowflakecomputing.com', SessionPool 0/1 active sessions
2023-11-09 19:41:53,313 - MainThread network.py:741 - _post_request() - DEBUG - ret[code] = None, after post request
2023-11-09 19:41:53,313 - MainThread network.py:765 - _post_request() - DEBUG - Query id: 01b03821-0203-050a-0000-024d54066072
2023-11-09 19:41:53,313 - MainThread _query_context_cache.py:189 - deserialize_json_dict() - DEBUG - deserialize_json_dict() called: data from server: {'entries': [{'id': 0, 'timestamp': 160108752569756, 'priority': 0}]}
2023-11-09 19:41:53,313 - MainThread _query_context_cache.py:274 - log_cache_entries() - DEBUG - Cache Entry: (0, 160108752421251, 0)
2023-11-09 19:41:53,314 - MainThread _query_context_cache.py:230 - deserialize_json_dict() - DEBUG - deserialize {'id': 0, 'timestamp': 160108752569756, 'priority': 0}
2023-11-09 19:41:53,314 - MainThread _query_context_cache.py:101 - _sync_priority_map() - DEBUG - sync_priority_map called priority_map size = 0, new_priority_map size = 1
2023-11-09 19:41:53,314 - MainThread _query_context_cache.py:127 - trim_cache() - DEBUG - trim_cache() called. treeSet size is 1 and cache capacity is 5
2023-11-09 19:41:53,314 - MainThread _query_context_cache.py:136 - trim_cache() - DEBUG - trim_cache() returns. treeSet size is 1 and cache capacity is 5
2023-11-09 19:41:53,314 - MainThread _query_context_cache.py:269 - deserialize_json_dict() - DEBUG - deserialize_json_dict() returns
2023-11-09 19:41:53,315 - MainThread _query_context_cache.py:274 - log_cache_entries() - DEBUG - Cache Entry: (0, 160108752569756, 0)
2023-11-09 19:41:53,315 - MainThread cursor.py:840 - execute() - DEBUG - sfqid: 01b03821-0203-050a-0000-024d54066072
2023-11-09 19:41:53,315 - MainThread cursor.py:846 - execute() - INFO - query execution done
2023-11-09 19:41:53,315 - MainThread cursor.py:860 - execute() - DEBUG - SUCCESS
2023-11-09 19:41:53,316 - MainThread cursor.py:879 - execute() - DEBUG - PUT OR GET: False
2023-11-09 19:41:53,316 - MainThread cursor.py:974 - _init_result_and_meta() - DEBUG - Query result format: arrow
2023-11-09 19:41:53,316 - MainThread cursor.py:988 - _init_result_and_meta() - INFO - Number of results in first chunk: 0
2023-11-09 19:41:53,320 - MainThread result_set.py:59 - result_set_iterator() - DEBUG - beginning to schedule result batch downloads
@github-actions github-actions bot changed the title Unexpected arrow table behaviour when query returns zero rows SNOW-966003: Unexpected arrow table behaviour when query returns zero rows Nov 9, 2023
@sfc-gh-aling
Copy link
Collaborator

thanks @thomasaarholt for reaching out.
what you described makes sense to me, we will review the PR.
cc: @sfc-gh-mkeller

@sfc-gh-aling
Copy link
Collaborator

closing the issue as the PR is merged and being released in 3.7.0:

Added a new boolean parameter force_return_table to SnowflakeCursor.fetch_arrow_all to force returning pyarrow.Table in case of zero rows.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants