You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am running the latest version (0.22) but I am getting errors when trying to use the BigQuery cache. On source-facebook-marketing, source-quickbooks and source-xero I get the following trace
Traceback (most recent call last): File "/Users/andrei/Work/cedara/pyairbyte/export_quickbooks.py", line 59, in <module> result = source.read(cache=cache) File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/sources/base.py", line 659, in read result = self._read_to_cache( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/sources/base.py", line 743, in _read_to_cache cache._write_airbyte_message_stream( # noqa: SLF001 # Non-public API File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/caches/base.py", line 325, in _write_airbyte_message_stream cache_processor.process_airbyte_messages( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/shared/sql_processor.py", line 315, in process_airbyte_messages self._write_all_stream_data( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/shared/sql_processor.py", line 329, in _write_all_stream_data self.write_stream_data( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/shared/sql_processor.py", line 748, in write_stream_data temp_table_name = self._write_files_to_new_table( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/_processors/sql/bigquery.py", line 178, in _write_files_to_new_table load_job = client.load_table_from_file( # Make an API request File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/google/cloud/bigquery/client.py", line 2601, in load_table_from_file raise exceptions.from_http_response(exc.response) google.api_core.exceptions.BadRequest: 400 POST https://bigquery.googleapis.com/upload/bigquery/v2/projects/REDACTED/jobs?uploadType=resumable: Invalid value for type: DECIMAL(38, 9) is not a valid value
It's working correctly when I use source-faker (with both users and purchases streams).
Any idea what's going on?
The text was updated successfully, but these errors were encountered:
@aaronsteers I've done a bit more testing and identified that this was broken in 0.17.9, it works fine with 0.17.8 and lower. Hopefully this intel gives some clues to what the problem could be.
I am running the latest version (0.22) but I am getting errors when trying to use the BigQuery cache. On
source-facebook-marketing
,source-quickbooks
andsource-xero
I get the following traceTraceback (most recent call last): File "/Users/andrei/Work/cedara/pyairbyte/export_quickbooks.py", line 59, in <module> result = source.read(cache=cache) File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/sources/base.py", line 659, in read result = self._read_to_cache( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/sources/base.py", line 743, in _read_to_cache cache._write_airbyte_message_stream( # noqa: SLF001 # Non-public API File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/caches/base.py", line 325, in _write_airbyte_message_stream cache_processor.process_airbyte_messages( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/shared/sql_processor.py", line 315, in process_airbyte_messages self._write_all_stream_data( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/shared/sql_processor.py", line 329, in _write_all_stream_data self.write_stream_data( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/shared/sql_processor.py", line 748, in write_stream_data temp_table_name = self._write_files_to_new_table( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/_processors/sql/bigquery.py", line 178, in _write_files_to_new_table load_job = client.load_table_from_file( # Make an API request File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/google/cloud/bigquery/client.py", line 2601, in load_table_from_file raise exceptions.from_http_response(exc.response) google.api_core.exceptions.BadRequest: 400 POST https://bigquery.googleapis.com/upload/bigquery/v2/projects/REDACTED/jobs?uploadType=resumable: Invalid value for type: DECIMAL(38, 9) is not a valid value
It's working correctly when I use
source-faker
(with bothusers
andpurchases
streams).Any idea what's going on?
The text was updated successfully, but these errors were encountered: