Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cache BigQuery: Invalid value for type: DECIMAL(38, 9) is not a valid value #573

Open
andreibaragan opened this issue Jan 14, 2025 · 1 comment

Comments

@andreibaragan
Copy link

andreibaragan commented Jan 14, 2025

I am running the latest version (0.22) but I am getting errors when trying to use the BigQuery cache. On source-facebook-marketing, source-quickbooks and source-xero I get the following trace

Traceback (most recent call last): File "/Users/andrei/Work/cedara/pyairbyte/export_quickbooks.py", line 59, in <module> result = source.read(cache=cache) File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/sources/base.py", line 659, in read result = self._read_to_cache( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/sources/base.py", line 743, in _read_to_cache cache._write_airbyte_message_stream( # noqa: SLF001 # Non-public API File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/caches/base.py", line 325, in _write_airbyte_message_stream cache_processor.process_airbyte_messages( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/shared/sql_processor.py", line 315, in process_airbyte_messages self._write_all_stream_data( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/shared/sql_processor.py", line 329, in _write_all_stream_data self.write_stream_data( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/shared/sql_processor.py", line 748, in write_stream_data temp_table_name = self._write_files_to_new_table( File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/airbyte/_processors/sql/bigquery.py", line 178, in _write_files_to_new_table load_job = client.load_table_from_file( # Make an API request File "/Users/andrei/Work/cedara/pyairbyte/python310/lib/python3.10/site-packages/google/cloud/bigquery/client.py", line 2601, in load_table_from_file raise exceptions.from_http_response(exc.response) google.api_core.exceptions.BadRequest: 400 POST https://bigquery.googleapis.com/upload/bigquery/v2/projects/REDACTED/jobs?uploadType=resumable: Invalid value for type: DECIMAL(38, 9) is not a valid value

It's working correctly when I use source-faker (with both users and purchases streams).

Any idea what's going on?

@andreibaragan
Copy link
Author

@aaronsteers I've done a bit more testing and identified that this was broken in 0.17.9, it works fine with 0.17.8 and lower. Hopefully this intel gives some clues to what the problem could be.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant