Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat: Support Snowflake's travel time #414

Merged

Conversation

yohannj
Copy link
Contributor

@yohannj yohannj commented Oct 7, 2024

Summary

Update sql_processor to allow adding "custom" extra clause at the end of the CREATE TABLE statement.
Update Snowflake processor to add an extra clause DATA_RETENTION_TIME_IN_DAYS to configure Time Travel.


This is a follow up of a conversation on Slack to have feature parity with Airbyte (PR)

Summary by CodeRabbit

  • New Features

    • Introduced an optional field for data retention time in Snowflake configurations.
    • Enhanced SQL table creation with the ability to include additional clauses and dynamic primary key definitions.
  • Bug Fixes

    • Improved validation of SQL command generation for data retention settings in Snowflake.
  • Tests

    • Added unit tests to verify data retention time handling in Snowflake table creation.

Copy link

coderabbitai bot commented Oct 7, 2024

📝 Walkthrough

Walkthrough

The changes introduce enhancements to the SnowflakeConfig and SqlConfig classes by adding a new optional field for data retention and methods to generate additional SQL clauses for table creation. The SnowflakeSqlProcessor class is tested to ensure correct SQL command generation based on these configurations. Unit tests validate both scenarios: with and without specified data retention time.

Changes

File Change Summary
airbyte/_processors/sql/snowflake.py - Added field: `data_retention_time_in_days: int
airbyte/shared/sql_processor.py - Added method: get_create_table_extra_clauses(self) -> list[str] in SqlConfig.
- Updated method signature: `_create_table(self, table_name: str, column_definition_str: str, primary_keys: list[str]
tests/unit_tests/test_processors.py - Added functions: test_snowflake_cache_config_data_retention_time_in_days, test_snowflake_cache_config_no_data_retention_time_in_days, and _build_mocked_snowflake_processor.

Possibly related PRs

  • Chore: Bump to Sqlalchemy 2.0 #396: The changes in airbyte/_processors/sql/snowflake.py regarding SQL command construction and execution are related to the main PR, as both involve modifications to the SnowflakeConfig class and its methods.

Wdyt?


Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

‼️ IMPORTANT
Auto-reply has been disabled for this repository in the CodeRabbit settings. The CodeRabbit bot will not respond to your replies unless it is explicitly tagged.

  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai or @coderabbitai title anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Outside diff range and nitpick comments (6)
tests/unit_tests/test_processors.py (4)

13-31: How about improving test isolation and SQL comparison?

The test looks good overall! A couple of suggestions to make it even better:

  1. Instead of using a global variable for actual_cmd, we could use a local variable and return it from the _execute_sql function. This would improve test isolation.

  2. The indentation in the expected_cmd string might cause comparison issues. We could use a heredoc-style string or textwrap.dedent() to make it cleaner.

What do you think about these changes? Here's a possible refactor:

import textwrap

def test_snowflake_cache_config_data_retention_time_in_days(
    mocker: pytest_mock.MockFixture,
):
    expected_cmd = textwrap.dedent("""
        CREATE TABLE airbyte_raw."table_name" (
            col_name type
        )
        DATA_RETENTION_TIME_IN_DAYS = 1
        """).strip()

    actual_cmd = None
    def _execute_sql(cmd):
        nonlocal actual_cmd
        actual_cmd = cmd
        
    mocker.patch.object(SnowflakeSqlProcessor, "_execute_sql", side_effect=_execute_sql)
    config = _build_mocked_snowflake_processor(mocker, data_retention_time_in_days=1)
    config._create_table(table_name="table_name", column_definition_str="col_name type")

    assert actual_cmd.strip() == expected_cmd

WDYT? This approach might make the test more robust and easier to maintain.


34-51: Shall we apply similar improvements here and address the extra newline?

Great to see a test for the case without data retention time! A couple of suggestions:

  1. We could apply the same improvements suggested for the previous test (using a local variable instead of global, and using textwrap.dedent()).

  2. There's an extra newline character at the end of the expected_cmd string. We might want to remove it to avoid potential whitespace comparison issues.

Here's a possible refactor:

import textwrap

def test_snowflake_cache_config_no_data_retention_time_in_days(
    mocker: pytest_mock.MockFixture,
):
    expected_cmd = textwrap.dedent("""
        CREATE TABLE airbyte_raw."table_name" (
            col_name type
        )
        """).strip()

    actual_cmd = None
    def _execute_sql(cmd):
        nonlocal actual_cmd
        actual_cmd = cmd
        
    mocker.patch.object(SnowflakeSqlProcessor, "_execute_sql", side_effect=_execute_sql)
    config = _build_mocked_snowflake_processor(mocker)
    config._create_table(table_name="table_name", column_definition_str="col_name type")

    assert actual_cmd.strip() == expected_cmd

What do you think about these changes? They should make the test more consistent with the previous one and avoid potential whitespace issues.


54-75: Looks good! How about adding a quick comment for the mock?

This helper function is well-structured and nicely parameterized. Good job on using SecretString for the password too!

One small suggestion: It might be helpful to add a quick comment explaining why we're mocking the _ensure_schema_exists method. Something like:

# Mock _ensure_schema_exists to isolate the test from actual schema creation
mocker.patch.object(
    SnowflakeSqlProcessor, "_ensure_schema_exists", return_value=None
)

What do you think? This could help future developers understand the test setup more quickly.


1-75: Great job on these tests! How about some additional scenarios?

These unit tests are well-structured and cover the main scenarios for the SnowflakeSqlProcessor's handling of data retention time. The use of a helper function for creating the mocked processor is a nice touch for code reuse.

To make the test suite even more robust, we might consider adding a few more test cases. Some ideas:

  1. Test with a large data retention time (e.g., 365 days)
  2. Test with a zero or negative data retention time (if these should be handled differently)
  3. Test error scenarios, like invalid data types for data_retention_time_in_days

What do you think about adding these? They could help catch edge cases and ensure the processor behaves correctly under various conditions.

airbyte/_processors/sql/snowflake.py (1)

47-55: New method looks great! Small suggestion for readability.

The get_create_table_extra_clauses method is well-implemented and aligns perfectly with the PR objective. It correctly adds the DATA_RETENTION_TIME_IN_DAYS clause when needed. Great job!

A tiny suggestion: wdyt about using a list comprehension for a more concise implementation? Something like:

def get_create_table_extra_clauses(self) -> list[str]:
    return [f"DATA_RETENTION_TIME_IN_DAYS = {self.data_retention_time_in_days}"] if self.data_retention_time_in_days is not None else []

This could make the method even more readable at a glance. What do you think? 🤔

airbyte/shared/sql_processor.py (1)

127-129: Would adding unit tests for the new method enhance test coverage?

Adding unit tests for get_create_table_extra_clauses and its integration within _create_table could help ensure that the extra clauses are handled correctly in different scenarios. Do you think it would be beneficial to include such tests? Wdyt?

Also applies to: 660-666

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL

📥 Commits

Files that changed from the base of the PR and between 53246a3 and 2f9de3e.

📒 Files selected for processing (3)
  • airbyte/_processors/sql/snowflake.py (1 hunks)
  • airbyte/shared/sql_processor.py (2 hunks)
  • tests/unit_tests/test_processors.py (1 hunks)
🧰 Additional context used
🔇 Additional comments (2)
airbyte/_processors/sql/snowflake.py (2)

45-45: LGTM! New field for data retention looks good.

The new data_retention_time_in_days field aligns perfectly with the PR objective to support Snowflake's Time Travel feature. The type and default value seem appropriate for an optional configuration. Nicely done! 👍


Line range hint 1-255: Verify usage of new configuration in SnowflakeSqlProcessor

The changes look good and are well-contained within the SnowflakeConfig class. However, I noticed that the SnowflakeSqlProcessor class doesn't seem to be using the new data_retention_time_in_days configuration yet.

Should we update the SnowflakeSqlProcessor to utilize this new configuration when creating tables? For example, in the _write_files_to_new_table method? What are your thoughts on this? 🤔

To help verify this, we can run the following script:

This will help us confirm if and where we might need to update the SnowflakeSqlProcessor to use the new configuration.

airbyte/shared/sql_processor.py Show resolved Hide resolved
airbyte/shared/sql_processor.py Show resolved Hide resolved
@aaronsteers
Copy link
Contributor

aaronsteers commented Oct 21, 2024

/test-pr

PR test job started... Check job output.

❌ Tests failed.

Copy link
Contributor

@aaronsteers aaronsteers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @yohannj for contributing! We might refactor the internals in the future (to reduce potential SQL injection footprint), but this looks solid to me and a great increment. The config input from the user won't be affected by future refactoring, so I'm happy to approve this.

Thanks so much for the contribution - and for your patience in us getting this merged.

@aaronsteers aaronsteers merged commit 3b50f3c into airbytehq:main Oct 22, 2024
9 checks passed
@yohannj yohannj deleted the support_snowflake_travel_time_config branch October 22, 2024 08:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants