Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add stream on directory table #3129

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 45 additions & 3 deletions MIGRATION_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,56 @@ across different versions.
> [!TIP]
> We highly recommend upgrading the versions one by one instead of bulk upgrades.
## v0.97.0 ➞ v0.98.0

### *(new feature)* snowflake_stream_on_directory_table resource
Continuing changes made in v0.96, to enhance clarity and functionality, the new resource `snowflake_stream_on_directory_table` has been introduced to replace the previous `snowflake_stream`. Recognizing that the old resource carried multiple responsibilities within a single entity, we opted to divide it into more specialized resources.
The newly introduced resources are aligned with the latest Snowflake documentation at the time of implementation, and adhere to our [new conventions](#general-changes).
This segregation was based on the object on which the stream is created. The mapping between SQL statements and the resources is the following:
- `ON TABLE <table_name>` -> `snowflake_stream_on_table` (added in v0.96)
- `ON EXTERNAL TABLE <external_table_name>` -> `snowflake_stream_on_external_table` (this was previously not supported, added in v0.96)
- `ON STAGE <stage_name>` -> `snowflake_stream_on_directory_table`

To use the new `stream_on_directory_table`, change the old `stream` from
```terraform
resource "snowflake_stream" "stream" {
name = "stream"
schema = "schema"
database = "database"
on_stage = snowflake_stage.stage.fully_qualified_name
comment = "A stream."
}
```

to

```terraform
resource "snowflake_stream_on_directory_table" "stream" {
name = "stream"
schema = "schema"
database = "database"
stage = snowflake_stage.stage.fully_qualified_name
comment = "A stream."
}
```

Then, follow our [Resource migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md).


## v0.96.0 ➞ v0.97.0
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is already present at line 10

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed.


### *(new feature)* snowflake_stream_on_table, snowflake_stream_on_external_table resource

To enhance clarity and functionality, the new resources `snowflake_stream_on_table` and `snowflake_stream_on_external_table` have been introduced to replace the previous `snowflake_stream`. Recognizing that the old resource carried multiple responsibilities within a single entity, we opted to divide it into more specialized resources.
To enhance clarity and functionality, the new resources `snowflake_stream_on_table`, `snowflake_stream_on_external_table` and `snowflake_stream_on_directory_table` have been introduced to replace the previous `snowflake_stream`. Recognizing that the old resource carried multiple responsibilities within a single entity, we opted to divide it into more specialized resources.
The newly introduced resources are aligned with the latest Snowflake documentation at the time of implementation, and adhere to our [new conventions](#general-changes).
This segregation was based on the object on which the stream is created. The mapping between SQL statements and the resources is the following:
- `ON TABLE <table_name>` -> `snowflake_stream_on_table`
- `ON EXTERNAL TABLE <external_table_name>` -> `snowflake_stream_on_external_table` (this was previously not supported)
- `ON STAGE <stage_name>` -> `snowflake_stream_on_directory_table`

The resources for streams on directory tables and streams on views will be implemented in the future releases.

Expand All @@ -35,7 +76,7 @@ resource "snowflake_stream" "stream" {

to

```
```terraform
resource "snowflake_stream_on_table" "stream" {
name = "stream"
schema = "schema"
Expand All @@ -48,6 +89,7 @@ resource "snowflake_stream_on_table" "stream" {
}
```


Then, follow our [Resource migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/resource_migration.md).

### *(new feature)* new snowflake_service_user and snowflake_legacy_service_user resources
Expand Down Expand Up @@ -83,7 +125,7 @@ resource "snowflake_user" "service_user" {
lifecycle {
ignore_changes = [user_type]
}
name = "Snowflake Service User"
login_name = "service_user"
email = "[email protected]"
Expand Down
126 changes: 126 additions & 0 deletions docs/resources/stream_on_directory_table.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,126 @@
---
page_title: "snowflake_stream_on_directory_table Resource - terraform-provider-snowflake"
subcategory: ""
description: |-
Resource used to manage streams on directory tables. For more information, check stream documentation https://docs.snowflake.com/en/sql-reference/sql/create-stream.
---

!> **V1 release candidate** This resource was reworked and is a release candidate for the V1. We do not expect significant changes in it before the V1. We will welcome any feedback and adjust the resource if needed. Any errors reported will be resolved with a higher priority. We encourage checking this resource out before the V1 release. Please follow the [migration guide](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/MIGRATION_GUIDE.md#v0970--v0980) to use it.

# snowflake_stream_on_directory_table (Resource)

Resource used to manage streams on directory tables. For more information, check [stream documentation](https://docs.snowflake.com/en/sql-reference/sql/create-stream).

## Example Usage

```terraform
resource "snowflake_stage" "example_stage" {
name = "EXAMPLE_STAGE"
url = "s3://com.example.bucket/prefix"
database = "EXAMPLE_DB"
schema = "EXAMPLE_SCHEMA"
credentials = "AWS_KEY_ID='${var.example_aws_key_id}' AWS_SECRET_KEY='${var.example_aws_secret_key}'"
}
# basic resource
resource "snowflake_stream_on_directory_table" "stream" {
name = "stream"
schema = "schema"
database = "database"
stage = snowflake_stage.stage.fully_qualified_name
}
# resource with more fields set
resource "snowflake_stream_on_directory_table" "stream" {
name = "stream"
schema = "schema"
database = "database"
copy_grants = true
stage = snowflake_stage.stage.fully_qualified_name
at {
statement = "8e5d0ca9-005e-44e6-b858-a8f5b37c5726"
}
comment = "A stream."
}
```
-> **Note** Instead of using fully_qualified_name, you can reference objects managed outside Terraform by constructing a correct ID, consult [identifiers guide](https://registry.terraform.io/providers/Snowflake-Labs/snowflake/latest/docs/guides/identifiers#new-computed-fully-qualified-name-field-in-resources).
<!-- TODO(SNOW-1634854): include an example showing both methods-->

<!-- schema generated by tfplugindocs -->
## Schema

### Required

- `database` (String) The database in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `(`, `)`, `"`
- `name` (String) Specifies the identifier for the stream; must be unique for the database and schema in which the stream is created. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `(`, `)`, `"`
- `schema` (String) The schema in which to create the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `(`, `)`, `"`
- `stage` (String) Specifies an identifier for the stage the stream will monitor. Due to Snowflake limitations, the provider can not read the stage's database and schema. For stages, Snowflake returns only partially qualified name instead of fully qualified name. Please use stages located in the same schema as the stream. Due to technical limitations (read more [here](https://github.com/Snowflake-Labs/terraform-provider-snowflake/blob/main/docs/technical-documentation/identifiers_rework_design_decisions.md#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `(`, `)`, `"`

### Optional

- `comment` (String) Specifies a comment for the stream.
- `copy_grants` (Boolean) Retains the access permissions from the original stream when a stream is recreated using the OR REPLACE clause. That is sometimes used when the provider detects changes for fields that can not be changed by ALTER. This value will not have any effect when creating a new stream.

### Read-Only

- `describe_output` (List of Object) Outputs the result of `DESCRIBE STREAM` for the given stream. (see [below for nested schema](#nestedatt--describe_output))
- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution).
- `id` (String) The ID of this resource.
- `show_output` (List of Object) Outputs the result of `SHOW STREAMS` for the given stream. (see [below for nested schema](#nestedatt--show_output))
- `stale` (Boolean) Indicated if the stream is stale. When Terraform detects that the stream is stale, the stream is recreated with `CREATE OR REPLACE`. Read more on stream staleness in Snowflake [docs](https://docs.snowflake.com/en/user-guide/streams-intro#data-retention-period-and-staleness).

<a id="nestedatt--describe_output"></a>
### Nested Schema for `describe_output`

Read-Only:

- `base_tables` (List of String)
- `comment` (String)
- `created_on` (String)
- `database_name` (String)
- `invalid_reason` (String)
- `mode` (String)
- `name` (String)
- `owner` (String)
- `owner_role_type` (String)
- `schema_name` (String)
- `source_type` (String)
- `stale` (String)
- `stale_after` (String)
- `table_name` (String)
- `type` (String)


<a id="nestedatt--show_output"></a>
### Nested Schema for `show_output`

Read-Only:

- `base_tables` (List of String)
- `comment` (String)
- `created_on` (String)
- `database_name` (String)
- `invalid_reason` (String)
- `mode` (String)
- `name` (String)
- `owner` (String)
- `owner_role_type` (String)
- `schema_name` (String)
- `source_type` (String)
- `stale` (String)
- `stale_after` (String)
- `table_name` (String)
- `type` (String)

## Import

Import is supported using the following syntax:

```shell
terraform import snowflake_stream_on_directory_table.example '"<database_name>"."<schema_name>"."<stream_name>"'
```
1 change: 1 addition & 0 deletions docs/resources/stream_on_external_table.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,7 @@ resource "snowflake_stream_on_external_table" "stream" {
- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution).
- `id` (String) The ID of this resource.
- `show_output` (List of Object) Outputs the result of `SHOW STREAMS` for the given stream. (see [below for nested schema](#nestedatt--show_output))
- `stale` (Boolean) Indicated if the stream is stale. When Terraform detects that the stream is stale, the stream is recreated with `CREATE OR REPLACE`. Read more on stream staleness in Snowflake [docs](https://docs.snowflake.com/en/user-guide/streams-intro#data-retention-period-and-staleness).

<a id="nestedblock--at"></a>
### Nested Schema for `at`
Expand Down
1 change: 1 addition & 0 deletions docs/resources/stream_on_table.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,7 @@ resource "snowflake_stream_on_table" "stream" {
- `fully_qualified_name` (String) Fully qualified name of the resource. For more information, see [object name resolution](https://docs.snowflake.com/en/sql-reference/name-resolution).
- `id` (String) The ID of this resource.
- `show_output` (List of Object) Outputs the result of `SHOW STREAMS` for the given stream. (see [below for nested schema](#nestedatt--show_output))
- `stale` (Boolean) Indicated if the stream is stale. When Terraform detects that the stream is stale, the stream is recreated with `CREATE OR REPLACE`. Read more on stream staleness in Snowflake [docs](https://docs.snowflake.com/en/user-guide/streams-intro#data-retention-period-and-staleness).

<a id="nestedblock--at"></a>
### Nested Schema for `at`
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
terraform import snowflake_stream_on_directory_table.example '"<database_name>"."<schema_name>"."<stream_name>"'
33 changes: 33 additions & 0 deletions examples/resources/snowflake_stream_on_directory_table/resource.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
resource "snowflake_stage" "example_stage" {
name = "EXAMPLE_STAGE"
url = "s3://com.example.bucket/prefix"
database = "EXAMPLE_DB"
schema = "EXAMPLE_SCHEMA"
credentials = "AWS_KEY_ID='${var.example_aws_key_id}' AWS_SECRET_KEY='${var.example_aws_secret_key}'"
}

# basic resource
resource "snowflake_stream_on_directory_table" "stream" {
name = "stream"
schema = "schema"
database = "database"

stage = snowflake_stage.stage.fully_qualified_name
}


# resource with more fields set
resource "snowflake_stream_on_directory_table" "stream" {
name = "stream"
schema = "schema"
database = "database"

copy_grants = true
stage = snowflake_stage.stage.fully_qualified_name

at {
statement = "8e5d0ca9-005e-44e6-b858-a8f5b37c5726"
}

comment = "A stream."
}
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,11 @@ func (s *StreamAssert) HasBaseTables(expected ...sdk.SchemaObjectIdentifier) *St
}
var errs []error
for _, wantId := range expected {
if !slices.ContainsFunc(o.BaseTables, func(gotId sdk.SchemaObjectIdentifier) bool {
if !slices.ContainsFunc(o.BaseTables, func(gotName string) bool {
gotId, err := sdk.ParseSchemaObjectIdentifier(gotName)
if err != nil {
errs = append(errs, err)
}
return wantId.FullyQualifiedName() == gotId.FullyQualifiedName()
}) {
errs = append(errs, fmt.Errorf("expected id: %s, to be in the list ids: %v", wantId.FullyQualifiedName(), o.BaseTables))
Expand All @@ -74,6 +78,23 @@ func (s *StreamAssert) HasBaseTables(expected ...sdk.SchemaObjectIdentifier) *St
return s
}

func (s *StreamAssert) HasBaseTablesPartiallyQualified(expected ...string) *StreamAssert {
s.AddAssertion(func(t *testing.T, o *sdk.Stream) error {
t.Helper()
if len(o.BaseTables) != len(expected) {
return fmt.Errorf("expected base tables length: %v; got: %v", len(expected), len(o.BaseTables))
}
var errs []error
for _, wantName := range expected {
if !slices.Contains(o.BaseTables, wantName) {
errs = append(errs, fmt.Errorf("expected name: %s, to be in the list ids: %v", wantName, o.BaseTables))
}
}
return errors.Join(errs...)
})
return s
}

func (s *StreamAssert) HasMode(expected sdk.StreamMode) *StreamAssert {
s.AddAssertion(func(t *testing.T, o *sdk.Stream) error {
t.Helper()
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -65,4 +65,8 @@ var allResourceSchemaDefs = []ResourceSchemaDef{
name: "StreamOnExternalTable",
schema: resources.StreamOnExternalTable().Schema,
},
{
name: "StreamOnDirectoryTable",
schema: resources.StreamOnDirectoryTable().Schema,
},
}
Loading
Loading