Skip to content

Commit

Permalink
Add support for the import_table block (#120)
Browse files Browse the repository at this point in the history
* feat: support table imports

* chore: update the import_table to a block

* chore: improve null handling and fix s3_bucket_source values

* chore: fix import_table var refs

* docs: update readme

* chore: fmt

* docs: readme update

* chore: update required TF version to 1.0

* docs: readme updates
  • Loading branch information
johncblandii authored Feb 14, 2024
1 parent 1a11bea commit 78293c3
Show file tree
Hide file tree
Showing 6 changed files with 144 additions and 194 deletions.
281 changes: 91 additions & 190 deletions README.md

Large diffs are not rendered by default.

3 changes: 2 additions & 1 deletion docs/terraform.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

| Name | Version |
|------|---------|
| <a name="requirement_terraform"></a> [terraform](#requirement\_terraform) | >= 0.13.0 |
| <a name="requirement_terraform"></a> [terraform](#requirement\_terraform) | >= 1.0.0 |
| <a name="requirement_aws"></a> [aws](#requirement\_aws) | >= 4.59 |
| <a name="requirement_null"></a> [null](#requirement\_null) | >= 2.0 |

Expand Down Expand Up @@ -59,6 +59,7 @@
| <a name="input_hash_key"></a> [hash\_key](#input\_hash\_key) | DynamoDB table Hash Key | `string` | n/a | yes |
| <a name="input_hash_key_type"></a> [hash\_key\_type](#input\_hash\_key\_type) | Hash Key type, which must be a scalar type: `S`, `N`, or `B` for (S)tring, (N)umber or (B)inary data | `string` | `"S"` | no |
| <a name="input_id_length_limit"></a> [id\_length\_limit](#input\_id\_length\_limit) | Limit `id` to this many characters (minimum 6).<br>Set to `0` for unlimited length.<br>Set to `null` for keep the existing setting, which defaults to `0`.<br>Does not affect `id_full`. | `number` | `null` | no |
| <a name="input_import_table"></a> [import\_table](#input\_import\_table) | Import Amazon S3 data into a new table. | <pre>object({<br> # Valid values are GZIP, ZSTD and NONE<br> input_compression_type = optional(string, null)<br> # Valid values are CSV, DYNAMODB_JSON, and ION.<br> input_format = string<br> input_format_options = optional(object({<br> csv = object({<br> delimiter = string<br> header_list = list(string)<br> })<br> }), null)<br> s3_bucket_source = object({<br> bucket = string<br> bucket_owner = optional(string)<br> key_prefix = optional(string)<br> })<br> })</pre> | `null` | no |
| <a name="input_label_key_case"></a> [label\_key\_case](#input\_label\_key\_case) | Controls the letter case of the `tags` keys (label names) for tags generated by this module.<br>Does not affect keys of tags passed in via the `tags` input.<br>Possible values: `lower`, `title`, `upper`.<br>Default value: `title`. | `string` | `null` | no |
| <a name="input_label_order"></a> [label\_order](#input\_label\_order) | The order in which the labels (ID elements) appear in the `id`.<br>Defaults to ["namespace", "environment", "stage", "name", "attributes"].<br>You can omit any of the 6 labels ("tenant" is the 6th), but at least one must be present. | `list(string)` | `null` | no |
| <a name="input_label_value_case"></a> [label\_value\_case](#input\_label\_value\_case) | Controls the letter case of ID elements (labels) as included in `id`,<br>set as tag values, and output by this module individually.<br>Does not affect values of tags passed in via the `tags` input.<br>Possible values: `lower`, `title`, `upper` and `none` (no transformation).<br>Set this to `title` and set `delimiter` to `""` to yield Pascal Case IDs.<br>Default value: `lower`. | `string` | `null` | no |
Expand Down
2 changes: 1 addition & 1 deletion examples/complete/versions.tf
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
terraform {
required_version = ">= 0.13.0"
required_version = ">= 1.0.0"

required_providers {
aws = {
Expand Down
26 changes: 26 additions & 0 deletions main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -93,6 +93,32 @@ resource "aws_dynamodb_table" "default" {
}
}

dynamic "import_table" {
for_each = var.import_table != null ? [1] : []

content {
input_compression_type = var.import_table.input_compression_type
input_format = var.import_table.input_format

dynamic "input_format_options" {
for_each = lookup(var.import_table, "input_format_options", null) != null ? [1] : []

content {
csv {
delimiter = var.import_table.input_format_options.csv.delimiter
header_list = var.import_table.input_format_options.csv.header_list
}
}
}

s3_bucket_source {
bucket = var.import_table.s3_bucket_source.bucket
bucket_owner = var.import_table.s3_bucket_source.bucket_owner
key_prefix = var.import_table.s3_bucket_source.key_prefix
}
}
}

dynamic "local_secondary_index" {
for_each = var.local_secondary_index_map
content {
Expand Down
24 changes: 23 additions & 1 deletion variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -179,4 +179,26 @@ variable "deletion_protection_enabled" {
type = bool
default = false
description = "Enable/disable DynamoDB table deletion protection"
}
}

variable "import_table" {
type = object({
# Valid values are GZIP, ZSTD and NONE
input_compression_type = optional(string, null)
# Valid values are CSV, DYNAMODB_JSON, and ION.
input_format = string
input_format_options = optional(object({
csv = object({
delimiter = string
header_list = list(string)
})
}), null)
s3_bucket_source = object({
bucket = string
bucket_owner = optional(string)
key_prefix = optional(string)
})
})
default = null
description = "Import Amazon S3 data into a new table."
}
2 changes: 1 addition & 1 deletion versions.tf
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
terraform {
required_version = ">= 0.13.0"
required_version = ">= 1.0.0"

required_providers {
aws = {
Expand Down

0 comments on commit 78293c3

Please sign in to comment.