-
Notifications
You must be signed in to change notification settings - Fork 394
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Internal] Use Plugin Framework types internally in generated TF SDK structures #4291
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great to me! The tests and comments are really helpful. Awesome job!
SliceStructPtr types.List `tfsdk:"slice_struct_ptr" tf:"optional"` | ||
Irrelevant types.String `tfsdk:"-"` | ||
Object types.Object `tfsdk:"object" tf:"optional"` | ||
ObjectPtr types.Object `tfsdk:"object_ptr" tf:"optional"` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove optional?
If integration tests don't run automatically, an authorized user can run them manually by following the instructions below: Trigger: Inputs:
Checks will be approved automatically on success. |
Test Details: go/deco-tests/12259752305 |
### New Features and Improvements * Add `databricks_app` resource and data source ([#4099](#4099)). ### Documentation * Add a warning that attribute should be used in `databricks_permissions` for `databricks_vector_search_endpoint` ([#4312](#4312)). ### Internal Changes * Added TF Plugin Framework checkbox item to PR template and removed checkbox for integration tests passing ([#4227](#4227)). * Expose several integration test helpers for use in plugin framework integration tests ([#4310](#4310)). * Fix ReadOnly() for ListNestedAttribute and Validators for ListNestedBlock ([#4313](#4313)). * Panic if the provided path is invalid ([#4309](#4309)). * Simplify `databricks_storage_credential` code ([#4301](#4301)). * Use Attributes by default for List Objects ([#4315](#4315)). * Use Plugin Framework types internally in generated TF SDK structures ([#4291](#4291)).
### New Features and Improvements * Add `databricks_app` resource and data source ([#4099](#4099)). ### Documentation * Add a warning that attribute should be used in `databricks_permissions` for `databricks_vector_search_endpoint` ([#4312](#4312)). ### Internal Changes * Added TF Plugin Framework checkbox item to PR template and removed checkbox for integration tests passing ([#4227](#4227)). * Expose several integration test helpers for use in plugin framework integration tests ([#4310](#4310)). * Fix ReadOnly() for ListNestedAttribute and Validators for ListNestedBlock ([#4313](#4313)). * Panic if the provided path is invalid ([#4309](#4309)). * Simplify `databricks_storage_credential` code ([#4301](#4301)). * Use Attributes by default for List Objects ([#4315](#4315)). * Use Plugin Framework types internally in generated TF SDK structures ([#4291](#4291)).
Changes
Under some circumstances, the generated TFSDK structures may crash when trying to read the plan during create or update. This can happen specifically when the generated structure includes a plain Go type, such as a slice or map, and the corresponding attribute is either null or unknown in the plan. The issue is that the plugin framework does not have a way to represent these states in a plain slice or map, so it simply fails.
Terraform recommends using the plugin framework types for all fields, including lists, objects, and maps. This PR changes the generated structures to use
types.List
andtypes.Map
for all lists, objects, and map attributes in all generated code.Because these types do not include compile-time metadata about the type of the contained element, the contained element types are accessible at runtime through the addition of a
GetComplexFieldTypes()
method. This returns a map from resource field name (specifically, the value of thetfsdk
tag) to thereflect.Type
instance of the contained type. This must be either a primitive type from the plugin framework type system or a TF SDK struct type. In this PR, I added support for only 4 primitive types:types.String
,types.Bool
,types.Int64
andtypes.Float64
.Additional methods are also added via code generation (which accounts for most of the lines of code in this PR). They are:
Type(context.Context) attr.Type
returns the Terraform type for the current object. This is always abasetypes.ObjectType
. It should recursively call the same method on other TF SDK structures that are contained in list, map, or object fields.ToObjectValue(context.Context) types.ObjectValue
converts the TF SDK object to antypes.ObjectValue
instance. This makes it simpler to construct otherattr.Value
s, such as lists and maps, from a TF SDK object.Get...(context.Context)
andSet...(context.Context, ...)
are convenience getters and setters for list, map, and object fields within your structure.GoSdkToTfSdkStruct, TfSdkToGoSdkStruct, and ResourceStructToSchema are all updated to handle the new structure of these TF SDK structs.
This PR does not change the default behavior of treating nested structs & pointers to structs as list blocks. However, it does add support for
types.Object
, so when we decide to change such fields to use this later, it should work as expected. Additionally, support for list attributes is not implemented here, but the change should be fairly easy (a small tweak intypeToSchema
).Note: I did need to make a manual change to one autogenerated file: the ComplianceSecurityProfile references ComplianceStandards from the settings package. This is an enum, so it should be treated as a string, but our current API spec doesn't provide enough metadata for our autogeneration templates to determine this, so it is treated as an object. This should be fixed after @renaudhartert-db's work to add duplicated structs in the API spec to eliminate cross-package dependencies.
Tests
Existing tests should continue to pass. Added a test case covering
types.Object
fields in conventions and ResourceStructToSchema.I have been running the integration tests from a feature branch implementing the
databricks_app
resource, which has succeeded locally.