Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Add datasource databricks_users #4028

Open
wants to merge 34 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
05f70c1
#3468: added initial data_users.go resource
dgomez04 Sep 13, 2024
f2dd1c2
#3468: made changes to data_users.go to explicitly define schema, and…
dgomez04 Sep 16, 2024
2d594a9
#3468: added 'user_name_contains' attribute to data_users.go to allow…
dgomez04 Sep 17, 2024
0d6f280
#3468: added acceptance tests for data_users
dgomez04 Sep 17, 2024
b6380f8
#3468: added documentation for databricks_users data source.
dgomez04 Sep 17, 2024
1c703b4
renamed resource func to DataSourceUsers, removed various acceptance …
dgomez04 Sep 20, 2024
dbd79d8
added correct reference to data resource 'databricks_users' on the ac…
dgomez04 Sep 20, 2024
4ed08d4
name format changes to acceptance tests
dgomez04 Sep 20, 2024
072123a
modified acceptance test
dgomez04 Oct 2, 2024
bffcf54
fixed integration test to point to correct import
dgomez04 Oct 2, 2024
7fa5823
#3468: started migrating to the plugin framework, adding support for …
dgomez04 Oct 23, 2024
1c58eaa
migrated data_users to plugin framework and added support for both pr…
dgomez04 Oct 24, 2024
c2d70b9
deleted sdkv2 'data_users' and removed it from sdkv2.go
dgomez04 Oct 24, 2024
4ab1e27
migrated test
dgomez04 Oct 24, 2024
8db0188
added acceptance tests and added the data source to pluginfw.go
dgomez04 Oct 24, 2024
2e46921
modified docs to reflect changes
dgomez04 Oct 24, 2024
ed587ee
Merge branch 'main' into feature/3468-databricks_users
dgomez04 Oct 24, 2024
6e4b8ed
added correct attributes to the docs
dgomez04 Oct 24, 2024
5604c68
added correct attributes to the docs
dgomez04 Oct 24, 2024
f285db5
Merge remote-tracking branch 'origin/feature/3468-databricks_users' i…
dgomez04 Oct 24, 2024
a628819
Merge branch 'main' into feature/3468-databricks_users
dgomez04 Oct 31, 2024
d5ac8b4
Fix `fmt`on pluginfw.go
dgomez04 Oct 31, 2024
32e79f6
merged upstream
dgomez04 Oct 31, 2024
6f6d19e
merged with upstream
dgomez04 Nov 1, 2024
88608c1
removed unnecesary tests, switched to use a filter field to leverage …
dgomez04 Nov 1, 2024
26c69fb
added resource to rollout utils
dgomez04 Nov 1, 2024
9e54137
updated docs to reflect changes
dgomez04 Nov 1, 2024
43b66ec
added requested changes
dgomez04 Nov 20, 2024
27f6d32
merged upstream and resolved conflicts
dgomez04 Nov 20, 2024
7a47c2a
Merge remote-tracking branch 'upstream/main' into feature/3468-databr…
dgomez04 Nov 25, 2024
7949b7d
merged latest changes from main branch and fixed a test definition
dgomez04 Nov 25, 2024
03fad81
made requested changes
dgomez04 Nov 27, 2024
a642e74
Merge remote-tracking branch 'upstream/main' into feature/3468-databr…
dgomez04 Nov 27, 2024
99d7d2c
added files into the products folder and fixed tests
dgomez04 Nov 27, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
99 changes: 99 additions & 0 deletions docs/data-sources/users.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,99 @@
---
subcategory: "Security"
---

# databricks_users Data Source

-> This data source works with both the account-level and workspace-level provider.

-> If you have a fully automated setup with workspaces created by [databricks_mws_workspaces](../resources/mws_workspaces.md) or [azurerm_databricks_workspace](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/databricks_workspace), please make sure to add [depends_on attribute](../guides/troubleshooting.md#data-resources-and-authentication-is-not-configured-errors) in order to prevent _default auth: cannot configure default credentials_ errors.
dgomez04 marked this conversation as resolved.
Show resolved Hide resolved

Retrieves information about multiple [databricks_user](../resources/user.md) resources.

## Example Usage

Adding a subset of users to a group

```hcl
data "databricks_users" "company_users" {
filter = "userName co \"@domain.org\""
}

resource "databricks_group" "data_users_group" {
display_name = "Data Users"
}

resource "databricks_group_member" "add_users_to_group" {
for_each = { for user in data.databricks_users.company_users.users : user.id => user }
group_id = databricks_group.data_users_group.id
member_id = each.value.id
}
```

## Argument Reference

This data source allows you to filter the list of users using the following optional arguments:

-> Attribute names and operators used in filters are case-insensitive. Find more information [here](https://datatracker.ietf.org/doc/html/rfc7644#section-3.4.2.2).

- `filter` - (Optional) Query by which the results have to be filtered. If not specified, all users will be returned. Supported operators are equals (`eq`), contains (`co`), starts with (`sw`), and not equals (`ne`). Additionally, simple expressions can be formed using logical operators `and` and `or`.
dgomez04 marked this conversation as resolved.
Show resolved Hide resolved

**Examples:**
- User whose `displayName` equals "john":
```hcl
filter = "displayName eq \"john\""
```
- User whose `displayName` contains "john" or `userName` contains "@domain.org":
```hcl
filter = "displayName co \"john\" or userName co \"@domain.org\""
```

- `extra_attributes` - (Optional) A comma-separated list of additional user attributes to include in the results. By default, the data source returns the following attributes: `id`, `userName`, `displayName`, and `externalId`. Use this argument to request additional attributes as needed. The list of all available attributes can be found in the [API reference](https://docs.databricks.com/api/workspace/users/list).

## Attribute Reference

This data source exposes the following attributes:

- `users` - A list of users matching the specified criteria. Each user has the following attributes:
- `id` - The ID of the user.
- `userName` - The username of the user.
- `emails` - All the emails associated with the Databricks user.
- `name`
- `givenName` - Given name of the Databricks user.
- `familyName` - Family name of the Databricks user.
- `displayName` - The display name of the user.
- `groups` - Indicates if the user is part of any groups.
- `$ref`
- `value`
- `display`
- `primary`
- `type`
- `entitlements` - Entitlements assigned to the user.
- `$ref`
- `value`
- `display`
- `primary`
- `type`
- `roles` - Indicates if the user has any associated roles.
- `$ref`
- `value`
- `display`
- `primary`
- `type`
- `schemas` - The schema of the user.
- `externalId` - Reserved for future use.
- `active` - Boolean that represents if this user is active.
dgomez04 marked this conversation as resolved.
Show resolved Hide resolved

## Related Resources

The following resources are used in the same context:

- [**databricks_user**](../resources/user.md): Resource to manage individual users in Databricks.

- [**databricks_group**](../resources/group.md): Resource to manage groups in Databricks.

- [**databricks_group_member**](../resources/group_member.md): Resource to manage group memberships by adding users to groups.

- [**databricks_permissions**](../resources/permissions.md): Resource to manage access control in the Databricks workspace.

- [**databricks_current_user**](current_user.md): Data source to retrieve information about the user or service principal that is calling the Databricks REST API.
2 changes: 2 additions & 0 deletions internal/providers/pluginfw/pluginfw_rollout_utils.go
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ import (
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/products/registered_model"
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/products/serving"
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/products/sharing"
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/products/user"
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/products/volume"
"github.com/hashicorp/terraform-plugin-framework/datasource"
"github.com/hashicorp/terraform-plugin-framework/resource"
Expand Down Expand Up @@ -48,6 +49,7 @@ var pluginFwOnlyDataSources = []func() datasource.DataSource{
registered_model.DataSourceRegisteredModel,
registered_model.DataSourceRegisteredModelVersions,
notificationdestinations.DataSourceNotificationDestinations,
user.DataSourceUsers,
catalog.DataSourceFunctions,
// TODO: Add DataSourceCluster into migratedDataSources after fixing unit tests.
cluster.DataSourceCluster, // Using the staging name (with pluginframework suffix)
Expand Down
104 changes: 104 additions & 0 deletions internal/providers/pluginfw/products/user/data_users.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
package user

import (
"context"

"github.com/databricks/databricks-sdk-go/service/iam"
"github.com/databricks/terraform-provider-databricks/common"
pluginfwcommon "github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/common"
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/converters"
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/tfschema"
"github.com/databricks/terraform-provider-databricks/internal/service/iam_tf"
"github.com/hashicorp/terraform-plugin-framework/datasource"
"github.com/hashicorp/terraform-plugin-framework/datasource/schema"
"github.com/hashicorp/terraform-plugin-framework/types"
)

const dataSourceName = "users"

func DataSourceUsers() datasource.DataSource {
return &UsersDataSource{}
}

var _ datasource.DataSourceWithConfigure = &UsersDataSource{}

type UsersDataSource struct {
Client *common.DatabricksClient
}

type UsersInfo struct {
Filter types.String `json:"filter,omitempty"`
ExtraAttributes types.String `json:"extra_attributes,omitempty"`
Users []iam_tf.User `json:"users,omitempty" tf:"computed"`
}

func (d *UsersDataSource) Metadata(ctx context.Context, req datasource.MetadataRequest, resp *datasource.MetadataResponse) {
resp.TypeName = pluginfwcommon.GetDatabricksProductionName(dataSourceName)
}

func (d *UsersDataSource) Schema(ctx context.Context, req datasource.SchemaRequest, resp *datasource.SchemaResponse) {
attrs, blocks := tfschema.DataSourceStructToSchemaMap(UsersInfo{}, nil)
resp.Schema = schema.Schema{
Attributes: attrs,
Blocks: blocks,
}
}

func (d *UsersDataSource) Configure(_ context.Context, req datasource.ConfigureRequest, resp *datasource.ConfigureResponse) {
if d.Client == nil {
d.Client = pluginfwcommon.ConfigureDataSource(req, resp)
}
}

func (d *UsersDataSource) Read(ctx context.Context, req datasource.ReadRequest, resp *datasource.ReadResponse) {
var usersInfo UsersInfo
attributes := "id,userName,displayName,externalId"

resp.Diagnostics.Append(req.Config.Get(ctx, &usersInfo)...)
if resp.Diagnostics.HasError() {
return
}

if !(usersInfo.ExtraAttributes.IsNull()) {
attributes += ","
attributes += usersInfo.ExtraAttributes.String()
}

var users []iam.User
var err error

if d.Client.Config.IsAccountClient() {
a, diags := d.Client.GetAccountClient()
resp.Diagnostics.Append(diags...)
if resp.Diagnostics.HasError() {
return
}

users, err = a.Users.ListAll(ctx, iam.ListAccountUsersRequest{Filter: usersInfo.Filter.ValueString(), Attributes: attributes})
if err != nil {
resp.Diagnostics.AddError("Error listing account users", err.Error())
}
} else {
w, diags := d.Client.GetWorkspaceClient()
resp.Diagnostics.Append(diags...)
if resp.Diagnostics.HasError() {
return
}

users, err = w.Users.ListAll(ctx, iam.ListUsersRequest{Filter: usersInfo.Filter.ValueString()})
if err != nil {
resp.Diagnostics.AddError("Error listing workspace users", err.Error())
}
}

for _, user := range users {
var tfUser iam_tf.User
resp.Diagnostics.Append(converters.GoSdkToTfSdkStruct(ctx, user, &tfUser)...)
if resp.Diagnostics.HasError() {
return
}
usersInfo.Users = append(usersInfo.Users, tfUser)
}

resp.Diagnostics.Append(resp.State.Set(ctx, usersInfo)...)
}
124 changes: 124 additions & 0 deletions internal/providers/pluginfw/products/user/data_users_acc_test.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,124 @@
package user_test

import (
"testing"

"github.com/databricks/terraform-provider-databricks/internal/acceptance"
"github.com/hashicorp/terraform-plugin-testing/terraform"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
)

const dataSourceTemplate = `
resource "databricks_user" "user1" {
user_name = "tf-{var.STICKY_RANDOM}[email protected]"
}

resource "databricks_user" "user2" {
user_name = "tf-{var.STICKY_RANDOM}[email protected]"
}

data "databricks_users" "this" {
filter = "userName co \"testuser\""
depends_on = [databricks_user.user1, databricks_user.user2]
}
`

const dataSourceTemplateExtraAttributes = `
resource "databricks_group" "admins" {
display_name = "admins-{var.STICKY_RANDOM}"
}

resource "databricks_user" "user1" {
user_name = "tf-{var.STICKY_RANDOM}[email protected]"
}

resource "databricks_group_member" "membership" {
group_id = databricks_group.admins.id
member_id = databricks_user.user1.id
}

data "databricks_users" "this" {
filter = "userName eq \"me-{var.STICKY_RANDOM}@example.com\""
extra_attributes = "groups"
depends_on = [databricks_group_member.membership]
}
`

func checkUsersDataSourcePopulated(t *testing.T) func(s *terraform.State) error {
return func(s *terraform.State) error {
ds, ok := s.Modules[0].Resources["data.databricks_users.this"]
require.True(t, ok, "data.databricks_users.this has to be there")

usersCount := ds.Primary.Attributes["users.#"]
require.Equal(t, "2", usersCount, "expected two users")

userIds := []string{
ds.Primary.Attributes["users.0.id"],
ds.Primary.Attributes["users.1.id"],
}

expectedUserIDs := []string{
s.Modules[0].Resources["databricks_user.user1"].Primary.ID,
s.Modules[0].Resources["databricks_user.user2"].Primary.ID,
}

assert.ElementsMatch(t, expectedUserIDs, userIds, "expected user ids to match")

return nil
}
}

func checkUsersDataSourceWithGroups(t *testing.T) func(s *terraform.State) error {
return func(s *terraform.State) error {
ds, ok := s.Modules[0].Resources["data.databricks_users.this"]
require.True(t, ok, "data.databricks_users.this must be present")

usersCount := ds.Primary.Attributes["users.#"]
require.Equal(t, "1", usersCount, "expected one user")

userPrefix := "users.0."

groupsCountAttr := userPrefix + "groups.#"
groupsCount, exists := ds.Primary.Attributes[groupsCountAttr]
require.True(t, exists, "attribute groups.# should be present")
require.Equal(t, "1", groupsCount, "expected one group membership")

groupIdAttr := userPrefix + "groups.0.value"
groupId, exists := ds.Primary.Attributes[groupIdAttr]
require.True(t, exists, "attribute group.0.value should be present")

expectedGroupId := s.Modules[0].Resources["databricks_group.admins"].Primary.ID
assert.Equal(t, expectedGroupId, groupId, "group id should match the admins group id")

return nil
}
}

func TestAccDataSourceDataUsers(t *testing.T) {
acceptance.AccountLevel(t, acceptance.Step{
Template: dataSourceTemplate,
Check: checkUsersDataSourcePopulated(t),
})
}

func TestWorkspaceDataSourceDataUsers(t *testing.T) {
acceptance.WorkspaceLevel(t, acceptance.Step{
Template: dataSourceTemplate,
Check: checkUsersDataSourcePopulated(t),
})
}

func TestAccDataSourceUsers_WithGroups(t *testing.T) {
acceptance.AccountLevel(t, acceptance.Step{
Template: dataSourceTemplateExtraAttributes,
Check: checkUsersDataSourceWithGroups(t),
})
}

func TestWorkspaceDataSourceUsers_WithGroups(t *testing.T) {
acceptance.WorkspaceLevel(t, acceptance.Step{
Template: dataSourceTemplateExtraAttributes,
Check: checkUsersDataSourceWithGroups(t),
})
}
Loading