subcategory |
---|
Compute |
-> Note If you have a fully automated setup with workspaces created by databricks_mws_workspaces or azurerm_databricks_workspace, please make sure to add depends_on attribute in order to prevent default auth: cannot configure default credentials errors.
Retrieves a list of databricks_cluster ids, that were created by Terraform or manually, with or without databricks_cluster_policy.
Retrieve all clusters on this workspace on AWS or GCP:
data "databricks_clusters" "all" {
depends_on = [databricks_mws_workspaces.this]
}
Retrieve all clusters with "Shared" in their cluster name on this Azure Databricks workspace:
data "databricks_clusters" "all_shared" {
depends_on = [azurerm_databricks_workspace.this]
cluster_name_contains = "shared"
}
cluster_name_contains
- (Optional) Only return databricks_cluster ids that match the given name string.
This data source exports the following attributes:
ids
- list of databricks_cluster ids
The following resources are used in the same context:
- End to end workspace management guide.
- databricks_cluster to create Databricks Clusters.
- databricks_cluster_policy to create a databricks_cluster policy, which limits the ability to create clusters based on a set of rules.
- databricks_instance_pool to manage instance pools to reduce cluster start and auto-scaling times by maintaining a set of idle, ready-to-use instances.
- databricks_job to manage Databricks Jobs to run non-interactive code in a databricks_cluster.
- databricks_library to install a library on databricks_cluster.
- databricks_pipeline to deploy Delta Live Tables.