subcategory |
---|
Workspace |
This resource allows you to manage global init scripts, which are run on all databricks_cluster and databricks_job.
You can declare Terraform-managed global init script by specifying source
attribute of corresponding local file.
resource "databricks_global_init_script" "init1" {
source = "${path.module}/init.sh"
name = "my init script"
}
You can also create managed global init script with inline sources through content_base64
attribute.
resource "databricks_global_init_script" "init2" {
content_base64 = base64encode(<<-EOT
#!/bin/bash
echo "hello world"
EOT
)
name = "hello script"
}
-> Note Global init script in the Databricks workspace would only be changed, if Terraform stage did change. This means that any manual changes to managed global init script won't be overwritten by Terraform, if there's no local change to source.
The size of a global init script source code must not exceed 64Kb. The following arguments are supported:
name
(string, required) - the name of the script. It should be uniquesource
- Path to script's source code on local filesystem. Conflicts withcontent_base64
content_base64
- The base64-encoded source code global init script. Conflicts withsource
. Use ofcontent_base64
is discouraged, as it's increasing memory footprint of Terraform state and should only be used in exceptional circumstancesenabled
(bool, optional default:false
) specifies if the script is enabled for execution, or notposition
(integer, optional default:null
) - the position of a global init script, where0
represents the first global init script to run,1
is the second global init script to run, and so on. When omitted, the script gets the last position.
In addition to all arguments above, the following attributes are exported:
id
- ID assigned to a global init script by API
Global init scripts are available only for administrators, so you can't change permissions for it.
The resource global init script can be imported using script ID:
$ terraform import databricks_global_init_script.this script_id
The following resources are often used in the same context:
- End to end workspace management guide.
- databricks_cluster to create Databricks Clusters.
- databricks_cluster_policy to create a databricks_cluster policy, which limits the ability to create clusters based on a set of rules.
- databricks_dbfs_file to manage relatively small files on Databricks File System (DBFS).
- databricks_ip_access_list to allow access from predefined IP ranges.
- databricks_mount to mount your cloud storage on
dbfs:/mnt/name
.