cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Databricks cluster pool deployed through Terraform does not have UC enabled

erigaud
Honored Contributor

Hello everyone,

we have a workspace with UC enabled, we already have a couple of catalogs attached and when using our personal compute we are able to read/write tables in those catalogs.

However for our jobs we deployed a cluster pool using Terraform but those clusters don't seem to have UC enabled and thus cannot access our catalogs. Did someone ever run into this issue ? 

erigaud_1-1736874136257.png

 

Thank you 

4 REPLIES 4

Alberto_Umana
Databricks Employee
Databricks Employee

Hi @erigaud,

Could you please share your cluster pool configuration?

saurabh18cs
Valued Contributor

try by adding this to your terraform code:

data_security_mode      = "SINGLE_USER"

erigaud
Honored Contributor

Hello @Alberto_Umana @saurabh18cs thank you for replying, here is the configuration we're using in the terraform code : 

resource "databricks_instance_pool" "instance_pool" {
  instance_pool_name = "instance-pool-${var.environment}-${var.project}"
  min_idle_instances = 0
  max_capacity       = 30
  node_type_id       = "Standard_DS3_v2"
  idle_instance_autotermination_minutes = 30
  disk_spec {
    disk_size  = 80
    disk_count = 1
    disk_type {
      azure_disk_volume_type = "STANDARD_LRS"
    }
  }
}

Should we add something to this configuration so it works ? 

Thanks !

saurabh18cs
Valued Contributor

hi @erigaud 

you should add this to databricks_job which is also using your instance pool.

example::

resource "databricks_job" "this" {
  new_cluster {
    instance_pool_id        = databricks_instance_pool.executor[each.key].id
    driver_instance_pool_id = databricks_instance_pool.driver[each.key].id
    spark_version           = each.value.new_cluster.spark_version
    spark_conf              = each.value.new_cluster.spark_conf
    # custom_tags             = each.value.new_cluster.custom_tags #removed tags from job level and added to instance pool level (could not have tags at job level and instance pool level at the same time)
    runtime_engine          = can(each.value.new_cluster.runtime_engine) ? each.value.new_cluster.runtime_engine == "PHOTON" ? "PHOTON" : null : null
    spark_env_vars          = each.value.new_cluster.spark_env_vars
    data_security_mode      = "SINGLE_USER" 
    azure_attributes {
      availability = can(each.value.new_cluster.azure_attributes.availability) ? each.value.new_cluster.azure_attributes.availability : null
    }

    num_workers = can(each.value.new_cluster.num_workers) ? each.value.new_cluster.num_workers : null
    dynamic "autoscale" {
      for_each    = can(each.value.new_cluster.num_workers) ? [] : [each.value.new_cluster.autoscale]
      content {
        min_workers = autoscale.value.min_workers
        max_workers = autoscale.value.max_workers
      }
    }
  }
}

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group