cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
cancel
Showing results for 
Search instead for 
Did you mean: 

How to resolve this error "Error: cannot create global init script: default auth: cannot configure default credentials"

apatel
New Contributor III

I'm trying to set the global init script via my Terraform deployment. I did a thorough google search and can't seem to find guidance here.

I'm using a very generic call to set these scripts in my TF Deployment.

terraform {

required_providers {

databricks = {

source = "databricks/databricks"

}

}

}

provider "databricks" {

profile = var.databricks_profile

alias = "databricks_provider"

host = var.databricks_workspace_url

token = var.databricks_workspace_token

}

resource "databricks_global_init_script" "init_script" {

source = "${path.module}/databricks_init_scripts.sh"

name = "mdp_init_script"

enabled = true

}

The bash script file databricks_global_init_script.sh is basically empty. I put "apt-get update". I don't believe the script is even being called.

The DB Workspace has a personal access token setup.

Full error message

│ Error: cannot create global init script: default auth: cannot configure default credentials

│ 

│  with module.databricks_cluster_setup.databricks_global_init_script.init_script,

│  on ../modules/core-databricks/main.tf line 41, in resource "databricks_global_init_script" "init_script":

│  41: resource "databricks_global_init_script" "init_script" {

1 ACCEPTED SOLUTION

Accepted Solutions

apatel
New Contributor III

Ok in case this helps anyone else, I've managed to resolve.

I confirmed in this documentation the databricks CLI is required locally, wherever this is being executed. https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/cluster-notebook-job

I managed to debug the init_script issues by viewing the output of the script from the DBFS.

https://docs.databricks.com/dev-tools/cli/dbfs-cli.html#

This way I can get to the STDOUT. One thing to keep in mind is to use commands in the script that are headless in nature - for example if doing apt-get do a apt-get -y instead to auto approve the action

Thanks to anyone who maybe spent time reading / investigating this with me!

View solution in original post

2 REPLIES 2

apatel
New Contributor III

Some follow up context.

I installed the Databricks CLI and set the target Azure Databricks host and PAT on a whim , and then re-ran the terraform and it looks like the cluster is being provisioned ... but ultimately fails due to the init script

Error: cannot create cluster: 0328-212215-kql46zgp is not able to transition from TERMINATED to RUNNING: An admin configured global init script failed. instance_id: 8ba71c27fbe54f3aa60799595954c3ce, databricks_error_message: Global init script mdp_init_script failed: Script exit status is non-zero, Termination info: code: GLOBAL_INIT_SCRIPT_FAILURE, type: CLIENT_ERROR, parameters: map[databricks_error_message:Global init script mdp_init_script failed: Script exit status is non-zero instance_id:8ba71c27fbe54f3aa60799595954c3ce]. Please see https://docs.databricks.com/dev-tools/api/latest/clusters.html#clusterclusterstate for more details

I would not expect the CLI as a requirement and a manual step to set the host and PAT, because I already do that in the Terraform code. Is this expected nature? If not, then how do I resolve the credential issue?

apatel
New Contributor III

Ok in case this helps anyone else, I've managed to resolve.

I confirmed in this documentation the databricks CLI is required locally, wherever this is being executed. https://learn.microsoft.com/en-us/azure/databricks/dev-tools/terraform/cluster-notebook-job

I managed to debug the init_script issues by viewing the output of the script from the DBFS.

https://docs.databricks.com/dev-tools/cli/dbfs-cli.html#

This way I can get to the STDOUT. One thing to keep in mind is to use commands in the script that are headless in nature - for example if doing apt-get do a apt-get -y instead to auto approve the action

Thanks to anyone who maybe spent time reading / investigating this with me!

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.