cancel
Showing results for 
Search instead for 
Did you mean: 

Azure Databricks cluster driver config

ShellyXiao
New Contributor II

Hi there,

I am trying to set up databricks storage account access in Global init script. according to Azure Databricks document on creating cluster with driver config for all clusters (https://learn.microsoft.com/en-us/azure/databricks/archive/compute/configure#spark-configuration), I wrote following code, but I am not sure the format is correct or not and how to reference secret.

I tested the init script, got this error:

ERROR ProjectConf$: Failed to parse conf file '/databricks/driver/conf/00-custom-spark-driver-defaults.conf', skipping...

com.typesafe.config.ConfigException$Parse: File: /databricks/driver/conf/00-custom-spark-driver-defaults.conf: 8: in value for key '"fs.azure.account.oauth2.client.secret.adlssaprdraw.dfs.core.windows.net"': expecting a close brace or a field name here, got '{'

   at com.typesafe.config.impl.Parser$ParseContext.parseError(Parser.java:435)

could you help?

Thanks!

My code:

spark_defaults_conf="/databricks/driver/conf/00-custom-spark-driver-defaults.conf"

cat << EOF > $spark_defaults_conf

[driver] {

"fs.azure.account.auth.type.adlssadevraw.dfs.core.windows.net" = "OAuth"

"fs.azure.account.oauth.provider.type.adlssadevraw.dfs.core.windows.net" = "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider"

"fs.azure.account.oauth2.client.id.adlssadevraw.dfs.core.windows.net" = "444aef64-8f39-41c0-b769-e312d20be27f"

"fs.azure.account.oauth2.client.secret.adlssadevraw.dfs.core.windows.net" = {{secrets/dev-kv-01-scope/databricks-dev-01-sp}}

"fs.azure.account.oauth2.client.endpoint.adlssadevraw.dfs.core.windows.net" = "https://login.microsoftonline.com/c499ec336-2375-432e-92f5-63cbbc442ad57/oauth2/token"

}

EOF

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @Shelly Xiao​, Thank you for reaching out regarding your issue with setting up a Databricks storage account access in the Global init script. It seems that the error is related to the format of your configuration file and the way you reference the secret.

Firstly, the format of the configuration file should be in the HOCON format, not JSON. This means you don't need double quotes for keys and values.

Additionally, it would be best if you used the

dbutils library to access the personal value instead of directly referencing it in the configuration file.

Here's the modified version of your code:

spark_defaults_conf="/databricks/driver/conf/00-custom-spark-driver-defaults.conf"
 
# Get the secret value using dbutils
client_secret=$(dbutils.secrets.get(scope = "dev-kv-01-scope", key = "databricks-dev-01-sp"))
 
cat << EOF > $spark_defaults_conf
[driver] {
fs.azure.account.auth.type.adlssadevraw.dfs.core.windows.net = OAuth
fs.azure.account.oauth.provider.type.adlssadevraw.dfs.core.windows.net = org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
fs.azure.account.oauth2.client.id.adlssadevraw.dfs.core.windows.net = 444aef64-8f39-41c0-b769-e312d20be27f
fs.azure.account.oauth2.client.secret.adlssadevraw.dfs.core.windows.net = $client_secret
fs.azure.account.oauth2.client.endpoint.adlssadevraw.dfs.core.windows.net = https://login.microsoftonline.com/c499ec336-2375-432e-92f5-63cbbc442ad57/oauth2/token
}
EOF

Please note that to use dbutils in the init script, you'll need to run the script in a Python environment.

You can create a Python script and run it as an init script to access the secret and complete the configuration file.

I hope this helps resolve your issue. If you have any further questions or need additional assistance, please get in touch with us.

ShellyXiao
New Contributor II

Thank you for your response, Kaniz

I don't think $(dbUtils) as you provided in the example code will work.

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.