Hi @Shelly Xiao, Thank you for reaching out regarding your issue with setting up a Databricks storage account access in the Global init script. It seems that the error is related to the format of your configuration file and the way you reference the secret.
Firstly, the format of the configuration file should be in the HOCON format, not JSON. This means you don't need double quotes for keys and values.
Additionally, it would be best if you used the
dbutils library to access the personal value instead of directly referencing it in the configuration file.
Here's the modified version of your code:
spark_defaults_conf="/databricks/driver/conf/00-custom-spark-driver-defaults.conf"
# Get the secret value using dbutils
client_secret=$(dbutils.secrets.get(scope = "dev-kv-01-scope", key = "databricks-dev-01-sp"))
cat << EOF > $spark_defaults_conf
[driver] {
fs.azure.account.auth.type.adlssadevraw.dfs.core.windows.net = OAuth
fs.azure.account.oauth.provider.type.adlssadevraw.dfs.core.windows.net = org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider
fs.azure.account.oauth2.client.id.adlssadevraw.dfs.core.windows.net = 444aef64-8f39-41c0-b769-e312d20be27f
fs.azure.account.oauth2.client.secret.adlssadevraw.dfs.core.windows.net = $client_secret
fs.azure.account.oauth2.client.endpoint.adlssadevraw.dfs.core.windows.net = https://login.microsoftonline.com/c499ec336-2375-432e-92f5-63cbbc442ad57/oauth2/token
}
EOF
Please note that to use dbutils in the init script, you'll need to run the script in a Python environment.
You can create a Python script and run it as an init script to access the secret and complete the configuration file.
I hope this helps resolve your issue. If you have any further questions or need additional assistance, please get in touch with us.