Hello Team,
I am quite new to Databricks and I am learning PySpark and Databricks.
I am trying to mount a DL Gen2 in Databricks, as part of that I had created app registration, added DL into app registration permissions, created a secret and also added the clientid, secret and tenantid in azure vault. I also created the scope too, but I am getting below error:
Please advice, what's the mistake.
ExecutionError: An error occurred while calling o376.mount.
: java.lang.NullPointerException: authEndpoint
at shaded.databricks.v20180920_b33d810.com.google.common.base.Preconditions.checkNotNull(Preconditions.java:204)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.oauth2.AzureADAuthenticator.getTokenUsingClientCreds(AzureADAuthenticator.java:84)
at com.databricks.backend.daemon.dbutils.DBUtilsCore.verifyAzureOAuth(DBUtilsCore.scala:751)
at com.databricks.backend.daemon.dbutils.DBUtilsCore.verifyAzureFileSystem(DBUtilsCore.scala:762)
at com.databricks.backend.daemon.dbutils.DBUtilsCore.mount(DBUtilsCore.scala:720)
at sun.reflect.GeneratedMethodAccessor291.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380)
at py4j.Gateway.invoke(Gateway.java:295)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:251)
at java.lang.Thread.run(Thread.java:748)
Script I ran, to mount:
applicationId=dbutils.secrets.get(scope="testscope",key="clientid")
authenticationKey=dbutils.secrets.get(scope="testscope",key="clientsecret")
tenandId=dbutils.secrets.get(scope="testscope",key="tenantid")
endpoint="https://login.microsoftonline.com/"+tenandId+"/oauth2/token"
source="abfss://"+adlsContainerName+"@"+adlsAccountName+".dfs.core.windows.net/"+adlsFolderName
# connecting using service principle oAuth
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": applicationId,
"fs.azure.account.oauth2.client.secret": authenticationKey,
"fs.azure.account.oauth2.client.endoint": "https://login.microsoftonline.com/"+tenandId+"/oauth2/token"}
## mounting the ADLS storage to Databricks. Mount only if the directory is not already mounted
dbutils.fs.mount(
source = source,
mount_point = "/mnt/adls",
extra_configs = configs)