I was mounting the Datalake Gen1 to Databricks for accessing and processing files, The below code was working great for the past 1 year and all of a sudden I'm getting an error
configs = {"df.adl.oauth2.access.token.provider.type": "ClientCredential", "df.adl.oauth2.client.id": dbutils.secrets.get(scope = "scope1", key = "scope1ID"), "df.adl.oauth2.credential": dbutils.secrets.get(scope = "scope1", key = "scope1Secret"), "df.adl.oauth2.refresh.url": "https://login.microsoftonline.com/####REPLACED
FOR SECURITY###/oauth2/token"}
Optionally, you can add <directory-name> to the source URI of your mount point.
butils.fs.mount( source = "adl://company.azuredatalakestore.net/sandbox/", mount_point = "/mnt/testFolder", extra_configs = configs) </p><p>But all of a sudden I'm getting the below error</p><pre>Error : ExecutionError: An error occurred while calling o334.mount. : com.microsoft.azure.datalake.store.ADLException: Error creating directory / Error fetching access token Operation null failed with exception java.io.IOException : AADToken: HTTP connection failed for getting token from AzureAD due to timeout. Client Request Id :##### MASKED FOR SECURITY## Latency(ns) : 73677066
I cant figure out the error, The Only thing which has changed is the Service Principle had expired and I've updated the Service Principle on Keyvault.