โ09-13-2022 02:42 AM
โ09-15-2022 12:25 PM
assuming your Excel file is located on ADLS you can add a service principal to the cluster configuration. see: https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/azure-storage#--access-azu...
With Unity Catalog you could probably also use authorization based on the external path.
โ09-13-2022 07:35 AM
Hi Deepak, Thanks for reaching out to community.databricks.com.
Job cluster is not having option to enable credential passthrough.
โ09-13-2022 11:23 PM
Hi Debayan,
Thankyou for your response. I am working on a project where it require to read and process Excel file and I am using Maven Spark Excel library to read the excel file, and it is successfully reading on All purpose cluster with credential passthrough option. However now we need to put this notebook into schedule using workflows and due to lack of credential passthrough option for Job Cluster we are getting below error.
Error: Failure to initialize configuration Invalid configuration value detected for fs.azure.account.key
Is there a workaround to run this notebook with Job Cluster? As we want to run all workflows using a service account.
Regards,
Deepak
โ09-15-2022 12:25 PM
assuming your Excel file is located on ADLS you can add a service principal to the cluster configuration. see: https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/azure-storage#--access-azu...
With Unity Catalog you could probably also use authorization based on the external path.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now