I'm using Azure Databricks notebook to read a excel file from a folder inside a mounted Azure blob storage.
The mounted excel location is like : "/mnt/2023-project/dashboard/ext/Marks.xlsx". 2023-project is the mount point and dashboard is the name of the container.
When I do a dbutils.fs.ls I can see all the files inside the ext folder. There are lot of os functions being used in the code as it was developed on a different environment.
When I do a os.listdir on ext folder, I get a error No such file or directory. When I do a os.listdir on dashboard container I get mount.err as the output. While reading the excel file using pandas or openpyxl I get a error No such file or directory.
I have tried with and without using /dbfs at the beginning of the mount point.
I'm using DBR 12.1 (includes Apache Spark 3.3.1, Scala 2.12). I mounted the azure storage using the credential pass through method.
configs = {
'fs.azure.account.auth.type': 'CustomAccessToken',
'fs.azure.account.custom.token.provider.class': spark.conf.get('spark.databricks.passthrough.adls.gen2.tokenProviderClassName')
}
Please help on this. I'm relatively new to Databricks.