I am planning to introduce a custom logging to the databricks workload. To achieve this I am using a python logging module. I am storing logs in driver memory "file:/tmp/" directory before I move those logs to blob storage. In my personal databricks account, I am able to list the files in "file:/tmp/" directory and can see my log file and the logs written on that file using dbutils.fs.head("path_to_file"). But when I try to implement the similar logic in client workspace, I am getting an exception.
"java.lang.SecurityException: User does not have permission SELECT on any file."
How to resolve this issue?