We are building a platform where we automatically execute Databricks jobs using Python packages delivered by our end-users.
We want to create a mount point so that we can deliver the cluster's driver logs to an external storage. However, we don't want the client code to have access to this mount point. Because then we can not:
- guarantee isolation between jobs (the code of one end-user project can read the logs of another project)
- ensure immutability to the logs (users can override )
Is it possible to set some access control, so that the cluster can only write the driver logs there?