cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Disable access to mount point for client code

Yatoom
New Contributor II

We are building a platform where we automatically execute Databricks jobs using Python packages delivered by our end-users.

We want to create a mount point so that we can deliver the cluster's driver logs to an external storage. However, we don't want the client code to have access to this mount point. Because then we can not:

  • guarantee isolation between jobs (the code of one end-user project can read the logs of another project)
  • ensure immutability to the logs (users can override )

Is it possible to set some access control, so that the cluster can only write the driver logs there?

2 REPLIES 2

daniel_sahal
Esteemed Contributor

It depends which Cloud provider you're using. For AWS S3 you'll need to create IAM role and create a bucket policy that provides access to the role.

For Azure Databricks no longer recommends mounting external data locations to Databricks Filesystem (https://docs.databricks.com/external-data/azure-storage.html#deprecated-patterns-for-storing-and-accessing-data-from-databricks) And there's no possibility to manage permissions when using mounts.

Aviral-Bhardwaj
Esteemed Contributor III

Check with cloud providers

AviralBhardwaj

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group