โ01-17-2023 06:42 AM
Hi databricks community,
I have searched quite a while through the internet but did not find an answer. If I have configured the azure datalake connection in Unity data catalog, is it possible to grant the access to users for a specific file or a folder to them? Have seen quite a lot of examples so far for the structured data only.
Thanks.
โ01-17-2023 07:20 AM
No.
Unity catalog enforces permissions on the table level (and catalog and schema etc), but not on the storage level.
Unity itself uses a managed identity or service principal for storage access btw. This id should have access to the data lake.
What you can do is create dynamic views to make a row-level security setup.
โ01-17-2023 07:20 AM
No.
Unity catalog enforces permissions on the table level (and catalog and schema etc), but not on the storage level.
Unity itself uses a managed identity or service principal for storage access btw. This id should have access to the data lake.
What you can do is create dynamic views to make a row-level security setup.
โ01-17-2023 12:31 PM
As @werners said service principal needs to have access to the file level.
In the unity catalog, you can use "READ FILES"/"WRITE FILES" permission to give someone the possibility of reading files from the storage level (but through databricks).
โ01-26-2023 03:26 AM
Hi @Hubert Dudekโ @Werner Stinckensโ , thank you for the idea. In our scenario, we would need to share the files inside the azure datalake in the same folder.
Imagine that we have a folder ORDER001 and file1, file2 and file3. Can we use databricks to share the access to user A the access of file1 and file2 but for user B the access to file3?
Some people have suggested to copy the files outside and create separate container. However, this will unavoidably create duplication and we would like to avoid.
Have you an idea how the acsess control in this scenario could be achieve through databricks?
Many thanks
โ01-26-2023 09:08 AM
It is messy as:
I don't know what the files are. Unstructured data can be included in the delta file / metastore table (array or binary).
You could also put these files outside of databricks and manage access separately.
@Werner Stinckens,โ is it possible to have Unity Catalog and mount another storage container under the dbfs path using credentials passthrough?
โ01-27-2023 01:05 AM
I am not sure. Someone at Databricks once told me that mounts and Unity are not friends.
The easiest way to achieve this on file level is either:
Frankly using ACLs always gets on my nerves. Hard to maintain.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group