Delta Live Tables INSUFFICIENT_PERMISSIONS
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2024 11:33 AM
I have a delta live table pipeline which reads from a delta table then applies 3 layers of transformations before merging the legs of the pipeline and outputting. I am getting this error when I run my pipeline against unity catalog.
```
In the first step of the pipeline it applies some filters and transformation logic that is configured with an azure sql database. The second step of the pipeline it joins against data in the same azure sql database.
My pipeline is not using a cluster policy and it has photon enabled.
Steps I have done to debug
1. I shut off the join logic and just passed the input to table 2 as the output, it works fine
2. I have run the exact join code against the input table data inside of a databricks notebook with no errors
3. I have run the exact pipeline against hive metastore with no errors
4. I have compared the query execution plans from the log4j logs for the pipeline to the query plan from my notebook and there isnt anything that sticks out. I have also meticulously combed over the logs for any other clues to no avail.
also, I have gotten this error before and it was masking other issues then that were not directly related to permissions.
- Labels:
-
Delta Lake
-
Spark
-
Workflows
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-02-2024 11:47 PM
Hi @dadrake3
You are encountering this error only when you run against the Unity Catalog. In Unity Catalog, Table Access Control is enabled by default. You must grant SELECT permission on files so the selected user(Who is running the DLT pipeline) can access them.
You can follow the below KB link,
https://kb.databricks.com/en_US/data/user-does-not-have-permission-select-on-any-file
Please leave a like if it is helpful. Follow-ups are appreciated.
Kudos,
Sai Kumar
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-03-2024 06:57 AM
I don't see how that can be the underlying issue because.
1. the first step of the pipeline which reads from unity catalog and azure sql is just fine.
2. when I remove the enrichment logic from the second step of the and just pass the table input as its output also works just fine.
I have also seen this error masking the true error multiple times in the past.

