Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
Showing results for 
Search instead for 
Did you mean: 

Issue with Delta Live tables access from workflow ( dlt pipelines )

New Contributor III

Hi all,

I am trying to write data from external s3 bucket to delta live tables at unity catalog( not dbfs ) from a workflow. I am seeing the following error ( saying catalog namespace is not supported. Please check full error below )

Error Detailspy4j.Py4JException: An exception was raised by the Python Proxy. Return Message: Traceback (most recent call last):

File "/databricks/spark/python/lib/", line 617, in _call_proxy

return_value = getattr(self.pool[obj_id], method)(*params)

File "/databricks/spark/python/dlt/", line 26, in call

res = self.func()

File "<command--1>", line 4, in demo_purpose

return base64.b64decode("L1VzZXJzL3NyaWthbnRoQG1vdmlzdGEuY29tL3Rlc3R2ZXJzaW9uX2RsdF8y").decode("utf-8")

File "<command--1>", line 30, in dlt_table_fn

File "/databricks/spark/python/pyspark/sql/", line 1120, in table

return DataFrame(self._jsparkSession.table(tableName), self)

File "/databricks/spark/python/lib/", line 1321, in __call__

return_value = get_return_value(

File "/databricks/spark/python/pyspark/sql/", line 202, in deco

raise converted from None

pyspark.sql.utils.AnalysisException: Catalog namespace is not supported.


Honored Contributor III

Hi @Srikanth Garik​ 

currently DLT is not supported with the Unity Catalog. The latest ETA that I've heard was end of the November.

Please see here for more details:

Referencing Unity Catalog tables from Delta Live Tables pipelines is currently not supported.

You might try to join the Databricks Office Hours tomorrow and ask the question or see if there is any update:



New Contributor III

Thanks @Pat Sienkiewicz​ for information that helped a lot. We will see if they release new version of databricks that facilitate communication between dlt and catalog.

Community Manager
Community Manager

Hi @Srikanth Garik​, We haven’t heard from you since the last response from @Pat Sienkiewicz​ ​, and I was checking back to see if my suggestions helped you.

Or else, If you have any solution, please share it with the community, as it can be helpful to others.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

New Contributor III

Hi @Kaniz, Seems like DLT dotn talk to unity catolog currently. So , we are thinking either develop while warehouse at DLT or catalog. But I guess DLT dont have data lineage option and catolog dont have change data feed ( cdc - change data capture ) . As we are leveraging kakfa for cdc . I guess for our environment it is better to develop data ware house ( bronze, silver, gold ) layers at Unity catalog as we can leverage data lineage and other data governance capabilities.

Not applicable

Hi @Srikanth Garik​ 

Hope all is well!

Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.


New Contributor III

Hi Vidula , as DBFS and Unity catalog can communicate with in workflows. We might have to use different approach for Orchestration. Or just use notebooks for all the code.


How long should we wait to use DLT on top of the Unity catalog?

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!