cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
Join discussions on data governance practices, compliance, and security within the Databricks Community. Exchange strategies and insights to ensure data integrity and regulatory compliance.
cancel
Showing results for 
Search instead for 
Did you mean: 

Issue with Delta Live tables access from workflow ( dlt pipelines )

Garik
New Contributor III

Hi all,

I am trying to write data from external s3 bucket to delta live tables at unity catalog( not dbfs ) from a workflow. I am seeing the following error ( saying catalog namespace is not supported. Please check full error below )

Error Detailspy4j.Py4JException: An exception was raised by the Python Proxy. Return Message: Traceback (most recent call last):

File "/databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/clientserver.py", line 617, in _call_proxy

return_value = getattr(self.pool[obj_id], method)(*params)

File "/databricks/spark/python/dlt/helpers.py", line 26, in call

res = self.func()

File "<command--1>", line 4, in demo_purpose

return base64.b64decode("L1VzZXJzL3NyaWthbnRoQG1vdmlzdGEuY29tL3Rlc3R2ZXJzaW9uX2RsdF8y").decode("utf-8")

File "<command--1>", line 30, in dlt_table_fn

File "/databricks/spark/python/pyspark/sql/session.py", line 1120, in table

return DataFrame(self._jsparkSession.table(tableName), self)

File "/databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py", line 1321, in __call__

return_value = get_return_value(

File "/databricks/spark/python/pyspark/sql/utils.py", line 202, in deco

raise converted from None

pyspark.sql.utils.AnalysisException: Catalog namespace is not supported.

6 REPLIES 6

Pat
Honored Contributor III

Hi @Srikanth Garik​ 

currently DLT is not supported with the Unity Catalog. The latest ETA that I've heard was end of the November.

Please see here for more details:

https://docs.databricks.com/release-notes/unity-catalog/20220825.html#unity-catalog-ga-limitations

Referencing Unity Catalog tables from Delta Live Tables pipelines is currently not supported.

You might try to join the Databricks Office Hours tomorrow and ask the question or see if there is any update:

https://community.databricks.com/s/feed/0D58Y00009NH0xZSAT

thanks,

Pat.

Garik
New Contributor III

Thanks @Pat Sienkiewicz​ for information that helped a lot. We will see if they release new version of databricks that facilitate communication between dlt and catalog.

Garik
New Contributor III

Hi @Kaniz, Seems like DLT dotn talk to unity catolog currently. So , we are thinking either develop while warehouse at DLT or catalog. But I guess DLT dont have data lineage option and catolog dont have change data feed ( cdc - change data capture ) . As we are leveraging kakfa for cdc . I guess for our environment it is better to develop data ware house ( bronze, silver, gold ) layers at Unity catalog as we can leverage data lineage and other data governance capabilities.

Anonymous
Not applicable

Hi @Srikanth Garik​ 

Hope all is well!

Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Garik
New Contributor III

Hi Vidula , as DBFS and Unity catalog can communicate with in workflows. We might have to use different approach for Orchestration. Or just use notebooks for all the code.

Sulfikkar
Contributor

How long should we wait to use DLT on top of the Unity catalog?

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group