cancel
Showing results for 
Search instead for 
Did you mean: 
Data Governance
cancel
Showing results for 
Search instead for 
Did you mean: 

Issue with Delta Live tables access from workflow ( dlt pipelines )

Garik
New Contributor III

Hi all,

I am trying to write data from external s3 bucket to delta live tables at unity catalog( not dbfs ) from a workflow. I am seeing the following error ( saying catalog namespace is not supported. Please check full error below )

Error Detailspy4j.Py4JException: An exception was raised by the Python Proxy. Return Message: Traceback (most recent call last):

File "/databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/clientserver.py", line 617, in _call_proxy

return_value = getattr(self.pool[obj_id], method)(*params)

File "/databricks/spark/python/dlt/helpers.py", line 26, in call

res = self.func()

File "<command--1>", line 4, in demo_purpose

return base64.b64decode("L1VzZXJzL3NyaWthbnRoQG1vdmlzdGEuY29tL3Rlc3R2ZXJzaW9uX2RsdF8y").decode("utf-8")

File "<command--1>", line 30, in dlt_table_fn

File "/databricks/spark/python/pyspark/sql/session.py", line 1120, in table

return DataFrame(self._jsparkSession.table(tableName), self)

File "/databricks/spark/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py", line 1321, in __call__

return_value = get_return_value(

File "/databricks/spark/python/pyspark/sql/utils.py", line 202, in deco

raise converted from None

pyspark.sql.utils.AnalysisException: Catalog namespace is not supported.

7 REPLIES 7

Pat
Honored Contributor III

Hi @Srikanth Garik​ 

currently DLT is not supported with the Unity Catalog. The latest ETA that I've heard was end of the November.

Please see here for more details:

https://docs.databricks.com/release-notes/unity-catalog/20220825.html#unity-catalog-ga-limitations

Referencing Unity Catalog tables from Delta Live Tables pipelines is currently not supported.

You might try to join the Databricks Office Hours tomorrow and ask the question or see if there is any update:

https://community.databricks.com/s/feed/0D58Y00009NH0xZSAT

thanks,

Pat.

Garik
New Contributor III

Thanks @Pat Sienkiewicz​ for information that helped a lot. We will see if they release new version of databricks that facilitate communication between dlt and catalog.

Kaniz
Community Manager
Community Manager

Hi @Srikanth Garik​, We haven’t heard from you since the last response from @Pat Sienkiewicz​ ​, and I was checking back to see if my suggestions helped you.

Or else, If you have any solution, please share it with the community, as it can be helpful to others.

Also, Please don't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.

Garik
New Contributor III

Hi @Kaniz, Seems like DLT dotn talk to unity catolog currently. So , we are thinking either develop while warehouse at DLT or catalog. But I guess DLT dont have data lineage option and catolog dont have change data feed ( cdc - change data capture ) . As we are leveraging kakfa for cdc . I guess for our environment it is better to develop data ware house ( bronze, silver, gold ) layers at Unity catalog as we can leverage data lineage and other data governance capabilities.

Anonymous
Not applicable

Hi @Srikanth Garik​ 

Hope all is well!

Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. 

We'd love to hear from you.

Thanks!

Garik
New Contributor III

Hi Vidula , as DBFS and Unity catalog can communicate with in workflows. We might have to use different approach for Orchestration. Or just use notebooks for all the code.

Sulfikkar
Contributor

How long should we wait to use DLT on top of the Unity catalog?

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.