cancel
Showing results for 
Search instead for 
Did you mean: 
Machine Learning
Dive into the world of machine learning on the Databricks platform. Explore discussions on algorithms, model training, deployment, and more. Connect with ML enthusiasts and experts.
cancel
Showing results for 
Search instead for 
Did you mean: 

DLT with unity catalog and ML

oteng
New Contributor III

We are currently using DLT with unity catalog. DLT tables are created as materialized views in a schema inside a catalog.

When we try to access these materialized view using a ML runtime (ex. 13.0 ML) cluster, it says, that we must use Single User security mode. However, Single User security mode cannot access materialized views. It throws the error [MATERIALIZED_VIEW_OPRATION_NOT_ALLOWED.REQUIRES_SHARED_COMPUTE].

Is there any way to use DLT with unity catalog and ML all combined? We could create a notebook that copies the DLT materialized views into a Delta table but then there doesn't seem much of a point to using DLT.

Are we using DLT with Unity Catalog incorrectly? Should it only be used for bronze ingest/silver layer transformation and then we use Delta tables for gold layer tables?

5 REPLIES 5

pg5
New Contributor III

I recently hit the same issue.
Seems like this is a limitation of DLT with Unity Catalog.

Did you find a workaround @oteng? Otherwise I will try copying the materialized views to a table before doing the ML work.

oteng
New Contributor III

No workaround was found. We are just copying all the table to do the ML work. We haven't looked at this for a while though. So we are not aware of any new features.

pg5
New Contributor III

I managed to get some information from a friend at Databricks. Copying the tables in a separate workflow seems to be the best workaround for now.

MarkusFra
New Contributor III

Is there any update on this? Basically you cannot access Materialized Views with ML Clsuters. To copy all tables for our Data Scientists seems like a really unnecessary step. Also they cannot profit from the advantage of the incremental table updates like others that can use shared cluster or SQL warehouses.

pg5
New Contributor III

No updates as far as I am aware.
You could make the workflow copying the data smart though and try to only do incremental updates, seems like a lot of effort though.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group