cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Reading a materialised view locally or using databricks api

ashraf1395
Contributor II

Hi there, 

This was my previous approach 
- I had a databricks notebook with a streaming table bronze level reading data from volumes which created a 2 downstream tables.

- 1st A a materialised view gold level, another a table for storing ingestion_metadata like most_recent_timestamp of events.

- ingestion metadata table was shared using open delta sharing

Then I would run an ingestion script from aws ecs / locally - I read the ingestion_metadata with delta sharing and then find the most_recent_timestamp log and fetch further data ahead from that timestamp and then other stuffs.This is was working good only issue - I needed to run the databricks notebook manually.

 

So I shifted to dlt pipeline.

Things are same but I cannot create a normal table either it will be a streaming table / materialised view / view 

and mv and views cannot be delta shared.So then I tried to access that ingestion_metadata materialised table created in dlt pipeline using databricks API but I cannot read the actual data inside it.

How to do this or any other way I should solve this case

1 ACCEPTED SOLUTION

Accepted Solutions

ashraf1395
Contributor II

I used this approach 
- Querying the materialised view using databricks serverless SQL endpoint by connecting it with SQL connect. 
Its working right now. If I face any issues, I will write it into a normal table and delta share it.

Thanks for your reply @Kaniz_Fatma as always very helpful.

View solution in original post

2 REPLIES 2

Kaniz_Fatma
Community Manager
Community Manager

Hi @ashraf1395

  • As youโ€™ve noticed, materialized views and views created in DLT cannot be directly shared using Delta.... This is a limitation of the current implementation.
  • One workaround is to create a separate Delta table that stores the ingestion metadata. You can update this table within your DLT pipeline and then share it using Delta Sharing. This way, you can still access the most recent timestamp and other metadata.
  • If you need to access the data inside the materialized view created in the DLT pipeline, you can use the Databricks SQL API to query the materialized view and then write the results to a Delta table that can be shared.
  • To avoid running the Databricks notebook manually, you can schedule your DLT pipeline using Databricks Jobs. This will automate the ingestion process and ensure that your pipeline runs at regular intervals.
  • You can create a job in Databricks and set it to run your DLT pipeline. This way, the ingestion metadata table will be updated automatically, and you can fetch the most recent timestamp without manual intervention.
  • Another approach could be to use a combination of DLT for the main data processing and a separate Databricks notebook or job to handle the ingestion metadata. This notebook can read the materialized view, update the Delta table, and share it using Delta Sharing.

 

  • I hope these suggestions help you resolve your issues. If you have any more questions or need further assistance, feel free to ask!

ashraf1395
Contributor II

I used this approach 
- Querying the materialised view using databricks serverless SQL endpoint by connecting it with SQL connect. 
Its working right now. If I face any issues, I will write it into a normal table and delta share it.

Thanks for your reply @Kaniz_Fatma as always very helpful.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group