cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Data is not loaded when creating two different streaming table from one delta live table pipeline

zero234
New Contributor III

 i am trying to create 2 streaming tables in one DLT pipleine , both read json data from different locations and both have different schema , the pipeline executes but no data is inserted in both the tables.
where as when i try to run each table individually they execute perfectly


is it because DLT cannot process two different streaming table at once.?

DF = spark.readStream.format("json") \
      .schema(schema) \
      .option("header", True) \
      .option("nullValue", "") \
      .load(source_path + "/*.json")
1 REPLY 1

Mounika_Tarigop
Databricks Employee
Databricks Employee

Delta Live Tables (DLT) can indeed process multiple streaming tables within a single pipeline.

Here are a few things to check:

1) Verify that each streaming table has a unique checkpoint location. Checkpointing is crucial for maintaining the state of streaming queries, and conflicts can arise if the same location is used for multiple streams.

2) Ensure that first_schema and second_schema are defined correctly and match the structure of the JSON data in first_source_path and second_source_path, respectively.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group