cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

DLT pipeline error key not found: user

ksenija
Contributor

When I try to create a DLT pipeline from a foreign catalog (BigQuery), I get this error: java.util.NoSuchElementException: key not found: user.

I used the same script to copy Salesforce data and that worked completely fine.

8 REPLIES 8

lucasrocha
Databricks Employee
Databricks Employee

Hello @ksenija ,

Are you able to query the data from this foreign catalog out of DLT pipeline?
If so, which channel are you using in your DLT pipeline?
If it is CURRENT channel, could you please try running the pipeline on PREVIEW channel and let me know the results?

Lakehouse query Federation: https://docs.databricks.com/en/query-federation/index.html

Best regards,
Lucas Rocha

Hello @lucasrocha,

Yes, I'm able to query my foreign catalog and I'm already using PREVIEW since CURRENT didn't work with Serverless.

Do you know if there is some setup that I'm missing?

Best regards,

Ksenija

lucasrocha
Databricks Employee
Databricks Employee

Hey @ksenija ,

Can you share more details? Any source code example?

Best regards,
Lucas Rocha

Hello @lucasrocha,

This is part of my code that does DLT:

table = 'my_table'
schema='my_schema'

@dlt.view(name = table + "_tmp_view")
spark.readStream.table(schema + '.' + table)

dlt.create_streaming_table(name = table)
dlt.apply_changes(
target = table,
source = table + "_tmp_view",
keys = ["id"],
sequence_by = col("updated_at"),
stored_as_scd_type = "1"
)

Best regards,

Ksenija

Hello @lucasrocha ,

Did you have time to check my code?

Thanks in advance,

Ksenija

lucasrocha
Databricks Employee
Databricks Employee

Hey @ksenija , 

Could you please share the full error stack trace so that we can further check?

Best regards,
Lucas Rocha

Hi @lucasrocha ,

Here is the whole error.

Best regards,

Ksenija

ksenija
Contributor

Hi @lucasrocha ,

Any luck with this error? I guess it's something with connection to BigQuery, but I didn't find anything.

Best regards,

Ksenija

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group