โ05-27-2024 06:13 AM - edited โ05-27-2024 06:14 AM
When I try to create a DLT pipeline from a foreign catalog (BigQuery), I get this error: java.util.NoSuchElementException: key not found: user.
I used the same script to copy Salesforce data and that worked completely fine.
โ05-27-2024 01:04 PM
Hello @ksenija ,
Are you able to query the data from this foreign catalog out of DLT pipeline?
If so, which channel are you using in your DLT pipeline?
If it is CURRENT channel, could you please try running the pipeline on PREVIEW channel and let me know the results?
Lakehouse query Federation: https://docs.databricks.com/en/query-federation/index.html
Best regards,
Lucas Rocha
โ05-27-2024 01:13 PM
Hello @lucasrocha,
Yes, I'm able to query my foreign catalog and I'm already using PREVIEW since CURRENT didn't work with Serverless.
Do you know if there is some setup that I'm missing?
Best regards,
Ksenija
โ05-27-2024 02:23 PM
โ05-28-2024 12:21 AM
Hello @lucasrocha,
This is part of my code that does DLT:
table = 'my_table'
schema='my_schema'
@dlt.view(name = table + "_tmp_view")
spark.readStream.table(schema + '.' + table)
dlt.create_streaming_table(name = table)
dlt.apply_changes(
target = table,
source = table + "_tmp_view",
keys = ["id"],
sequence_by = col("updated_at"),
stored_as_scd_type = "1"
)
Best regards,
Ksenija
โ06-05-2024 05:51 AM
โ06-05-2024 01:36 PM
Hey @ksenija ,
Could you please share the full error stack trace so that we can further check?
Best regards,
Lucas Rocha
โ06-06-2024 02:03 AM
โ06-19-2024 08:36 AM
Hi @lucasrocha ,
Any luck with this error? I guess it's something with connection to BigQuery, but I didn't find anything.
Best regards,
Ksenija
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now