โ05-27-2024 06:13 AM - edited โ05-27-2024 06:14 AM
When I try to create a DLT pipeline from a foreign catalog (BigQuery), I get this error: java.util.NoSuchElementException: key not found: user.
I used the same script to copy Salesforce data and that worked completely fine.
โ05-27-2024 01:04 PM
Hello @ksenija ,
Are you able to query the data from this foreign catalog out of DLT pipeline?
If so, which channel are you using in your DLT pipeline?
If it is CURRENT channel, could you please try running the pipeline on PREVIEW channel and let me know the results?
Lakehouse query Federation: https://docs.databricks.com/en/query-federation/index.html
Best regards,
Lucas Rocha
โ05-27-2024 01:13 PM
Hello @lucasrocha,
Yes, I'm able to query my foreign catalog and I'm already using PREVIEW since CURRENT didn't work with Serverless.
Do you know if there is some setup that I'm missing?
Best regards,
Ksenija
โ05-27-2024 02:23 PM
โ05-28-2024 12:21 AM
Hello @lucasrocha,
This is part of my code that does DLT:
table = 'my_table'
schema='my_schema'
@dlt.view(name = table + "_tmp_view")
spark.readStream.table(schema + '.' + table)
dlt.create_streaming_table(name = table)
dlt.apply_changes(
target = table,
source = table + "_tmp_view",
keys = ["id"],
sequence_by = col("updated_at"),
stored_as_scd_type = "1"
)
Best regards,
Ksenija
3 weeks ago
3 weeks ago
Hey @ksenija ,
Could you please share the full error stack trace so that we can further check?
Best regards,
Lucas Rocha
3 weeks ago
a week ago
Hi @lucasrocha ,
Any luck with this error? I guess it's something with connection to BigQuery, but I didn't find anything.
Best regards,
Ksenija
Excited to expand your horizons with us? Click here to Register and begin your journey to success!
Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!