โ05-26-2025 06:47 AM
I have one table customer and 1 temp view which I am creating from the incremental file and using as a source in merge command. Earlier the notebook was working fine from the adf pipeline, but from past few days, I am getting an error states that my target table is not a delta table, but earlier it was executing the merge command properly.
Is it a databricks glitch or seriously an issue from my side.
Catalog Name I am picking from the Key vault and its working fine for other tables
Error - he feature is not supported: Table `[REDACTED]`.`schema-we-001`.`Customer` does not support MERGE. Please check the current catalog and namespace to make sure the qualified table name is expected, and also check the catalog implementation which is configured by "spark.sql.catalog". SQLSTATE: 0A000
โ05-26-2025 09:54 AM
Hi @Sahil0007 is it a masked table or normal table?
Which dbr version and dbr mode are you using?
Did you chnage the code or cluster configurations recently?
โ05-26-2025 08:23 PM
โ05-26-2025 08:26 PM
Sorry my bad , it was a normal table before. And dbr version currently am using is 15.4 LTS, single mode - job cluster.
โ05-26-2025 10:57 PM
Hi @Sahil0007,
Please switch to dbr 16.2 single user mode cluster.
โ05-27-2025 12:32 AM
Should i use Interactive cluster in single user mode or job cluster with 16.2.
I tried using 16.4 LTS job cluster but got the same error.
โ05-27-2025 01:27 AM
You can go with the job cluster.
Did you use 16.4 dbr single-user mode for the interactive cluster?
โ05-27-2025 01:49 AM
I have tried with job cluster , but getting same error. I cant use single mode interactive cluster due to costing part.
โ05-28-2025 04:55 AM - edited โ05-28-2025 04:57 AM
Hi @Sahil0007 ,
The [REDACTED] value you're seeing is being retrieved from Key Vault.
Here is the workaround, you can reverse the value twice to decode it and retrieve the original string. Alternatively, you can slice the string into two parts and concatenate them to reconstruct the actual value. Here's an example:
python:
_catalog = (key_vault_value[:-1] + key_vault_value[-1]).capitalize()
You can then use _catalog when querying the table, capitalize() helps to maintain the decoded string.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now