3 weeks ago
Hello community,
We're cloning (deep clones) data objects of the production catalog to our non-production catalog weekly. The non-production catalog is used to run our DBT transformation to ensure we're not breaking any production models.
Lately, we have experienced several cases with certain schemas that all the tables and views in there throw this error class: TABLE_OR_VIEW_NOT_FOUND. As previously mentioned, we started to face this issues very recently.
As a workaround we have deleted and cloned the tables again, but this is not a viable solution for much longer.
Anybody experiencing or has experienced similar issues with clones?
Thanks
3 weeks ago
Hi, @adisalj. Facing issues with TABLE_OR_VIEW_NOT_FOUND errors after cloning data objects can be frustrating.
Letโs explore some potential reasons and solutions:
Schema Mismatch:
Dependency Order:
Permissions and Access:
Metadata Refresh:
Logging and Debugging:
Automated Validation:
Database Engine Specifics:
Version Control and Rollbacks:
Remember that debugging such issues often involves a combination of trial and error, thorough investigation, and collaboration with your database administrators or platform support. If the problem persists, contact the Databricks support team for further assistance. ๐ ๏ธ๐
Wednesday
Hi Kaniz,
The issue only persisted for a certain timeframe and everything is working as expected. What worked was a full refresh of the clones instead of a REPLACE function.
I will investigate in detail if this error occurs again.
Best,
Adis
2 weeks ago
@adisalj have a small question how you are handling deep cloned data in target, are you created managed table with data that is being clone into target. can you please post sample query that you are using between your catalogs to do deep clone.
i am facing Issue while trying to map data that i got from deep clone within target (eg: using same source table ddl in target), it is only creating empty table with no data
Wednesday
Hi kathrik_p,
We have a Python notebook that iterates over the schemas that exist in the production catalog and exclude certain schemas in the iteration (such as information_schemas).
The actual deep clone command looks like this `CREATE OR REPLACE {target} DEEP CLONE {source}`.
We use deep clones, since we use the staging catalogues for testing and otherwise the DBT transformations aren't working. Have a look at this documentation about Delta clones
Best,
Adis
Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections.
Click here to register and join today!
Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.