Issue in Databricks Cluster Configuration in Pre Prod Environment
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-05-2025 02:50 AM
Hi team,
Hope you are doing well!!
This is just to share one incident regarding one of the difficulties we are facing for a UC enabled cluster (interactive and job cluster both) in our pre prod environment that the data is not getting refreshed properly in the dataframes or tables even after doing a fresh execution from ADF, in the table it is showing old data. But after one or two executions it is showing data correctly in the output. So an internal unexpected cache is happening in ppd even after having same configurations with Dev cluster and PPD cluster and codes are identical as well. Please suggest what can be considered to get rid of this issue.
Since, this is happening in a client environment so cannot share any code but please let us know if any additional input we can provide for better understanding.
Regards,
Subhrajyoti Chatterjee
Technology Specialist at Azure Data Engineering Community
Guild & Community(GnC),
Analytics & AI,Cognizant Technology Solutions
Kolkata, India
Subhrajyoti Chatterjee
Technology Specialist at Azure Data Engineering Community,
Guild & Community (GnC), Analytics & AI
Cognizant Technology Solutions
Kolkata, India
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-22-2025 01:13 AM
Hi @Subhrajyoti ,
Can you please try running REFRESH TABLE table_name when you encounter this issue.
Can you also try disabling Delta caching and check if it returns correct result (spark.databricks.io.cache.enabled false)