cancel
Showing results for 
Search instead for 
Did you mean: 
shan-databricks
New Contributor III
since ‎02-13-2025
‎08-31-2025

User Stats

  • 9 Posts
  • 1 Solutions
  • 0 Kudos given
  • 1 Kudos received

User Activity

How to load all the previous day's data only into the newly added column of the existing delta table? Is there any option available to do that without writing any logic?
I have one file that has 100 rows and in which two rows are bad data and the remaining 98 rows is good data, but when I use the bad records' path, it completely moves the file to the bad records' path, which has good data as well, and it should move ...
Need help to resolve the issue Error : com.databricks.sql.transaction.tahoe.DeltaAnalysisException: [_LEGACY_ERROR_TEMP_DELTA_0007] A schema mismatch detected when writing to the Delta table.I am using the below code and my JSON is dynamically changi...
I have 50 tables and will increase gradually, so I want to create a single workflow to orchestrate the job and run it table-wise. Is there an option to do this in Databricks workflow?
Kudos from