Almost same advice than Hubert, I use the history of the delta table :df_history.select(F.col('operationMetrics')).collect()[0].operationMetrics['numOutputRows']You can find also other 'operationMetrics' values, like 'numTargetRowsDeleted'.
Here is the resolution for my particular case, it should help you for the first topic issue.We were using those 2 lines of code in a library (whl file library in Databricks cluster global init script) created by our own (needed to do that in v9.1 bec...
Hi, we have the same issue since few days when executing notebooks from Azure Data Factory :AnalysisException: Undefined function: count. This function is neither a built-in/temporary function, nor a persistent function that is qualified as spark_cat...