Hi @mangosta , the System Tables docs have been updated with additional context on the `statement_text` here: https://docs.databricks.com/en/admin/system-tables/query-history.html#using-the-query-history-table
Thanks again for letting us know.
Hi @DineshOjha ,
You can reference the task-level parameters using `my_python_var = dbutils.widgets.get('my-task-level-param')`.
More information on task values can be found here: https://docs.databricks.com/en/jobs/task-values.html#set-task-values.
...
Were you able to review the documentation provided here: https://docs.databricks.com/en/compute/serverless/dependencies.html#install-notebook-dependencies?
Sure, I will presume you are using OSS Apache Spark outside of Azure Databricks. If so, you can use the delta-spark library at these maven coordinates: https://mvnrepository.com/artifact/io.delta/delta-spark_2.12.
Further information is at the Delta ...
Seems like a duplicate: https://community.databricks.com/t5/data-engineering/urgent-need-information-details-and-reference-link-on-below-two/td-p/107260