Hi,What the maximum size to read using dbutils.fs.head()?is there a limit? because AI says 10MB and I couldn't find useful info in documentations, while I tried in the actual one and it was only limited by the driver memory.Thanks in advance.
Hi,I want to set default ACL that applies to all created jobs and clusters, according to a cluster policy for example, but currently I need to apply my ACL at every created job/cluster separately.is there a way to do that?BR,
Hi,Is there a way to custom name an assetbundle file name and pass that to databricks bundle deploy?I mean right now I must use databricks.yml, so my question is whether there is a way to pass a custom file name.note that I don't want to embed a file...
This has happened alot in the previous weeks although both azure and Databricks showed no issues at the time the error was recieved by both Databricks python SDK and Java SDK, now I started creating a retry mechnaism to retry those errors selectively...
I am using the platform DBUtils.fs.mv() on databricks clusters, and facing issues with move operation slowness.I move files in UC Volumes or ADLS storage abfss links, which work but is so slow.I mean it takes hours to transfer files that used to take...