Hi @aliz ,My suggestion would be not to rely on this. When you repair a run, Databricks creates a new job cluster and only re-runs the selected subset of tasks, so your setup task is skipped and the fresh cluster is missing dependencies.
Some quick c...
Hi @srijan1881 ,what do you mean by logs here? If you meant tracing step by step invocations etc in the model serving side. You need to add these environment variables to the served model (Serving > your endpoint > Edit endpoint > Environment variabl...
System tables are a Databricks‑hosted, read‑only analytical store shared to your workspace via Delta Sharing; they aren’t modifiable (no indexes you can add), and the first read can have extra overhead on a very small warehouse. This can make “simple...
@hobrob_ex , yes, this is possible, but not like the HTML way; instead, you will have to use the markdown rendering formats.
Add #Heading 1, #Heading 2.. so on in the (+Text) button of the notebook. Once these headings/ sections that you want are con...
+1 to all the above comments. Having the %run command along with other commands will confuse the REPL execution. So, have the %run notebook_b3 command alone in a new cell, maybe as the first cell, is notebook_a, which will resolve the issue, and your...