- 3638 Views
- 7 replies
- 3 kudos
I am trying to call run-now with notebook_params in Azure Databricks CLI, following https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/jobs-cliandescapse the quotes as stated in the documentationhttps://learn.microsoft.com/en-us/azure/d...
- 3638 Views
- 7 replies
- 3 kudos
Latest Reply
I have the latest Databricks CLI setup and configured in my Ubuntu VM. When I tried to run a job using the json template I generated using databricks jobs get 'xxxjob_idxxx' > orig.json it throws an unknown error.Databricks CLI v0.216.0databricks job...
6 More Replies
- 509 Views
- 0 replies
- 0 kudos
I know that there is already the Databricks (technically Spark) integration for DataDog. Unfortunately, that integration only covers the cluster execution itself and that means only Cluster Metrics and Spark Jobs and Tasks. I'm looking for somethin...
- 509 Views
- 0 replies
- 0 kudos
- 968 Views
- 2 replies
- 2 kudos
Graviton instances do not support Container services on paper (https://docs.databricks.com/clusters/graviton.html#unsupported-features) but if you try to build Docker ARM image and run it on Graviton, it will work. Does anyone use this combination in...
- 968 Views
- 2 replies
- 2 kudos
Latest Reply
Graviton is not supported by Databricks Container Services. How are you planning to run it on Databricks? Please tag @Debayan​ with your next comment so that I will get notified. Thank you!
1 More Replies
- 1416 Views
- 2 replies
- 0 kudos
i copied my question from an very old question/post that i reponded. and decided to move it to here:context:I have jar (scala), using scala pureconfig (wrapper of typesafe config)uploaded an application.conf file to a path which is mounted to the wor...
- 1416 Views
- 2 replies
- 0 kudos
Latest Reply
we had to put the conf in the root folder of the mounted path, and that works.maybe the mounted storage account being blob instead of adls2 is causing the issues.
1 More Replies
- 2741 Views
- 11 replies
- 2 kudos
we are trying to execute the databricks jobs for dbt task type but it is failing to autheticate to git. Problem is job is created using service principal but service principal don't seem to have access to the repo. few questions we have:1) can we giv...
- 2741 Views
- 11 replies
- 2 kudos
Latest Reply
Hi @Rahul Samant​ I'm sorry you could not find a solution to your problem in the answers provided.Our community strives to provide helpful and accurate information, but sometimes an immediate solution may only be available for some issues.I suggest p...
10 More Replies
- 1482 Views
- 1 replies
- 2 kudos
Hello,I want to execute a custom code onApplicationEnd. Outisde Databricks, I have used the Spark Listener onApplicationEnd without problems. But it is not working on Databricks (I tried listener onJobEnd and this one worked).I have also tried Spark ...
- 1482 Views
- 1 replies
- 2 kudos
Latest Reply
Did you find any solution?
- 1319 Views
- 1 replies
- 1 kudos
In Databricks jobs on Azure you can use the {{run_id}} and {{parent_run_id}}variables for a specific run: https://docs.databricks.com/workflows/jobs/jobs.htmlFor Databricks jobs with only two or more tasks, then {{run_id}} seems to correspond to task...
- 1319 Views
- 1 replies
- 1 kudos
Latest Reply
@Kasper H​ :Yes, you are correct in your understanding that in Databricks jobs with multiple tasks, the {{run_id}} variable corresponds to the task_run_id and the {{parent_run_id}} variable corresponds to the job_run_id.For Databricks jobs with only ...
- 1265 Views
- 5 replies
- 2 kudos
Currently we are creating and monitoring jobs using the api. This results in a lot of polling of the API for job status. Is there a Kafka stream, we could listen to get jobs updates and significantly reduce the number of calls to the Databricks jobs...
- 1265 Views
- 5 replies
- 2 kudos
Latest Reply
Hi @Ryan Hager​ Hope everything is going great.Just wanted to check in if you were able to resolve your issue. If yes, would you be happy to mark an answer as best so that other members can find the solution more quickly? If not, please tell us so we...
4 More Replies
- 449 Views
- 1 replies
- 3 kudos
Updated UI for Create, run, and manage Databricks Jobs
- 449 Views
- 1 replies
- 3 kudos
Latest Reply
Hi Ajay, Could you please clarify the ask here? Thanks.
- 2290 Views
- 7 replies
- 6 kudos
Hi, We currently leverage Azure DevOps to source control our notebooks and use CICD to publish the notebooks to different environments and this works very well. We do not have the same functionality available for Databricks jobs (the ability to sourc...
- 2290 Views
- 7 replies
- 6 kudos
Latest Reply
My team is currently looking at establishing REPO(s) for source control to start. I know I've seen some documentation for when a MERGE is completed to auto update the main branch in DBX remote repo. Does annyone have a template and/or best practices ...
6 More Replies
- 1120 Views
- 2 replies
- 0 kudos
Hi, is there any way to share the run_id from a task_A to a task_B within the same job when task_A is a dbt task?
- 1120 Views
- 2 replies
- 0 kudos
Latest Reply
Hi, You can pass {job_id}} and {{run_id}} in Job arguments and print that information and save into wherever it is neededplease find below the documentation for the same:https://docs.databricks.com/data-engineering/jobs/jobs.html#task-parameter-varia...
1 More Replies
- 1211 Views
- 1 replies
- 1 kudos
Recently my Databricks jobs have failed with the error message:Failure starting repl. Try detaching and re-attaching the notebook.
java.lang.Exception: Python repl did not start in 30 seconds seconds.
at com.databricks.backend.daemon.driver.Ipyker...
- 1211 Views
- 1 replies
- 1 kudos
Latest Reply
Yes, you can use re-try if still it's not resolve raise a support ticket to databricks
- 706 Views
- 1 replies
- 0 kudos
I am configuring an Databricks jobs using multiple notebooks having dependency with each other. All the notebooks are parameterized and using similiar parameters. How can i configure the parameter on global level so that all the notebooks can consume...
- 706 Views
- 1 replies
- 0 kudos
Latest Reply
actually, it is very hard but if you want to use an alternative option you have to change your code and use a widget feature of data bricks.May be this is not a right option but you can still explore this doc for testing purpose https://docs.databric...
- 4760 Views
- 15 replies
- 39 kudos
I have a databricks job running in azure databricks. A similar job is also running in databricks gcp. I would like to compare the cost. If I assign a custom tag to the job cluster running in azure databricks, I can see the cost incurred by that job i...
- 4760 Views
- 15 replies
- 39 kudos
Latest Reply
In Azure, you can use Cost Management to track your expenses incurred by Databricks instance.
14 More Replies
- 1073 Views
- 2 replies
- 0 kudos
We have used the following example to successfully create a distributed deep learning training notebook https://www.databricks.com/blog/2022/09/07/accelerating-your-deep-learning-pytorch-lightning-databricks.html that works as expected.We now want to...
- 1073 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Sergii Ivakhno​ ​, We haven’t heard from you since the last response from me​ ​, and I was checking back to see if my suggestions helped you.Or else, If you have any solution, please share it with the community, as it can be helpful to others.Als...
1 More Replies