cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

User16826990884
by New Contributor III
  • 3669 Views
  • 3 replies
  • 0 kudos

Version control jobs

How do engineering teams out there version control their jobs? If there is a production issue, can I revert to an older version of the job?

  • 3669 Views
  • 3 replies
  • 0 kudos
Latest Reply
Rom
New Contributor III
  • 0 kudos

You can use version controlled source code for you databricks job and each time you need to rollback to older version of your job you need just to move to older version code. For version controlled source code you have multiple choises:-  Use a noteb...

  • 0 kudos
2 More Replies
Serhii
by Contributor
  • 1868 Views
  • 3 replies
  • 1 kudos

Could not launch jobs due to node_type_id (instance) unavailability

I am running hourly job on a cluster using p3.2xlarge GPU instance, but sometimes cluster couldn't start due to instance unavailability. I wander is there is any fallback mechanism to, for example, try a different instance type if one is not availabl...

  • 1868 Views
  • 3 replies
  • 1 kudos
Latest Reply
abagshaw
New Contributor III
  • 1 kudos

 (AWS only) For anyone experiencing capacity related cluster launch failures on non-GPU instance types, AWS Fleet instance types are now GA and available for clusters and instance pools. They help improve chance of successful cluster launch by allowi...

  • 1 kudos
2 More Replies
thib
by New Contributor III
  • 5484 Views
  • 3 replies
  • 2 kudos

Can we use multiple git repos for a job running multiple tasks?

I have a job running multiple tasks :Task 1 runs a machine learning pipeline from git repo 1Task 2 runs an ETL pipeline from git repo 1Task 2 is actually a generic pipeline and should not be checked in repo 1, and will be made available in another re...

image
  • 5484 Views
  • 3 replies
  • 2 kudos
Latest Reply
trijit
New Contributor II
  • 2 kudos

The way to go about this would be to create Databricks repos in the workspace and then use that in the task formation. This way we can refer multiple repos in different tasks.

  • 2 kudos
2 More Replies
RJB
by New Contributor II
  • 11148 Views
  • 6 replies
  • 0 kudos

Resolved! How to pass outputs from a python task to a notebook task

I am trying to create a job which has 2 tasks as follows:A python task which accepts a date and an integer from the user and outputs a list of dates (say, a list of 5 dates in string format).A notebook which runs once for each of the dates from the d...

  • 11148 Views
  • 6 replies
  • 0 kudos
Latest Reply
BilalAslamDbrx
Databricks Employee
  • 0 kudos

Just a note that this feature, Task Values, has been generally available for a while.

  • 0 kudos
5 More Replies
swetha
by New Contributor III
  • 2788 Views
  • 3 replies
  • 4 kudos

Resolved! Retrieving the job-id's of a notebook running inside tasks

I have created a job, Inside a job I have created tasks which are independent, I have used the concept of concurrent futures to exhibit parallelism and in each task there are couple of notebooks running(which are independent) Each notebook running ha...

  • 2788 Views
  • 3 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @swetha kadiyala​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 4 kudos
2 More Replies
sawya
by New Contributor II
  • 2425 Views
  • 3 replies
  • 0 kudos

Migrate workspaces to another AWS account

Hi everyone,I have a Databricks workspace in an AWS account that I have to migrate to a new AWS accountDo you know how I can do it ? Or it's better to recreate a new one and move all the workbooks and if I choose to create one new how can you export ...

  • 2425 Views
  • 3 replies
  • 0 kudos
Latest Reply
Abishek
Databricks Employee
  • 0 kudos

@AMADOU THIOUNE​ Can you check the below link to export the run jobs? https://docs.databricks.com/jobs.html#export-job-runs. Try to reuse the same job_id with the /update and /reset endpoints, it should allow you much better access to previous run re...

  • 0 kudos
2 More Replies
Sunny
by New Contributor III
  • 7751 Views
  • 6 replies
  • 1 kudos

Using Thread.sleep in Scala

We need to hit REST web service every 5 mins until success message is received. The Scala object is inside a Jar file and gets invoked by Databricks task within a workflow.Thread.sleep(5000) is working fine but not sure if it is safe practice or is t...

  • 7751 Views
  • 6 replies
  • 1 kudos
Latest Reply
Vartika
Databricks Employee
  • 1 kudos

Hey there @Sundeep P​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.C...

  • 1 kudos
5 More Replies
Serhii
by Contributor
  • 3107 Views
  • 1 replies
  • 1 kudos

Resolved! Behaviour of cluster launches in multi-task jobs

We are adapting the multi-tasks workflow example from dbx documentation for our pipelines https://dbx.readthedocs.io/en/latest/examples/python_multitask_deployment_example.html. As a part of configuration we specify cluster configuration and provide ...

  • 3107 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16873043099
Contributor
  • 1 kudos

Tasks within the same multi task job can reuse the clusters. A shared job cluster allows multiple tasks in the same job to use the cluster. The cluster is created and started when the first task using the cluster starts and terminates after the last ...

  • 1 kudos
MadelynM
by Databricks Employee
  • 5358 Views
  • 2 replies
  • 3 kudos

How do I move existing workflows and jobs running on an all-purpose cluster to a shared jobs cluster?

A Databricks cluster is a set of computation resources that performs the heavy lifting of all of the data workloads you run in Databricks. Databricks provides a number of options when you create and configure clusters to help you get the best perform...

Left navigation bar selecting Data Science & Engineering Left nav Workflows selected Screen Shot 2022-07-05 at 10.24.37 AM Screen Shot 2022-07-05 at 10.24.46 AM
  • 5358 Views
  • 2 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

@Doug Harrigan​ Thanks for your question! @Prabakar Ammeappin​ linked above to our Docs page that mentions a bit more about the recent (April) version update/change: "This release fixes an issue that removed the Swap cluster button from the Databrick...

  • 3 kudos
1 More Replies
Sunny
by New Contributor III
  • 7492 Views
  • 7 replies
  • 4 kudos

Resolved! Retrieve job id and run id from scala

I need to retrieve job id and run id of the job from a jar file in Scala.When I try to compile below code in IntelliJ, below error is shown.import com.databricks.dbutils_v1.DBUtilsHolder.dbutils   object MainSNL {   @throws(classOf[Exception]) de...

  • 7492 Views
  • 7 replies
  • 4 kudos
Latest Reply
Mohit_m
Valued Contributor II
  • 4 kudos

Maybe its worth going through the Task Parameter variables section of the below dochttps://docs.databricks.com/data-engineering/jobs/jobs.html#task-parameter-variables

  • 4 kudos
6 More Replies
Mohit_m
by Valued Contributor II
  • 4840 Views
  • 1 replies
  • 2 kudos

Resolved! Databricks jobs create API throws unexpected error

Databricks jobs create API throws unexpected errorError response :{"error_code": "INVALID_PARAMETER_VALUE","message": "Cluster validation error: Missing required field: settings.cluster_spec.new_cluster.size"}Any idea on this?

  • 4840 Views
  • 1 replies
  • 2 kudos
Latest Reply
Mohit_m
Valued Contributor II
  • 2 kudos

Could you please specify num_workers in the json body and try API again.Also, another recommendation can be configuring what you want in UI, and then pressing “JSON” button that should show corresponding JSON which you can use for API

  • 2 kudos
Maverick1
by Valued Contributor II
  • 6036 Views
  • 2 replies
  • 8 kudos

Resolved! How to get the list of all jobs available for a particular user?

As of now, if I try to list the jobs via "list job" API then there is a limit of 25 jobs only.Is there a way to list all the available/visible jobs to a user?

  • 6036 Views
  • 2 replies
  • 8 kudos
Latest Reply
User16764241763
Honored Contributor
  • 8 kudos

Hello @Saurabh Verma​ Can the user generate the API token in the workspace and try to use the API?

  • 8 kudos
1 More Replies
Robbie
by New Contributor III
  • 3170 Views
  • 2 replies
  • 4 kudos

Resolved! Why can't I create new jobs? ("You are not entitled to run this type of task...")

This morning I encountered an issue when trying to create a new job using the Workflows UI (in browser). Never had this issue before.The error message that appears is:"You are not entitled to run this type of task, please contact your Databricks admi...

Screenshot including the error message
  • 3170 Views
  • 2 replies
  • 4 kudos
Latest Reply
Robbie
New Contributor III
  • 4 kudos

@Kaniz Fatma​ @Philip Nord​, thanks!I was able to do what I needed by cloning an existing job & modifying. It's fine as a temporary fix for now.Thanks again for the response-- good to know you're aware of it & this isn't anything on my end.

  • 4 kudos
1 More Replies
Sunny
by New Contributor III
  • 1164 Views
  • 1 replies
  • 0 kudos

Update task status from external application

I am having a workflow with a task that is dependant on external application execution (not residing in Databricks). After external application finishes, how to update the status of a task to complete. Currently, Jobs API doesn't support status updat...

  • 1164 Views
  • 1 replies
  • 0 kudos
Latest Reply
Sunny
New Contributor III
  • 0 kudos

Any inputs on this one please

  • 0 kudos
JBear
by New Contributor III
  • 4151 Views
  • 4 replies
  • 4 kudos

Resolved! Cant find reason but suddenly new Jobs are getting huge job id numbers. example 945270539673815

Created Job ID is suddenly started to make huge numbers, and that is now making problems in Terraform plan, cause int is too big Error: strconv.ParseInt: parsing "945270539673815": value out of rangeIm new on the board and pretty new with Databricks ...

  • 4151 Views
  • 4 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Jere Karhu​ , In case you are using the Job/Run id in API, please be advised that you will need to change the client-side logic to process int64/long and expect a random number. In some cases, you just need to change the declared type in their so...

  • 4 kudos
3 More Replies
Labels