cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Deepak_Goldwyn
by New Contributor III
  • 3381 Views
  • 5 replies
  • 2 kudos

Resolved! DLT Pipeline and Job Cluster

We have written few python functions(methods within a class) and packaged them as a wheel library.In the as-is situation we use to install that wheel library in All-Purpose cluster that we already have created. It works fine.In the to-be situtation(D...

  • 3381 Views
  • 5 replies
  • 2 kudos
Latest Reply
tomasz
Databricks Employee
  • 2 kudos

Does it give you an error when running the DLT pipeline specifically on the %pip command or does it not work in some other way? If it's the former, could you share the path format that you're using for the %pip command path?

  • 2 kudos
4 More Replies
rbarrero
by New Contributor III
  • 6230 Views
  • 9 replies
  • 7 kudos

Resolved! Error saving changes on Job Cluster

Hello all and thanks.After apply to serving a model, I go to edit corresponding Job Cluster to configure its init_script but when I try to save changes (Confirm and restart) it thrown the following error:Error: Cannot edit cluster 0503-141315-hu3wd4i...

  • 6230 Views
  • 9 replies
  • 7 kudos
Latest Reply
rbarrero
New Contributor III
  • 7 kudos

Sorry for the delay in responding. Finally a partner could fix the problem, he can edit without problems the cluster and add the init_script.Thank you!

  • 7 kudos
8 More Replies
Deepak_Bhutada
by Contributor III
  • 2264 Views
  • 3 replies
  • 3 kudos

Retrieve workspace instance name on E2 architecture (multi-tenant) in notebook running on job cluster

I have a databricks job on E2 architecture in which I want to retrieve the workspace instance name within a notebook running in a Job cluster context so that I can use it further in my use case. While the call dbutils.notebook.entry_point.getDbutils(...

  • 2264 Views
  • 3 replies
  • 3 kudos
Latest Reply
Thomas_B_
New Contributor II
  • 3 kudos

Found workaround for Azure Databricks question above: dbutils.notebook.getContext().apiUrl will return the regional URI, but this forwards to the workspace-specific one if the workspace id is specified with o=.

  • 3 kudos
2 More Replies
Jin_Kim
by New Contributor II
  • 5781 Views
  • 2 replies
  • 4 kudos

Resolved! address how to use multiple spark streaming jobs connecting to one job cluster

Hi,We have a scenario where we need to deploy 15 spark streaming applications on databricks reading from kafka to single Job cluster. We tried following approach:1. create job 1 with new job cluster (C1)2. create job2 pointing to C1...3. create job15...

  • 5781 Views
  • 2 replies
  • 4 kudos
Latest Reply
Jin_Kim
New Contributor II
  • 4 kudos

@Hubert Dudek​ , thanks a lot for responding. When we have setup like this, if one tasks fails, it will not terminate the entire job right?Since, the job is continously running as it is streaming app, is it possible to add new task to the job(while i...

  • 4 kudos
1 More Replies
Manoj
by Contributor II
  • 1709 Views
  • 2 replies
  • 5 kudos

Resolved! Does job cluster helps the jobs that are fighting for Resources on all purpose cluster ?

Hi Team, Does job cluster helps the jobs that are fighting for Resources on all purpose cluster ?With job cluster the drawback that i see is creation of cluster every time when the job starts, Its taking 2 mins for spinning up the cluster. Instead of...

  • 1709 Views
  • 2 replies
  • 5 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 5 kudos

@Manoj Kumar Rayalla​ , You can in the job set to use an all-purpose cluster (that feature was added recently)You can use the pool to limit job cluster starting time (but it still can take a moment),

  • 5 kudos
1 More Replies
Jed
by New Contributor II
  • 4558 Views
  • 2 replies
  • 0 kudos

Enabled 2.1 jobs api feature and unable to create a shared jobs cluster.

Hello, We enabled the 2.1 jobs api feature and when I attempt to create a "shared" job cluster in the configuration I always get this response:{'error_code': 'FEATURE_DISABLED', 'message': 'Shared job cluster feature is not enabled.'}Please could you...

  • 4558 Views
  • 2 replies
  • 0 kudos
Latest Reply
Jed
New Contributor II
  • 0 kudos

I am able to access now. To summarize the problem from my perspective the shared cluster API did not work as expected. Some direct manual intervention by databricks support was required.

  • 0 kudos
1 More Replies
MattM
by New Contributor III
  • 2315 Views
  • 3 replies
  • 2 kudos

Resolved! Pricing Spot Instance vs New Job Cluster

We are running multiple Databricks job via ADF. I was wondering which option out of the below is a cheaper route for databricks notebook processing from ADF. When I create a ADF linked service, which should I use to lower my cost.New Job Cluster opti...

  • 2315 Views
  • 3 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

the instance pool will be cheaper if you use spot instances. But only if you size your instance pool correctly. (number of workers and scale down time)AFAIK you cannot use spot instances for job clusters in ADF

  • 2 kudos
2 More Replies
User16752239289
by Databricks Employee
  • 1701 Views
  • 1 replies
  • 1 kudos

Resolved! Failed to add S3 init script in job cluster

I use below payload to submit my job that include am init script saved on S3. The instance profile and init script worked on interactive cluster. But when I move to job cluster the init script cannot be configure. { "new_cluster": { "spar...

  • 1701 Views
  • 1 replies
  • 1 kudos
Latest Reply
User16752239289
Databricks Employee
  • 1 kudos

It is due to the region is missing. For init script saved in S3, the region field is required. The init script section should be like below :"init_scripts": [ { "s3": { "destination": "s3://<my bucket>...

  • 1 kudos
Labels