cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Praveen2609
by New Contributor
  • 1181 Views
  • 2 replies
  • 0 kudos

dbfs access for job clusters and interactive cluster

Hi All,I am new to databricks need some understanding for my requirement .our requirement:a: we have zip file in azure blob storage and we are bringing that file to dbfs and unzip that file and executing our transformations in multiple steps (3 steps...

  • 1181 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @praveen rajak​ Does @Debayan Mukherjee​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 0 kudos
1 More Replies
kjoth
by Contributor II
  • 8701 Views
  • 10 replies
  • 5 kudos

Resolved! Where is the cluster logs of the Databricks Jobs stored.

I'm running a scheduled job on Job clusters. I didnt mention the log location for the cluster. Where can we get the stored logs location. Yes, I can see the logs in the runs, but i need the logs location.

  • 8701 Views
  • 10 replies
  • 5 kudos
Latest Reply
kjoth
Contributor II
  • 5 kudos

Hi @Sai Kalyani P​ , Yes it helped. Thanks

  • 5 kudos
9 More Replies
dmayi
by New Contributor
  • 2232 Views
  • 2 replies
  • 1 kudos

Resolved! Setting up custom tags (JobName, JobID, UserId) on an all-purpose cluster

Hi i want to set up custom tags on an all-purpose cluster for purposes of cost break down and chargebacks. What: specifically, i want to capture JobName, JobID, UserId who ran jobI can set other custom tags such as Business Unit, Owner... However,...

  • 2232 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hey there @DIEUDONNE MAYI​ Does @Kaniz Fatma​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 1 kudos
1 More Replies
NicolasEscobar
by New Contributor II
  • 6806 Views
  • 8 replies
  • 5 kudos

Resolved! Job fails after runtime upgrade

I have a job running with no issues in Databricks runtime 7.3 LTS. When I upgraded to 8.3 it fails with error An exception was thrown from a UDF: 'pyspark.serializers.SerializationError'... SparkContext should only be created and accessed on the driv...

  • 6806 Views
  • 8 replies
  • 5 kudos
Latest Reply
User16873042682
New Contributor II
  • 5 kudos

Adding to @Sean Owen​  comments, The only reason this is working is that the optimizer is evaluating this locally rather than creating a context on executors and evaluating it.

  • 5 kudos
7 More Replies
nolanlavender00
by New Contributor
  • 2520 Views
  • 3 replies
  • 1 kudos

Resolved! How to stop a Streaming Job based on time of the week

I have an always-on job cluster triggering Spark Streaming jobs. I would like to stop this streaming job once a week to run table maintenance. I was looking to leverage the foreachBatch function to check a condition and stop the job accordingly.

  • 2520 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Nolan Lavender​, How is it going?Were you able to resolve your problem?

  • 1 kudos
2 More Replies
irfanaziz
by Contributor II
  • 3909 Views
  • 4 replies
  • 0 kudos

Resolved! If two Data Factory pipelines are run at the same time or share a window of execution do they share the Databricks spark cluster(if both have the same linked service)? ( job clusters are those that are create on the go, defined in the linked service).

Continuing the above case, does that mean if i have several like 5 ADF pipelines scheduled regularly at the same time, its better to use an existing cluster as all of the ADF pipelines would share the same cluster and hence the cost will be lower?

  • 3909 Views
  • 4 replies
  • 0 kudos
Latest Reply
Atanu
Esteemed Contributor
  • 0 kudos

for adf or job run we always prefer job cluster. but for streaming, you may consider using interactive cluster . but anyway you need to monitor the cluster load, if loads are high there will be chance to job slowness as well as failure. also data siz...

  • 0 kudos
3 More Replies
Junee
by New Contributor III
  • 3067 Views
  • 7 replies
  • 3 kudos

Resolved! What happens to the clusters whose jobs are canceled or terminated due to failures? (Jobs triggered through Job API2.1 using runs/submit)

I am using Databeicks Job Api 2.1 to trigger and run my jobs. "jobs/runs/submit" this API helps in starting the cluster, as well as create the job and run it. This API works great for normal jobs as it also cleans the cluster once job is finished suc...

  • 3067 Views
  • 7 replies
  • 3 kudos
Latest Reply
User16871418122
Contributor III
  • 3 kudos

@Junee, Anytime! It is crisply mentioned in the doc too. https://docs.databricks.com/clusters/index.html

  • 3 kudos
6 More Replies
krishnachaitany
by New Contributor II
  • 3286 Views
  • 3 replies
  • 4 kudos

Resolved! Spot instance in Azure Databricks

When I run a job enabling using spot instances , I would like to know how many number of workers are using spot and how many number of workers are using on demand instances for a given job run In order to identify the spot instances we got for any...

  • 3286 Views
  • 3 replies
  • 4 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 4 kudos

You can. do it on Azure Portal using the Virtual Machines list:You will filter by either JobId tag or RunName tag (job name) and group by azure spot eviction policy or azure spot eviction type, the vm's under Stop/Deallocate and Capacity (using the 2...

  • 4 kudos
2 More Replies
User16826992666
by Valued Contributor
  • 1179 Views
  • 1 replies
  • 0 kudos

Can you restrict the type of clusters users are allowed to create?

I would like to make it so users can only create job clusters and not interactive clusters. Is it possible to do this in a workspace?

  • 1179 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826992666
Valued Contributor
  • 0 kudos

This can be accomplished with cluster policies. You can use a policy similar to this example to restrict certain users or groups to only have permission to create job clusters.

  • 0 kudos
Labels