- 2602 Views
- 5 replies
- 1 kudos
I want to know what happen with my cluster and if I can recover it.I entered to my Databricks account and I didn't found my jobs and my cluster. I couldn't find any log of the deleted cluster because the log is into the cluster interface. I entered t...
- 2602 Views
- 5 replies
- 1 kudos
Latest Reply
Dear folks,When the tables has been deleted, then why I am unable to create the table with same name.It continiously giving me error"DeltaAnalysisException: Cannot create table ('`spark_catalog`.`default`.`Customer_Data`'). The associated location ('...
4 More Replies
- 2282 Views
- 3 replies
- 1 kudos
- 2282 Views
- 3 replies
- 1 kudos
Latest Reply
The above code will create two jobs.JOB-1. dataframe: DataFrame = spark.createDataFrame(data=data,schema=schema)The createDataFrame function is responsible for inferring the schema from the provided data or using the specified schema.Depending on the...
2 More Replies
- 2030 Views
- 3 replies
- 0 kudos
How do engineering teams out there version control their jobs? If there is a production issue, can I revert to an older version of the job?
- 2030 Views
- 3 replies
- 0 kudos
Latest Reply
You can use version controlled source code for you databricks job and each time you need to rollback to older version of your job you need just to move to older version code. For version controlled source code you have multiple choises:- Use a noteb...
2 More Replies
by
Ludo
• New Contributor III
- 3109 Views
- 7 replies
- 2 kudos
Hello,This is question on our platform with `Databricks Runtime 11.3 LTS`.I'm running a Job with multiple tasks in // using a shared cluster.Each task runs a dedicated scala class within a JAR library attached as a dependency.One of the task fails (c...
- 3109 Views
- 7 replies
- 2 kudos
Latest Reply
Hi,This actually should not be marked as solved. We are having the same problem, whenever a Shared Job Cluster crashes for some reason (generally OoM), all tasks will start failing until eternity, with the error message as described above. This is ac...
6 More Replies
- 383 Views
- 1 replies
- 0 kudos
Hi! When I use `databricks jobs list --version=2.0` I get all jobs deployed using 2.0 and 2.1 API, however, when I use `databricks jobs list --version=2.1` I only get jobs deployed using 2.1 API. This is a behaviour that we've only experienced recent...
- 383 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Guillermo Sanchez​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
by
Eelke
• New Contributor II
- 969 Views
- 2 replies
- 2 kudos
This seems impossible with the cron that databricks is using but maybe I am wrong? However, if this is not possible it seems to me a missing feature, and thereby would like to suggest this feature to you
- 969 Views
- 2 replies
- 2 kudos
Latest Reply
Hi @Eelke van Foeken​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.T...
1 More Replies
- 508 Views
- 1 replies
- 3 kudos
I am trying to start the same Jobs multiple times using the python sdk's "run_now" command.If the number of requests exceeds the Maximum concurrent runs, the status of the run will be Skipped and the run will not be executed.Is there any way to queue...
- 508 Views
- 1 replies
- 3 kudos
Latest Reply
Hi, We do have a private preview feature which will be enabled shortly for queueing. Please tag me (@Debayan Mukherjee​ ) with your next update so that I will get notified.
by
j_al
• New Contributor II
- 1945 Views
- 9 replies
- 5 kudos
Jobs API 2.1 OpenAPI specification seems broken.The swagger file seems to be invalid.https://docs.databricks.com/_extras/api-refs/jobs-2.1-aws.yaml
- 1945 Views
- 9 replies
- 5 kudos
Latest Reply
@Debayan Mukherjee​ , are you suggesting to revert the openapi version specified in https://docs.databricks.com/_extras/api-refs/jobs-2.1-aws.yaml from 3.1.0 to 3.0.3?
8 More Replies
- 2536 Views
- 3 replies
- 3 kudos
Hi allI have a need to migrate just notebooks & jobs from one workspace to another. Is there an utility to do so?
- 2536 Views
- 3 replies
- 3 kudos
Latest Reply
@NOOR BASHA SHAIK​ you can also try databricks connect in addition to @Artem Sheiko​ information that was provided .
2 More Replies
by
cblock
• New Contributor III
- 1186 Views
- 3 replies
- 3 kudos
So, in this case our jobs are deployed from our development workspace to our isolated testing workspace via an automated Azure DevOps pipeline. As such, they are created (and thus run as) a service account user.Recently we made the switch to using gi...
- 1186 Views
- 3 replies
- 3 kudos
Latest Reply
Hi @Chris Block​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thanks...
2 More Replies
by
Thijs
• New Contributor III
- 1137 Views
- 3 replies
- 4 kudos
Hi all, we are building custom Databricks containers (https://docs.databricks.com/clusters/custom-containers.html). During the container build process we install dependencies and also python source code scripts. We now want to run some of these scrip...
- 1137 Views
- 3 replies
- 4 kudos
Latest Reply
Hi @Thijs van den Berg​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best an...
2 More Replies
- 969 Views
- 2 replies
- 2 kudos
Graviton instances do not support Container services on paper (https://docs.databricks.com/clusters/graviton.html#unsupported-features) but if you try to build Docker ARM image and run it on Graviton, it will work. Does anyone use this combination in...
- 969 Views
- 2 replies
- 2 kudos
Latest Reply
Graviton is not supported by Databricks Container Services. How are you planning to run it on Databricks? Please tag @Debayan​ with your next comment so that I will get notified. Thank you!
1 More Replies
- 862 Views
- 2 replies
- 0 kudos
Hi All.. I had created 2 job flows and one for transaction layer and another for datamart layer. I need to specify the job dependency between job1 vs Job2 and need to trigger the job2 after completing job1 without using any other orchestration tool o...
- 862 Views
- 2 replies
- 0 kudos
by
SK21
• New Contributor II
- 979 Views
- 3 replies
- 1 kudos
I had created Jobs to trigger the respective notebooks in Databricks Workflow.Now I need to move them to further environments.Would you please help me with an CICD process to promote jobs to further environments.
- 979 Views
- 3 replies
- 1 kudos
Latest Reply
Please use jobs API 2.1 You can get job and save JSON with that jobs to git.In git then set variables defining databricks workspaces (URL and token) and after push define that API call is triggered with your json stored in git.
2 More Replies
- 1416 Views
- 2 replies
- 0 kudos
i copied my question from an very old question/post that i reponded. and decided to move it to here:context:I have jar (scala), using scala pureconfig (wrapper of typesafe config)uploaded an application.conf file to a path which is mounted to the wor...
- 1416 Views
- 2 replies
- 0 kudos
Latest Reply
we had to put the conf in the root folder of the mounted path, and that works.maybe the mounted storage account being blob instead of adls2 is causing the issues.
1 More Replies