cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

digitalinstitut
by New Contributor
  • 584 Views
  • 0 replies
  • 0 kudos

www.amritsardigitalacademy.in

Amritsar Digital Academy is the best https://www.amritsardigitalacademy.in/ digital marketing institute In Punjab. if you want to do a digital marketing course. you can enroll now!

  • 584 Views
  • 0 replies
  • 0 kudos
Mirko
by Contributor
  • 12497 Views
  • 12 replies
  • 2 kudos

Resolved! strange error with dbutils.notebook.run(...)

The situation is as following: i have a sheduled job, which uses dbutils.notebook.run(path,timeout) . During the last week everything worked smooth. During the weekend the job began to fail, at the dbutils.notebook.run(path,timeout) command. I get th...

  • 12497 Views
  • 12 replies
  • 2 kudos
Latest Reply
User16753724663
Valued Contributor
  • 2 kudos

Hi @Florent POUSSEROT​ Apologies for the delay. Could you please confirm if you are still facing the issue?

  • 2 kudos
11 More Replies
BorislavBlagoev
by Valued Contributor III
  • 4960 Views
  • 9 replies
  • 3 kudos

Resolved! Tring to create incremental pipeline but fails when I try to use outputMode "update"

def upsertToDelta(microBatchOutputDF, batchId): microBatchOutputDF.createOrReplaceTempView("updates")   microBatchOutputDF._jdf.sparkSession().sql(""" MERGE INTO old o USING updates u ON u.id = o.id WHEN MATCHED THEN UPDATE SE...

  • 4960 Views
  • 9 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

Delta table/file version is too old. Please try to upgrade it as described here https://docs.microsoft.com/en-us/azure/databricks/delta/versioning​

  • 3 kudos
8 More Replies
pjp94
by Contributor
  • 2191 Views
  • 4 replies
  • 9 kudos

Databrick Job - Notebook Execution

Question - When you set a reoccuring job to simply update a notebook, does databricks clear the state of the notebook prior to executing the notebook? If not, can I configure it to make sure it clears the state before running?

  • 2191 Views
  • 4 replies
  • 9 kudos
Latest Reply
Anonymous
Not applicable
  • 9 kudos

@Paras Patel​ - Would you be happy to mark Hubert's answer as best so that other members can find the solution more easily?Thanks!

  • 9 kudos
3 More Replies
MadelynM
by Databricks Employee
  • 706 Views
  • 0 replies
  • 0 kudos

vimeo.com

A job is a way of running a notebook either immediately or on a scheduled basis. Here's a quick video (4:04) on how to schedule a job and automate a workflow for Databricks on AWS. To follow along with the video, import this notebook into your worksp...

  • 706 Views
  • 0 replies
  • 0 kudos
Siddhesh2525
by New Contributor III
  • 5765 Views
  • 2 replies
  • 6 kudos

How to pass dynamic value in databricks

I have separate column value defined in 13 diffrent notebook and i want merge into 1 databrick notebook and want to pass dynamic parameter using databrick so it will help me to run in single databricks notebook .

  • 5765 Views
  • 2 replies
  • 6 kudos
Latest Reply
Prabakar
Databricks Employee
  • 6 kudos

Hi @siddhesh Bhavar​ you can use widgets with the %run command to achieve this. https://docs.databricks.com/notebooks/widgets.html#use-widgets-with-run%run /path/to/notebook $X="10" $Y="1"

  • 6 kudos
1 More Replies
MadelynM
by Databricks Employee
  • 2728 Views
  • 2 replies
  • 1 kudos

2021-08-Best-Practices-for-Your-Data-Architecture-v3-OG-1200x628

Thanks to everyone who joined the Best Practices for Your Data Architecture session on Getting Workloads to Production using CI/CD. You can access the on-demand session recording here, and the code in the Databricks Labs CI/CD Templates Repo. Posted ...

  • 2728 Views
  • 2 replies
  • 1 kudos
Latest Reply
MadelynM
Databricks Employee
  • 1 kudos

Here's the embedded links list!Jobs scheduling and orchestrationBuilt-in job scheduling: https://docs.databricks.com/jobs.html#schedule-a-job Periodic scheduling of the jobsExecute notebook / jar / Python script / Spark-submitMultitask JobsExecute no...

  • 1 kudos
1 More Replies
kpendergast
by Contributor
  • 3239 Views
  • 3 replies
  • 3 kudos

Resolved! How do I create a job for a notebook not in the /Users/ directory?

I am setting up a job to to load data from S3 into Delta using Auto loader. I can do this fine in interactive mode. When trying to create a job in the UI. I can select the notebook in the root directory I created for the project within the create jo...

  • 3239 Views
  • 3 replies
  • 3 kudos
Latest Reply
User16844513407
New Contributor III
  • 3 kudos

Hi @Ken Pendergast​, you are supposed to be able to reference any Notebook you have the right permissions on so it looks like you are running into a bug, can you please reach out to support or email me directly with your workspace ID? My email is jan...

  • 3 kudos
2 More Replies
Mohit_m
by Valued Contributor II
  • 1243 Views
  • 1 replies
  • 4 kudos

Enabling of Task Orchestration feature in Jobs via API as well Databricks supports the ability to orchestrate multiple tasks within a job. You must en...

Enabling of Task Orchestration feature in Jobs via API as wellDatabricks supports the ability to orchestrate multiple tasks within a job. You must enable this feature in the admin console. Once enabled, this feature cannot be disabled. To enable orch...

  • 1243 Views
  • 1 replies
  • 4 kudos
Latest Reply
Prabakar
Databricks Employee
  • 4 kudos

@Mohit Miglani​ this will be really helpful for those who prefer CLI / API more than the UI.

  • 4 kudos
Anonymous
by Not applicable
  • 1943 Views
  • 2 replies
  • 4 kudos

Multi-task Job Run starting point

Hi community!I would like to know if it is possible to start a Multi-task Job Run from and specific task. The use case is as follows:I have a 17 tasks JobA task in the middle, let's say a task after 2 dependencies, failsI found the error and now it i...

  • 1943 Views
  • 2 replies
  • 4 kudos
Latest Reply
BilalAslamDbrx
Databricks Employee
  • 4 kudos

+1 to what @Dan Zafar​  said. We're working **** ** this. Looking forward to bring this to you in the near future.

  • 4 kudos
1 More Replies
eq
by New Contributor III
  • 4837 Views
  • 7 replies
  • 7 kudos

Resolved! Multi-task Jobs orchestration - simulating onComplete status

Currently, we are investigating how to effectively incorporate databricks latest feature for orchestration of tasks - Multi-task Jobs.The default behaviour is that a downstream task would not be executed if the previous one has failed for some reason...

  • 4837 Views
  • 7 replies
  • 7 kudos
Latest Reply
User16844513407
New Contributor III
  • 7 kudos

Hi @Stefan V​ ,My name is Jan and I'm a product manager working on job orchestration. Thank you for your question. At the moment this is not something directly supported yet, this is however on our radar. If you are interested in having a short conve...

  • 7 kudos
6 More Replies
marchello
by New Contributor III
  • 6493 Views
  • 8 replies
  • 3 kudos

Resolved! error on connecting to Snowflake

Hi team, I'm getting weird error in one of my jobs when connecting to Snowflake. All my other jobs (I've got plenty) work fine. The current one also works fine when I have only one coding step (except installing needed libraries in my very first step...

  • 6493 Views
  • 8 replies
  • 3 kudos
Latest Reply
Dan_Z
Databricks Employee
  • 3 kudos

@marchello​ I suggest you contact Snowflake to move forward on this one.

  • 3 kudos
7 More Replies
User16856693631
by New Contributor II
  • 6023 Views
  • 1 replies
  • 0 kudos
  • 6023 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16856693631
New Contributor II
  • 0 kudos

Yes you can. Databricks maintains a history of your job runs for up to 60 days. If you need to preserve job runs, Databricks recommends that you export results before they expire. For more information, see https://docs.databricks.com/jobs.html#export...

  • 0 kudos
Labels