cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

KVNARK
by Honored Contributor II
  • 3111 Views
  • 2 replies
  • 1 kudos

Resolved! Notebook activity is getting timed out in ADF pipeline.

Notebook activity is getting timed out after certain time of running (5 hours) in ADF pipeline and getting timeout error.Its just simply getting timed out error. Problem is this will process TB of data daily. can anyone have any idea to fix this.

  • 3111 Views
  • 2 replies
  • 1 kudos
Latest Reply
KVNARK
Honored Contributor II
  • 1 kudos

@Daniel Sahal​ - Noted. Thanks Daniel!

  • 1 kudos
1 More Replies
sreedata
by New Contributor III
  • 3576 Views
  • 4 replies
  • 7 kudos

Resolved! Getting status of "If Condition" Activity into a variable

"If Condition" has lot of activities that can succeeded or fail. If any activity fails then whole "If Condition" fails. I have to get the status of the "If Condition" activity (pass or fail) so that i can use it for processing in the next notebook t...

  • 3576 Views
  • 4 replies
  • 7 kudos
Latest Reply
UmaMahesh1
Honored Contributor III
  • 7 kudos

In your ADF Pipeline activity, set two different pipeline activities from the If condition activity based on success or failure (the green and red arrows). Then inside each pipeline activity, you can add a set variable, get variable and your adb note...

  • 7 kudos
3 More Replies
Prototype998
by New Contributor III
  • 3169 Views
  • 4 replies
  • 4 kudos

Resolved! Databricks notebook run

How to run the databricks notebook through ADF ??? 

  • 3169 Views
  • 4 replies
  • 4 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 4 kudos

Hi @Punit Chauhan​ you can use databricks notebook activity in ADF to trigger you databricks notebook via ADF-

  • 4 kudos
3 More Replies
g96g
by New Contributor III
  • 5206 Views
  • 8 replies
  • 0 kudos

Resolved! ADF pipeline fails when passing the parameter to databricks

I have project where I have to read the data from NETSUITE using API. Databricks Notebook runs perfectly when I manually insert the table names I want to read from the source. I have dataset (csv) file in adf with all the table names that I need to r...

  • 5206 Views
  • 8 replies
  • 0 kudos
Latest Reply
mcwir
Contributor
  • 0 kudos

Have you tried do debug the json payload of adf trigger ? maybe it wrongly conveys tables names

  • 0 kudos
7 More Replies
kkumar
by New Contributor III
  • 17582 Views
  • 3 replies
  • 7 kudos

Resolved! can we update a Parquet file??

i have copied a table in to a Parquet file now can i update a row or a column in a parquet file without rewriting all the data as the data is huge.using Databricks or ADFThank You

  • 17582 Views
  • 3 replies
  • 7 kudos
Latest Reply
youssefmrini
Honored Contributor III
  • 7 kudos

You can only append Data with Parquet that's why you need to convert your parquet table to Delta. It will be much easier.

  • 7 kudos
2 More Replies
g96g
by New Contributor III
  • 4683 Views
  • 1 replies
  • 1 kudos

Resolved! how can I pass the df columns as a parameter

Im doing the self study and want pass df column name as a parameter.I have defined the widget column_name= dbutils.widgets.get('column_name')which is executing succefuly ( giving me a column name)then Im reading the df and do some transformation and ...

  • 4683 Views
  • 1 replies
  • 1 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 1 kudos

df2.select([column_name]).writeORdf2.select(column_name).write

  • 1 kudos
SailajaB
by Valued Contributor III
  • 10979 Views
  • 10 replies
  • 10 kudos

Resolved! Is there a way to capture the notebook logs from ADF pipeline?

Hi,I would like to capture notebook custom log exceptions(python) from ADF pipeline based on the exceptions pipeline should got succeed or failed.Is there any mechanism to implement it. In my testing ADF pipeline is successful irrespective of the log...

  • 10979 Views
  • 10 replies
  • 10 kudos
Latest Reply
GurpreetSethi
New Contributor III
  • 10 kudos

Hi SailajaB,Try this out.Notebook, once executed successfully return a long JSON formatted output. We need to specify appropriate nodes to fetch the output. In below screenshot we can see that when notebook ran it returns empName & empCity as output....

  • 10 kudos
9 More Replies
Vibhor
by Contributor
  • 5495 Views
  • 5 replies
  • 13 kudos

Resolved! ADF Pipeline - Notebook Run time

In adf/pipeline can we specify to exit notebook and proceed to another notebook after some threshold value like 15 minutes. For example I have a pipeline with notebooks scheduled in sequence, want the pipeline to keep running that notebook for a cert...

  • 5495 Views
  • 5 replies
  • 13 kudos
Latest Reply
jose_gonzalez
Moderator
  • 13 kudos

Hi @Vibhor Sethi​ ,There is a global timeout in Azure Data Factory (ADF) that you can use to stop the pipeline. In addition, you can use the notebook timeout in case you want to control it from your Databricks job.

  • 13 kudos
4 More Replies
Labels