cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

joao_vnb
by New Contributor III
  • 21083 Views
  • 7 replies
  • 11 kudos

Resolved! Automate the Databricks workflow deployment

Hi everyone,Do you guys know if it's possible to automate the Databricks workflow deployment through azure devops (like what we do with the deployment of notebooks)?

  • 21083 Views
  • 7 replies
  • 11 kudos
Latest Reply
asingamaneni
New Contributor II
  • 11 kudos

Did you get a chance to try Brickflows - https://github.com/Nike-Inc/brickflowYou can find the documentation here - https://engineering.nike.com/brickflow/v0.11.2/Brickflow uses - Databricks Asset Bundles(DAB) under the hood but provides a Pythonic w...

  • 11 kudos
6 More Replies
PearceR
by New Contributor III
  • 1038 Views
  • 1 replies
  • 2 kudos

try and except in DLT pipelines

Good Morning,I am having some issues with my DLT pipeline. I have a scenario where I am loading in bronze-silver tables programatically from a SQL database (each row corresponds to a table to create). This leaves me in a situation where sometimes onl...

  • 1038 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Robert Pearce​  Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 2 kudos
riccostamendes
by New Contributor II
  • 1824 Views
  • 2 replies
  • 0 kudos

Just a doubt, can we develop a kedro project in databricks?

I am asking this because up to now I have just seen some examples of deploying a pre-existent kedro project in databricks in order to run some pipelines...

  • 1824 Views
  • 2 replies
  • 0 kudos
Latest Reply
riccostamendes
New Contributor II
  • 0 kudos

yes, you can deploy a pre-existent kedro project in databricks, but as far as I know you cannot create it. you have to create it somewhere else and then deploy it in db.

  • 0 kudos
1 More Replies
espenol
by New Contributor III
  • 1460 Views
  • 4 replies
  • 1 kudos

How to work with DLT pipelines? Best practices?

So I'm used to developing notebooks interactively. Write some code, run to see if I made an error and if no error, filter and display dataframe to see that I did what I intended. With DLT pipelines, however, I can't run interactively. Is my understan...

  • 1460 Views
  • 4 replies
  • 1 kudos
Latest Reply
Rishabh264
Honored Contributor II
  • 1 kudos

yes exactly i am also working on the dlt , and what i get to know about from this is that if we want to check our error , we have to run the pipeline again and again for debugging it , but this is not the best practice to do so , so the other metho...

  • 1 kudos
3 More Replies
nolanreilly
by New Contributor
  • 386 Views
  • 0 replies
  • 0 kudos

Impossible to read a custom pipeline? (Scala)

I have created a custom transformer to be used in a ml pipeline. I was able to write the pipeline to storage by extending the transformer class with DefaultParamsWritable. Reading the pipeline back in however, does not seem possible in Scala. I have...

  • 386 Views
  • 0 replies
  • 0 kudos
Labels