- 21083 Views
- 7 replies
- 11 kudos
Hi everyone,Do you guys know if it's possible to automate the Databricks workflow deployment through azure devops (like what we do with the deployment of notebooks)?
- 21083 Views
- 7 replies
- 11 kudos
Latest Reply
Did you get a chance to try Brickflows - https://github.com/Nike-Inc/brickflowYou can find the documentation here - https://engineering.nike.com/brickflow/v0.11.2/Brickflow uses - Databricks Asset Bundles(DAB) under the hood but provides a Pythonic w...
6 More Replies
- 1038 Views
- 1 replies
- 2 kudos
Good Morning,I am having some issues with my DLT pipeline. I have a scenario where I am loading in bronze-silver tables programatically from a SQL database (each row corresponds to a table to create). This leaves me in a situation where sometimes onl...
- 1038 Views
- 1 replies
- 2 kudos
Latest Reply
Hi @Robert Pearce​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.
- 1824 Views
- 2 replies
- 0 kudos
I am asking this because up to now I have just seen some examples of deploying a pre-existent kedro project in databricks in order to run some pipelines...
- 1824 Views
- 2 replies
- 0 kudos
Latest Reply
yes, you can deploy a pre-existent kedro project in databricks, but as far as I know you cannot create it. you have to create it somewhere else and then deploy it in db.
1 More Replies
- 1460 Views
- 4 replies
- 1 kudos
So I'm used to developing notebooks interactively. Write some code, run to see if I made an error and if no error, filter and display dataframe to see that I did what I intended. With DLT pipelines, however, I can't run interactively. Is my understan...
- 1460 Views
- 4 replies
- 1 kudos
Latest Reply
yes exactly i am also working on the dlt , and what i get to know about from this is that if we want to check our error , we have to run the pipeline again and again for debugging it , but this is not the best practice to do so , so the other metho...
3 More Replies
- 386 Views
- 0 replies
- 0 kudos
I have created a custom transformer to be used in a ml pipeline. I was able to write the pipeline to storage by extending the transformer class with DefaultParamsWritable. Reading the pipeline back in however, does not seem possible in Scala. I have...
- 386 Views
- 0 replies
- 0 kudos