yesterday
Hi Team,
While creating a Declarative ETL pipeline in Databricks, I tried to configure a notebook using the "Add existing assets" option by providing the notebook path. However, I received a warning message:
"Legacy configuration detected. Use files instead of notebooks for an optimal experience."
My question is:
Will Declarative Pipelines continue to support notebooks in the future? As a developer, I primarily use notebooks to build pipelines. If notebooks are going to be deprecated or unsupported, I would prefer to start development using .py files instead.
Please help me understand the meaning of this warning message.
yesterday
Hello @karuppusamy!
That warning appears because the new editor is optimized for file-based workflows rather than notebooks. Notebooks still work, but they donโt behave like regular interactive notebooks, for example, you canโt run cells individually once theyโre part of a pipeline. Splitting into files gives a smoother experience and makes your pipeline code easier to organize and maintain.
yesterday - last edited yesterday
Hi @karuppusamy ,
In documentation of declarative pipelines they're saying that 2 default formats recommended are .sql and .py files.
When you read about available formats for databricks notebooks you will see that you have two options:
1. Source
2. IPYNB
If we consider first option, it's supports.py file.
In other words, you can still use notebooks if you want. They should work because you can save them as .py file. IPYNB format should also work. But if for some reason they will stop support it you can always easily convert it to source format.
yesterday
Hello @karuppusamy!
That warning appears because the new editor is optimized for file-based workflows rather than notebooks. Notebooks still work, but they donโt behave like regular interactive notebooks, for example, you canโt run cells individually once theyโre part of a pipeline. Splitting into files gives a smoother experience and makes your pipeline code easier to organize and maintain.
yesterday
Thank you @Advika, Just to clarify can I continue using notebooks for development? Will they be supported and work fine in the long term without running into legacy issues?
yesterday - last edited yesterday
Hi @karuppusamy ,
In documentation of declarative pipelines they're saying that 2 default formats recommended are .sql and .py files.
When you read about available formats for databricks notebooks you will see that you have two options:
1. Source
2. IPYNB
If we consider first option, it's supports.py file.
In other words, you can still use notebooks if you want. They should work because you can save them as .py file. IPYNB format should also work. But if for some reason they will stop support it you can always easily convert it to source format.
yesterday
Thank you @szymon_dybczak, Now I have a good clarification from my end.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now