13 hours ago
Hi Team,
While creating a Declarative ETL pipeline in Databricks, I tried to configure a notebook using the "Add existing assets" option by providing the notebook path. However, I received a warning message:
"Legacy configuration detected. Use files instead of notebooks for an optimal experience."
My question is:
Will Declarative Pipelines continue to support notebooks in the future? As a developer, I primarily use notebooks to build pipelines. If notebooks are going to be deprecated or unsupported, I would prefer to start development using .py files instead.
Please help me understand the meaning of this warning message.
11 hours ago
Hello @karuppusamy!
That warning appears because the new editor is optimized for file-based workflows rather than notebooks. Notebooks still work, but they don’t behave like regular interactive notebooks, for example, you can’t run cells individually once they’re part of a pipeline. Splitting into files gives a smoother experience and makes your pipeline code easier to organize and maintain.
10 hours ago - last edited 10 hours ago
Hi @karuppusamy ,
In documentation of declarative pipelines they're saying that 2 default formats recommended are .sql and .py files.
When you read about available formats for databricks notebooks you will see that you have two options:
1. Source
2. IPYNB
If we consider first option, it's supports.py file.
In other words, you can still use notebooks if you want. They should work because you can save them as .py file. IPYNB format should also work. But if for some reason they will stop support it you can always easily convert it to source format.
11 hours ago
Hello @karuppusamy!
That warning appears because the new editor is optimized for file-based workflows rather than notebooks. Notebooks still work, but they don’t behave like regular interactive notebooks, for example, you can’t run cells individually once they’re part of a pipeline. Splitting into files gives a smoother experience and makes your pipeline code easier to organize and maintain.
11 hours ago
Thank you @Advika, Just to clarify can I continue using notebooks for development? Will they be supported and work fine in the long term without running into legacy issues?
10 hours ago - last edited 10 hours ago
Hi @karuppusamy ,
In documentation of declarative pipelines they're saying that 2 default formats recommended are .sql and .py files.
When you read about available formats for databricks notebooks you will see that you have two options:
1. Source
2. IPYNB
If we consider first option, it's supports.py file.
In other words, you can still use notebooks if you want. They should work because you can save them as .py file. IPYNB format should also work. But if for some reason they will stop support it you can always easily convert it to source format.
10 hours ago
Thank you @szymon_dybczak, Now I have a good clarification from my end.
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now