cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Getting an warning message in Declarative Pipelines.

karuppusamy
Visitor

Hi Team,

While creating a Declarative ETL pipeline in Databricks, I tried to configure a notebook using the "Add existing assets" option by providing the notebook path. However, I received a warning message:
"Legacy configuration detected. Use files instead of notebooks for an optimal experience."

My question is:
Will Declarative Pipelines continue to support notebooks in the future? As a developer, I primarily use notebooks to build pipelines. If notebooks are going to be deprecated or unsupported, I would prefer to start development using .py files instead.
Please help me understand the meaning of this warning message.

2 ACCEPTED SOLUTIONS

Accepted Solutions

Advika
Databricks Employee
Databricks Employee

Hello @karuppusamy!

That warning appears because the new editor is optimized for file-based workflows rather than notebooks. Notebooks still work, but they don’t behave like regular interactive notebooks, for example, you can’t run cells individually once they’re part of a pipeline. Splitting into files gives a smoother experience and makes your pipeline code easier to organize and maintain.

View solution in original post

szymon_dybczak
Esteemed Contributor III

Hi @karuppusamy ,

In documentation of declarative pipelines they're saying that 2 default formats recommended are .sql and .py files.

When you read about available formats for databricks notebooks you will see that you have two options:

1. Source

2. IPYNB

If we consider first option, it's supports.py file.

In other words, you can still use notebooks if you want. They should work because you can save them as .py file. IPYNB format should also work. But if for some reason they will stop support it you can always easily convert it to source format.

View solution in original post

4 REPLIES 4

Advika
Databricks Employee
Databricks Employee

Hello @karuppusamy!

That warning appears because the new editor is optimized for file-based workflows rather than notebooks. Notebooks still work, but they don’t behave like regular interactive notebooks, for example, you can’t run cells individually once they’re part of a pipeline. Splitting into files gives a smoother experience and makes your pipeline code easier to organize and maintain.

Thank you @Advika, Just to clarify can I continue using notebooks for development? Will they be supported and work fine in the long term without running into legacy issues?

szymon_dybczak
Esteemed Contributor III

Hi @karuppusamy ,

In documentation of declarative pipelines they're saying that 2 default formats recommended are .sql and .py files.

When you read about available formats for databricks notebooks you will see that you have two options:

1. Source

2. IPYNB

If we consider first option, it's supports.py file.

In other words, you can still use notebooks if you want. They should work because you can save them as .py file. IPYNB format should also work. But if for some reason they will stop support it you can always easily convert it to source format.

karuppusamy
Visitor

Thank you @szymon_dybczak, Now I have a good clarification from my end. 

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now