cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

In databricks deployment .py files getting converted to notebooks

amit_jbs
New Contributor II

A critical issue has arisen that is impacting our deployment planning for our client. We have encountered a challenge with our Azure CI/CD pipeline integration, specifically concerning the deployment of Python files (.py). Despite our best efforts, when deploying these files through the pipeline, they are being converted into normal notebooks. As a result, we are forced to manually recreate the .py files and copy-paste the code to run the notebooks.

We have extensively searched for a solution to this issue across various resources, but unfortunately, we have not found clear and comprehensive documentation that provides a step-by-step guide to address this specific situation.

If someone can provide guidance or a solution it would be greatly appreciated.

5 REPLIES 5

-werners-
Esteemed Contributor III

What is your pipeline?  We propagate notebooks using Azure Devops Repos with PRs and merges. like that files do not get converted.

Dazza
New Contributor II

Experiencing a similar issue that we are looking to resolve except the files are .sql. We have a process that has 1 orchestration notebook , calling multiple .sql files. These .sql files are being converted to regular databricks notebooks when deploying so we are having to promote them to production in a manual manner right now - which is not ideal. Any insight/support here on this same issue would be appreciated

amarantevitor94
New Contributor II

Did you manage to find the solution? If so, could you share it? 

I have the same problem, and if I find the solution, I'll share it here.

AGivenUser
New Contributor II

We had the same problem but you can deploy the python files as 'RAW' file types with Databricks CLI.
Ref on usage from ADO Pipelines: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/auth-with-azure-devops

AGivenUser
New Contributor II

Another option is Databricks Asset Bundles.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now