cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

In databricks deployment .py files getting converted to notebooks

amit_jbs
New Contributor II

A critical issue has arisen that is impacting our deployment planning for our client. We have encountered a challenge with our Azure CI/CD pipeline integration, specifically concerning the deployment of Python files (.py). Despite our best efforts, when deploying these files through the pipeline, they are being converted into normal notebooks. As a result, we are forced to manually recreate the .py files and copy-paste the code to run the notebooks.

We have extensively searched for a solution to this issue across various resources, but unfortunately, we have not found clear and comprehensive documentation that provides a step-by-step guide to address this specific situation.

If someone can provide guidance or a solution it would be greatly appreciated.

2 REPLIES 2

-werners-
Esteemed Contributor III

What is your pipeline?  We propagate notebooks using Azure Devops Repos with PRs and merges. like that files do not get converted.

Dazza
New Contributor

Experiencing a similar issue that we are looking to resolve except the files are .sql. We have a process that has 1 orchestration notebook , calling multiple .sql files. These .sql files are being converted to regular databricks notebooks when deploying so we are having to promote them to production in a manual manner right now - which is not ideal. Any insight/support here on this same issue would be appreciated

Welcome to Databricks Community: Lets learn, network and celebrate together

Join our fast-growing data practitioner and expert community of 80K+ members, ready to discover, help and collaborate together while making meaningful connections. 

Click here to register and join today! 

Engage in exciting technical discussions, join a group with your peers and meet our Featured Members.