cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Asset Bundle .py files being converted to notebooks when deployed to Databricks

briancuster63
New Contributor II

Hi everyone, I'm finding a particularly frustrating issue whenever I try to run some python code in an asset bundle on my workspace. The code and notebooks deploy fine but once deployed, the code files get converted to notebooks and I'm no longer able to import the module in the code file in my notebook. Databricks thinks the code file is a notebook for some reason and so everything breaks. Does anyone know a workaround on this issue? #asset bundles #python code files #deployment issues

3 REPLIES 3

SP_6721
Contributor

Hi @briancuster63 ,

From what I understand, Databricks may treat .py files as notebooks if they include notebook-style headers like # Databricks notebook source or if the upload format is set incorrectly.
Hereโ€™s what you can try:

  • Remove any headers like # Databricks notebook source from your .py files, thatโ€™s often what triggers the conversion.
  • Ensure the files are in a folder that wonโ€™t get auto-converted to notebooks during deployment.
  • After deployment, in your notebook, add the bundle's code path to sys.path

briancuster63
New Contributor II

Thanks for your reply. The .py code fille in question contains no headers or other information in it. It is a standard python code file. I am adjusting the path prior to executing the code on Databricks so it knows where to get the file. The exact error I'm getting is this:
NotebookImportException: Unable to import holidays.main. Importing notebooks directly is not supported. Use dbutils.import_notebook("holidays.main") instead.

By the way, I tried using the dbutils method and it didn't work either. I've never run into this before but I'm using a bundle this time. The bundle must be indicating the file is a notebook somehow. I just haven't figured out where in the configuration it specifies that the py file is a notebook.

It sounds like the issue is related to how your .py files are being interpreted during the bundle deployment process. If your databricks.yml configuration or deployment pipeline doesnโ€™t explicitly define the .py files as raw Python files (by including them as artifacts or resources with the correct type), Databricks may mistakenly treat them as notebooks.
To prevent this, make sure your .py files are listed under the appropriate section in your databricks.yml.
You can find detailed syntax and examples in the Databricks Asset Bundle documentation.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local communityโ€”sign up today to get started!

Sign Up Now