โ06-07-2025 03:02 PM
Hi everyone, I'm finding a particularly frustrating issue whenever I try to run some python code in an asset bundle on my workspace. The code and notebooks deploy fine but once deployed, the code files get converted to notebooks and I'm no longer able to import the module in the code file in my notebook. Databricks thinks the code file is a notebook for some reason and so everything breaks. Does anyone know a workaround on this issue? #asset bundles #python code files #deployment issues
โ06-09-2025 06:08 AM - edited โ06-09-2025 06:10 AM
Hi @briancuster63 ,
From what I understand, Databricks may treat .py files as notebooks if they include notebook-style headers like # Databricks notebook source or if the upload format is set incorrectly.
Hereโs what you can try:
โ06-09-2025 06:54 AM
Thanks for your reply. The .py code fille in question contains no headers or other information in it. It is a standard python code file. I am adjusting the path prior to executing the code on Databricks so it knows where to get the file. The exact error I'm getting is this:
NotebookImportException: Unable to import holidays.main. Importing notebooks directly is not supported. Use dbutils.import_notebook("holidays.main") instead.
By the way, I tried using the dbutils method and it didn't work either. I've never run into this before but I'm using a bundle this time. The bundle must be indicating the file is a notebook somehow. I just haven't figured out where in the configuration it specifies that the py file is a notebook.
โ06-10-2025 02:08 AM
It sounds like the issue is related to how your .py files are being interpreted during the bundle deployment process. If your databricks.yml configuration or deployment pipeline doesnโt explicitly define the .py files as raw Python files (by including them as artifacts or resources with the correct type), Databricks may mistakenly treat them as notebooks.
To prevent this, make sure your .py files are listed under the appropriate section in your databricks.yml.
You can find detailed syntax and examples in the Databricks Asset Bundle documentation.
โ06-30-2025 12:11 PM
I came here looking for a solution to the opposite problem: I was hoping my .py files to be available as a notebook (without adding extra headers). Unfortunately, this does not seem to be possible with DABs.
@facebiranhari if you have not solved your problem, you can also take a look at Laktory (www.laktory.ai). It gives you full control on how you deploy files (regarless of their filename extension) as it uses Terraform for deploying each file. It also lets you deploy any Databricks resources like warehouses, schemas, secrets, metastores, etc.
Passionate about hosting events and connecting people? Help us grow a vibrant local communityโsign up today to get started!
Sign Up Now