- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-15-2024 11:36 PM - edited 06-15-2024 11:38 PM
Could you please provide guidance on the correct way to dynamically import a Python module from a user-specific path in Databricks Repos? Any advice on resolving the ModuleNotFoundError would be greatly appreciated.
udf_check_table_exists notebook:
Thank you for your assistance.
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-17-2024 09:03 AM
It works for me now 🙂 by refer this solution:
[Errno 95] Operation not supported · Issue #113823 · MicrosoftDocs/azure-docs · GitHub
creating a "file" instead of a "notebook" and moving the code from the notebook into the file, I was able to use the "import" statement
my import code:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-16-2024 10:03 PM
Hi,
There are two ways to import functions from other notebook:
- %run ../notebook path : This command will run the entire notebook and the function along with all the variable names will be imported. [This function should ideally be used to import the functions from another notebook, if another notebook only contains function definition]
- The second method to import function is for repos: In repos we can easily import static .py files.
From folder name import function.
Refer to this documentation for more detail answer: https://www.databricks.com/blog/2021/10/07/databricks-repos-is-now-generally-available.html
Data engineer at Rsystema
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-17-2024 01:23 AM
Thanks @Hkesharwani for your replying,
As DLT doesn't support the magic command %run, that's why I'm trying the import function way
my layout like this:
other notebook:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-17-2024 04:28 AM - edited 06-17-2024 04:29 AM
Hey @tramtran,
I didn't try with repos, but it works with a Workspace path the following way (we're doing it like this so I validated it for you). I guess since Repos are also inside /Workspace, it will work the same way.
Lets use the default /Workspace/Shared folder for this example.
1. Add the .py file with your table_exists function to the /Workspace/Shared folder. Lets call the file function_file.py for this example.
2. Create an __init__.py file in this/Workspace/Shared directory as well. So Databricks knows this is a package index.
3. In the notebook you want to import the function from the .py file, add this:
import sys
sys.path.append("/Workspace/Shared")
then import the function like this:
from function_file import table_exists
Hope this helps, good luck!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-17-2024 05:18 AM
Thanks @jacovangelder,
I did the same as your suggestion, but another error appeared:
OSError: [Errno 95] Operation not supported: '/Workspace/Shared/function_file.py'.
Have you faced this issue before?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-17-2024 05:46 AM
I haven't, have you done exactly the steps outlined? It looks like perhaps you're not allowed to read form the shared location? Not 100% sure.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-17-2024 09:03 AM
It works for me now 🙂 by refer this solution:
[Errno 95] Operation not supported · Issue #113823 · MicrosoftDocs/azure-docs · GitHub
creating a "file" instead of a "notebook" and moving the code from the notebook into the file, I was able to use the "import" statement
my import code:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-17-2024 09:07 AM
Thank you all again

