cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Import python file to notebook doesn't work

Jennifer
New Contributor III

I followed the documentation here under the section "Import a file into a notebook" to import a shared python file among notebooks used by delta live table. But it sometimes can find the module, sometimes not and returns me exception No module named '***'.

I wonder if this is a bug on Databricks.

1 ACCEPTED SOLUTION

Accepted Solutions

Jennifer
New Contributor III

It is still a problem when using DLT. But I put my functions in another notebook and schedule a job with multiple tasks, not using DLT anymore, then it works. I guess it might be a issue with DLT.

View solution in original post

5 REPLIES 5

Debayan
Databricks Employee
Databricks Employee

Hi,

Whenever you get the error, could you please try to list the file from DBFS in the notebook? Also, a screenshot would be helpful of the error code and the output.

Please let us know if this helps. 

Also please tag @Debayan​ with your next response which will notify me, Thank you!

Vartika
Databricks Employee
Databricks Employee

Hey there @Jennifer MJ​ 

Hope everything is going great.

Just wanted to check in if you were able to resolve your issue or do you need more help? We'd love to hear from you.

Thanks!

Jennifer
New Contributor III

It is still a problem when using DLT. But I put my functions in another notebook and schedule a job with multiple tasks, not using DLT anymore, then it works. I guess it might be a issue with DLT.

Vartika
Databricks Employee
Databricks Employee

Thank you so much for getting back to us @Jennifer MJ​ . It's really great of you to send in the solution. Would you be happy to mark the answer as best so other community members can find the solution quickly and easily? 

We really appreciate your time.

Wish you a great Databricks journey ahead!

java.lang.RuntimeException: Failed to execute python command for notebook '/Repos/cflowers@trend.community/treNLP/src/pipeline' with id RunnableCommandId(5385449474169222390) and error AnsiResult(---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
File <command--1>:5
      3 import dlt
      4 from pyspark.sql import SparkSession
----> 5 from clients.pipeline_client import DatabricksPipelineClient
      6 from clients.TreNLPModel import MLFlowModel
      7 from clients.databricks_client import DatabricksHttpClient

File /databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py:171, in _create_import_patch.<locals>.import_patch(name, globals, locals, fromlist, level)
    166 thread_local._nest_level += 1
    168 try:
    169     # Import the desired module. If you’re seeing this while debugging a failed import,
    170     # look at preceding stack frames for relevant error information.
--> 171     original_result = python_builtin_import(name, globals, locals, fromlist, level)
    173     is_root_import = thread_local._nest_level == 1
    174     # `level` represents the number of leading dots in a relative import statement.
    175     # If it's zero, then this is an absolute import.

ModuleNotFoundError: No module named 'clients',None,Map(),Map(),List(),List(),Map())

I actually don't think the accepted solution should be accepted as the problem involving DLT was not addressed. 

I'm having a similar issue while trying to import from repos in a DLT pipeline. 

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group