03-07-2023 06:00 AM
I followed the documentation here under the section "Import a file into a notebook" to import a shared python file among notebooks used by delta live table. But it sometimes can find the module, sometimes not and returns me exception No module named '***'.
I wonder if this is a bug on Databricks.
03-31-2023 01:41 AM
It is still a problem when using DLT. But I put my functions in another notebook and schedule a job with multiple tasks, not using DLT anymore, then it works. I guess it might be a issue with DLT.
03-07-2023 10:40 PM
Hi,
Whenever you get the error, could you please try to list the file from DBFS in the notebook? Also, a screenshot would be helpful of the error code and the output.
Please let us know if this helps.
Also please tag @Debayan with your next response which will notify me, Thank you!
03-31-2023 01:37 AM
Hey there @Jennifer MJ
Hope everything is going great.
Just wanted to check in if you were able to resolve your issue or do you need more help? We'd love to hear from you.
Thanks!
03-31-2023 01:41 AM
It is still a problem when using DLT. But I put my functions in another notebook and schedule a job with multiple tasks, not using DLT anymore, then it works. I guess it might be a issue with DLT.
03-31-2023 01:49 AM
Thank you so much for getting back to us @Jennifer MJ . It's really great of you to send in the solution. Would you be happy to mark the answer as best so other community members can find the solution quickly and easily?
We really appreciate your time.
Wish you a great Databricks journey ahead!
03-08-2024 01:19 PM
java.lang.RuntimeException: Failed to execute python command for notebook '/Repos/cflowers@trend.community/treNLP/src/pipeline' with id RunnableCommandId(5385449474169222390) and error AnsiResult([0;31m---------------------------------------------------------------------------[0m
[0;31mModuleNotFoundError[0m Traceback (most recent call last)
File [0;32m<command--1>:5[0m
[1;32m 3[0m [38;5;28;01mimport[39;00m [38;5;21;01mdlt[39;00m
[1;32m 4[0m [38;5;28;01mfrom[39;00m [38;5;21;01mpyspark[39;00m[38;5;21;01m.[39;00m[38;5;21;01msql[39;00m [38;5;28;01mimport[39;00m SparkSession
[0;32m----> 5[0m [38;5;28;01mfrom[39;00m [38;5;21;01mclients[39;00m[38;5;21;01m.[39;00m[38;5;21;01mpipeline_client[39;00m [38;5;28;01mimport[39;00m DatabricksPipelineClient
[1;32m 6[0m [38;5;28;01mfrom[39;00m [38;5;21;01mclients[39;00m[38;5;21;01m.[39;00m[38;5;21;01mTreNLPModel[39;00m [38;5;28;01mimport[39;00m MLFlowModel
[1;32m 7[0m [38;5;28;01mfrom[39;00m [38;5;21;01mclients[39;00m[38;5;21;01m.[39;00m[38;5;21;01mdatabricks_client[39;00m [38;5;28;01mimport[39;00m DatabricksHttpClient
File [0;32m/databricks/python_shell/dbruntime/PythonPackageImportsInstrumentation/__init__.py:171[0m, in [0;36m_create_import_patch.<locals>.import_patch[0;34m(name, globals, locals, fromlist, level)[0m
[1;32m 166[0m thread_local[38;5;241m.[39m_nest_level [38;5;241m+[39m[38;5;241m=[39m [38;5;241m1[39m
[1;32m 168[0m [38;5;28;01mtry[39;00m:
[1;32m 169[0m [38;5;66;03m# Import the desired module. If you’re seeing this while debugging a failed import,[39;00m
[1;32m 170[0m [38;5;66;03m# look at preceding stack frames for relevant error information.[39;00m
[0;32m--> 171[0m original_result [38;5;241m=[39m [43mpython_builtin_import[49m[43m([49m[43mname[49m[43m,[49m[43m [49m[38;5;28;43mglobals[39;49m[43m,[49m[43m [49m[38;5;28;43mlocals[39;49m[43m,[49m[43m [49m[43mfromlist[49m[43m,[49m[43m [49m[43mlevel[49m[43m)[49m
[1;32m 173[0m is_root_import [38;5;241m=[39m thread_local[38;5;241m.[39m_nest_level [38;5;241m==[39m [38;5;241m1[39m
[1;32m 174[0m [38;5;66;03m# `level` represents the number of leading dots in a relative import statement.[39;00m
[1;32m 175[0m [38;5;66;03m# If it's zero, then this is an absolute import.[39;00m
[0;31mModuleNotFoundError[0m: No module named 'clients',None,Map(),Map(),List(),List(),Map())
I actually don't think the accepted solution should be accepted as the problem involving DLT was not addressed.
I'm having a similar issue while trying to import from repos in a DLT pipeline.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group