cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Caveats when importing functions from REPO stored .py files

Craig_
New Contributor III

The ability to import .py files into notebooks looked like a clean and easy way to reuse code and to ensure all notebooks are using the same version of code. However, two items remain unclear after scouring documentation and forums.

Are these the right / best solutions to the problems or should we revert back to %run and notebooks instead of .py files? Thanks!

Code within the .py file does not have access to the spark session by default.

Outcome: NameError: name 'spark' is not defined

Solution: add the following to the .py file:

from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()

Are there any implications to this?

Does the notebook code and .py code share the same session or does this cause separate sessions?

display() and displayHTML() functions are not available to the .py code by default

Outcome: NameError: name 'displayHTML' is not defined when displayHTML() is called from within the .py file

Solution: add the following to the .py file and use display(HTML()) instead of displayHTML():

from IPython.core.display import display, HTML  #How to use:  display(HTML("your content"))

Is there a better way to get displayHTML() working inside the .py file?

What about all of the other databricks specific functions?

1 ACCEPTED SOLUTION

Accepted Solutions

Hubert-Dudek
Esteemed Contributor III
  • "does not have access to the spark session by default" yes, that is correct you need to pass the reference to the spark variable inside Class or Function, something like when you call from notebook function_from_file(spark=spark)
  • displayHTML() is designed to be work from notebook

View solution in original post

4 REPLIES 4

Debayan
Esteemed Contributor III
Esteemed Contributor III

Hi, In Databricks the display functions is coming from PythonShell

import PythonShell as pi

pi.PythonShell.display(temps)

This works in Databricks but this is not whitelisted for outside use.

Craig_
New Contributor III

This didnt work for me.

import PythonShell as pi

results in "ImportWarning: PythonShell.py is a launch script, which shall not be imported directly. You might want to import PythonShellImpl instead."

and then

pi.PythonShell.display("test")

results in "AttributeError: 'str' object has no attribute 'display'"

I tried different variations of import PythonShellImpl which didn't work either.

Hubert-Dudek
Esteemed Contributor III
  • "does not have access to the spark session by default" yes, that is correct you need to pass the reference to the spark variable inside Class or Function, something like when you call from notebook function_from_file(spark=spark)
  • displayHTML() is designed to be work from notebook

Craig_
New Contributor III

Thanks Hubert!

Passing the spark variable is a great idea.

The display(HTML()) solution is working well. Thanks for confirming displayHTML() isn't designed to be used outside of notebooks.