- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2023 01:09 PM
The ability to import .py files into notebooks looked like a clean and easy way to reuse code and to ensure all notebooks are using the same version of code. However, two items remain unclear after scouring documentation and forums.
Are these the right / best solutions to the problems or should we revert back to %run and notebooks instead of .py files? Thanks!
Code within the .py file does not have access to the spark session by default.
Outcome: NameError: name 'spark' is not defined
Solution: add the following to the .py file:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
Are there any implications to this?
Does the notebook code and .py code share the same session or does this cause separate sessions?
display() and displayHTML() functions are not available to the .py code by default
Outcome: NameError: name 'displayHTML' is not defined when displayHTML() is called from within the .py file
Solution: add the following to the .py file and use display(HTML()) instead of displayHTML():
from IPython.core.display import display, HTML #How to use: display(HTML("your content"))
Is there a better way to get displayHTML() working inside the .py file?
What about all of the other databricks specific functions?
- Labels:
-
Files In Repos
-
Py File
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-24-2023 10:34 AM
- "does not have access to the spark session by default" yes, that is correct you need to pass the reference to the spark variable inside Class or Function, something like when you call from notebook function_from_file(spark=spark)
- displayHTML() is designed to be work from notebook
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-23-2023 10:53 PM
Hi, In Databricks the display functions is coming from PythonShell
import PythonShell as pi
pi.PythonShell.display(temps)
This works in Databricks but this is not whitelisted for outside use.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-24-2023 11:19 AM
This didnt work for me.
import PythonShell as pi
results in "ImportWarning: PythonShell.py is a launch script, which shall not be imported directly. You might want to import PythonShellImpl instead."
and then
pi.PythonShell.display("test")
results in "AttributeError: 'str' object has no attribute 'display'"
I tried different variations of import PythonShellImpl which didn't work either.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-24-2023 10:34 AM
- "does not have access to the spark session by default" yes, that is correct you need to pass the reference to the spark variable inside Class or Function, something like when you call from notebook function_from_file(spark=spark)
- displayHTML() is designed to be work from notebook
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-24-2023 11:06 AM
Thanks Hubert!
Passing the spark variable is a great idea.
The display(HTML()) solution is working well. Thanks for confirming displayHTML() isn't designed to be used outside of notebooks.