- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-11-2021 02:31 AM
If I have a dict created in python on a Scala notebook (using magic word ofcourse):
%python
d1 = {1: "a", 2:"b", 3:"c"}
Can I access this d1 in Scala ?
I tried the following and it returns d1 not found:
%scala
println(d1)
- Labels:
-
Python
-
Scala
-
Scala notebook
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-11-2021 03:44 AM
Variables are not shared between language contexts.
Ugly workaround: you could do something like this to pass your python variable to the spark context:
%python
d1 = {1: "a", 2:"b", 3:"c"}
spark.conf.set('d1', str(d1))
%scala
println( spark.conf.get("d1") )
However, you lose variable type. You only can pass on strings.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-11-2021 03:44 AM
Variables are not shared between language contexts.
Ugly workaround: you could do something like this to pass your python variable to the spark context:
%python
d1 = {1: "a", 2:"b", 3:"c"}
spark.conf.set('d1', str(d1))
%scala
println( spark.conf.get("d1") )
However, you lose variable type. You only can pass on strings.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
09-11-2021 07:38 AM
Martin is correct. We could only access the external files and objects. In most of our cases, we just use temporary views to pass data between R & Python.
https://docs.databricks.com/notebooks/notebooks-use.html#mix-languages

