cancel
Showing results forย 
Search instead forย 
Did you mean:ย 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

how to access data objects from different languages [R/SQL/Spark/Python]

fs
New Contributor III

Hi sorry new to Spark, DataBricks. Please could someone summarise options for moving data between these different languages. Esp. interested in R<=>Python options: can see how to do SQL/Spark. Spent a lot of time googling but no result. Presume can use R's reticulate to access python objects..?

Anyway grateful for any idiot-proof links, quick guides, code.

1 ACCEPTED SOLUTION

Accepted Solutions

If youโ€™re using R I highly recommend the sparklyr package from RStudio. Many of the pyspark functions have the same name, for example, โ€œspark.read.table()โ€ is โ€œspark_read_tableโ€ in sparklyr. More info here: https://spark.rstudio.com/packages/sparklyr/latest/reference/spark_read_table.html

View solution in original post

12 REPLIES 12

Pholo
Contributor

Hi, you can create a temporary table and then retrieve it with every rogramming language:

ex create in sql:

%sql
CREATE OR REPLACE TEMPORARY VIEW Test1 AS
 SELECT *
 FROM TEST

And then retrieve in python

%python
spark.read.table('Test1')

fs
New Contributor III

Thanks. I took that approach to create view and then queried it using SQL from R:

%r

rd=as.data.frame(sql("select * from CNTRY_FLOWS"))

...not sure if there's a more direct route. I was unsure what the equivalent to python spark.read.table() for R was.

If youโ€™re using R I highly recommend the sparklyr package from RStudio. Many of the pyspark functions have the same name, for example, โ€œspark.read.table()โ€ is โ€œspark_read_tableโ€ in sparklyr. More info here: https://spark.rstudio.com/packages/sparklyr/latest/reference/spark_read_table.html

One other thing I thought ofโ€” with Spark you want to keep your data in Spark as much as possible and not bring it back to R unless you have too. With Sparklyr you can use many tidyverse functions directly in Spark without having to collect your results and put them in a data frame first. For R functions or packages that donโ€™t have a connection to the Spark API directly you can also use sparklyr::spark_apply to distribute your R code over the cluster and leave your Spark data frames in spark.

fs
New Contributor III

Thanks so much for this. Actually most of my code has been python to date. It was really about knowing how to access objects from one language in the othersโ€”e.g. I had some R code to produce a graph that I wanted to recycle.

fs
New Contributor III

I think there's a trick which still hasn't (yet) been achieved in spark. Why can't there be standard syntax to access all objects across all languages it supports (with appropriate data structure translation).

Depending on what youโ€™re doing there is a package called reticulate in R that lets you directly share objects between R and python, Spark not required. https://rstudio.github.io/reticulate/

However Iโ€™ve found (so far) it really only works in RStudio, which does run awesomely on Databricks when using the DB ML distributions. You can find RStudio preinstalled on the DB cluster under โ€œappsโ€ on the clusterโ€™s page when you deselect the cluster auto termination for inactivity:

Cluster settings needed to run RStudio on DatabricksRstudio under โ€œAppsโ€ 

There is a bit more set up once in RStudio on DB to make reticulate work flawlessly that I could post if interested.

What I havenโ€™t tested yet is what happens if you make an Rmd using reticulate in RStudio on Databricks and then try to schedule that in a DB Workflow later. If I do Iโ€™ll be sure to post about it in the community.

Iโ€™d love it if more direct adoption of reticulate was included in Databricks notebooks that extended it to Scala and SQL too, like youโ€™ve suggested. Each language has its advantages in my opinion, and there are some really awesome ML packages in tidymodels for R that get frequently overlooked by the python community in my opinion because the switching between languages is difficult in most places.

fs
New Contributor III

hi thanks for this. Yes I was aware of reticulate, if not its use in databricks. Actually most of my code is python. It's just had an issue getting some libraries to load and had R code for that so wanted to include an R chunk.

Anonymous
Not applicable

@Simone Folinoโ€‹ & @Matthew Gigliaโ€‹ Amazing responses, thank you for jumping in and providing your personal expertise in this thread!

@Fernley Symonsโ€‹ I think we are all eager to hear if these suggestions got you 100% sorted! If so, feel free to choose one of the replies as "best" so the rest of the community knows this question is answered in the future. If not, feel free to let us know what else you need. Thanks!

Noopur_Nigam
Databricks Employee
Databricks Employee

Hi @Fernley Symonsโ€‹ Gentle reminder on the answer provided above. Please let us know if you have more doubts or queries.

fs
New Contributor III

? I've already voted best answer...

Noopur_Nigam
Databricks Employee
Databricks Employee

@Fernley Symonsโ€‹ Thank you for your prompt reply. Apologies, we have just noticed that an answer is already marked as best. Thank you once again.

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโ€™t want to miss the chance to attend and share knowledge.

If there isnโ€™t a group near you, start one and help create a community that brings people together.

Request a New Group