โ06-30-2022 05:25 AM
Hi sorry new to Spark, DataBricks. Please could someone summarise options for moving data between these different languages. Esp. interested in R<=>Python options: can see how to do SQL/Spark. Spent a lot of time googling but no result. Presume can use R's reticulate to access python objects..?
Anyway grateful for any idiot-proof links, quick guides, code.
โ07-05-2022 10:47 PM
If youโre using R I highly recommend the sparklyr package from RStudio. Many of the pyspark functions have the same name, for example, โspark.read.table()โ is โspark_read_tableโ in sparklyr. More info here: https://spark.rstudio.com/packages/sparklyr/latest/reference/spark_read_table.html
โ06-30-2022 07:38 AM
Hi, you can create a temporary table and then retrieve it with every rogramming language:
ex create in sql:
%sql
CREATE OR REPLACE TEMPORARY VIEW Test1 AS
SELECT *
FROM TEST
And then retrieve in python
%python
spark.read.table('Test1')
โ07-05-2022 07:50 AM
Thanks. I took that approach to create view and then queried it using SQL from R:
%r
rd=as.data.frame(sql("select * from CNTRY_FLOWS"))
...not sure if there's a more direct route. I was unsure what the equivalent to python spark.read.table() for R was.
โ07-05-2022 10:47 PM
If youโre using R I highly recommend the sparklyr package from RStudio. Many of the pyspark functions have the same name, for example, โspark.read.table()โ is โspark_read_tableโ in sparklyr. More info here: https://spark.rstudio.com/packages/sparklyr/latest/reference/spark_read_table.html
โ07-05-2022 10:58 PM
One other thing I thought ofโ with Spark you want to keep your data in Spark as much as possible and not bring it back to R unless you have too. With Sparklyr you can use many tidyverse functions directly in Spark without having to collect your results and put them in a data frame first. For R functions or packages that donโt have a connection to the Spark API directly you can also use sparklyr::spark_apply to distribute your R code over the cluster and leave your Spark data frames in spark.
โ07-07-2022 03:30 AM
Thanks so much for this. Actually most of my code has been python to date. It was really about knowing how to access objects from one language in the othersโe.g. I had some R code to produce a graph that I wanted to recycle.
โ07-07-2022 03:32 AM
I think there's a trick which still hasn't (yet) been achieved in spark. Why can't there be standard syntax to access all objects across all languages it supports (with appropriate data structure translation).
โ07-07-2022 09:45 PM
Depending on what youโre doing there is a package called reticulate in R that lets you directly share objects between R and python, Spark not required. https://rstudio.github.io/reticulate/
However Iโve found (so far) it really only works in RStudio, which does run awesomely on Databricks when using the DB ML distributions. You can find RStudio preinstalled on the DB cluster under โappsโ on the clusterโs page when you deselect the cluster auto termination for inactivity:
There is a bit more set up once in RStudio on DB to make reticulate work flawlessly that I could post if interested.
What I havenโt tested yet is what happens if you make an Rmd using reticulate in RStudio on Databricks and then try to schedule that in a DB Workflow later. If I do Iโll be sure to post about it in the community.
Iโd love it if more direct adoption of reticulate was included in Databricks notebooks that extended it to Scala and SQL too, like youโve suggested. Each language has its advantages in my opinion, and there are some really awesome ML packages in tidymodels for R that get frequently overlooked by the python community in my opinion because the switching between languages is difficult in most places.
โ07-08-2022 04:46 AM
hi thanks for this. Yes I was aware of reticulate, if not its use in databricks. Actually most of my code is python. It's just had an issue getting some libraries to load and had R code for that so wanted to include an R chunk.
โ07-06-2022 04:01 PM
@Simone Folinoโ & @Matthew Gigliaโ Amazing responses, thank you for jumping in and providing your personal expertise in this thread!
@Fernley Symonsโ I think we are all eager to hear if these suggestions got you 100% sorted! If so, feel free to choose one of the replies as "best" so the rest of the community knows this question is answered in the future. If not, feel free to let us know what else you need. Thanks!
โ07-25-2022 03:05 AM
Hi @Fernley Symonsโ Gentle reminder on the answer provided above. Please let us know if you have more doubts or queries.
โ07-25-2022 03:08 AM
? I've already voted best answer...
โ07-25-2022 03:10 AM
@Fernley Symonsโ Thank you for your prompt reply. Apologies, we have just noticed that an answer is already marked as best. Thank you once again.
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you wonโt want to miss the chance to attend and share knowledge.
If there isnโt a group near you, start one and help create a community that brings people together.
Request a New Group