by
Jeff1
• Contributor II
- 20319 Views
- 5 replies
- 9 kudos
Struggling with how to export a Spark dataframe as a *.csv file to a local computer. I'm successfully using the spark_write_csv funciton (sparklyr R library R) to write the csv file out to my databricks dbfs:FileStore location. Becase (I'm assuming)...
- 20319 Views
- 5 replies
- 9 kudos
Latest Reply
sparklyr has a different syntax. There is function sdf_coalesce.The code which you paste is for Scala/Python. Additionally, even in python you can only specify folder not file so CSV("dbfs:FileStore/temp/")
4 More Replies
- 2534 Views
- 2 replies
- 3 kudos
Hello DataBricks,Struggling with a workflow issue and wondering if anyone can help. I am developing my project in R and sometimes Python locally on my laptop, and committing the files to a git repo. I can then clone that repo in databricks, and *see*...
- 2534 Views
- 2 replies
- 3 kudos
Latest Reply
This is separate script which than need to be run from notebook (or job). I am not using R but in Python and Scala it works the same. In Python I am just importing it in notebook ("from folder_structure import myClass") in R probably similar. There ...
1 More Replies