by
Jeff1
• Contributor II
- 9923 Views
- 7 replies
- 10 kudos
Struggling with how to export a Spark dataframe as a *.csv file to a local computer. I'm successfully using the spark_write_csv funciton (sparklyr R library R) to write the csv file out to my databricks dbfs:FileStore location. Becase (I'm assuming)...
- 9923 Views
- 7 replies
- 10 kudos
Latest Reply
Hi @Jeff (Customer), Were you able to follow @Hubert Dudek​ ? Did it help you?
6 More Replies
- 1167 Views
- 4 replies
- 3 kudos
Hello DataBricks,Struggling with a workflow issue and wondering if anyone can help. I am developing my project in R and sometimes Python locally on my laptop, and committing the files to a git repo. I can then clone that repo in databricks, and *see*...
- 1167 Views
- 4 replies
- 3 kudos
Latest Reply
This is separate script which than need to be run from notebook (or job). I am not using R but in Python and Scala it works the same. In Python I am just importing it in notebook ("from folder_structure import myClass") in R probably similar. There ...
3 More Replies