cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Jeff1
by Contributor II
  • 16670 Views
  • 5 replies
  • 9 kudos

Resolved! How to write *.csv file from DataBricks FileStore

Struggling with how to export a Spark dataframe as a *.csv file to a local computer. I'm successfully using the spark_write_csv funciton (sparklyr R library R) to write the csv file out to my databricks dbfs:FileStore location. Becase (I'm assuming)...

  • 16670 Views
  • 5 replies
  • 9 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 9 kudos

sparklyr has a different syntax. There is function sdf_coalesce.The code which you paste is for Scala/Python. Additionally, even in python you can only specify folder not file so CSV("dbfs:FileStore/temp/")

  • 9 kudos
4 More Replies
GC-James
by Contributor II
  • 2240 Views
  • 2 replies
  • 3 kudos

Resolved! Working locally then moving to databricks

Hello DataBricks,Struggling with a workflow issue and wondering if anyone can help. I am developing my project in R and sometimes Python locally on my laptop, and committing the files to a git repo. I can then clone that repo in databricks, and *see*...

  • 2240 Views
  • 2 replies
  • 3 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 3 kudos

This is separate script which than need to be run from notebook (or job). I am not using R but in Python and Scala it works the same. In Python I am just importing it in notebook ("from folder_structure import myClass") in R probably similar. There ...

  • 3 kudos
1 More Replies
Labels