When to use Dataframes API over Spark SQL
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-29-2022 07:56 PM
Hello Experts,
I am new to Databricks. Building data pipelines, I have both batch and streaming data.
Should I use Dataframes API to read csv files then convert to parquet format then do the transformation?
or
write to table using CSV then use Spark SQL to do transformation?.
Appreciate pros and cons and which one is better
Thank you
Rathinam
Labels:
- Labels:
-
Dataframes API
-
Spark sql
1 REPLY 1
Options
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-30-2022 01:09 PM
Hi Rathinam, It would be better to understand the pipeline more in this situation. Writing to table using CSV and then using spark SQL will be faster in few cases than the other one.

