Databricks SQL row limits
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
hi there,
my dataset is approx 408K rows. i am trying to run a query that will return everything. but the results set seems to stop at 64K rows.
i've seen a few posts in here asking about it, but they are several years old and a solution is promised. but nothing seems to have been updated on the row limitations.
is there a setting that i am missing to allow >64K rows?
or is it still locked at 64K rows?
if this is the case, is there any potential solution for this?
i like databricks. i dont want to have to go back to Athena 🙁
Thanks guys
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago - last edited 2 weeks ago
Hi @bigkahunaburger,
The 64k row limit in Databricks SQL applies only to the UI display, not the actual data processing. To access your full dataset, you can use the Download full results option to save the query output.
Or use Spark or JDBC/ODBC connections to retrieve the full dataset. Simply Run your SQL query and store the results in a Spark dataframe and then export it to a file format like Parquet or CSV. Then read the file back into a Spark dataframe for filtering, or advanced analytics, else you can also load the file into BI tools or SQL databases for visualization and querying purpose.

