Hitting 504 Error with Databricks Delta Share open sharing protocol
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
When trying to query a particularly large table that my team has been given access to in a share (we are querying the table using a profile file with bearer token), we are continually hitting the following error:
io.delta.sharing.client.util.UnexpectedHttpStatuss: HTTP request failed with status: HTTP/1.1 504 Gateway Timeout {"error_code":"DEADLINE_EXCEEDED"}
Are there limitations on the databricks delta share server for the amount of data you can query via a share?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 weeks ago
As a note we are using the deltaSharing data source like so:
spark.read.format("deltaSharing")
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Monday
Hi @nbrisson
This could be due to large metadata in the Delta table being queried via Delta Sharing. There are some known limitations, you can refer to these docs for more details:
- Troubleshoot common sharing issues in Delta Sharing
- RESOURCE_LIMIT_EXCEEDED error when querying a Delta Sharing table
And try to run OPTIMIZE and VACUUM command on the table to reduce metadata size and also you can use partitioning to reduce the number of files.

