COPY INTO size limit
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-11-2024 11:21 AM
Hi
I'm using the COPY INTO command to ingest data into a delta table in my Azure Databricks instance. Sometime I get a timeout error running this command. Is there a limit on the size of the data that can be ingested using "COPY INTO" or limit on the number of files that can be ingested at a time?
Thanks
- Labels:
-
Delta Lake
-
Spark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-11-2024 12:16 PM
Hello @DBUser2,
The COPY INTO command does not have a specific documented limit on the size of the data or the number of files that can be ingested at a time. Timeout errors can occur due to network issues, resource limitations, or long-running operations. Are you running the commands on a warehouse or over notebook?
Looking at cluster/warehouse metrics would be a good way to start investigating.
Also, do you have the statementID is executed via warehouse?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-11-2024 02:12 PM
Thanks for replying, i'm running the command on sql warehouse.

