cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Terraform Error: Cannot create sql table context deadline

Volker
New Contributor III

I am currently trying to deploy external parquet tables to the Databricks UC using terraform. However, for some tables I get the following error:

Error: cannot create sql table: Post "https://[MASKED]/api/2.0/sql/statements/": context deadline exceeded

This is probably due to a high degree of partitioning of the underlying parquet tables. The terraform databricks provider sets the timeoutto max 50s (https://github.com/databricks/terraform-provider-databricks/blob/main/catalog/resource_sql_table.go#...) and this can also not be changed as far as I have seen.

Did anybody else experience this error and knows a way to circumvent this?

2 REPLIES 2

Kaniz
Community Manager
Community Manager

Hi @VolkerThis can often be due to network connection issues or, as you’ve mentioned, a high degree of partiti...

While the Terraform Databricks provider does set a maximum timeout of 50 seconds, which cannot be changed, there are a few potential workarounds you could consider:

  1. If the high degree of partitioning is causing the operation to exceed the timeout, you could consider reducing the partitioning of your parquet tables. However, depending on your specific use case, this might not be feasible.
  2. Instead of deploying all tables at once, try deploying them one by one or in smaller batches. This might prevent the operation from exceeding the timeout.
  3. In some cases, increasing the timeout parameter in your provider block in your Terraform script migh.... However, this might not be applicable as you’ve mentioned the timeout cannot be changed.

If the issue persists, I would recommend raising a Databricks support ticket for more specific guidance.

Volker
New Contributor III

Hey @Kaniz,

thanks for your reply and sorry for the late reply from my side. I couldn't fix the problem with the databricks terraform provider unfortunately. I now switched to using liquibase to deploy tables to databricks.

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!