I am currently trying to deploy external parquet tables to the Databricks UC using terraform. However, for some tables I get the following error:
Error: cannot create sql table: Post "https://[MASKED]/api/2.0/sql/statements/": context deadline exceeded
This is probably due to a high degree of partitioning of the underlying parquet tables. The terraform databricks provider sets the timeoutto max 50s (https://github.com/databricks/terraform-provider-databricks/blob/main/catalog/resource_sql_table.go#...) and this can also not be changed as far as I have seen.
Did anybody else experience this error and knows a way to circumvent this?