cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Configuration spark.sql.sources.partitionOverwriteMode is not available.

Bram
New Contributor II

Dear,

 

In the current setup, we are using dbt as a modeling tool for our data lakehouse.

For a specific use case, we want to use the insert_overwrite strategy, where dbt will replace all data for a specific partition:

Databricks configurations | dbt Developer Hub (getdbt.com)

 

Hereby the specific dbt model configuration:

{{

    config(

        materialized='incremental',

        partition_by=['zcFiscalYearPeriod'],

        file_format='delta',  

        incremental_strategy='insert_overwrite',

        on_schema_change='sync_all_columns'

    )

}}

 

When dbt is execution the needed query’s, we retrieve the following message:

Qry:

/* {"app": "dbt", "dbt_version": "1.5.6", "dbt_databricks_version": "1.5.5", "databricks_sql_connector_version": "2.7.0", "profile_name": "user", "target_name": "default", "node_id": "model.data_platform.prep_InventoryStockDay_test"} */

set

  spark.sql.sources.partitionOverwriteMode = DYNAMIC

 

Msg:

Configuration spark.sql.sources.partitionOverwriteMode is not available.

 

 

Maybe important to note that we are connecting via a sql endpoint, to a unity-catalog enabled cluster.

 

We already contacted the DBT-team and everything looks good on dbt side, maybe it’s a setting in databricks we need to change?

7 REPLIES 7

-werners-
Esteemed Contributor III

the sql endpoint could be the culprit.  dynamic partition overwrite is in public preview on workspace so it could be that the sql endpoints are not yet supported.

What happens if you use replaceWhere instead of insertoverwrite?

Bram
New Contributor II

Hi Werners1,

thank you for the response, using the replaceWhere statement gives us the same message

Bram_0-1693988024939.png

 

-werners-
Esteemed Contributor III

Weird, because replaceWhere does not require the parameter to be set.
Or do you still try to set it to true, because that is not a requirement for replaceWhere

bbthorn
New Contributor II

Did you ever get a solution or answer on this error? Apparently partition overwrites are no longer a preview feature, they were released in version Databricks SQL version 2023.40 (see link below). My SQL warehouse is using 2023.40 but I am still getting the same "spark.sql.sources.partitionOverwriteMode is not available" error as you via dbt.

 

https://docs.databricks.com/en/sql/release-notes/index.html

  • Block schema overwrite is available when using dynamic partition overwrites.

 

Bram
New Contributor II

Hi, it seems that this is not supported via sql warehousing (ref. dbt code)

https://github.com/databricks/dbt-databricks/blob/main/dbt/include/databricks/macros/materialization...

 

nad__
New Contributor II

Hi!
I have same issue with insert_overwrite on Databricks with SQL Warehouse. Do you have any solution or updates? Or is it still not supported by Databricks? 

Lakshay
Esteemed Contributor
Esteemed Contributor

Hi @nad__ , This is still not available in sql warehouse

Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!