<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Configuration spark.sql.sources.partitionOverwriteMode is not available. in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/49250#M28500</link>
    <description>&lt;P&gt;Hi, it seems that this is not supported via sql warehousing (ref. dbt code)&lt;/P&gt;&lt;H3&gt;&lt;A href="https://github.com/databricks/dbt-databricks/blob/main/dbt/include/databricks/macros/materializations/incremental/incremental.sql#L20" target="_blank"&gt;https://github.com/databricks/dbt-databricks/blob/main/dbt/include/databricks/macros/materializations/incremental/incremental.sql#L20&lt;/A&gt;&lt;/H3&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Mon, 16 Oct 2023 07:58:21 GMT</pubDate>
    <dc:creator>Bram</dc:creator>
    <dc:date>2023-10-16T07:58:21Z</dc:date>
    <item>
      <title>Configuration spark.sql.sources.partitionOverwriteMode is not available.</title>
      <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/43646#M27528</link>
      <description>&lt;P&gt;Dear,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;In the current setup, we are using dbt as a modeling tool for our data lakehouse.&lt;/P&gt;&lt;P&gt;For a specific use case, we want to use the insert_overwrite strategy, where dbt will replace all data for a specific partition:&lt;/P&gt;&lt;P&gt;&lt;A href="https://protect-us.mimecast.com/s/guYOCyPJoqTxBLzqcZVawz?domain=docs.getdbt.com" target="_blank"&gt;Databricks configurations | dbt Developer Hub (getdbt.com)&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hereby the specific dbt model configuration:&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="1030"&gt;&lt;P&gt;{{&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; config(&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; materialized='incremental',&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; partition_by=['zcFiscalYearPeriod'],&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; file_format='delta', &amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; incremental_strategy='insert_overwrite',&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; on_schema_change='sync_all_columns'&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; )&lt;/P&gt;&lt;P&gt;}}&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;When dbt is execution the needed query’s, we retrieve the following message:&lt;/P&gt;&lt;P&gt;Qry:&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="1030"&gt;&lt;P&gt;/* {"app": "dbt", "dbt_version": "1.5.6", "dbt_databricks_version": "1.5.5", "databricks_sql_connector_version": "2.7.0", "profile_name": "user", "target_name": "default", "node_id": "model.data_platform.prep_InventoryStockDay_test"} */&lt;/P&gt;&lt;P&gt;set&lt;/P&gt;&lt;P&gt;&amp;nbsp; spark.sql.sources.partitionOverwriteMode = DYNAMIC&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;Msg:&lt;/P&gt;&lt;TABLE&gt;&lt;TBODY&gt;&lt;TR&gt;&lt;TD width="1030"&gt;&lt;P&gt;&lt;FONT size="3"&gt;Configuration spark.sql.sources.partitionOverwriteMode is not available.&lt;/FONT&gt;&lt;/P&gt;&lt;/TD&gt;&lt;/TR&gt;&lt;/TBODY&gt;&lt;/TABLE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Maybe important to note that we are connecting via a sql endpoint, to a unity-catalog enabled cluster.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;We already contacted the DBT-team and everything looks good on dbt side, maybe it’s a setting in databricks we need to change?&lt;/P&gt;</description>
      <pubDate>Tue, 05 Sep 2023 13:53:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/43646#M27528</guid>
      <dc:creator>Bram</dc:creator>
      <dc:date>2023-09-05T13:53:03Z</dc:date>
    </item>
    <item>
      <title>Re: Configuration spark.sql.sources.partitionOverwriteMode is not available.</title>
      <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/43654#M27530</link>
      <description>&lt;P&gt;the sql endpoint could be the culprit.&amp;nbsp; dynamic partition overwrite is in public preview on workspace so it could be that the sql endpoints are not yet supported.&lt;/P&gt;&lt;P&gt;What happens if you use replaceWhere instead of insertoverwrite?&lt;/P&gt;</description>
      <pubDate>Tue, 05 Sep 2023 14:14:03 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/43654#M27530</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2023-09-05T14:14:03Z</dc:date>
    </item>
    <item>
      <title>Re: Configuration spark.sql.sources.partitionOverwriteMode is not available.</title>
      <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/43754#M27537</link>
      <description>&lt;P&gt;Hi Werners1,&lt;/P&gt;&lt;P&gt;thank you for the response, using the replaceWhere statement gives us the same message&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="Bram_0-1693988024939.png" style="width: 400px;"&gt;&lt;img src="https://community.databricks.com/t5/image/serverpage/image-id/3512iEE2ED0188BE53397/image-size/medium/is-moderation-mode/true?v=v2&amp;amp;px=400" role="button" title="Bram_0-1693988024939.png" alt="Bram_0-1693988024939.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 06 Sep 2023 08:14:11 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/43754#M27537</guid>
      <dc:creator>Bram</dc:creator>
      <dc:date>2023-09-06T08:14:11Z</dc:date>
    </item>
    <item>
      <title>Re: Configuration spark.sql.sources.partitionOverwriteMode is not available.</title>
      <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/43755#M27538</link>
      <description>&lt;P&gt;Weird, because replaceWhere does not require the parameter to be set.&lt;BR /&gt;Or do you still try to set it to true, because that is not a requirement for replaceWhere&lt;/P&gt;</description>
      <pubDate>Wed, 06 Sep 2023 08:31:35 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/43755#M27538</guid>
      <dc:creator>-werners-</dc:creator>
      <dc:date>2023-09-06T08:31:35Z</dc:date>
    </item>
    <item>
      <title>Re: Configuration spark.sql.sources.partitionOverwriteMode is not available.</title>
      <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/49126#M28480</link>
      <description>&lt;P&gt;Did you ever get a solution or answer on this error? Apparently partition overwrites are no longer a preview feature, they were released in version Databricks SQL version 2023.40 (see link below). My SQL warehouse is using 2023.40 but I am still getting the same "spark.sql.sources.partitionOverwriteMode is not available" error as you via dbt.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;A href="https://docs.databricks.com/en/sql/release-notes/index.html" target="_blank"&gt;https://docs.databricks.com/en/sql/release-notes/index.html&lt;/A&gt;&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;&lt;P&gt;Block schema overwrite is available when using dynamic partition overwrites.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 13 Oct 2023 11:02:41 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/49126#M28480</guid>
      <dc:creator>bbthorn</dc:creator>
      <dc:date>2023-10-13T11:02:41Z</dc:date>
    </item>
    <item>
      <title>Re: Configuration spark.sql.sources.partitionOverwriteMode is not available.</title>
      <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/49250#M28500</link>
      <description>&lt;P&gt;Hi, it seems that this is not supported via sql warehousing (ref. dbt code)&lt;/P&gt;&lt;H3&gt;&lt;A href="https://github.com/databricks/dbt-databricks/blob/main/dbt/include/databricks/macros/materializations/incremental/incremental.sql#L20" target="_blank"&gt;https://github.com/databricks/dbt-databricks/blob/main/dbt/include/databricks/macros/materializations/incremental/incremental.sql#L20&lt;/A&gt;&lt;/H3&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 16 Oct 2023 07:58:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/49250#M28500</guid>
      <dc:creator>Bram</dc:creator>
      <dc:date>2023-10-16T07:58:21Z</dc:date>
    </item>
    <item>
      <title>Re: Configuration spark.sql.sources.partitionOverwriteMode is not available.</title>
      <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/58326#M31103</link>
      <description>&lt;P&gt;Hi!&lt;BR /&gt;I have same issue with insert_overwrite on Databricks with SQL Warehouse. Do you have any solution or updates? Or is it still not supported by Databricks?&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jan 2024 12:09:17 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/58326#M31103</guid>
      <dc:creator>nad__</dc:creator>
      <dc:date>2024-01-24T12:09:17Z</dc:date>
    </item>
    <item>
      <title>Re: Configuration spark.sql.sources.partitionOverwriteMode is not available.</title>
      <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/58330#M31105</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/98723"&gt;@nad__&lt;/a&gt;&amp;nbsp;, This is still not available in sql warehouse&lt;/P&gt;</description>
      <pubDate>Wed, 24 Jan 2024 13:19:21 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/58330#M31105</guid>
      <dc:creator>Lakshay</dc:creator>
      <dc:date>2024-01-24T13:19:21Z</dc:date>
    </item>
    <item>
      <title>Re: Configuration spark.sql.sources.partitionOverwriteMode is not available.</title>
      <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/113033#M44400</link>
      <description>&lt;P&gt;&lt;a href="https://community.databricks.com/t5/user/viewprofilepage/user-id/75976"&gt;@Lakshay&lt;/a&gt;&amp;nbsp;when will this be added? I still get this error in 2025.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 19 Mar 2025 10:20:29 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/113033#M44400</guid>
      <dc:creator>hspuijbroek</dc:creator>
      <dc:date>2025-03-19T10:20:29Z</dc:date>
    </item>
    <item>
      <title>Re: Configuration spark.sql.sources.partitionOverwriteMode is not available.</title>
      <link>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/120483#M46176</link>
      <description>&lt;P&gt;An approach that works well when using a Databricks SQL Warehouse is to use the replace_where strategy - I've just tested this. It also works with partitioned tables:&lt;/P&gt;&lt;LI-CODE lang="java"&gt;{{ config(
   materialized='incremental',
   incremental_strategy='replace_where',
   incremental_predicates=["date in (" ~ var('date', '1999-12-31') ~ ")"],
   partition_by=['date']
) }}

select * from {{ ref('table_upstream') }}
where date in ({{ var("date", "1999-12-31") }})&lt;/LI-CODE&gt;&lt;P&gt;which is compiled into the following SQL&lt;/P&gt;&lt;LI-CODE lang="java"&gt;insert into `table_downstream`
  replace where
    date in ('2023-01-03', '2023-01-02')
  table `table_downstream__dbt_tmp`&lt;/LI-CODE&gt;&lt;P&gt;and run it with the following filter predicates dynamically passed in via CLI:&lt;/P&gt;&lt;P&gt;dbt run --vars "{\"date\": \"'2023-01-03', '2023-01-02'\"}"&lt;/P&gt;&lt;P&gt;I'd recommend to avoid using insert_overwrite with DBSQL warehouses. Also, omitting partition_by=['date'] causes the entire table to be overwritten. While recoverable via time travel and RESTORE, it’s risky if not intended.&lt;/P&gt;</description>
      <pubDate>Thu, 29 May 2025 01:30:40 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/configuration-spark-sql-sources-partitionoverwritemode-is-not/m-p/120483#M46176</guid>
      <dc:creator>hendrik</dc:creator>
      <dc:date>2025-05-29T01:30:40Z</dc:date>
    </item>
  </channel>
</rss>

