<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Databricks throwing error &amp;quot;SQL DW failed to execute the JDBC query produced by the connector.&amp;quot; while pushing the column with string length more than 255 in Data Engineering</title>
    <link>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27872#M19715</link>
    <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;I tried this still I have same error. My string length is 1900, i gave maxStrLength value as 3000 .. still not working.&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
    <pubDate>Sat, 27 Jun 2020 00:16:09 GMT</pubDate>
    <dc:creator>ImranShaik</dc:creator>
    <dc:date>2020-06-27T00:16:09Z</dc:date>
    <item>
      <title>Databricks throwing error "SQL DW failed to execute the JDBC query produced by the connector." while pushing the column with string length more than 255</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27865#M19708</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;I am using databricks to transform the data and than pushing the data into datalake.&lt;/P&gt;
&lt;P&gt;the data is getting pushed in if the length of the string field is 255 or less but it throws following error if it is beyond that.&lt;/P&gt;
&lt;P&gt;"SQL DW failed to execute the JDBC query produced by the connector.&lt;/P&gt;
&lt;P&gt;Underlying SQLException(s): - com.microsoft.sqlserver.jdbc.SQLServerException: HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: String or binary data would be truncated. [ErrorCode = 107090] [SQLState = S0001"&lt;/P&gt;
&lt;P&gt;I am using following code to push the data into Datawarehouse&lt;/P&gt;
&lt;P&gt;testdf.write &lt;/P&gt;
&lt;P&gt; .format("com.databricks.spark.sqldw") .&lt;/P&gt;
&lt;P&gt;option("url", sqlDwUrlSmall) .&lt;/P&gt;
&lt;P&gt;option("dbtable", "dbo.testAddress") .&lt;/P&gt;
&lt;P&gt;option( "forward_spark_azure_storage_credentials","True") &lt;/P&gt;
&lt;P&gt; .option("tempdir", tempDir) &lt;/P&gt;
&lt;P&gt; .mode("overwrite")&lt;/P&gt;
&lt;P&gt;.save()&lt;/P&gt;
&lt;P&gt;This table has just one column and the length of this field is 4000.&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 29 Aug 2019 18:47:37 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27865#M19708</guid>
      <dc:creator>bhaumikg</dc:creator>
      <dc:date>2019-08-29T18:47:37Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks throwing error "SQL DW failed to execute the JDBC query produced by the connector." while pushing the column with string length more than 255</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27866#M19709</link>
      <description>&lt;P&gt;@bhaumikg I think this link here explains whats going on:https://kb.informatica.com/solution/23/pages/68/563577.aspx The hive metastore column PARAM_VALUE in table TABLE_PARAMS probably has datatype that is 4000 in length. Thus when you attempt to load a record greater than that you get: "String or binary data would be truncated".&lt;/P&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 04 Sep 2019 03:42:08 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27866#M19709</guid>
      <dc:creator>jamesferrisjr</dc:creator>
      <dc:date>2019-09-04T03:42:08Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks throwing error "SQL DW failed to execute the JDBC query produced by the connector." while pushing the column with string length more than 255</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27867#M19710</link>
      <description>&lt;PRE&gt;&lt;CODE&gt;I'm having a similar issue in Azure with MS-SQL Server Datawarehouse (DWH)&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt; Eventhough I have set the target column to nvarchar(4000) on MS SQL Server DWH &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt; This is the code: &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;spark.conf.set(
"spark.sql.parquet.writeLegacyFormat", "true")
dwDF.write.format("com.databricks.spark.sqldw")
    .option("url", sqlDwUrlSmall)
    .option("dbtable", TableName)
    .option( "forward_spark_azure_storage_credentials","True")
    .option("tempdir", tempDir)
    .mode("overwrite")
    .save()&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt; I get the following error. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;PRE&gt;&lt;CODE&gt;-
com.microsoft.sqlserver.jdbc.SQLServerException:
HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling
record reader buffer: HadoopSqlException: String or binary data would be
truncated. [ErrorCode = 107090] [SQLState = S0001]&lt;/CODE&gt;&lt;/PRE&gt;&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 04 Sep 2019 18:31:31 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27867#M19710</guid>
      <dc:creator>leonarbe</dc:creator>
      <dc:date>2019-09-04T18:31:31Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks throwing error "SQL DW failed to execute the JDBC query produced by the connector." while pushing the column with string length more than 255</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27868#M19711</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;I have the same issue regarding this case. &lt;/P&gt;CREATE TABLE [dbo].[SampleTable14]
&lt;P&gt;&lt;/P&gt;
&lt;P&gt;&lt;/P&gt; 
&lt;P&gt;(&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt; [firstname] [varchar](8000) NULL,&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt; [lastname] [nvarchar](300) NULL,&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt; [gender] [nvarchar](300) NULL,&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt; [location] [nvarchar](300) NULL,&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt; [subscription_type] [nvarchar](300) NULL&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt;)&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt;WITH&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt;(&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt; DISTRIBUTION = ROUND_ROBIN,&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt; CLUSTERED COLUMNSTORE INDEX&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt;)&lt;/P&gt;&lt;P&gt;&lt;/P&gt; 
&lt;P&gt;GO&lt;/P&gt;
&lt;P&gt;I also tried by maximizing the table length to 1000 and data length to 8000.&lt;/P&gt;
&lt;P&gt;&lt;B&gt;&lt;/B&gt;&lt;U&gt;&lt;/U&gt;&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 27 Sep 2019 10:52:53 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27868#M19711</guid>
      <dc:creator>hasitha</dc:creator>
      <dc:date>2019-09-27T10:52:53Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks throwing error "SQL DW failed to execute the JDBC query produced by the connector." while pushing the column with string length more than 255</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27869#M19712</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;You have to set the maxStrLength option to a value bigger than the longest string in your source data &lt;/P&gt;
&lt;PRE&gt;&lt;CODE&gt;synapseDF.write \
      .format("com.databricks.spark.sqldw") \
      .option("url", connStr ) \
      .mode( "append" ) \
      .option("tempDir", synapse_tempDir ) \
      .option("forwardSparkAzureStorageCredentials", "true") \
      .option("maxStrLength", "1024" ) \
      .option("dbTable", synapse_targetschema + "." + synapse_targettable ) \
      .save() &lt;/CODE&gt;&lt;/PRE&gt;
&lt;P&gt;Synapse creates a temporary external table while loading data with Polybase, so even if you create your target table with columns of the appropriate width, you can still get truncation errors from this temp table if you don't set maxStrLength&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 02 Apr 2020 13:42:34 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27869#M19712</guid>
      <dc:creator>ZAIvR</dc:creator>
      <dc:date>2020-04-02T13:42:34Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks throwing error "SQL DW failed to execute the JDBC query produced by the connector." while pushing the column with string length more than 255</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27870#M19713</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;Hello guys, I'm facing this error, it is not exactly the same as refered above, but as it is about the same code, i though you might help.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;thanks in advance &lt;span class="lia-unicode-emoji" title=":slightly_smiling_face:"&gt;🙂&lt;/span&gt; I'm following the tutorial on this link:&lt;P&gt;&lt;/P&gt;&lt;A href="https://docs.microsoft.com/en-us/azure/azure-databricks/databricks-extract-load-sql-data-warehouse" target="test_blank"&gt;https://docs.microsoft.com/en-us/azure/azure-databricks/databricks-extract-load-sql-data-warehouse&lt;/A&gt;
&lt;PRE&gt;&lt;CODE&gt;com.databricks.spark.sqldw.SqlDWSideException: SQL DW failed to execute the JDBC query produced by the connector.
Underlying SQLException(s):
  - com.microsoft.sqlserver.jdbc.SQLServerException: External file access failed due to internal error: 'Error occurred while accessing HDFS: Java exception raised on call to HdfsBridge_IsDirExist. Java exception message:
HdfsBridge::isDirExist - Unexpected error encountered checking whether directory exists or not: StorageException: This request is not authorized to perform this operation.' [ErrorCode = 105019] [SQLState = S0001]&lt;/CODE&gt;&lt;/PRE&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 22 Apr 2020 17:12:37 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27870#M19713</guid>
      <dc:creator>RafaelCruz</dc:creator>
      <dc:date>2020-04-22T17:12:37Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks throwing error "SQL DW failed to execute the JDBC query produced by the connector." while pushing the column with string length more than 255</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27871#M19714</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;As suggested by ZAIvR, please use append and provide maxlength while pushing the data. Overwrite may not work with this unless databricks team has fixed the issue&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 24 Apr 2020 16:23:13 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27871#M19714</guid>
      <dc:creator>bhaumikg</dc:creator>
      <dc:date>2020-04-24T16:23:13Z</dc:date>
    </item>
    <item>
      <title>Re: Databricks throwing error "SQL DW failed to execute the JDBC query produced by the connector." while pushing the column with string length more than 255</title>
      <link>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27872#M19715</link>
      <description>&lt;P&gt;&lt;/P&gt;
&lt;P&gt;I tried this still I have same error. My string length is 1900, i gave maxStrLength value as 3000 .. still not working.&lt;/P&gt; 
&lt;P&gt;&lt;/P&gt;</description>
      <pubDate>Sat, 27 Jun 2020 00:16:09 GMT</pubDate>
      <guid>https://community.databricks.com/t5/data-engineering/databricks-throwing-error-quot-sql-dw-failed-to-execute-the-jdbc/m-p/27872#M19715</guid>
      <dc:creator>ImranShaik</dc:creator>
      <dc:date>2020-06-27T00:16:09Z</dc:date>
    </item>
  </channel>
</rss>

