cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
cancel
Showing results for 
Search instead for 
Did you mean: 

Unable to write Data frame to Azure Synapse Table

Akshith_Rajesh
New Contributor III

When I am trying to insert records into the azure synapse Table using JDBC Its throwing below error

com.microsoft.sqlserver.jdbc.SQLServerException: The statement failed. Column 'COMPANY_ADDRESS_STATE' has a data type that cannot participate in a columnstore index.

In Database this table have column [COMPANY_ADDRESS_STATE] datatype as Varchar(50)

df.write.format("jdbc").mode("append") \
        .option('url', sqlURL)\
        .option('dbTable', tableName)\
        .option('user', synUser)\
        .option('password',jdbcPassword).save()

Can someone please guide me with the correct approach

4 REPLIES 4

Hubert-Dudek
Esteemed Contributor III

Columns that use any of the following data types cannot be included in a columnstore index:

nvarchar(max), varchar(max), and varbinary(max)

(Applies to SQL Server 2016 and prior versions, and nonclustered columnstore indexes)

so the issue is on the Azure Synapse's side.

Hi Hubert,

Interesting part is none of my columns have ​nvarchar(max), varchar(max), and varbinary(max) as the data types

​Can you please let me know what ever the approach which I am trying to the table is the correct one or not

Or do we have any other approach in writing data to the synapse database using data bricks ​

Hubert-Dudek
Esteemed Contributor III

Yes, indeed, it is max only. I remember having some issues with varchar, too, but I can not remember exactly now. However, I solved it by creating a new table (like COMPANY_ADDRESS_STATE_2) in Azure SQL from databricks (omit append mode) and comparing datatypes.

But here is one more thing it is Synapse, so I think that you need to specify it by adding .format("com.databricks.spark.sqldw")

https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/synapse-analytics

Kaniz
Community Manager
Community Manager

Hi @Rajesh Akshith​, We haven't heard from you on the last response from @Hubert Dudek​ , and I was checking back to see if his suggestions helped you.

Or else, If you have any solution, please share it with the community as it can be helpful to others.

Also, Pledon'ton't forget to click on the "Select As Best" button whenever the information provided helps resolve your question.