03-05-2025 02:02 PM
I'm trying to insert and update data in an SQLServer table from a python script. No matter what I try, it seems to give me this error:
The input query contains unsupported data source(s). Only csv, json, avro, delta, kafka, parquet, orc, text, unity_catalog, binaryFile, xml, simplescan, iceberg, mysql, postgresql, sqlserver, redshift, snowflake, sqldw, databricks, bigquery, oracle, salesforce, salesforce_data_cloud, teradata, workday_raas, mongodb data sources are supported on serverless compute...
- I've tried from spark.sql on serverless and non-serverless compute.
- I've tried updating using dataframe.write...
- I've tried over a federated connection as well as opening a jdbc connection.
- Based on the message and suggestions from the assistant, I've tried writing the input data to a delta table first, then updating from there.
Is anyone out there able to write to an existing sql server table who could give me some hints?
03-05-2025 04:04 PM
Hi @Unimog,
Currently the support for data sources are limited to as mentioned in the General Limitations for serverless compute as of now: General Serverless Limitations
Hopefully, it will soon be enabled for more data formats including sqlserver as mentioned in your dataframe write query. But when you are using Delta as a format it should work ideally without any issues.
03-05-2025 03:18 PM
Hi @Unimog, Is it possible to provide the dataframe write statement? Or more details on the error apart from allowed file formats
03-05-2025 03:44 PM
03-05-2025 03:47 PM
Here is the full text of the error message:
03-05-2025 04:04 PM
Hi @Unimog,
Currently the support for data sources are limited to as mentioned in the General Limitations for serverless compute as of now: General Serverless Limitations
Hopefully, it will soon be enabled for more data formats including sqlserver as mentioned in your dataframe write query. But when you are using Delta as a format it should work ideally without any issues.
03-06-2025 07:39 AM
I thought I had already tried this on general purpose compute, but apparently not. After reading the docs you referenced, I retried using standard compute and it works perfectly.
Thanks!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now