Error when executing an INSERT statement on an External Postgres table from Databricks SQL Editor
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
Hi,
This is the context of my issue:
I have an AWS RDS Postgres database instance setup. I have also set up a Postgres CONNECTION in Databricks and can view the Postgres tables under a newly created FOREIGN CATALOG in Databricks Unity Catalog.
Using the Databricks SQL editor, I am able to successfully read from these tables with a SELECT statement. But, when I try to execute an INSERT statement in the SQL Editor, I am getting the following error:
The input query contains unsupported data source(s). Only csv, json, avro, delta, kafka, parquet, orc, text, unity_catalog, binaryFile, xml, simplescan, iceberg, mysql, postgresql, sqlserver, redshift, snowflake, sqldw, databricks, bigquery, oracle, salesforce, salesforce_data_cloud, teradata, workday_raas, mongodb data sources are supported on Databricks SQL, and only csv, json, avro, delta, kafka, parquet, orc, text, unity_catalog, binaryFile, xml, simplescan, iceberg data sources are allowed to run DML on Databricks SQL.
I am trying to figure out if it is possible to perform an UPDATE/INSERT into a Postgres table from Databricks?
Thanks in advance !
Pankj

