- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Friday
Hi,
This is the context of my issue:
I have an AWS RDS Postgres database instance setup. I have also set up a Postgres CONNECTION in Databricks and can view the Postgres tables under a newly created FOREIGN CATALOG in Databricks Unity Catalog.
Using the Databricks SQL editor, I am able to successfully read from these tables with a SELECT statement. But, when I try to execute an INSERT statement in the SQL Editor, I am getting the following error:
The input query contains unsupported data source(s). Only csv, json, avro, delta, kafka, parquet, orc, text, unity_catalog, binaryFile, xml, simplescan, iceberg, mysql, postgresql, sqlserver, redshift, snowflake, sqldw, databricks, bigquery, oracle, salesforce, salesforce_data_cloud, teradata, workday_raas, mongodb data sources are supported on Databricks SQL, and only csv, json, avro, delta, kafka, parquet, orc, text, unity_catalog, binaryFile, xml, simplescan, iceberg data sources are allowed to run DML on Databricks SQL.
I am trying to figure out if it is possible to perform an UPDATE/INSERT into a Postgres table from Databricks?
Thanks in advance !
Pankj
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Sunday
Hi @pankj0510,
DML for tables is blocked from Databricks SQL, you can only read from DBSQL. I think you can set up a JDBC URL to the Postgres database and use Spark/Pandas DataFrame write methods to insert data
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Sunday
Hi @pankj0510,
DML for tables is blocked from Databricks SQL, you can only read from DBSQL. I think you can set up a JDBC URL to the Postgres database and use Spark/Pandas DataFrame write methods to insert data
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
Thanks @Alberto_Umana, Appreciate your response.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
Thanks @Alberto_Umana, Appreciate your response.
The use case I am trying to address is this: we are using Unity Catalog to manage access to our Databricks tables. Now that we are introducing some tables in Postgres DB, I want to control the read/write access to these Postgres tables through Unity Catalog as well.
If I could control the access of the external tables in Unity Catalog, then I do not have implement separate security in AWS RDS Postgres using IAM.
With the suggested approach of Spark/Pandas DataFrame, will it be able to leverage the Unity Catalog to manage the 'Write' access?

