- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-04-2022 05:02 AM
I would like to know if it's possible to connect Databricks SQL module with not just internal Metastore DB and tables from Data Science and Engineering module but also connect with an AWS Redshift DB to do queries and create alerts.
- Labels:
-
AWS
-
Databricks SQL
-
Redshift
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-26-2022 06:20 AM
Hi @Kaniz Fatmaโ I contacted Customer support explaining this issue, they told me that this feature is not implemented yet but it's in the roadmap with no ETA. It would be great if you ping me back when it's possible to access Redshift tables from SQL module in Databricks SQL
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-04-2022 05:36 AM
@Lorenzo Rondanโ , I agree with your vision as we can register parquet from some s3; why not as a data mesh register anything, for example, Redshift or Azure SQL database.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-04-2022 08:37 AM
@Lorenzo Rondanโ , You can create a table pointing to redshift in a python notebook and register that table with the metastore.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-04-2022 08:40 AM
Hi @Joseph Kambourakisโ can you give an example or link docs to do it? Thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-04-2022 09:39 AM
Yes sorry, indeed there is CREATE TABLE USING JDBC driver, never tried it.
CREATE TABLE example_table
USING com.databricks.spark.redshift
OPTIONS (
dbtable '<your-table-name>',
tempdir 's3a://<your-bucket>/<your-directory-path>',
url 'jdbc:redshift://<the-rest-of-the-connection-string>'
);
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-04-2022 02:35 PM
@Hubert Dudekโ @Joseph Kambourakisโ
After testing this new table pointing to redshift it doesn't work because com.databricks.spark.redshift is not a supported data source in Databricks SQL (Not delta like a normal DB/Tabled stored in S3)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-05-2022 02:01 AM
@Lorenzo Rondanโ , So it seems that this syntax USING com.databricks.spark.redshift is working only in notebook/standard workspace.
Databricks SQL endpoints have a different runtime. So let's hope that the unity catalog mentioned in error will include external systems.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-08-2022 04:18 AM
@Lorenzo Rondanโ , Hi You can check on the below content and please search for Redshift.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-26-2022 06:20 AM
Hi @Kaniz Fatmaโ I contacted Customer support explaining this issue, they told me that this feature is not implemented yet but it's in the roadmap with no ETA. It would be great if you ping me back when it's possible to access Redshift tables from SQL module in Databricks SQL
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
โ04-26-2022 08:00 AM
Done, thanks. Looking forward for your update.

