How to write a Delta Live Table(dlt) pipeline output to Databricks SQL directly
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-14-2022 04:30 PM
Hi,
I am trying to see if it is possible to setup a direct connection from dlt pipeline to a table in Databricks SQL by configuring the Target Schema:
with poc being a location of schema like "dbfs:/***/***/***/poc.db
The error message was just a syntax error and I am not sure if I am missing a trick here:
I have read the documentation on Target Schema but I dont think it is related to Databricks SQL and I wonder if someone can point me to the right direction or advice me on if this could be done at all?
Many thanks in advance!
- Labels:
-
Databricks SQL
-
Setup
-
Target
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-20-2022 09:13 AM
Not quite sure what you mean by "output to Databricks SQL directly". DLT currently cannot write to Unity Catalog. The target schema name you specify in DLT pipeline settings currently becomes a Hive Metastore database and it will be created by DLT if it doesn't exist.
It sounds like what you are trying to do is:
You have an existing HMS database named "poc" with location set to "dbfs:/..../poc.db" and you would like to use this database as the target schema in your DLT pipeline. I see no reason why you shouldn't be able to do this. Based on the error message, it looks like either the database path is incorrect or it has special characters that are not allowed.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-21-2022 01:24 AM
When ever you store a Delta Table to Hive Metastore. This table will be available in Databricks SQL Workspace ( Data Explorer ) under hive_metastore catalog.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-28-2022 02:15 AM
It wont let me write to Hive Metastore but DBFS only - is that something I need to talk to my admin?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
12-28-2022 12:23 PM
You mean to say that you are not able to write in a hive metastore as well , in that case you need to ask to your admin to take a write permission

