How to write event_log destination into DLT Settings JSON via Asset Bundles
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Wednesday
Hi all,
I would like to publish the event_log of my DLT Pipeline to a specific schema in Unity Catalog.
Following this article (https://docs.databricks.com/gcp/en/dlt/observability#query-the-event-log) this can be done by writing this into the DLTs settings JSON:
"event_log": { "catalog": "catalog_name", "schema": "schema_name", "name": "event_log_table_name" }
I want to incorporate this setting in my Databricks Asset Bundle Deployment, so it doesnt need to be manually added.
Is there a way to do this?
Thanks and best regards,
Susanne
- Labels:
-
Delta Lake
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
yesterday
I’m sharing an option in case you find it useful. In my case, I was testing how to publish these tables using Databricks Asset Bundles, but I couldn’t get it to work. I'm not sure if I posted a question about this. Let’s see if we’re lucky and someone provides the corresponding property, if it exists.
From what I’ve seen, it seems that this property doesn’t exist. So, I decided to create a Python script to handle this process. The script communicates directly with the Databricks API of the corresponding workspace and, based on the ID of several pipelines, retrieves the configuration using a GET method. Then, it updates the configuration by adding the event_log and finally performs a PUT to update it.
Additionally, you can automate the execution of this job using Databricks Asset Bundles.
I hope this option helps you.

