cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

How to write event_log destination into DLT Settings JSON via Asset Bundles

susanne
New Contributor III

Hi all,

I would like to publish the event_log of my DLT Pipeline to a specific schema in Unity Catalog.
Following this article (https://docs.databricks.com/gcp/en/dlt/observability#query-the-event-log) this can be done by writing this into the DLTs settings JSON:

  "event_log": {
    "catalog": "catalog_name",
    "schema": "schema_name",
    "name": "event_log_table_name"
  }

I want to incorporate this setting in my Databricks Asset Bundle Deployment, so it doesnt need to be manually added.
Is there a way to do this?

Thanks and best regards,
Susanne

1 REPLY 1

jorperort
Contributor

I’m sharing an option in case you find it useful. In my case, I was testing how to publish these tables using Databricks Asset Bundles, but I couldn’t get it to work. I'm not sure if I posted a question about this. Let’s see if we’re lucky and someone provides the corresponding property, if it exists.

From what I’ve seen, it seems that this property doesn’t exist. So, I decided to create a Python script to handle this process. The script communicates directly with the Databricks API of the corresponding workspace and, based on the ID of several pipelines, retrieves the configuration using a GET method. Then, it updates the configuration by adding the event_log and finally performs a PUT to update it.

Additionally, you can automate the execution of this job using Databricks Asset Bundles.

I hope this option helps you.

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now