cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

pt07
by New Contributor
  • 839 Views
  • 1 replies
  • 0 kudos

how do i pass the value from one task in the workflow to another task?

how do i pass the value from one task in the workflow to another task? #worksflow #orchestration 

  • 839 Views
  • 1 replies
  • 0 kudos
Latest Reply
brockb
Databricks Employee
  • 0 kudos

Hi @pt07 ,    This may be what you're looking for. Can you please take a look? https://www.databricks.com/blog/2022/08/02/sharing-context-between-tasks-in-databricks-workflows.html  

  • 0 kudos
subashdsouza
by New Contributor
  • 998 Views
  • 1 replies
  • 0 kudos
  • 998 Views
  • 1 replies
  • 0 kudos
Latest Reply
mhiltner
Databricks Employee
  • 0 kudos

Delta Uniform is the way to go at the moment: https://www.databricks.com/blog/delta-lake-universal-format-uniform-iceberg-compatibility-now-ga In one word, tables are written in Delta but have compatibility with Iceberg as they will also save iceberg...

  • 0 kudos
Peter-M
by New Contributor II
  • 3247 Views
  • 3 replies
  • 2 kudos
  • 3247 Views
  • 3 replies
  • 2 kudos
Latest Reply
mhiltner
Databricks Employee
  • 2 kudos

You could have a workflow with two tasks, one being a "trigger checker" that could be a super light task scheduled to run every X hours/minutes. This first task would check for your different triggers and define a success criteria for your next task....

  • 2 kudos
2 More Replies
Husainyusuf
by New Contributor
  • 756 Views
  • 1 replies
  • 0 kudos

DataAIsummit 2024

Having a wonderful learning experience and sharing experience and knowledge with fellow data engineers/scientists/architects

  • 756 Views
  • 1 replies
  • 0 kudos
Latest Reply
Ryangough37
New Contributor II
  • 0 kudos

Interesting use cases so far. Looking forward ot the sessions tomorrow!

  • 0 kudos
301444
by New Contributor
  • 687 Views
  • 0 replies
  • 0 kudos

dbr

overwhelming and need to learn a lot

  • 687 Views
  • 0 replies
  • 0 kudos
Kaviprakash_S
by New Contributor III
  • 2786 Views
  • 3 replies
  • 1 kudos

SQL compilation error while connecting to snowflake from Databricks

Hi All,I'm trying to connect to the snowflake database from databricks notebook either to read the data or write the data. However I'm getting an weird error. The code and error are provided as follows, snowflake_table = (spark.read  .format("snowfla...

Kaviprakash_S_0-1717986833387.png
  • 2786 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaviprakash_S
New Contributor III
  • 1 kudos

@Retired_mod Could you please help with this ? 

  • 1 kudos
2 More Replies
jsperson
by New Contributor
  • 586 Views
  • 0 replies
  • 0 kudos

Partition Pruning with Non-Delta Files

Howdy - first time caller here.I'm trying to figure out how/if partition pruning works with non-CSV files. I have files landing in bronze in physical partitions of the form dlk_load_dtm=<load date time>. I'd like to load the partitions that don't yet...

  • 586 Views
  • 0 replies
  • 0 kudos
heron
by New Contributor
  • 3271 Views
  • 2 replies
  • 2 kudos

Get metadata information about Delta Table using only SQL Editor (Query)?

I'm trying to obtain the basic information and the storage location of the delta table, but without success.Is there a way to get: storage location, type, catalog, schema, table name using SQL Editor through a query?I can get the basic information (c...

  • 3271 Views
  • 2 replies
  • 2 kudos
Latest Reply
jacovangelder
Honored Contributor
  • 2 kudos

I believe what you're looking for is DESCRIBE EXTENDED <table_name>.This returns both delta storage location aswell as detailed table information such as type, table properties, catalog, schema, etc. 

  • 2 kudos
1 More Replies
WWoman
by Contributor
  • 2217 Views
  • 3 replies
  • 1 kudos

Looking for descriptions of action_name column in system.access.audit.. specifically getTable

Hi all,I am looking for description of action_name column in system.access.audit. I am specifically interested in getTable, deleteTable and createTable. I believe the latter 2 are self descriptive but I'd like to confirm.If getTable is related to acc...

  • 2217 Views
  • 3 replies
  • 1 kudos
Latest Reply
jacovangelder
Honored Contributor
  • 1 kudos

Have you read this? https://docs.databricks.com/en/admin/system-tables/audit-logs.htmlI do agree that is a bit vague. But getTable seems to me when you do a DESCRIBE TABLE <table>, or view table metadata in the UI, so not accessing data in the table,...

  • 1 kudos
2 More Replies
felix_counter
by New Contributor III
  • 5275 Views
  • 8 replies
  • 5 kudos

Resolved! DBR 14.3 LTS auto-optimizes liquid cluster tables with "clusterBy" empty

Hello,I have a structured stream job writing every 5 mins into a table with liquid clustering enabled. After migrating from DBR 13.3 LTS to DBR 14.3 LTS I observe that the table is newly regularly optimized despite I have not set the "spark.databrick...

  • 5275 Views
  • 8 replies
  • 5 kudos
Latest Reply
raphaelblg
Databricks Employee
  • 5 kudos

Hello @felix_counter , It seems you're referring to Predictive optimization for Delta Lake, a relatively new feature. In contrast to Optimized writes for Delta Lake on Databricks (basically `spark.databricks.delta.autoCompact.enabled ` and `spark.dat...

  • 5 kudos
7 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels