cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Hubert-Dudek
by Esteemed Contributor III
  • 10031 Views
  • 9 replies
  • 5 kudos

Databricks now supports event-driven workloads, especially for loading cloud files from external locations. This means you can save costs and resource...

Databricks now supports event-driven workloads, especially for loading cloud files from external locations. This means you can save costs and resources by triggering your Databricks jobs only when new files arrive in your cloud storage instead of mou...

ezgif-3-946af786d0
  • 10031 Views
  • 9 replies
  • 5 kudos
Latest Reply
adriennn
Contributor II
  • 5 kudos

@daniel_sahal I get your point, but if for a scheduled trigger you can get all kind of attributes on the trigger time (arguably, this is available for all the triggers), then why wouldn't the most important attribute of a file event not be available ...

  • 5 kudos
8 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 1315 Views
  • 1 replies
  • 6 kudos

Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and...

Exciting news for #azure users! The #databricks runtime 12.2 has been officially released as a long-term support (LTS) version, providing a stable and reliable platform for users to build and deploy their applications. As part of this release, the en...

122
  • 1315 Views
  • 1 replies
  • 6 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 6 kudos

Thank you for sharing @Hubert Dudek​ !!!

  • 6 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 1409 Views
  • 1 replies
  • 7 kudos

Starting from #databricks 12.2 LTS, the explode function can be used in the FROM statement to manipulate data in new and powerful ways. This function ...

Starting from #databricks 12.2 LTS, the explode function can be used in the FROM statement to manipulate data in new and powerful ways. This function takes an array column as input and returns a new row for each element in the array, offering new pos...

ezgif-3-f42040b788
  • 1409 Views
  • 1 replies
  • 7 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 7 kudos

Thank you for sharing @Hubert Dudek​ 

  • 7 kudos
Sujitha
by Databricks Employee
  • 7364 Views
  • 0 replies
  • 2 kudos

Weekly Release Notes RecapHere’s a quick recap of the latest release notes updates from the past one week. Databricks platform release notesMarch 13 -...

Weekly Release Notes RecapHere’s a quick recap of the latest release notes updates from the past one week.Databricks platform release notesMarch 13 - 17, 2023Execute SQL cells in the notebook in parallelYou can now run SQL cells in Databricks noteboo...

  • 7364 Views
  • 0 replies
  • 2 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 991 Views
  • 1 replies
  • 5 kudos

Starting from #databricks runtime 12.2 LTS, implicit lateral column aliasing is now supported. This feature enables you to reuse an expression defined...

Starting from #databricks runtime 12.2 LTS, implicit lateral column aliasing is now supported. This feature enables you to reuse an expression defined earlier in the same SELECT list, thus avoiding repetition of the same calculation.For instance, in ...

ezgif-3-d3fac0139c
  • 991 Views
  • 1 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Thanks for sharing this with the Databricks community.

  • 5 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 1137 Views
  • 0 replies
  • 4 kudos

lnkd.in

Databricks has introduced a new feature that allows users to send SQL statements to their database via REST API. Users can easily integrate this feature with any tool by simply posting their queries to the /api/2.0/sql/statements/ endpoint. With this...

statmentapi
  • 1137 Views
  • 0 replies
  • 4 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 782 Views
  • 1 replies
  • 5 kudos

Exciting news for Databricks users! #databricks launched a new feature that allows users to run job workflows continuously. Setting up a continuous jo...

Exciting news for Databricks users! #databricks launched a new feature that allows users to run job workflows continuously. Setting up a continuous job workflow is straightforward: create a job and select the continuous trigger option in the scheduli...

ezgif-1-1c3322d3f9
  • 782 Views
  • 1 replies
  • 5 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 5 kudos

Thank you for sharing!!!

  • 5 kudos
Sujitha
by Databricks Employee
  • 909 Views
  • 1 replies
  • 1 kudos

Weekly Release Notes RecapHere’s a quick recap of the latest release notes updates from the past one week. Databricks platform release notesFebruary 2...

Weekly Release Notes RecapHere’s a quick recap of the latest release notes updates from the past one week.Databricks platform release notesFebruary 21 - 28, 2023Ray on Databricks (Public Preview)With Databricks Runtime 12.0 and above, you can create ...

  • 909 Views
  • 1 replies
  • 1 kudos
Latest Reply
jose_gonzalez
Databricks Employee
  • 1 kudos

Thank you for sharing!!!

  • 1 kudos
Sujitha
by Databricks Employee
  • 1507 Views
  • 3 replies
  • 5 kudos

Weekly Release Notes Recap�� Here’s a quick recap of the latest release notes updates from the past one week. Databricks platform release notes...

Weekly Release Notes Recap Here’s a quick recap of the latest release notes updates from the past one week.Databricks platform release notes December 1-6, 2022Partner Connect supports connecting to AtScale:You can now easily create a connection betwe...

  • 1507 Views
  • 3 replies
  • 5 kudos
Latest Reply
karthik_p
Esteemed Contributor
  • 5 kudos

@Uma Maheswara Rao Desula​ if i am not wrong, below ideas portal should help you Ideas Portal | Databricks on AWS

  • 5 kudos
2 More Replies
Labels