cancel
Showing results for 
Search instead for 
Did you mean: 
Community Articles
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ThierryBa
by New Contributor III
  • 8259 Views
  • 3 replies
  • 5 kudos

Build & Refresh a Calendar Dates Table

IntroductionMaintaining accurate and up-to-date calendar date tables is crucial for reliable reporting, yet manual updates can be time-consuming and prone to error. This fundamental component serves as the backbone for date-based analysis, enabling a...

ThierryBa_0-1726544617449.png ThierryBa_1-1726544617449.png ThierryBa_2-1726544617450.png
  • 8259 Views
  • 3 replies
  • 5 kudos
Latest Reply
may-tun
New Contributor II
  • 5 kudos

Nice article, very informative! 

  • 5 kudos
2 More Replies
AhmedAlnaqa
by Contributor
  • 8056 Views
  • 3 replies
  • 1 kudos

Reading Excel files folder

Dears,One of the tasks needed by DE is to ingest data from files, for example, Excel file.Thanks for OnerFusion-AI for the below thread that give us the steps of reading from one file https://community.databricks.com/t5/get-started-discussions/how-to...

  • 8056 Views
  • 3 replies
  • 1 kudos
Latest Reply
maddy08
New Contributor II
  • 1 kudos

Hi @AhmedAlnaqa ,Can we read from ADLS location too by using abfss ?Thanks

  • 1 kudos
2 More Replies
Ajay-Pandey
by Databricks MVP
  • 1373 Views
  • 1 replies
  • 3 kudos

Notification destination API

You can now create, delete, and update notification destinations through the Databricks API.The notification destinations API lets you programmatically manage a workspace's notification destinations. Notification destinations are used to send notific...

Community Articles
databricks api
notificatio api
  • 1373 Views
  • 1 replies
  • 3 kudos
Latest Reply
Anushree_Tatode
Databricks Employee
  • 3 kudos

Hi,Thank you for sharing this Ajay. We appreciate you keeping the community informed!Thanks,Anushree

  • 3 kudos
Ajay-Pandey
by Databricks MVP
  • 2593 Views
  • 2 replies
  • 5 kudos

Onboarding your new Databricks AI/BI Genie

The integration of AI and BI into the modern data stack has been a game-changer for businesses seeking to leverage data-driven insights. Databricks, a leader in this innovative frontier, has introduced the AI/BI Genie, a tool designed to democratize ...

Community Articles
AI
assistance
bi
data engineering
Databricks
  • 2593 Views
  • 2 replies
  • 5 kudos
Latest Reply
Anushree_Tatode
Databricks Employee
  • 5 kudos

Hi,The AI/BI Genie is a fantastic innovation! By enabling natural language queries and learning from user interactions, it makes data analytics more accessible and insightful. It’s a powerful tool for businesses looking to enhance their data-driven d...

  • 5 kudos
1 More Replies
Rishabh-Pandey
by Databricks MVP
  • 2073 Views
  • 1 replies
  • 2 kudos

The Importance of Databricks in Today's Data Market

  Unlocking the Power of Data with Databricks In the rapidly evolving landscape of data and analytics, Databricks has emerged as a transformative force, reshaping how organizations handle big data, data engineering, and machine learning. As we naviga...

  • 2073 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anushree_Tatode
Databricks Employee
  • 2 kudos

Hi @Rishabh-Pandey,Thank you for sharing this! Databricks is really making a difference with its unified platform and powerful features. Excited to see how it will continue to shape the future of data! Thanks,

  • 2 kudos
Rishabh-Pandey
by Databricks MVP
  • 6629 Views
  • 1 replies
  • 1 kudos

Databricks on Databricks: Kicking off the Journey to Governance with Unity Catalog

In the world of data engineering and analytics, governance is paramount. Databricks has taken a significant step forward in this arena with the introduction of Unity Catalog, a unified governance solution for all data and AI assets. This journey to e...

  • 6629 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anushree_Tatode
Databricks Employee
  • 1 kudos

Hi Rishabh ,Unity Catalog is a major advancement in data governance, enhancing how we manage and secure data and AI assets in Databricks. Exciting to see these improvements!Thanks,Anushree 

  • 1 kudos
Brahmareddy
by Esteemed Contributor
  • 1526 Views
  • 1 replies
  • 2 kudos

How to detect the Risks in Claims Data Using Databricks and PySpark

As a data engineer with experience in Databricks and other data engineering tools, I know that processing claims data and detecting risks early can really help in insurance claims processing. In this article, I’ll show you how to use Databricks and P...

  • 1526 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anushree_Tatode
Databricks Employee
  • 2 kudos

Hi, Thanks for sharing this! Kudos for breaking it down so clearly. I’m sure, it will help other community members. Thanks,Anushree

  • 2 kudos
Ajay-Pandey
by Databricks MVP
  • 1037 Views
  • 0 replies
  • 2 kudos

September 2024 - Databricks SQL fixes

Added customization options for number formats in dashboard widgets, including currencies and percentages.The width of the Values column in pivot tables now auto-adjusts based on the width of the cell value names.The time displayed in dashboard widge...

  • 1037 Views
  • 0 replies
  • 2 kudos
wheersink
by New Contributor
  • 1628 Views
  • 1 replies
  • 0 kudos

SQL code for appending a notebook result into an existing database table

I am attempting to append the results from a notebook query results table into an existing databricks database table.  By chance would someone share an example of the sql code with me?   

  • 1628 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @wheersink ,So let's say you created following table with some sample values.%sql CREATE TABLE dev.default.employee ( id INT, name STRING, age INT, department STRING ); INSERT INTO dev.default.employee VALUES (1, 'John Doe', 30, 'Financ...

  • 0 kudos
Ajay-Pandey
by Databricks MVP
  • 2287 Views
  • 1 replies
  • 2 kudos

🚀 Boost Your Data Pipelines with Dynamic, Data-Driven Databricks Workflows (For Each Task)! 💡

Unlock the power of the For Each task in Databricks to seamlessly iterate over collections—whether it's a list of table names or any value—and dynamically run tasks with specific parameter values. This powerful feature lets you automate repetitive pr...

AjayPandey_0-1724212332991.gif
Community Articles
automation
bigdata
Databricks
dataengineering
Workfow
  • 2287 Views
  • 1 replies
  • 2 kudos
Latest Reply
Rishabh-Pandey
Databricks MVP
  • 2 kudos

Thanks for sharing @Ajay-Pandey 

  • 2 kudos
Danny_Lee
by Valued Contributor
  • 1381 Views
  • 0 replies
  • 2 kudos

Code patterns / best practices in SparkSQL, pyspark and Scala

Hi all,I am looking for collections of Code patterns for:SparkSQLpySparkScalaI'm sure there are at least a few repos in Github with snippets and will share them as I find them in this thread.  If you come across any good collections, please post your...

  • 1381 Views
  • 0 replies
  • 2 kudos
Ajay-Pandey
by Databricks MVP
  • 972 Views
  • 0 replies
  • 2 kudos

SHOW CREATE TABLE for materialized views and streaming tables

Added support for SHOW CREATE TABLE for materialized views and streaming tables. This will show the complete CREATE command used at creation time, including properties and schedulesCatalog Explorer "Overview" tab now shows the full CREATE command for...

Community Articles
materialized views
Streaming tables
  • 972 Views
  • 0 replies
  • 2 kudos
Ajay-Pandey
by Databricks MVP
  • 3285 Views
  • 0 replies
  • 2 kudos

LakeFlow Connect

LakeFlow Connect introduces simple ingestion connectors for databases, enterprise applications, and file sources. Set up efficient, low-maintenance pipelines in just a few clicks or via an API. Current sources include Salesforce, Workday, and SQL Ser...

Community Articles
Databricks
ingestion
LakeFlow
  • 3285 Views
  • 0 replies
  • 2 kudos
harripy
by New Contributor III
  • 4415 Views
  • 2 replies
  • 3 kudos

Resolved! Timeout handling with JDBC connection to SQL Warehouse

We have tried to build a connection test logic to our software to try out the reachability of the SQL Warehouse, yet the connection parameters do not seem to function in expected manner.When the SQL Warehouse is running, the connection test functions...

  • 4415 Views
  • 2 replies
  • 3 kudos
Latest Reply
NandiniN
Databricks Employee
  • 3 kudos

Hello, To create the connection you would need an endpoint, I would suggest you to give Serverless warehouse a try so that you don not have to wait, and for the suggestion on the product you may also submit a feedback and share the details of use cas...

  • 3 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels