cancel
Showing results for 
Search instead for 
Did you mean: 
Knowledge Sharing Hub
Dive into a collaborative space where members like YOU can exchange knowledge, tips, and best practices. Join the conversation today and unlock a wealth of collective wisdom to enhance your experience and drive success.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

SumitSingh
by Contributor
  • 2817 Views
  • 7 replies
  • 9 kudos

From Associate to Professional: My Learning Plan to ace all Databricks Data Engineer Certifications

In today’s data-driven world, the role of a data engineer is critical in designing and maintaining the infrastructure that allows for the efficient collection, storage, and analysis of large volumes of data. Databricks certifications holds significan...

SumitSingh_0-1721402402230.png SumitSingh_1-1721402448677.png SumitSingh_2-1721402469214.png
  • 2817 Views
  • 7 replies
  • 9 kudos
Latest Reply
sandeepmankikar
New Contributor II
  • 9 kudos

As an additional tip for those working towards both the Associate and Professional certifications, I recommend avoiding a long gap between the two exams to maintain your momentum. If possible, try to schedule them back-to-back with just a few days in...

  • 9 kudos
6 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 712 Views
  • 1 replies
  • 3 kudos

Notification destination API

You can now create, delete, and update notification destinations through the Databricks API.The notification destinations API lets you programmatically manage a workspace's notification destinations. Notification destinations are used to send notific...

Knowledge Sharing Hub
databricks api
notificatio api
  • 712 Views
  • 1 replies
  • 3 kudos
Latest Reply
Anushree_Tatode
Honored Contributor II
  • 3 kudos

Hi,Thank you for sharing this Ajay. We appreciate you keeping the community informed!Thanks,Anushree

  • 3 kudos
Ajay-Pandey
by Esteemed Contributor III
  • 1701 Views
  • 2 replies
  • 4 kudos

Onboarding your new Databricks AI/BI Genie

The integration of AI and BI into the modern data stack has been a game-changer for businesses seeking to leverage data-driven insights. Databricks, a leader in this innovative frontier, has introduced the AI/BI Genie, a tool designed to democratize ...

Knowledge Sharing Hub
AI
assistance
bi
data engineering
Databricks
  • 1701 Views
  • 2 replies
  • 4 kudos
Latest Reply
Anushree_Tatode
Honored Contributor II
  • 4 kudos

Hi,The AI/BI Genie is a fantastic innovation! By enabling natural language queries and learning from user interactions, it makes data analytics more accessible and insightful. It’s a powerful tool for businesses looking to enhance their data-driven d...

  • 4 kudos
1 More Replies
Rishabh-Pandey
by Esteemed Contributor
  • 1106 Views
  • 1 replies
  • 2 kudos

The Importance of Databricks in Today's Data Market

  Unlocking the Power of Data with Databricks In the rapidly evolving landscape of data and analytics, Databricks has emerged as a transformative force, reshaping how organizations handle big data, data engineering, and machine learning. As we naviga...

  • 1106 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anushree_Tatode
Honored Contributor II
  • 2 kudos

Hi @Rishabh-Pandey,Thank you for sharing this! Databricks is really making a difference with its unified platform and powerful features. Excited to see how it will continue to shape the future of data! Thanks,

  • 2 kudos
Rishabh-Pandey
by Esteemed Contributor
  • 6250 Views
  • 1 replies
  • 1 kudos

Databricks on Databricks: Kicking off the Journey to Governance with Unity Catalog

In the world of data engineering and analytics, governance is paramount. Databricks has taken a significant step forward in this arena with the introduction of Unity Catalog, a unified governance solution for all data and AI assets. This journey to e...

  • 6250 Views
  • 1 replies
  • 1 kudos
Latest Reply
Anushree_Tatode
Honored Contributor II
  • 1 kudos

Hi Rishabh ,Unity Catalog is a major advancement in data governance, enhancing how we manage and secure data and AI assets in Databricks. Exciting to see these improvements!Thanks,Anushree 

  • 1 kudos
Brahmareddy
by Honored Contributor
  • 782 Views
  • 1 replies
  • 2 kudos

How to detect the Risks in Claims Data Using Databricks and PySpark

As a data engineer with experience in Databricks and other data engineering tools, I know that processing claims data and detecting risks early can really help in insurance claims processing. In this article, I’ll show you how to use Databricks and P...

  • 782 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anushree_Tatode
Honored Contributor II
  • 2 kudos

Hi, Thanks for sharing this! Kudos for breaking it down so clearly. I’m sure, it will help other community members. Thanks,Anushree

  • 2 kudos
Ajay-Pandey
by Esteemed Contributor III
  • 431 Views
  • 0 replies
  • 2 kudos

September 2024 - Databricks SQL fixes

Added customization options for number formats in dashboard widgets, including currencies and percentages.The width of the Values column in pivot tables now auto-adjusts based on the width of the cell value names.The time displayed in dashboard widge...

  • 431 Views
  • 0 replies
  • 2 kudos
wheersink
by New Contributor
  • 503 Views
  • 1 replies
  • 0 kudos

SQL code for appending a notebook result into an existing database table

I am attempting to append the results from a notebook query results table into an existing databricks database table.  By chance would someone share an example of the sql code with me?   

  • 503 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @wheersink ,So let's say you created following table with some sample values.%sql CREATE TABLE dev.default.employee ( id INT, name STRING, age INT, department STRING ); INSERT INTO dev.default.employee VALUES (1, 'John Doe', 30, 'Financ...

  • 0 kudos
Ajay-Pandey
by Esteemed Contributor III
  • 1277 Views
  • 1 replies
  • 2 kudos

🚀 Boost Your Data Pipelines with Dynamic, Data-Driven Databricks Workflows (For Each Task)! 💡

Unlock the power of the For Each task in Databricks to seamlessly iterate over collections—whether it's a list of table names or any value—and dynamically run tasks with specific parameter values. This powerful feature lets you automate repetitive pr...

AjayPandey_0-1724212332991.gif
Knowledge Sharing Hub
automation
bigdata
Databricks
dataengineering
Workfow
  • 1277 Views
  • 1 replies
  • 2 kudos
Latest Reply
Rishabh-Pandey
Esteemed Contributor
  • 2 kudos

Thanks for sharing @Ajay-Pandey 

  • 2 kudos
Ajay-Pandey
by Esteemed Contributor III
  • 1630 Views
  • 0 replies
  • 2 kudos

Cross-filtering for AI/BI dashboards

AI/BI dashboards now support cross-filtering, which allows you to click on an element in one chart to filter and update related data in other charts.Cross-filtering allows users to interactively explore relationships and patterns across multiple visu...

cross-filter.gif
Knowledge Sharing Hub
AI BI
Cross-filtering
Dashboards
reports
  • 1630 Views
  • 0 replies
  • 2 kudos
Ajay-Pandey
by Esteemed Contributor III
  • 522 Views
  • 0 replies
  • 2 kudos

SHOW CREATE TABLE for materialized views and streaming tables

Added support for SHOW CREATE TABLE for materialized views and streaming tables. This will show the complete CREATE command used at creation time, including properties and schedulesCatalog Explorer "Overview" tab now shows the full CREATE command for...

Knowledge Sharing Hub
materialized views
Streaming tables
  • 522 Views
  • 0 replies
  • 2 kudos
Ajay-Pandey
by Esteemed Contributor III
  • 2332 Views
  • 0 replies
  • 2 kudos

LakeFlow Connect

LakeFlow Connect introduces simple ingestion connectors for databases, enterprise applications, and file sources. Set up efficient, low-maintenance pipelines in just a few clicks or via an API. Current sources include Salesforce, Workday, and SQL Ser...

Knowledge Sharing Hub
Databricks
ingestion
LakeFlow
  • 2332 Views
  • 0 replies
  • 2 kudos
harripy
by New Contributor III
  • 2336 Views
  • 2 replies
  • 3 kudos

Resolved! Timeout handling with JDBC connection to SQL Warehouse

We have tried to build a connection test logic to our software to try out the reachability of the SQL Warehouse, yet the connection parameters do not seem to function in expected manner.When the SQL Warehouse is running, the connection test functions...

  • 2336 Views
  • 2 replies
  • 3 kudos
Latest Reply
NandiniN
Databricks Employee
  • 3 kudos

Hello, To create the connection you would need an endpoint, I would suggest you to give Serverless warehouse a try so that you don not have to wait, and for the suggestion on the product you may also submit a feedback and share the details of use cas...

  • 3 kudos
1 More Replies
pavlosskev
by New Contributor III
  • 876 Views
  • 0 replies
  • 1 kudos

Databricks java.util.NoSuchElementException: None.get Error on "SHOW TABLES IN" command

This is not a question, this is just the solution to a problem we encountered in case someone from the community finds it useful.Recently we encountered an issue, where our users' jobs started failing out of nowhere on the following command, with the...

pavlosskev_0-1722501323223.png
  • 876 Views
  • 0 replies
  • 1 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group