cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Data Detective Series | Ready to crack a case with data? 🕵️‍♀️

We’re getting ready to pilot the Databricks Detective Series ahead of its launch at Data + AI Summit 2026, and we’re looking for Community members to help test it.  Databricks Detective Series is a free, gamified, hands-on experience where you become...

  • 112 Views
  • 1 replies
  • 3 kudos
yesterday
Take the Poll: How Are You Attending Data + AI Summit 2026?

Data + AI Summit 2026 is almost here! We’ve added a community poll on the homepage and would love to hear how you’re planning to attend this year. Cast your vote from the poll section on the right side of the Community homepage.

  • 228 Views
  • 0 replies
  • 4 kudos
Monday
Data+AI Summit 2026 | Get hands on with AI

Limited seats are available to join our revamped training program at Data + AI Summit 2026 in San Francisco, with early-bird pricing that gives you 50% off training if you register by April 30. Key highlights Big savings on training and certification...

  • 1221 Views
  • 2 replies
  • 4 kudos
3 weeks ago
The Lakebase Hub: Official Community Space for Lakebase Insights

Community Space for Lakebase Insights Are you building with Lakebase? If so, do you have a single source to stay ahead of every technical shift and architectural breakthrough? More importantly, do you know what practitioners are actually saying o...

  • 531 Views
  • 1 replies
  • 3 kudos
3 weeks ago
Data + AI Summit 2026: Registration Now Open - Early Bird Pricing!

June 15–18, 2026 Register Now! Join thousands of data analytics and AI professionals at Data+ AI Summit 2026, the world’s largest conference for data, analytics and AI in San Francisco and virtually. Don’t miss early-bird pricing! Early-bird discoun...

  • 6985 Views
  • 6 replies
  • 4 kudos
02-19-2026
Solution Accelerator Series | Compliance: AML and KYC

Keeping up with financial crime is complex. Banks and financial institutions need to deal with changing regulations, sophisticated money laundering techniques and large volumes of data that must be analyzed to detect risk and protect client assets an...

  • 52 Views
  • 0 replies
  • 0 kudos
yesterday
🌟 Community Pulse: Your Weekly Roundup! April 27 – May 03, 2026

The Weekly Digest • April 27 – May 03 Community Pulse Your cheat sheet for the week's biggest breakthroughs and sharpest threads. Top Contributors This Week   Contributors leading with curiosity and kindness: @szymon_dybczak @balajij8  @amirab...

  • 314 Views
  • 1 replies
  • 4 kudos
a week ago

Community Activity

KVNARK
by > Honored Contributor II
  • 9 Views
  • 0 replies
  • 0 kudos

In our retail analytics project (CPG domain), Lakebase transformed how we handled operational data

We had ADF pipelines extracting POS data to Snowflake, but needed real-time operational tracking—job statuses, data quality alerts, user audit logs. Traditional RDS/SQL databases created ETL sync nightmares between ops and analytics layers.Lakebase s...

  • 9 Views
  • 0 replies
  • 0 kudos
jessdarnell
by Databricks Employee
  • 34 Views
  • 1 replies
  • 0 kudos

Share your Lakebase story and receive a $50 gift card!

Calling all builders!Since GA, thousands of teams have moved production workloads onto Lakebase. We want to hear how you’re using it.A few examples of stories we’d love to capture: Building stateful AI agents that need persistent memory alongside lak...

Share you ideas About (3).png
  • 34 Views
  • 1 replies
  • 0 kudos
Latest Reply
KVNARK
Honored Contributor II
  • 0 kudos

In our retail analytics project (CPG domain), Lakebase transformed how we handled operational data alongside analytics.We had ADF pipelines extracting POS data to Snowflake, but needed real-time operational tracking—job statuses, data quality alerts,...

  • 0 kudos
123mishrachan
by > Visitor
  • 15 Views
  • 0 replies
  • 0 kudos

Notebook load failed

Hi Everyone, I am facing issue in loading the Notebook, also on top of it. Some of the other application are also not loading. I am attaching the image for it. Please see if anyone can help on it.  

  • 15 Views
  • 0 replies
  • 0 kudos
Garybary
by > New Contributor III
  • 1437 Views
  • 3 replies
  • 2 kudos

Scheduling jobs with table update triggers

Hi all,Lately I've been experimenting with the newish feature of scheduling jobs on a table update trigger. There's one thing thats blokcing me from implementing it however and I was hoping someone found a solution to it.We occasionally perform a vac...

  • 1437 Views
  • 3 replies
  • 2 kudos
Latest Reply
SteveOstrowski
Databricks Employee
  • 2 kudos

Hi @Garybary, Quick clarification on how table update triggers actually behave, because this changes the answer significantly. Table update triggers fire on data-changing operations only (writes, merges, updates, deletes). A standalone VACUUM does NO...

  • 2 kudos
2 More Replies
Tushar_Parekar
by Databricks Employee
  • 112 Views
  • 1 replies
  • 3 kudos

Data Detective Series | Ready to crack a case with data? 🕵️‍♀️

We’re getting ready to pilot the Databricks Detective Series ahead of its launch at Data + AI Summit 2026, and we’re looking for Community members to help test it.  Databricks Detective Series is a free, gamified, hands-on experience where you become...

cover_databricks_detective_series_v2.png
  • 112 Views
  • 1 replies
  • 3 kudos
Latest Reply
Brahmareddy
Esteemed Contributor
  • 3 kudos

I'm in!

  • 3 kudos
ThiagoRosetti
by > Visitor
  • 29 Views
  • 0 replies
  • 0 kudos

Serverless Compute connectivity issues with .com.br domains vs. Classic Clusters Spark hangs

Hi everyone,I'm facing two specific issues in my Databricks Premium workspace (AWS - sa-east-1).Serverless Connectivity Issue: When using Serverless compute, I can successfully call APIs ending in .com, but calls to .com.br domains fail with connecti...

  • 29 Views
  • 0 replies
  • 0 kudos
Saketh30
by > New Contributor
  • 98 Views
  • 2 replies
  • 0 kudos

Transfer Certification - Previous Work Email to New Work Email

Hi Databricks Team,I've been trying to transfer my certifications from my previous employer email Id to my current employer email Id. I've opened a ticket with the partner databricks certification team and haven't heard back from them for a while. He...

  • 98 Views
  • 2 replies
  • 0 kudos
Latest Reply
KrisJohannesen
Contributor
  • 0 kudos

Take a look at this thread - it provides all the details you need including the escalation pathhttps://community.databricks.com/t5/certifications/transfer-my-existing-profile-with-certifications-to-a-different/td-p/147890

  • 0 kudos
1 More Replies
Rupa0503
by > Visitor
  • 50 Views
  • 3 replies
  • 0 kudos

Schema Evolution and Schema Enforcement without Delta live Tables & Unity catalog

In Delta Lake, schema evolution with mergeSchema handles column additions perfectly — new columns get added and old rows get NULL. But when there is a data type change in the incoming data (for example, a column that was INT now coming as STRING from...

  • 50 Views
  • 3 replies
  • 0 kudos
Latest Reply
Lu_Wang_ENB_DBX
Databricks Employee
  • 0 kudos

Yes — defining the schema manually for production is okay when you are not using Auto Loader. With a manually provided schema, you should expect stricter enforcement: Delta will not automatically absorb non-widening type changes like INT -> STRING in...

  • 0 kudos
2 More Replies
TalessRocha
by > New Contributor II
  • 5940 Views
  • 11 replies
  • 8 kudos

Resolved! Connect to azure data lake storage using databricks free edition

Hello guys, i'm using databricks free edition (serverless) and i am trying to connect to a azure data lake storage.The problem I'm having is that in the free edition we can't configure the cluster so I tried to make the connection via notebook using ...

  • 5940 Views
  • 11 replies
  • 8 kudos
Latest Reply
pjvi
New Contributor II
  • 8 kudos

If you want to read from your Azure storage account using Databricks Free Edition, you can add a specific option when reading:spark.read.option("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net",                  "your_storage_account...

  • 8 kudos
10 More Replies
maikel
by > Contributor II
  • 361 Views
  • 4 replies
  • 1 kudos

Resolved! Uploading file to volume and start ingestion job

Hello Community!I am writing to you with my idea about data ingestion job which we have to implement in our project.The data which we have are in CSV file format and depending on the case it differs a little bit. Before uploading we pivoting csv file...

  • 361 Views
  • 4 replies
  • 1 kudos
Latest Reply
maikel
Contributor II
  • 1 kudos

Yeah, understood. Thank you very much once again! 

  • 1 kudos
3 More Replies
maikel
by > Contributor II
  • 32 Views
  • 0 replies
  • 0 kudos

Job tasks monitoring

Hello Community,We have a case in our project that we would like to solve in an elegant and scalable manner. As always, I would really appreciate your suggestions and experience.In short:We have a multi-step job consisting of 4 stages. In one of the ...

  • 32 Views
  • 0 replies
  • 0 kudos
Danish11052000
by > Contributor
  • 1130 Views
  • 7 replies
  • 1 kudos

Resolved! How should I correctly extract the full table name from request_params in audit logs?

’m trying to build a UC usage/refresh tracking table for every workspace. For each workspace, I want to know how many times a UC table was refreshed or accessed each month. To do this, I’m reading the Databricks audit logs and I need to extract only ...

  • 1130 Views
  • 7 replies
  • 1 kudos
Latest Reply
SteveOstrowski
Databricks Employee
  • 1 kudos

Hi @Danish11052000, You are on the right track with the COALESCE approach. The reason for the inconsistency is that different Unity Catalog action types populate different keys in request_params. Here is a breakdown of the key fields and which action...

  • 1 kudos
6 More Replies
Pranav_1699
by > New Contributor
  • 32 Views
  • 0 replies
  • 0 kudos

Building a Spark Declarative Pipeline OSS with Apache Iceberg and AWS Glue Catalog

Hey everyone,I recently worked on building a modern financial data lakehouse using Spark Declarative Pipeline OSS (SDP OSS), Apache Iceberg, and AWS Glue Catalog.The blog covers:- Building declarative data pipelines with Spark- Using Apache Iceberg a...

Data Engineering
Spark Declarative Pipelines
  • 32 Views
  • 0 replies
  • 0 kudos
Tushar_Parekar
by Databricks Employee
  • 52 Views
  • 0 replies
  • 0 kudos

Solution Accelerator Series | Compliance: AML and KYC

Keeping up with financial crime is complex. Banks and financial institutions need to deal with changing regulations, sophisticated money laundering techniques and large volumes of data that must be analyzed to detect risk and protect client assets an...

compliance-aml-and-kyc-inbody-graphics.png
  • 52 Views
  • 0 replies
  • 0 kudos
SD_KCM
by > Visitor
  • 41 Views
  • 1 replies
  • 0 kudos

Azure Databricks Exclusive groups

Une nouvelle primitive de permissions pour empêcher le croisement de données entre usages hébergés sur le même workspace Databricks.

  • 41 Views
  • 1 replies
  • 0 kudos
Latest Reply
SD_KCM
Visitor
  • 0 kudos

lien Medium :https://medium.com/@kacn12872/azure-databricks-exclusive-groups-garantir-létanchéité-entre-cas-d-usage-sur-la-lakehouse-e340ce28f332

  • 0 kudos
Polls
DATA + AI
SUMMIT 2026
IS ALMOST HERE! 🚀
Will you be joining in person or virtually?
Top Kudoed Authors
Welcome to the Databricks Community!

Once you are logged in, you will be ready to post content, ask questions, participate in discussions, earn badges, and more.

Spend a few minutes exploring Get Started Resources, Learning Paths, Certifications, and Platform Discussions.

Join Learning Events here in the Community.

Connect with peers through Databricks User Groups and learn more about Community Events happening near you. We’re excited to see you get involved.

Latest from our Blog

Unit Testing TransformWithState Logic with TwsTester

You've spent the afternoon building a StatefulProcessor for your TransformWithState streaming job. It tracks per-user sessions, accumulates running totals, or deduplicates events. Now you want to know...

195Views 0kudos