cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Self-Paced Learning Festival: 09 January - 30 January 2026

Grow Your Skills and Earn Rewards! Mark your calendar: January 09 – January 30, 2026 Join us for a three-week event dedicated to learning, upskilling, and advancing your career in data engineering, analytics, machine learning, and generative AI. ...

  • 146204 Views
  • 494 replies
  • 154 kudos
12-09-2025
🎤 Call for Presentations: Data + AI Summit 2026 is Open!

June 15–18, 2026 Are you building the future with data and AI? Then this is your moment. The Call for Proposals for Data + AI Summit 2026 is officially open, and we want to hear from builders, practitioners, and innovators across the data and AI com...

  • 9100 Views
  • 4 replies
  • 6 kudos
12-15-2025
Level Up with Databricks Specialist Sessions

How to Register & Prepare If you're interested in advancing your skills with Databricks through a Specialist Session, here's a clear guide on how to register and what free courses you can take to prepare effectively. How to Begin Your Learning Path S...

  • 4107 Views
  • 2 replies
  • 9 kudos
10-02-2025
Solution Accelerator Series | Scale cybersecurity analytics with Splunk and Databricks

Strengthening cybersecurity isn’t a one-time project — it’s an ongoing process of adapting to new and complex threats. The focus is on modernizing your data infrastructure and analytics capabilities to make security operations smarter, faster, and mo...

  • 1173 Views
  • 1 replies
  • 2 kudos
2 weeks ago

Community Activity

spjti
by > Visitor
  • 39 Views
  • 1 replies
  • 0 kudos

Manager hierarchy

for employee manager hierarchy some employees are not having correct manager mapping, what can be the reason?

  • 39 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 0 kudos

This is unfortunately an unanswerable question with the lack of information. If you can at least provide the schemas of the tables you're working with and the joins/CTE you may have tried, it would at least be a starting point.

  • 0 kudos
demo-user
by > New Contributor II
  • 37 Views
  • 1 replies
  • 0 kudos

Connecting an S3-compatible endpoint (such as MinIO) to Unity Catalog

Hi everyone, is it possible to connect an S3-compatible storage endpoint that is not AWS S3 (for example MinIO) to Databricks Unity Catalog? I already have access using Spark configurations (3a endpoint, access key, secret key, etc.), and I can read/...

  • 37 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 0 kudos

It's not supported unfortunately to register those as Storage Credentials. But the ask seems to be coming up more frequently and I believe Product is in "Discovery" phase to support it in UC. Here are some standard questions that might help with coll...

  • 0 kudos
greengil
by > New Contributor III
  • 74 Views
  • 4 replies
  • 0 kudos

Resolved! Databricks database

In Oracle, I create schemas and tables and link tables together via the primary/foreign key to do SQL queries.In Databricks, I notice that I can create tables, but how do I link tables together for querying?  Does Databricks queries need the key in t...

  • 74 Views
  • 4 replies
  • 0 kudos
Latest Reply
tharple135
New Contributor II
  • 0 kudos

@greengil you treat your joins in Databricks the same way you would in any other relational database.  Join 2+ tables together using the primary/foreign keys from each of those tables.  

  • 0 kudos
3 More Replies
cdn_yyz_yul
by > Contributor
  • 39 Views
  • 3 replies
  • 0 kudos

unionbyname several streaming dataframes of different sources

Is the following type of union safe with spark structured streaming?union multiple streaming dataframes, and each from a different source.Anything better solution ?for example, df1 = spark.readStream.table(f"{bronze_catalog}.{bronze_schema}.table1") ...

  • 39 Views
  • 3 replies
  • 0 kudos
Latest Reply
cdn_yyz_yul
Contributor
  • 0 kudos

Thanks @stbjelcevic ,I am looking for a solution .... === Let's say, I have already had: df1 = spark.readStream.table(f"{bronze_catalog}.{bronze_schema}.table1") df2 = spark.readStream.table(f"{bronze_catalog}.{bronze_schema}.table2") df1a = df1.se...

  • 0 kudos
2 More Replies
newenglander
by > New Contributor II
  • 3555 Views
  • 3 replies
  • 1 kudos

Cannot import editable installed module in notebook

Hi,I have the following directory structure:- mypkg/ - setup.py - mypkg/ - __init__.py - module.py - scripts/ - main # notebook From the `main` notebok I have a cell that runs:%pip install -e /path/to/mypkgThis command appears to succ...

  • 3555 Views
  • 3 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Hey @newenglander — always great to meet a fellow New Englander Could you share a bit more detail about your setup? For example, are you running on classic compute or serverless? And are you working in a customer workspace, or using Databricks Free ...

  • 1 kudos
2 More Replies
dbrixr
by > New Contributor II
  • 2505 Views
  • 3 replies
  • 1 kudos

Reverse colors in dark mode

It seems that Databricks implements its dark mode by applying the invert filter so that all colors are reversed. This is problematic if one wants to create some sort of html widget or rich output since this filter is passed down to the result of disp...

  • 2505 Views
  • 3 replies
  • 1 kudos
Latest Reply
valentin9
Visitor
  • 1 kudos

Also struggled with that issue. What I found is that there is a style applied onto the iframe: "filter: var(--invert-filter);" which applies this CSS: filter: invert(1) saturate(0.5).I couldn't find any elements within the iframe that I can use to de...

  • 1 kudos
2 More Replies
Sanjeeb2024
by > Contributor III
  • 671 Views
  • 14 replies
  • 2 kudos

Need Help - System tables that contains all databricks users, service principal details !!

Hi all - I am trying to create a dashabord where I need to list down all users and service principals along with groups and understand their databricks usages. Is there any table available in Databricks that contains user, service principal details. ...

  • 671 Views
  • 14 replies
  • 2 kudos
Latest Reply
emma_s
Databricks Employee
  • 2 kudos

Hi, I can't find any reference to a user system table in our docs. Instead the recommended approach is to use the API to return users, groups and service principals. You can either run this using the Workspace Client if you only have worspace admin p...

  • 2 kudos
13 More Replies
Raman_Unifeye
by > Contributor III
  • 84 Views
  • 2 replies
  • 2 kudos

Change default workspace access to Consumer access

When Databricks One was launched, the default behaviour of the system-managed users group was a major issue. Since every new user is automatically added to users group, and that group traditionally came with "Workspace Access" entitlements, admins ha...

  • 84 Views
  • 2 replies
  • 2 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 2 kudos

Thanks @Advika. it was certainly a big win for my env.

  • 2 kudos
1 More Replies
Hubert-Dudek
by Databricks MVP
  • 30 Views
  • 0 replies
  • 1 kudos

Runtime 18 GA

Runtime 18, including Spark 4.1, is no longer in Beta. You can start migrating now. Runtime 18 is available now only for classic compute. Serverless, or SQL warehouse, still using older runtimes. Once 18 is everywhere, we will be able to use identifi...

18.jpg
  • 30 Views
  • 0 replies
  • 1 kudos
Jim_Anderson
by Databricks Employee
  • 1351 Views
  • 4 replies
  • 6 kudos

FAQ for Databricks Learning Festival (Virtual): 09 January - 30 January 2026

General Q: How can I check whether I have completed the required modules? Login to your Databricks Customer Academy Account and select ‘My Activities’ within the user menu in the top left. Under the Courses tab, you can verify whether all courses wit...

  • 1351 Views
  • 4 replies
  • 6 kudos
Latest Reply
LivinVincent
New Contributor
  • 6 kudos

Hi Jim,It means we need to wrap up the certification completion within 07 May 2026.  

  • 6 kudos
3 More Replies
hobrob_ex
by > New Contributor
  • 29 Views
  • 1 replies
  • 0 kudos

Anchor links in notebook markdown

Does anyone know if there is a way to get anchor links working in Databricks notebooks so you can jump to sections in the same book without a full page refresh? i.e. something that works like the following html:<a href="#jump_to_target">Jump</a>...<p...

  • 29 Views
  • 1 replies
  • 0 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 0 kudos

@hobrob_ex , yes, this is possible, but not like the HTML way; instead, you will have to use the markdown rendering formats. Add #Heading 1, #Heading 2.. so on in the (+Text) button of the notebook. Once these headings/ sections that you want are con...

  • 0 kudos
dnchankov
by > New Contributor II
  • 9834 Views
  • 5 replies
  • 7 kudos

Resolved! Why my notebook I created in a Repo can be opened safe?

I've cloned a Repo during "Get Started with Data Engineering on Databricks".Then I'm trying to run another notebook from a cell with a magic %run command.But I get that the file can't be opened safe.Here my code:notebook_aname = "John" print(f"Hello ...

  • 9834 Views
  • 5 replies
  • 7 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 7 kudos

+1 to all the above comments. Having the %run command along with other commands will confuse the REPL execution. So, have the %run notebook_b3 command alone in a new cell, maybe as the first cell, is notebook_a, which will resolve the issue, and your...

  • 7 kudos
4 More Replies
fintech_latency
by > New Contributor
  • 133 Views
  • 9 replies
  • 0 kudos

How to guarantee “always-warm” serverless compute for low-latency Jobs workloads?

We’re building a low-latency processing pipeline on Databricks and are running into serverless cold-start constraints.We ingest events (calls) continuously via a Spark Structured Streaming listener.For each event, we trigger a serverlesss compute tha...

  • 133 Views
  • 9 replies
  • 0 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 0 kudos

@fintech_latency  For streaming: refactor to one long‑running Structured Streaming job with a short trigger interval (for example, 1s) and move “assignment” logic into foreachBatch or a transactional task table updated within the micro‑batch. For per...

  • 0 kudos
8 More Replies
Fox19
by > New Contributor II
  • 87 Views
  • 4 replies
  • 0 kudos

DELTA_FEATURES_REQUIRE_MANUAL_ENABLEMENT DLT Streaming Table as Variant

I am attempting to ingest csv files from an S3 bucket with Autoloader. Since the schema of the data is inconsistent (each csv may have different headers), I was hoping to ingest the data as Variant following this: https://docs.databricks.com/aws/en/i...

  • 87 Views
  • 4 replies
  • 0 kudos
Latest Reply
dbxdev
New Contributor II
  • 0 kudos

Can you also share the exact code you are running to ingest 

  • 0 kudos
3 More Replies
Danik
by > New Contributor
  • 35 Views
  • 1 replies
  • 1 kudos

Population stability index (PSI) calculation in Lakehouse monitor

Hi! We are using Lakehouse monitoring for detecting data drift in our metrics. However, the exact calculation of metrics is not documented anywhere (I couldnt find it) and it raises questions on how they are done, in our case especially - PSI. I woul...

  • 35 Views
  • 1 replies
  • 1 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 1 kudos

Hi @Danik , I have reviewed this. 1) Is there documentation for PSI and other metrics?Public docs list PSI in the drift table and give thresholds, but don’t detail the exact algorithm.Internally, numeric PSI uses ~1000 quantiles, equal‑height binning...

  • 1 kudos
Welcome to the Databricks Community!

Once you are logged in, you will be ready to post content, ask questions, participate in discussions, earn badges and more.

Spend a few minutes exploring Get Started Resources, Learning Paths, Certifications, and Platform Discussions.

Connect with peers through User Groups and stay updated by subscribing to Events. We are excited to see you engage!

Top Kudoed Authors
Read Databricks Data Intelligence Platform reviews on G2

Latest from our Blog