cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 
Self-Paced Learning Festival: 09 January - 30 January 2026

Grow Your Skills and Earn Rewards! Mark your calendar: January 09 – January 30, 2026 Join us for a three-week event dedicated to learning, upskilling, and advancing your career in data engineering, analytics, machine learning, and generative AI. ...

  • 147875 Views
  • 496 replies
  • 154 kudos
12-09-2025
🎤 Call for Presentations: Data + AI Summit 2026 is Open!

June 15–18, 2026 Are you building the future with data and AI? Then this is your moment. The Call for Proposals for Data + AI Summit 2026 is officially open, and we want to hear from builders, practitioners, and innovators across the data and AI com...

  • 9162 Views
  • 4 replies
  • 6 kudos
12-15-2025
Level Up with Databricks Specialist Sessions

How to Register & Prepare If you're interested in advancing your skills with Databricks through a Specialist Session, here's a clear guide on how to register and what free courses you can take to prepare effectively. How to Begin Your Learning Path S...

  • 4117 Views
  • 2 replies
  • 9 kudos
10-02-2025
Solution Accelerator Series | Scale cybersecurity analytics with Splunk and Databricks

Strengthening cybersecurity isn’t a one-time project — it’s an ongoing process of adapting to new and complex threats. The focus is on modernizing your data infrastructure and analytics capabilities to make security operations smarter, faster, and mo...

  • 1184 Views
  • 1 replies
  • 2 kudos
2 weeks ago

Community Activity

gokkul
by > New Contributor II
  • 8 Views
  • 0 replies
  • 0 kudos

How to access a delta table in UC from lakebase postgres ?

Hi DB Community ,Is there any way to access/write to delta table in UC from lakebase postgres ? There's a way using "Sync Table" - but it is recommended only to read data from Sync Table . Databricks recommends against writing to sync table . Or else...

  • 8 Views
  • 0 replies
  • 0 kudos
NuviaKapslerNo
by > Visitor
  • 8 Views
  • 0 replies
  • 0 kudos

Nuvia-kapsler for vekttap

Nuvia For personer som ønsker å forbedre vekttapsreisen sin på en trygg og effektiv måte, kan Nuvia være et verdifullt tillegg til et sunt kosthold og en aktiv livsstil. Offisiell nettside: http://nuvianorge.com/https://testonax.no/https://nuvia.com....

  • 8 Views
  • 0 replies
  • 0 kudos
maddan80
by > New Contributor II
  • 100 Views
  • 1 replies
  • 0 kudos

Serverless giving inconsistent results in Oracle UCM SOAP call

Hello ,We have implemented Data pipeline to ingest data from Oracle UCM using SOAP API, This was working fine with Job and all Purpose clusters. Recently we wanted to use Serverless to take advantage of the server startup time. In this case we were n...

  • 100 Views
  • 1 replies
  • 0 kudos
Latest Reply
mmayorga
Databricks Employee
  • 0 kudos

Hi @maddan80  Thank you for reaching out with your question and providing the context about your use case. Per your comments, having a 200 Status Code in Serverless is a good initial indicator that the request is reaching the Oracle UCM server. Brain...

  • 0 kudos
spjti
by > Visitor
  • 43 Views
  • 1 replies
  • 0 kudos

Manager hierarchy

for employee manager hierarchy some employees are not having correct manager mapping, what can be the reason?

  • 43 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 0 kudos

This is unfortunately an unanswerable question with the lack of information. If you can at least provide the schemas of the tables you're working with and the joins/CTE you may have tried, it would at least be a starting point.

  • 0 kudos
demo-user
by > New Contributor II
  • 45 Views
  • 1 replies
  • 0 kudos

Connecting an S3-compatible endpoint (such as MinIO) to Unity Catalog

Hi everyone, is it possible to connect an S3-compatible storage endpoint that is not AWS S3 (for example MinIO) to Databricks Unity Catalog? I already have access using Spark configurations (3a endpoint, access key, secret key, etc.), and I can read/...

  • 45 Views
  • 1 replies
  • 0 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 0 kudos

It's not supported unfortunately to register those as Storage Credentials. But the ask seems to be coming up more frequently and I believe Product is in "Discovery" phase to support it in UC. Here are some standard questions that might help with coll...

  • 0 kudos
greengil
by > New Contributor III
  • 98 Views
  • 4 replies
  • 0 kudos

Resolved! Databricks database

In Oracle, I create schemas and tables and link tables together via the primary/foreign key to do SQL queries.In Databricks, I notice that I can create tables, but how do I link tables together for querying?  Does Databricks queries need the key in t...

  • 98 Views
  • 4 replies
  • 0 kudos
Latest Reply
tharple135
New Contributor II
  • 0 kudos

@greengil you treat your joins in Databricks the same way you would in any other relational database.  Join 2+ tables together using the primary/foreign keys from each of those tables.  

  • 0 kudos
3 More Replies
cdn_yyz_yul
by > Contributor
  • 45 Views
  • 3 replies
  • 0 kudos

unionbyname several streaming dataframes of different sources

Is the following type of union safe with spark structured streaming?union multiple streaming dataframes, and each from a different source.Anything better solution ?for example, df1 = spark.readStream.table(f"{bronze_catalog}.{bronze_schema}.table1") ...

  • 45 Views
  • 3 replies
  • 0 kudos
Latest Reply
cdn_yyz_yul
Contributor
  • 0 kudos

Thanks @stbjelcevic ,I am looking for a solution .... === Let's say, I have already had: df1 = spark.readStream.table(f"{bronze_catalog}.{bronze_schema}.table1") df2 = spark.readStream.table(f"{bronze_catalog}.{bronze_schema}.table2") df1a = df1.se...

  • 0 kudos
2 More Replies
newenglander
by > New Contributor II
  • 3559 Views
  • 3 replies
  • 1 kudos

Cannot import editable installed module in notebook

Hi,I have the following directory structure:- mypkg/ - setup.py - mypkg/ - __init__.py - module.py - scripts/ - main # notebook From the `main` notebok I have a cell that runs:%pip install -e /path/to/mypkgThis command appears to succ...

  • 3559 Views
  • 3 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Hey @newenglander — always great to meet a fellow New Englander Could you share a bit more detail about your setup? For example, are you running on classic compute or serverless? And are you working in a customer workspace, or using Databricks Free ...

  • 1 kudos
2 More Replies
dbrixr
by > New Contributor II
  • 2514 Views
  • 3 replies
  • 1 kudos

Reverse colors in dark mode

It seems that Databricks implements its dark mode by applying the invert filter so that all colors are reversed. This is problematic if one wants to create some sort of html widget or rich output since this filter is passed down to the result of disp...

  • 2514 Views
  • 3 replies
  • 1 kudos
Latest Reply
valentin9
Visitor
  • 1 kudos

Also struggled with that issue. What I found is that there is a style applied onto the iframe: "filter: var(--invert-filter);" which applies this CSS: filter: invert(1) saturate(0.5).I couldn't find any elements within the iframe that I can use to de...

  • 1 kudos
2 More Replies
Sanjeeb2024
by > Contributor III
  • 677 Views
  • 14 replies
  • 2 kudos

Need Help - System tables that contains all databricks users, service principal details !!

Hi all - I am trying to create a dashabord where I need to list down all users and service principals along with groups and understand their databricks usages. Is there any table available in Databricks that contains user, service principal details. ...

  • 677 Views
  • 14 replies
  • 2 kudos
Latest Reply
emma_s
Databricks Employee
  • 2 kudos

Hi, I can't find any reference to a user system table in our docs. Instead the recommended approach is to use the API to return users, groups and service principals. You can either run this using the Workspace Client if you only have worspace admin p...

  • 2 kudos
13 More Replies
Raman_Unifeye
by > Contributor III
  • 86 Views
  • 2 replies
  • 2 kudos

Change default workspace access to Consumer access

When Databricks One was launched, the default behaviour of the system-managed users group was a major issue. Since every new user is automatically added to users group, and that group traditionally came with "Workspace Access" entitlements, admins ha...

  • 86 Views
  • 2 replies
  • 2 kudos
Latest Reply
Raman_Unifeye
Contributor III
  • 2 kudos

Thanks @Advika. it was certainly a big win for my env.

  • 2 kudos
1 More Replies
Hubert-Dudek
by Databricks MVP
  • 34 Views
  • 0 replies
  • 1 kudos

Runtime 18 GA

Runtime 18, including Spark 4.1, is no longer in Beta. You can start migrating now. Runtime 18 is available now only for classic compute. Serverless, or SQL warehouse, still using older runtimes. Once 18 is everywhere, we will be able to use identifi...

18.jpg
  • 34 Views
  • 0 replies
  • 1 kudos
Jim_Anderson
by Databricks Employee
  • 1370 Views
  • 4 replies
  • 6 kudos

FAQ for Databricks Learning Festival (Virtual): 09 January - 30 January 2026

General Q: How can I check whether I have completed the required modules? Login to your Databricks Customer Academy Account and select ‘My Activities’ within the user menu in the top left. Under the Courses tab, you can verify whether all courses wit...

  • 1370 Views
  • 4 replies
  • 6 kudos
Latest Reply
LivinVincent
New Contributor
  • 6 kudos

Hi Jim,It means we need to wrap up the certification completion within 07 May 2026.  

  • 6 kudos
3 More Replies
hobrob_ex
by > New Contributor
  • 32 Views
  • 1 replies
  • 0 kudos

Anchor links in notebook markdown

Does anyone know if there is a way to get anchor links working in Databricks notebooks so you can jump to sections in the same book without a full page refresh? i.e. something that works like the following html:<a href="#jump_to_target">Jump</a>...<p...

  • 32 Views
  • 1 replies
  • 0 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 0 kudos

@hobrob_ex , yes, this is possible, but not like the HTML way; instead, you will have to use the markdown rendering formats. Add #Heading 1, #Heading 2.. so on in the (+Text) button of the notebook. Once these headings/ sections that you want are con...

  • 0 kudos
dnchankov
by > New Contributor II
  • 9842 Views
  • 5 replies
  • 7 kudos

Resolved! Why my notebook I created in a Repo can be opened safe?

I've cloned a Repo during "Get Started with Data Engineering on Databricks".Then I'm trying to run another notebook from a cell with a magic %run command.But I get that the file can't be opened safe.Here my code:notebook_aname = "John" print(f"Hello ...

  • 9842 Views
  • 5 replies
  • 7 kudos
Latest Reply
iyashk-DB
Databricks Employee
  • 7 kudos

+1 to all the above comments. Having the %run command along with other commands will confuse the REPL execution. So, have the %run notebook_b3 command alone in a new cell, maybe as the first cell, is notebook_a, which will resolve the issue, and your...

  • 7 kudos
4 More Replies
Welcome to the Databricks Community!

Once you are logged in, you will be ready to post content, ask questions, participate in discussions, earn badges and more.

Spend a few minutes exploring Get Started Resources, Learning Paths, Certifications, and Platform Discussions.

Connect with peers through User Groups and stay updated by subscribing to Events. We are excited to see you engage!

Top Kudoed Authors
Read Databricks Data Intelligence Platform reviews on G2

Latest from our Blog