- 2367 Views
- 15 replies
- 2 kudos
Need Help - System tables that contains all databricks users, service principal details !!
Hi all - I am trying to create a dashabord where I need to list down all users and service principals along with groups and understand their databricks usages. Is there any table available in Databricks that contains user, service principal details. ...
- 2367 Views
- 15 replies
- 2 kudos
- 2 kudos
Hi Sanjeeb,Please let me know what worked for you.
- 2 kudos
- 363 Views
- 1 replies
- 1 kudos
Custom MCP for Genie via Apps
I wanted to create a custom MCP to help Genie generate some code - its a simple API callI had everything set up - the app, the secrets, everything was deployedBut Genie won't connect, and when I try to add "On Behalf of User Authorization" roles it d...
- 363 Views
- 1 replies
- 1 kudos
- 1 kudos
Check these: First, make sure you're adding the authorization roles through the App's settings in the workspace where Genie will actually use it, not just where you developed it. The roles need to be set in the target workspace. Second, try refreshin...
- 1 kudos
- 230 Views
- 1 replies
- 1 kudos
OJDBC8 stoped working on LTS 17.3
Hi, we upgraded our clusters to LTS 17.3 as we needed some functionality from that release and our OJDBC8 drivers have stopped working. Downgraded back to LTS 14 and its all fine again.What are the latest OJDBC drivers that are compatible with LTS 1...
- 230 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @ScottDunk, Looking at the release notes for LTS 14 and LTS 17.3, it appears to be a compatibility issue... LTS 14:LTS 17.3: As you can see from the images above, Databricks 17.3 LTS upgrades the cluster JVM to Java 17. Oracle’s own compatibility ...
- 1 kudos
- 262 Views
- 1 replies
- 1 kudos
Resolved! DLT pipeline production deployment with AWS
Hi,We are currently working with pyspark where we are doing the ETL as well as data quality check. We are converting our code in to the wheel package to go in to the prod and manage the version for better stability.We are running Databricks with AWS...
- 262 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @satycse06, Have you considered Declarative Automation Bundles (Previously called Databricks Asset bunddles) for this? This is exactly the type of problem it solves. You can still keep your DLT code and a databricks.yml bundle file in Azure DevOps...
- 1 kudos
- 772 Views
- 4 replies
- 3 kudos
Resolved! Security & Compliance understanding on LLM Usage in Databricks Genie and Agentbricks
Hi everyone,With the increasing focus on security and compliance for AI Agents and LLMs, I wanted to get some clarity on a couple of points related to Databricks Genie and Agentbricks.Could someone help provide detailed information on the following, ...
- 772 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @abhijit007, Please take a look at these pages. They answer your queries in detail for Genie. https://docs.databricks.com/genie - Covers architecture and how it works. Also covers security. https://docs.databricks.com/databricks-ai/databricks-ai-...
- 3 kudos
- 344 Views
- 2 replies
- 1 kudos
Resolved! Newly added workspace users do not appear immediately in WorkspaceClient().users.list() or SCIM API
Hello,I’m trying to retrieve the list of users in a Databricks workspace.I am currently using both the Databricks SDK and the SCIM API: from databricks.sdk import WorkspaceClient w = WorkspaceClient() users = list(w.users.list())and also: import re...
- 344 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @discuss_darende, I agree with Pradeep here. In practice, there can be a delay before the identity and its memberships are fully visible everywhere, especially if you’re on Azure and using AIM or a SCIM connector from your IdP. The delay isn’t do...
- 1 kudos
- 903 Views
- 1 replies
- 0 kudos
Issues when configuring keystore spark config for pyspark to mongo atlas X.509 connectivity
Step followed - Step1: To add init script that will copy the keystore file in the tmp location.Step2: To add spark config in cluster advance options - spark.driver.extraJavaOptions -Djavax.net.ssl.keyStore=/tmp/keystore.jks -Djavax.net.ssl.keyStorePa...
- 903 Views
- 1 replies
- 0 kudos
- 0 kudos
The error isn’t actually about MongoDB. HikariCP failing on port 3306 via DataNucleus is your Hive Metastore losing its SSL connection to MySQL on the driver.Setting javax.net.ssl.keyStore globally in extraJavaOptions overwrites the default JVM trust...
- 0 kudos
- 1715 Views
- 3 replies
- 0 kudos
Resolved! How can i run a single task in job from Rest API
How can I run a single task in a job that has many tasks?I can do it in the UI, but I can’t find a way to do it using the REST API. Does anyone know how to accomplish this?
- 1715 Views
- 3 replies
- 0 kudos
- 0 kudos
It looks like this may be a possibility now? I haven't actually tried it, but I noticed a parameter named "only" has been added to the Databricks SDK for when running a job. Here is the commit that made the change: [Release] Release v0.38.0 (#826) · ...
- 0 kudos
- 501 Views
- 3 replies
- 0 kudos
Resolved! Migrating notebooks from Old databricks community instance
I didn't migrate my notebooks from the old community instance. Now I can not login. How can I access the old instance or the materials therein? Please help, I had tons of notebooks in there
- 501 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @RichardWanjohi ,Unfortunately, Databricks Community Editon has been shutdown at January 1, 2026. So there's no way to restore your content. From now on you should use Free Edition.This was announced several times and they asked every user to back...
- 0 kudos
- 524 Views
- 3 replies
- 1 kudos
Resolved! DBSQL MCP Server - how to specify compute cluster?
Hi,the DBSQL MCP Server is really cool, however, I am not sure how to connect it to a specific cluster, and I could not find any information in any documentation page. My MCP settings looks like this:"databricks-sql-mcp": { "type": "streamable-http",...
- 524 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @rdruska, You are right. The behaviour is a bit subtle and not well-documented yet. Having checked internally, here is what I have found. As of today, the DBSQL MCP server will, by default, pick a "random running" SQL warehouse from the set of war...
- 1 kudos
- 368 Views
- 1 replies
- 1 kudos
Resolved! Copy files from /tmp to abfss location
I have a notebook which generates a bunch of excel and pdf reports. These reports needs to be sent out through email and also needs to be archived in external location. i am able to generate these reports in the /tmp file and then send them as attach...
- 368 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @deepu, The reason it isn't working is that Python’s shutil only understands local/POSIX-style paths, not abfss:// URIs, and dbutils.fs expects Databricks-style paths (e.g., file:/..., /Volumes/...). The recommended pattern for this is.. Write re...
- 1 kudos
- 313 Views
- 3 replies
- 1 kudos
Conflicting with predictive optimization.
Hi. We have a continuous DLT pipeline with tables updating every minute and partitioned by partition_key column. Every 3-5 days, we encounter a below conflict error caused by predictive optimization. The pipeline runs fine after restarting, but I nee...
- 313 Views
- 3 replies
- 1 kudos
- 1 kudos
As per the error message i see who is causing the issue: Predictive Optimization Job-d453e56b-97f1-425d-a33d-841bd8a3771f. This is a classic Optimistic Concurrency Issue!What's happening is predictive optimization runs a background job that sets tab...
- 1 kudos
- 170 Views
- 0 replies
- 1 kudos
Lakeventory: Automated Asset Discovery for Databricks Workspaces
Lakeventory is an open-source inventory tool that automatically discovers and catalogs everything in your Databricks workspaces in minutes. What It Collects?- Workspace: notebooks, files, directories- Compute: jobs, clusters, instance pools, policie...
- 170 Views
- 0 replies
- 1 kudos
- 389 Views
- 2 replies
- 0 kudos
Resolved! Databricks One Redirectio
Hello, I have an Entra ID group linked to Databricks with the Consumer Access entitlement enabled, other entitlements are unchecked. They also have "use catalog" on the a specific catalog. They have "select" and "use schema" to a gold level schema wi...
- 389 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @NatJ, You are correct that users with only the Consumer Access entitlement are intended to see the Databricks One interface when they log in. However, the behavior you are observing with direct URLs to the catalog explorer is expected, and here i...
- 0 kudos
- 374 Views
- 3 replies
- 0 kudos
Workspace Folder ACL design
How should the Databricks workspace folder architecture be designed to support cross-team collaboration, access governance, and scalability in an enterprise platform? Please suggest below or share some ideas from your experience ThanksNote: I'm new t...
- 374 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @APJESK, To address your follow-up questions about the two behaviors you observed after implementing the folder ACL structure: ISSUE 1: USERS CAN STILL CREATE NOTEBOOKS IN THEIR HOME FOLDER This is by-design behavior. Every Databricks user automat...
- 0 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
74 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 53 | |
| 38 | |
| 36 | |
| 25 |