- 928 Views
- 1 replies
- 0 kudos
Create Databricks managed service principal programatically ?
For the current Databricks service principal API or the Databricks SDK, an ID is required. However, when dealing with Databricks-managed service principals, you typically only have the name. For registering with cloud providers, like Microsoft Entra ...
- 928 Views
- 1 replies
- 0 kudos
- 0 kudos
@SunilPoluri, could you please provide some more context by providing screenshots?
- 0 kudos
- 2428 Views
- 4 replies
- 1 kudos
Databricks is taking too long to run a query
I ran a simple query. It is taking too much time and it doesn't stop running. I am the only one who is using cluster. It's happening to every notebook in the workspace.
- 2428 Views
- 4 replies
- 1 kudos
- 1 kudos
Do you see spark jobs running in Spark UI ?
- 1 kudos
- 1280 Views
- 2 replies
- 0 kudos
Proper way to collect Statement ID from JDBC Connection
Hi, We are executing DML calls on Databricks SQL Warehouse programmatically, with Java and Python.There can be thousands of executions running on daily level, so in case of an error occurs, it would be very beneficial to spot the Statement ID of the ...
- 1280 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @harripy, To retrieve the Statement ID from a Databricks SQL Warehouse execution using the Java SDK, you can follow these steps: Execute the Query: First, execute your DML query using the Java JDBC connection. You can use the java.sql.Statement ...
- 0 kudos
- 1379 Views
- 2 replies
- 1 kudos
Resolved! Native service principal support in JDBC/ODBC drivers
Read from Databricks integration best practises about the native support for Service Principal authentication on JDBC/ODBC drivers. The timetable mentioned for this was "expected to land in 2023", is this referring to the https://docs.databricks.com/...
- 1379 Views
- 2 replies
- 1 kudos
- 1 kudos
@harripy As it says in the documentation, JDBC driver 2.6.36 and above supports Azure Databricks OAuth secrets for OAuth M2M or OAuth 2.0 client credentials authentication. Microsoft Entra ID secrets are not supported.https://learn.microsoft.com/en-u...
- 1 kudos
- 1580 Views
- 1 replies
- 2 kudos
Resolved! What happens to the Workspace when deleting the user that created that workspace
Hi,Background:In Azure-Datebricks:We have a personal account that created a workspace, this workspace is our main workspace. The users Microsoft Office account is already deleted but the user still exist in Databricks.Question 1 is:What happens to th...
- 1580 Views
- 1 replies
- 2 kudos
- 2 kudos
What happens to this workspace if we delete the User from workspace? Will the Workspace be deleted? → Workspace will not get deleted. It will continue to function.Is it possible to change Workspace own to another account i.e. a service account. If so...
- 2 kudos
- 2410 Views
- 1 replies
- 1 kudos
Resolved! Import folder (no .whl or .jar files) and run the `python3 setup.py bdist_wheel` for lib install
I want to import the ibapi python module in Azure Databricks Notebook.Before this, I downloaded the the TWS API folder from https://interactivebrokers.github.io/# I need to go through the following steps to install the API:Download and install TWS Ga...
- 2410 Views
- 1 replies
- 1 kudos
- 1 kudos
You can try to upload the folder in the workspace location and try to cd in the desired folder and try to install in via notebook. But it would be a notebook scope installation. If you are looking for a cluster scoped installation then you would need...
- 1 kudos
- 1168 Views
- 3 replies
- 0 kudos
Permission denied using patchelf
Hello all I would like to run a Python script on a Shared cluster: The Python script tries to do, under the hood, a call to `patchelf` utility, in order to set a r-path. Something along those lines:execute('patchelf',['--set-rpath', rpath, lib])The p...
- 1168 Views
- 3 replies
- 0 kudos
- 0 kudos
I basically followed this tutorial https://learn.microsoft.com/en-gb/azure/databricks/data-governance/unity-catalog/get-started and it seemed to be working, so I guess Unity Catalog is enabled?
- 0 kudos
- 1552 Views
- 1 replies
- 0 kudos
Delete Azure Databricks Workspace resource but reference remains in Account Console
As the title says. I have deleted both the Azure Databricks Workspace resource and the resource group it was located in. However, I can still see the workspace in the account console. Also, it appears in the response from GET https://accounts.azureda...
- 1552 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Kim3, Here are some steps you can follow to ensure the workspace is properly removed: Check Deployment Status: If the workspace deployment failed, it might cause issues during deletion. Review the error details in the activity log tab and re...
- 0 kudos
- 2151 Views
- 3 replies
- 0 kudos
Resolved! Why does use of Azure SSO require Databricks PAT enabled ?
My org uses Databricks and SSO. We are keen to disable the use of PAT but have noticed that when it's disabled, we're not able to use SSO. May I ask why does SSO have a dependency on PATs [arguably they are two distinct authentication methods] ?Also,...
- 2151 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @phguk, Let’s delve into the intricacies of Databricks, SSO, and personal access tokens (PATs). SSO and PATs: Single Sign-On (SSO) and personal access tokens (PATs) serve different purposes, but they can intersect in certain scenarios.SSO all...
- 0 kudos
- 3742 Views
- 3 replies
- 0 kudos
Resolved! Call an Azure Function App with Access Restrictions from a Databricks Workspace
Hello,As the title says, I am trying to call an function from an Azure Function App configured with access restrictions from a python notebook in my Databricks workspace. The Function App resource is in a different subscription as the Databricks work...
- 3742 Views
- 3 replies
- 0 kudos
- 0 kudos
Update : Problem was fixed ! The key was to set an VNET rule in the access restriction, giving access directly to the subnets used by Databricks.It seems like for Microsoft to Microsoft connections, the IP addresses are not used, so adding the IP ran...
- 0 kudos
- 1437 Views
- 1 replies
- 0 kudos
Azure Devops repos access
I have a Databricks setup, where the users and their permissions are handled in Microsoft Azure using AD groups and then provisioned (account level) using a provisioning connector to Databricks. The code repositories are in Azure Devops where users a...
- 1437 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @alm, Here are some steps to check and potential solutions: SCIM Provisioning Connector: Ensure that you have set up the SCIM provisioning connector between Microsoft Azure AD and Databricks. This connector syncs users and groups from your Az...
- 0 kudos
- 2680 Views
- 3 replies
- 0 kudos
How to bind a User assigned Managed identity to Databricks to access external resources?
Is there a way to bind a user assigned managed identity to Databricks? We want to access some SQL DBs, Redis cache from our Spark code running on Databricks using Managed Identity instead of Service Principals and basic authentication.As of today, Da...
- 2680 Views
- 3 replies
- 0 kudos
- 0 kudos
@Carpender correcting my comment above, Databricks assigned Managed Identity is working and we are able to access but as stated in the original question we are looking for authorization using User Assigned Managed Identity (UAMI). With UAMI we cannot...
- 0 kudos
- 4805 Views
- 1 replies
- 0 kudos
How to setup service principal to assing account-level groups to workspaces using terraform
Based on best practices, we have set up SCIM provisioning using Microsoft Entra ID to synchronize Entra ID groups to our Databricks account. All workspaces have identity federation enabled.However, how should workspace administrators assign account-l...
- 4805 Views
- 1 replies
- 0 kudos
- 0 kudos
Have you tried giving Manager role on the group to the service principal which is workspace admin? Once you do this you may be able to use the settings to In workspace context, adding account-level group to a workspace in databricks_permission_assig...
- 0 kudos
- 1315 Views
- 1 replies
- 0 kudos
Resolved! Instances are not being terminated in time (extra AWS costs)
For a few days we have been trying to figure out why our AWS costs suddenly went up around March 20th, and we just found the answer: the EC2 instances are left in an unterminated state for a couple of minutes at the end of each run! This is a very se...
- 1315 Views
- 1 replies
- 0 kudos
- 2786 Views
- 5 replies
- 0 kudos
System.billing.usage table - cannot match job_id from databricks api/UI
Hello, I have multiple continuous jobs that are running for many days (Kafka stream), however querying System.billing.usage table by job_id from UI or databricks job api not return any results for those jobs.1. What is the reason behind that?2. If I ...
- 2786 Views
- 5 replies
- 0 kudos
- 0 kudos
Hello, you are right, apologize for my misunderstanding as you have mentioned the job id persist, is the job run id the one that will be changed. So your issue is that as the job is continuing running the information of that run is not showing in the...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
42 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
36 | |
9 | |
9 | |
8 | |
8 |