- 1390 Views
- 3 replies
- 0 kudos
Rest endpoint for data bricks audit logs
I am trying to find official documentation link to get audit logs of data bricks. unable to find it. referred on.https://docs.databricks.com/en/administration-guide/account-settings/audit-logs.htmlhttps://docs.databricks.com/api/workspace/introductio...
- 1390 Views
- 3 replies
- 0 kudos
- 0 kudos
@madhura I could not find any endpoint that can be used to get the Audit logs. However, you can enable system tables in your workspace and try to read the data just like you read from any other table. Please check this to enable system tables: https...
- 0 kudos
- 1977 Views
- 1 replies
- 0 kudos
Databricks Job alerts
I'm currently running jobs on job clusters and would like these jobs to time out after 168 hours (7 days), at which point a new job cluster will be assigned. This timeout is specifically to ensure that jobs don't run on the same cluster for too long,...
- 1977 Views
- 1 replies
- 0 kudos
- 0 kudos
@Priyam1 Good day!Based on the information provided, it seems that we do not have a direct way to mute notifications for timed-out jobs while still receiving alerts for job failures. You can reduce the number of notifications sent by filtering out no...
- 0 kudos
- 5720 Views
- 7 replies
- 1 kudos
Resolved! Valid Workspace Conf keys
HiI'm trying to automate the configuration of Admin Settings of our Databricks Workspace using Terraform. However identifying the correct config keys is very difficult.Databricks exposes a Workspace Conf API (Enable/disable features | Workspace Conf ...
- 5720 Views
- 7 replies
- 1 kudos
- 1 kudos
I wanted to know the key for Store interactive notebook results in customer account.It's not ideal but by using the browser dev tools you can find out what it is by looking at the network activity after toggling it in the UI. 
- 1 kudos
- 3888 Views
- 6 replies
- 3 kudos
Azure Databricks with standard private link cluster event log error: "Metastore down"...
We have Azure Databricks with standard private link (back-end and front-end private link).We are able to successfully attach a Databricks workspace to the Databricks metastore (ADLS Gen2 storage).However, when trying to create tables in a catalog in ...
- 3888 Views
- 6 replies
- 3 kudos
- 3 kudos
can confirm that the approach will solve your error. Ran into a similar issue a while back.
- 3 kudos
- 1422 Views
- 4 replies
- 0 kudos
Secrets ACL API Behavior Change
Hey all,Has the behavior of the Secrets ACL API changed over the last 24 hours? With no code changes on our scope-deployment pipeline, I am suddenly getting strange errors back from this endpoint.Anybody else noticing a change?Thanks,Alex
- 1422 Views
- 4 replies
- 0 kudos
- 0 kudos
Idk, I control the resource group myself and I don't remember ever granting or revoking contributor roles on that RG for any of these users which are now suddenly throwing errors. Interesting to see that line from the docs... I wonder if that was alw...
- 0 kudos
- 953 Views
- 0 replies
- 0 kudos
Unity Catalog Enabled Clusters using PrivateNIC
Hello,When reviewing the VM settings for Databricks worker VMs, we can see that there are two(2) NICs.A primary ( PublicNIC (primary)) and a secondary (PrivateNIC (primary)).The workers VM is always assigned the PublicNIC and this is reachable from w...
- 953 Views
- 0 replies
- 0 kudos
- 2804 Views
- 4 replies
- 1 kudos
Databricks is taking too long to run a query
I ran a simple query. It is taking too much time and it doesn't stop running. I am the only one who is using cluster. It's happening to every notebook in the workspace.
- 2804 Views
- 4 replies
- 1 kudos
- 1 kudos
Do you see spark jobs running in Spark UI ?
- 1 kudos
- 1556 Views
- 1 replies
- 0 kudos
Proper way to collect Statement ID from JDBC Connection
Hi, We are executing DML calls on Databricks SQL Warehouse programmatically, with Java and Python.There can be thousands of executions running on daily level, so in case of an error occurs, it would be very beneficial to spot the Statement ID of the ...
- 1556 Views
- 1 replies
- 0 kudos
- 2359 Views
- 2 replies
- 1 kudos
Resolved! Native service principal support in JDBC/ODBC drivers
Read from Databricks integration best practises about the native support for Service Principal authentication on JDBC/ODBC drivers. The timetable mentioned for this was "expected to land in 2023", is this referring to the https://docs.databricks.com/...
- 2359 Views
- 2 replies
- 1 kudos
- 1 kudos
@harripy As it says in the documentation, JDBC driver 2.6.36 and above supports Azure Databricks OAuth secrets for OAuth M2M or OAuth 2.0 client credentials authentication. Microsoft Entra ID secrets are not supported.https://learn.microsoft.com/en-u...
- 1 kudos
- 1929 Views
- 1 replies
- 2 kudos
Resolved! What happens to the Workspace when deleting the user that created that workspace
Hi,Background:In Azure-Datebricks:We have a personal account that created a workspace, this workspace is our main workspace. The users Microsoft Office account is already deleted but the user still exist in Databricks.Question 1 is:What happens to th...
- 1929 Views
- 1 replies
- 2 kudos
- 2 kudos
What happens to this workspace if we delete the User from workspace? Will the Workspace be deleted? → Workspace will not get deleted. It will continue to function.Is it possible to change Workspace own to another account i.e. a service account. If so...
- 2 kudos
- 2888 Views
- 1 replies
- 1 kudos
Resolved! Import folder (no .whl or .jar files) and run the `python3 setup.py bdist_wheel` for lib install
I want to import the ibapi python module in Azure Databricks Notebook.Before this, I downloaded the the TWS API folder from https://interactivebrokers.github.io/# I need to go through the following steps to install the API:Download and install TWS Ga...
- 2888 Views
- 1 replies
- 1 kudos
- 1 kudos
You can try to upload the folder in the workspace location and try to cd in the desired folder and try to install in via notebook. But it would be a notebook scope installation. If you are looking for a cluster scoped installation then you would need...
- 1 kudos
- 1509 Views
- 3 replies
- 0 kudos
Permission denied using patchelf
Hello all I would like to run a Python script on a Shared cluster: The Python script tries to do, under the hood, a call to `patchelf` utility, in order to set a r-path. Something along those lines:execute('patchelf',['--set-rpath', rpath, lib])The p...
- 1509 Views
- 3 replies
- 0 kudos
- 0 kudos
I basically followed this tutorial https://learn.microsoft.com/en-gb/azure/databricks/data-governance/unity-catalog/get-started and it seemed to be working, so I guess Unity Catalog is enabled?
- 0 kudos
- 2341 Views
- 1 replies
- 0 kudos
Why does use of Azure SSO require Databricks PAT enabled ?
My org uses Databricks and SSO. We are keen to disable the use of PAT but have noticed that when it's disabled, we're not able to use SSO. May I ask why does SSO have a dependency on PATs [arguably they are two distinct authentication methods] ?Also,...
- 2341 Views
- 1 replies
- 0 kudos
- 1895 Views
- 0 replies
- 0 kudos
Delete Azure Databricks Workspace resource but reference remains in Account Console
As the title says. I have deleted both the Azure Databricks Workspace resource and the resource group it was located in. However, I can still see the workspace in the account console. Also, it appears in the response from GET https://accounts.azureda...
- 1895 Views
- 0 replies
- 0 kudos
- 4432 Views
- 2 replies
- 0 kudos
Resolved! Call an Azure Function App with Access Restrictions from a Databricks Workspace
Hello,As the title says, I am trying to call an function from an Azure Function App configured with access restrictions from a python notebook in my Databricks workspace. The Function App resource is in a different subscription as the Databricks work...
- 4432 Views
- 2 replies
- 0 kudos
- 0 kudos
Update : Problem was fixed ! The key was to set an VNET rule in the access restriction, giving access directly to the subnets used by Databricks.It seems like for Microsoft to Microsoft connections, the IP addresses are not used, so adding the IP ran...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
69 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
41 | |
22 | |
12 | |
9 | |
9 |