- 1017 Views
- 2 replies
- 1 kudos
Databricks Audit Logs Analysis
Please share the attributes related to the audit logsHow is the audit logs can be utilized by cyber security team? What are the insights into the audit logs and how we can maintain the compliance? What are the non-compliance items can be identified f...
- 1017 Views
- 2 replies
- 1 kudos
- 1 kudos
The audit logs will only get the information of the events and actions being performed by users, service principals in the workspace, so there is no compliance actions being tracked itself.The Service name is a subgroup of items you want to check, so...
- 1 kudos
- 1688 Views
- 1 replies
- 0 kudos
not able to configure databricks with aws
I am trying to set up Databricks with my AWS account but I am facing some issues, I followed the steps to create a new Databricks workspace using the AWS CloudFormation template or Databricks setup guide, But the CloudFormation stack fails every time...
- 1688 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @dipin_wantoo! Can you please check the CloudFormation Stack Events for the exact error? That should help identify why the stack is failing. If you're looking to get started with Databricks as an individual user, the Express Setup is a simple w...
- 0 kudos
- 1481 Views
- 3 replies
- 1 kudos
Library installation failed for library due to user error for pypi
Hi!I get the below error when a cluster job starts up and tries to install a Python .whl file. (Which is hosted on an Azure Artefact feed, though this seems more like a problem of trying to read from a disk/network storage). The failure is seemingly ...
- 1481 Views
- 3 replies
- 1 kudos
- 1 kudos
Thanks both - I think the problem is that this library installation is called when creating a new Job & Task via the rest endpoint. Where the libraires are specified in the .json file. So short version, don't think I can 'get at' the pip install call...
- 1 kudos
- 3425 Views
- 1 replies
- 0 kudos
Unable to destroy NCC private endpoint
Hi TeamAccidentally, we removed one of the NCC private endpoints from our storage account that was created using Terraform. When I tried to destroy and recreate it, I encountered the following error. According to some articles, the private endpoint w...
- 3425 Views
- 1 replies
- 0 kudos
- 0 kudos
Once a private endpoint rule is deactivated, it isn't immediately removed. Instead, it will be scheduled for purging after a set time period. In your case, the rule is slated for purging at the timestamp mentioned. This situation can occur in scena...
- 0 kudos
- 1898 Views
- 2 replies
- 1 kudos
Resolved! Why am I seeing NAT Gateway in the cost? Serverless Compute.
I have Azure Databricks premium subscription. I am running the Python interactive Notebooks in the Databricks Wokrspace using Serverless compute since the last few days. Today I received an alter in my email saying the monthly Billing already crossed...
- 1898 Views
- 2 replies
- 1 kudos
- 1 kudos
Thanks @szymon_dybczak I deleted the Workspace and the NAT Gatway service got deleted from the vnet. I created a simple single node cluster to run my code.
- 1 kudos
- 1327 Views
- 1 replies
- 0 kudos
Resolved! Can I add another user to Free edition
Is it possible to add another user to Free edition ?I am wanting to test what they can see when they connect as a restricted user i.e. only granted Browse on 1 catalogThanks
- 1327 Views
- 1 replies
- 0 kudos
- 3478 Views
- 7 replies
- 0 kudos
Failing PowerBi connection with data Bricks via SQL warehouse
I'm encountering an 'Invalid credentials' error (Session ID: 4601-b5a6-0daf792752a2, Region: us) when connecting Power BI to an Azure Databricks SQL Warehouse using an SPN. The SPN has CAN MANAGE access at SQL warehouse, admin rights at account and...
- 3478 Views
- 7 replies
- 0 kudos
- 0 kudos
Hello sai, sorry for the experience I usually available at desk during EMEA time zone. Apologies for the delay. Can you please try this?In "Advanced settings" of the data source within the gateway configuration and set the "Connection Encryption se...
- 0 kudos
- 2413 Views
- 1 replies
- 0 kudos
Looking for insights on enabling Databricks Automatic Provisioning
We currently have a SCIM provisioning connector set up to synchronize identities from Entra ID to Unity Catalog.We’re now considering enabling Databricks Automatic Provisioning but want to fully understand the potential impact on our environment befo...
- 2413 Views
- 1 replies
- 0 kudos
- 0 kudos
TranslateHello Xaveri.Good day!Here are few links related to databricks provisioninghttps://docs.databricks.com/aws/en/admin/users-groups/scim/aadhttps://www.databricks.com/blog/announcing-automatic-identity-management-azure-databricksBut do let me k...
- 0 kudos
- 2377 Views
- 4 replies
- 4 kudos
Privileged Identity Management for Databricks with Microsoft Entra ID
Privileged Identity Management (PIM) can be used to secure access to critical Databricks roles with Just-in-Time (JIT) access. This approach helps organizations enforce time-bound permissions, approval workflows, and centralized auditing for sensitiv...
- 2377 Views
- 4 replies
- 4 kudos
- 1410 Views
- 2 replies
- 0 kudos
Delete unassigned catalogs
Hi everybody,due to some not-so-optimal Infrastructure as code experiments with terraform I ended up a lot (triple digit) of catalogs in a metastore that are not assigned to any workspace and that i want to delete.Unfortunately, there is no way to ev...
- 1410 Views
- 2 replies
- 0 kudos
- 0 kudos
Yeah, I see those catalogs and i know that I could reattach and delete them. As i have around 100 those catalogs it would be nice to iterate through them by getting a list, e.g. using the cli or the rest API. And then force delete them , as described...
- 0 kudos
- 1107 Views
- 1 replies
- 1 kudos
VS Code - ipynb vs py execution - spark issue
Databricks Connect works inside VS Code notebook but the same code fails in a standalone script withValueError: default auth: cannot configure default credentialsI’m developing locally with **Databricks Connect 16.1.6** and VS Code.Inside a Jupyter n...
- 1107 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @Sisi ,I think what's happening here is when you debug with option "Debug current file with Databricks Connect" then VS Code is using Databricks extension, which automatically handles authentication and sets up proper configuration.The regular Pyt...
- 1 kudos
- 2742 Views
- 1 replies
- 2 kudos
Resolved! Lakebase use cases
1. What are the use cases for Lakebase? When should I use the Lakebase Postgres over delta tables?2. What are the differences between open-source Postgres and Lakebase?3. Should I utilize Lakebase for all OLTP requirements?
- 2742 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @Sharanya13 ,1. Use Lakebase whenever you have application workload (OLTP) and you require low latency. For analytical workloads use Lakehouse. Here you have couple of example use cases from documentation:Serving data and/or features from the lake...
- 2 kudos
- 2236 Views
- 6 replies
- 2 kudos
Out of memory error when installing environment dependencies of UC Python UDF
Hi,I've created a small UC Python UDF to test whether it works with custom dependencies (new PP feature), and every time I'm getting OOM errors with this message: [UDF_ENVIRONMENT_USER_ERROR.OUT_OF_MEMORY] Failed to install UDF dependencies for <cata...
- 2236 Views
- 6 replies
- 2 kudos
- 2 kudos
I tried with cluster, spent some couple of hours to load some libraries but unable to do. may be someone else can help you on this.
- 2 kudos
- 2044 Views
- 5 replies
- 4 kudos
Resolved! Metastore deletion issues
Good afternoon, I have an issue with my metastore in North Europe.All my workspaces got detached:If I go to Databricks console, I can see the metastore in North Europe I created.However, when I select the metastore in North Europe, I get the followin...
- 2044 Views
- 5 replies
- 4 kudos
- 4 kudos
I solved the issue by deleting all the asignments before deleting the metastore.1. Access to Databricks CLI and authenticate 2. List metastores>> databricks account metastores list 3. List wotrkspaces and check assignments>> databricks account worksp...
- 4 kudos
- 1239 Views
- 1 replies
- 1 kudos
Drop schema or catalog using cascade function
Hello In Databricks (non-Unity Catalog), I have two schemas (schema_a and schema_b) that both use the same root location in DBFS or external storage like ADLS.Example:abfss://container@storage_account.dfs.core.windows.net/data/project/schema_aabfss:/...
- 1239 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @EjB For the given example, here is the response:Will DROP SCHEMA schema_a CASCADE remove or affect tables in schema_b?No, unless:1. The tables in schema_a are managed tables, AND2. Tables in schema_b store their data physically inside /schema_...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
47 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 108 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |