- 347 Views
- 1 replies
- 1 kudos
Resolved! Delta share not showing in delta shared with me
Hi Everyone,We just start using Databricks, and we were expecting to receive a Delta Share from a third-party provider. They’ve confirmed that the sharing process has been completed on their end. However, the shared data is not appearing on our porta...
- 347 Views
- 1 replies
- 1 kudos
- 1 kudos
You need USE PROVIDER privileges on the recipient workspaces assigned metastore (or you need to be a metastore admin), you will then see the providers delta sharing org name in SHOW PROVIDERS then you can mount their share as a catalog, let me know h...
- 1 kudos
- 4371 Views
- 3 replies
- 2 kudos
Resolved! Restricting Catalog and External Location Visibility Across Databricks Workspaces
Hi Databricks Community,I need some guidance regarding catalogs and external locations across multiple environments. Here's my situation:I've set up a resource group (dev-rg) and created a Databricks workspace where I successfully created catalogs (b...
- 4371 Views
- 3 replies
- 2 kudos
- 2 kudos
I am facing exact similar issue. I don't want to create separate metastore. and I have added environment name as a prefix to all external locations. All the locations are restricted to their workspaces, so functionality wise everything is fine. my co...
- 2 kudos
- 986 Views
- 4 replies
- 5 kudos
Resolved! Azure Databricks Control Plane connectivity issue after migrating to vWAN
Hello everyone,Recently, I received a client request to migrate our Azure Databricks environment from a Hub-and-Spoke architecture to a vWAN Hub architecture with an NVA (Network Virtual Appliance).Here’s a quick overview of the setup:The Databricks ...
- 986 Views
- 4 replies
- 5 kudos
- 5 kudos
@nodeb Can you please mark your reply as solution. It will help other users find the resolution fast.
- 5 kudos
- 329 Views
- 1 replies
- 2 kudos
Resolved! Academy: Secondary Email purpose
Hello,I'd like to ask about the functionality of the secondary email.When I try to log-in with it, it prompts me to create a new account, and seems disconnected from my existing account.1. How does it help with keeping access to the account in case a...
- 329 Views
- 1 replies
- 2 kudos
- 2 kudos
Hello @amrim! If you try logging in with your secondary email, it will prompt you to create a new account. That’s expected behaviour, as the secondary email is meant for recovery purposes. If you lose access to your primary (registration) email, plea...
- 2 kudos
- 437 Views
- 3 replies
- 2 kudos
Unable to access the account console
Hi,I was trying to add in some IP whitelist under https://docs.databricks.com/aws/en/security/network/front-end/ip-access-list-account#gsc.tab=0 and I have locked myself out. I cannot log into my account console anymore. I need help.Your administra...
- 437 Views
- 3 replies
- 2 kudos
- 2 kudos
Yes I am account admin and normally can log into https://accounts.cloud.databricks.com via Okta to be able to manage the account. However because I forgot to whitelist my IP, I cannot log into the account console to do anything to my account.Databri...
- 2 kudos
- 685 Views
- 2 replies
- 0 kudos
A way to get databricks data via Rest API on behalf of user
Hello Databricks Community!I am trying to figure out the best way to get the data from databricks via REST API (or Python SDK but not preferable) but to do not lose information about users permissions during authentication. The use case is that serve...
- 685 Views
- 2 replies
- 0 kudos
- 0 kudos
The recommended approach is OAuth 2.0 with On-Behalf-Of (OBO) authentication. This allows your server or app to act on behalf of the user, using their identity and permissions to access Databricks resources.How it works:The user authenticates via OAu...
- 0 kudos
- 5124 Views
- 3 replies
- 3 kudos
Resolved! Unity Catalog Volume mounting broken by cluster environment variables (http proxy)
Hello all,I have a slightly niche issue here, albeit one that others are likely to run into.Using databricks on Azure, my organisation has included extended our WAN into the cloud, so that all compute clusters are granted a private IP address that ca...
- 5124 Views
- 3 replies
- 3 kudos
- 3 kudos
Unfortunately, the only solution I found was to not use the proxy globally. Good luck!
- 3 kudos
- 229 Views
- 2 replies
- 1 kudos
Regarding External Location
If I update the ownership or privileges of an external location, will it have any impact on the associated container? Specifically, while users continue to access the container through the external location during frequent updates, could these change...
- 229 Views
- 2 replies
- 1 kudos
- 1 kudos
If you transfer ownership of the external location to another principal, the new owner gains full control.Existing privileges may remain intact unless explicitly revoked.However, if the new owner modifies privileges or policies, it could affect user ...
- 1 kudos
- 211 Views
- 1 replies
- 0 kudos
Schemas
If I frequently update the schema—such as changing ownership—will it have any impact on the underlying tables or views?
- 211 Views
- 1 replies
- 0 kudos
- 0 kudos
Nope. If your users have appropriate grants on external location then changing the owner won't have any negative effect on their ability to interact with data.
- 0 kudos
- 366 Views
- 2 replies
- 0 kudos
How to find when was the last time a user logged in to Azure Databricks?
Hi,I am asked to prepare a report on Azure Databricks usage in our company. Most metadata I need, like a list of users, workspaces, users rights on workspaces etc. I already have, but I do not know how to determine when the last time was a user logge...
- 366 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @schluckie you can check using a query like the one below and just change the ip address to match your conventions, or remove it entirely. SELECT event_time, user_identity.email, source_ip_address, action_name, response.status_code FROM system.acc...
- 0 kudos
- 4190 Views
- 1 replies
- 0 kudos
VS Code Databricks Connect Cluster Configuration
I am currently setting up the VSCode extension for Databricks Connect, and it’s working fine so far. However, I have a question about cluster configurations. I want to access Unity Catalog from VSCode through the extension, and I’ve noticed that I ca...
- 4190 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Tito, you can use the standard cluster (formerly known as the shared cluster) from VSCode using DBConnect. Here is an example, from databricks.connect import DatabricksSession # Option 1: Use cluster_id from .databrickscfg automatically # Since ...
- 0 kudos
- 1897 Views
- 3 replies
- 1 kudos
Unity Catalog system tables (table_lineage, column_lineage) not populated
Hi community,We have enabled Unity Catalog system schemas (including `access`) more than 24 hours ago in sandbox. The schemas are showing ENABLE_COMPLETED, and other system tables (like query) are working fine.However, both `system.access.table_linea...
- 1897 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @yavuzmert, is this still an issue? Are you seeing lineage for those tables in the UI? One thing to remember about system tables is that their data is updated throughout the day. Usually, if you don't see a log for a recent event, check back later...
- 1 kudos
- 970 Views
- 2 replies
- 2 kudos
Resolved! Asset Bundle Include Glob paths not resolving recursive directories
Hello,When trying to include resource definitions in nested yaml files, the recursive paths I am specifying in the include section are not resolving as would be expected.With the include path resources/**/*.yml and a directory structure structure as ...
- 970 Views
- 2 replies
- 2 kudos
- 2 kudos
This behavior is caused by the way the Databricks CLI currently handles recursive globbing for the include section in databricks.yml files. You are not misunderstanding; this is a limitation (and partially a bug) in how the CLI resolves glob patterns...
- 2 kudos
- 3655 Views
- 1 replies
- 0 kudos
How to know Legacy Metastore connection to SQL DB (used to store metadata)
I am logged into a workspace and trying to check the schemas in legacy hive_metastore using a serverless compute, I can see the schemas listed.However, when I am creating all-purpose cluster and trying to check the schemas in legacy hive_metastore. I...
- 3655 Views
- 1 replies
- 0 kudos
- 0 kudos
In Azure Databricks, the visibility difference you observe between Serverless SQL and All-Purpose Clusters when listing schemas in the hive_metastore is due to cluster-level configuration and how each environment connects to the underlying metastore....
- 0 kudos
- 431 Views
- 1 replies
- 2 kudos
Resolved! Looking for Databricks–Kinaxis Integration or Accelerator Information
Hi Databricks Community,I’m looking for information on the partnership between Databricks and Kinaxis. Specifically:Are there any official integrations or joint solutions available between the two platforms?Does Databricks provide any accelerators, r...
- 431 Views
- 1 replies
- 2 kudos
- 2 kudos
Greetings @vamsi_simbus , I did some digging and have some helpful information for you. Here’s a concise summary of what’s publicly available today on Databricks + Kinaxis. Official partnership and integration scope A formal strategic partnersh...
- 2 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
61 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 120 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |