cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

yvishal519
by Contributor
  • 4888 Views
  • 3 replies
  • 2 kudos

Resolved! Restricting Catalog and External Location Visibility Across Databricks Workspaces

Hi Databricks Community,I need some guidance regarding catalogs and external locations across multiple environments. Here's my situation:I've set up a resource group (dev-rg) and created a Databricks workspace where I successfully created catalogs (b...

  • 4888 Views
  • 3 replies
  • 2 kudos
Latest Reply
eshwari
New Contributor III
  • 2 kudos

I am facing exact similar issue. I don't want to create separate metastore. and I have added environment name as a prefix to all external locations. All the locations are restricted to their workspaces, so functionality wise everything is fine. my co...

  • 2 kudos
2 More Replies
amrim
by New Contributor III
  • 488 Views
  • 1 replies
  • 2 kudos

Resolved! Academy: Secondary Email purpose

Hello,I'd like to ask about the functionality of the secondary email.When I try to log-in with it, it prompts me to create a new account, and seems disconnected from my existing account.1. How does it help with keeping access to the account in case a...

  • 488 Views
  • 1 replies
  • 2 kudos
Latest Reply
Advika
Community Manager
  • 2 kudos

Hello @amrim! If you try logging in with your secondary email, it will prompt you to create a new account. That’s expected behaviour, as the secondary email is meant for recovery purposes. If you lose access to your primary (registration) email, plea...

  • 2 kudos
magicgrin
by New Contributor II
  • 699 Views
  • 3 replies
  • 2 kudos

Unable to access the account console

Hi,I was trying to add in some IP whitelist under https://docs.databricks.com/aws/en/security/network/front-end/ip-access-list-account#gsc.tab=0 and I have locked myself out.  I cannot log into my account console anymore.  I need help.Your administra...

  • 699 Views
  • 3 replies
  • 2 kudos
Latest Reply
magicgrin
New Contributor II
  • 2 kudos

Yes I am account admin and normally can log into https://accounts.cloud.databricks.com via Okta to be able to manage the account.  However because I forgot to whitelist my IP, I cannot log into the account console to do anything to my account.Databri...

  • 2 kudos
2 More Replies
maikel
by Contributor II
  • 1803 Views
  • 2 replies
  • 0 kudos

A way to get databricks data via Rest API on behalf of user

Hello Databricks Community!I am trying to figure out the best way to get the data from databricks via REST API (or Python SDK but not preferable) but to do not lose information about users permissions during authentication. The use case is that serve...

  • 1803 Views
  • 2 replies
  • 0 kudos
Latest Reply
nayan_wylde
Esteemed Contributor II
  • 0 kudos

The recommended approach is OAuth 2.0 with On-Behalf-Of (OBO) authentication. This allows your server or app to act on behalf of the user, using their identity and permissions to access Databricks resources.How it works:The user authenticates via OAu...

  • 0 kudos
1 More Replies
Seb_G
by New Contributor III
  • 5912 Views
  • 3 replies
  • 3 kudos

Resolved! Unity Catalog Volume mounting broken by cluster environment variables (http proxy)

Hello all,I have a slightly niche issue here, albeit one that others are likely to run into.Using databricks on Azure, my organisation has included extended our WAN into the cloud, so that all compute clusters are granted a private IP address that ca...

  • 5912 Views
  • 3 replies
  • 3 kudos
Latest Reply
Seb_G
New Contributor III
  • 3 kudos

Unfortunately, the only solution I found was to not use the proxy globally. Good luck!

  • 3 kudos
2 More Replies
satishreddy1326
by New Contributor
  • 370 Views
  • 2 replies
  • 1 kudos

Regarding External Location

If I update the ownership or privileges of an external location, will it have any impact on the associated container? Specifically, while users continue to access the container through the external location during frequent updates, could these change...

Administration & Architecture
Databricks
External Location
  • 370 Views
  • 2 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor II
  • 1 kudos

If you transfer ownership of the external location to another principal, the new owner gains full control.Existing privileges may remain intact unless explicitly revoked.However, if the new owner modifies privileges or policies, it could affect user ...

  • 1 kudos
1 More Replies
satishreddy1326
by New Contributor
  • 289 Views
  • 1 replies
  • 0 kudos

Schemas

If I frequently update the schema—such as changing ownership—will it have any impact on the underlying tables or views?

  • 289 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Nope. If your users have appropriate grants on external location then changing the owner won't have any negative effect on their ability to interact with data.

  • 0 kudos
schluckie
by New Contributor
  • 524 Views
  • 2 replies
  • 0 kudos

How to find when was the last time a user logged in to Azure Databricks?

Hi,I am asked to prepare a report on Azure Databricks usage in our company. Most metadata I need, like a list of users, workspaces, users rights on workspaces etc. I already have, but I do not know how to determine when the last time was a user logge...

  • 524 Views
  • 2 replies
  • 0 kudos
Latest Reply
ncf5031
New Contributor II
  • 0 kudos

Hi @schluckie you can check using a query like the one below and just change the ip address to match your conventions, or remove it entirely. SELECT event_time, user_identity.email, source_ip_address, action_name, response.status_code FROM system.acc...

  • 0 kudos
1 More Replies
Tito
by New Contributor II
  • 4717 Views
  • 1 replies
  • 0 kudos

VS Code Databricks Connect Cluster Configuration

I am currently setting up the VSCode extension for Databricks Connect, and it’s working fine so far. However, I have a question about cluster configurations. I want to access Unity Catalog from VSCode through the extension, and I’ve noticed that I ca...

  • 4717 Views
  • 1 replies
  • 0 kudos
Latest Reply
dkushari
Databricks Employee
  • 0 kudos

Hi @Tito, you can use the standard cluster (formerly known as the shared cluster) from VSCode using DBConnect. Here is an example, from databricks.connect import DatabricksSession # Option 1: Use cluster_id from .databrickscfg automatically # Since ...

  • 0 kudos
yavuzmert
by New Contributor II
  • 2455 Views
  • 3 replies
  • 1 kudos

Unity Catalog system tables (table_lineage, column_lineage) not populated

Hi community,We have enabled Unity Catalog system schemas (including `access`) more than 24 hours ago in sandbox. The schemas are showing ENABLE_COMPLETED, and other system tables (like query) are working fine.However, both `system.access.table_linea...

  • 2455 Views
  • 3 replies
  • 1 kudos
Latest Reply
dkushari
Databricks Employee
  • 1 kudos

Hi @yavuzmert, is this still an issue? Are you seeing lineage for those tables in the UI? One thing to remember about system tables is that their data is updated throughout the day. Usually, if you don't see a log for a recent event, check back later...

  • 1 kudos
2 More Replies
DavidMoss
by New Contributor
  • 2037 Views
  • 2 replies
  • 2 kudos

Resolved! Asset Bundle Include Glob paths not resolving recursive directories

Hello,When trying to include resource definitions in nested yaml files, the recursive paths I am specifying in the include section are not resolving as would be expected.With the include path resources/**/*.yml and a directory structure structure as ...

  • 2037 Views
  • 2 replies
  • 2 kudos
Latest Reply
mark_ott
Databricks Employee
  • 2 kudos

This behavior is caused by the way the Databricks CLI currently handles recursive globbing for the include section in databricks.yml files. You are not misunderstanding; this is a limitation (and partially a bug) in how the CLI resolves glob patterns...

  • 2 kudos
1 More Replies
bhanu_dp
by New Contributor III
  • 4221 Views
  • 1 replies
  • 0 kudos

How to know Legacy Metastore connection to SQL DB (used to store metadata)

I am logged into a workspace and trying to check the schemas in legacy hive_metastore using a serverless compute, I can see the schemas listed.However, when I am creating all-purpose cluster and trying to check the schemas in legacy hive_metastore. I...

Administration & Architecture
configuration
Hive
metastore
  • 4221 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

In Azure Databricks, the visibility difference you observe between Serverless SQL and All-Purpose Clusters when listing schemas in the hive_metastore is due to cluster-level configuration and how each environment connects to the underlying metastore....

  • 0 kudos
vamsi_simbus
by Contributor
  • 1334 Views
  • 1 replies
  • 2 kudos

Resolved! Looking for Databricks–Kinaxis Integration or Accelerator Information

Hi Databricks Community,I’m looking for information on the partnership between Databricks and Kinaxis. Specifically:Are there any official integrations or joint solutions available between the two platforms?Does Databricks provide any accelerators, r...

  • 1334 Views
  • 1 replies
  • 2 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 2 kudos

Greetings @vamsi_simbus , I did some digging and have some helpful information for you.   Here’s a concise summary of what’s publicly available today on Databricks + Kinaxis.   Official partnership and integration scope   A formal strategic partnersh...

  • 2 kudos
Barnita
by New Contributor III
  • 934 Views
  • 4 replies
  • 2 kudos

Resolved! How to run black code-formating on the notebooks using custom configurations in UI

Hi all,I’m currently exploring how we can format notebook code using Black (installed via libraries) with specific configurations.I understand that we can configure Black locally using a pyproject.toml file. However, I’d like to know if there’s a way...

  • 934 Views
  • 4 replies
  • 2 kudos
Latest Reply
Barnita
New Contributor III
  • 2 kudos

Hi @szymon_dybczak ,Thanks for your response. My team has been using the same setup you mentioned. I’d like to know if there’s a way to override the default configuration that Black uses in a cluster environment — for example, adjusting the line-leng...

  • 2 kudos
3 More Replies
MaximeGendre
by New Contributor III
  • 1094 Views
  • 4 replies
  • 4 kudos

Resolved! Disable SQL Warehouse during week-ends

Hello,I massively deployed SQL Warehouses in our data Platform.Right now, most of them are running every hour (with some inactivity phasis) because of Power BI report/jobs schedules.To limit cost, I would like to stop/disable some on them on Friday e...

  • 1094 Views
  • 4 replies
  • 4 kudos
Latest Reply
nayan_wylde
Esteemed Contributor II
  • 4 kudos

 Also like to provide you with some alternate options.Tagging & Monitoring: Use tags and cost dashboards to monitor weekend usage and identify high-cost warehouses for manual intervention.Serverless SQL Warehouses: If not already in use, consider swi...

  • 4 kudos
3 More Replies