cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

sparkplug
by New Contributor III
  • 578 Views
  • 6 replies
  • 3 kudos

Resolved! I need a switch to turn off Data Apps in databricks workspaces

HiHow do I disable Data Apps on my workspace. This is really annoying that Databricks pushes new features without any option to disable them. At least you should have some tools to control access before rolling it out. It seems you only care about fe...

  • 578 Views
  • 6 replies
  • 3 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 3 kudos

It is presently not an option at the Workspace level.   Regards, Louis.

  • 3 kudos
5 More Replies
apjeskeaa
by New Contributor II
  • 325 Views
  • 4 replies
  • 1 kudos

Resolved! Can a Databricks Workspace be renamed after creation ?

A Databricks workspace has already been created with all configurations completed. The customer has now requested to change the workspace name. Is it possible to rename an existing Databricks workspace after creation?

  • 325 Views
  • 4 replies
  • 1 kudos
Latest Reply
jeffreyaven
Databricks Employee
  • 1 kudos

Yes, you can safely rename it. The workspace name is largely cosmetic - it won't affect the actual workspace functionality, API endpoints, or integrations since those all rely on the deployment name/URL (which doesn't change). That said, just a heads...

  • 1 kudos
3 More Replies
sfibich1
by New Contributor II
  • 206 Views
  • 3 replies
  • 1 kudos

Resolved! API call to /api/2.0/serving-endpoints/{name}/ai-gateway does not support tokens or principals

From what I understand of reading the documentation the /api/2.0/serving-endpoints/{name}/ai-gateway supports a "tokens" and a "principals" attribute in the JSON payload.Documentation link: Update AI Gateway of a serving endpoint | Serving endpoints ...

sfibich1_0-1761683609470.png
  • 206 Views
  • 3 replies
  • 1 kudos
Latest Reply
jeffreyaven
Databricks Employee
  • 1 kudos

I have dug a bit deeper on this these properties are supported but not as top level request body fields, instead they are available in object element fields under `rate_limits`. The actual payload looks like:: ```{    "guardrails": { /* ... */ },    ...

  • 1 kudos
2 More Replies
PNC
by New Contributor II
  • 138 Views
  • 1 replies
  • 0 kudos

Using Terraform to GRANT SELECT ON ANY FILE securable

I have a use case where service principals will read .csv files from Azure Storage Account and create views from them. This used to work in our legacy environment but we are currently migrating to Unity Catalog and when we tested our existing jobs we...

  • 138 Views
  • 1 replies
  • 0 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 0 kudos

you can try using this code.resource "databricks_grants" "any_file_select_grant" { principal = "your_user_or_group_name" // Replace with the actual user or group privileges { privilege_type = "SELECT" securable_type = "ANY FILE" } }

  • 0 kudos
MMJ
by New Contributor
  • 150 Views
  • 1 replies
  • 1 kudos

Resolved! Delta share not showing in delta shared with me

Hi Everyone,We just start using Databricks, and we were expecting to receive a Delta Share from a third-party provider. They’ve confirmed that the sharing process has been completed on their end. However, the shared data is not appearing on our porta...

  • 150 Views
  • 1 replies
  • 1 kudos
Latest Reply
jeffreyaven
Databricks Employee
  • 1 kudos

You need USE PROVIDER privileges on the recipient workspaces assigned metastore (or you need to be a metastore admin), you will then see the providers delta sharing org name in SHOW PROVIDERS then you can mount their share as a catalog, let me know h...

  • 1 kudos
yvishal519
by Contributor
  • 3918 Views
  • 3 replies
  • 2 kudos

Resolved! Restricting Catalog and External Location Visibility Across Databricks Workspaces

Hi Databricks Community,I need some guidance regarding catalogs and external locations across multiple environments. Here's my situation:I've set up a resource group (dev-rg) and created a Databricks workspace where I successfully created catalogs (b...

  • 3918 Views
  • 3 replies
  • 2 kudos
Latest Reply
eshwari
New Contributor III
  • 2 kudos

I am facing exact similar issue. I don't want to create separate metastore. and I have added environment name as a prefix to all external locations. All the locations are restricted to their workspaces, so functionality wise everything is fine. my co...

  • 2 kudos
2 More Replies
nodeb
by New Contributor II
  • 444 Views
  • 4 replies
  • 5 kudos

Resolved! Azure Databricks Control Plane connectivity issue after migrating to vWAN

Hello everyone,Recently, I received a client request to migrate our Azure Databricks environment from a Hub-and-Spoke architecture to a vWAN Hub architecture with an NVA (Network Virtual Appliance).Here’s a quick overview of the setup:The Databricks ...

  • 444 Views
  • 4 replies
  • 5 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 5 kudos

@nodeb Can you please mark your reply as solution. It will help other users find the resolution fast.

  • 5 kudos
3 More Replies
amrim
by New Contributor II
  • 80 Views
  • 1 replies
  • 1 kudos

Academy: Secondary Email purpose

Hello,I'd like to ask about the functionality of the secondary email.When I try to log-in with it, it prompts me to create a new account, and seems disconnected from my existing account.1. How does it help with keeping access to the account in case a...

  • 80 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Databricks Employee
  • 1 kudos

Hello @amrim! If you try logging in with your secondary email, it will prompt you to create a new account. That’s expected behaviour, as the secondary email is meant for recovery purposes. If you lose access to your primary (registration) email, plea...

  • 1 kudos
magicgrin
by New Contributor II
  • 244 Views
  • 3 replies
  • 2 kudos

Unable to access the account console

Hi,I was trying to add in some IP whitelist under https://docs.databricks.com/aws/en/security/network/front-end/ip-access-list-account#gsc.tab=0 and I have locked myself out.  I cannot log into my account console anymore.  I need help.Your administra...

  • 244 Views
  • 3 replies
  • 2 kudos
Latest Reply
magicgrin
New Contributor II
  • 2 kudos

Yes I am account admin and normally can log into https://accounts.cloud.databricks.com via Okta to be able to manage the account.  However because I forgot to whitelist my IP, I cannot log into the account console to do anything to my account.Databri...

  • 2 kudos
2 More Replies
maikel
by New Contributor II
  • 283 Views
  • 2 replies
  • 0 kudos

A way to get databricks data via Rest API on behalf of user

Hello Databricks Community!I am trying to figure out the best way to get the data from databricks via REST API (or Python SDK but not preferable) but to do not lose information about users permissions during authentication. The use case is that serve...

  • 283 Views
  • 2 replies
  • 0 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 0 kudos

The recommended approach is OAuth 2.0 with On-Behalf-Of (OBO) authentication. This allows your server or app to act on behalf of the user, using their identity and permissions to access Databricks resources.How it works:The user authenticates via OAu...

  • 0 kudos
1 More Replies
Seb_G
by New Contributor III
  • 4675 Views
  • 3 replies
  • 3 kudos

Resolved! Unity Catalog Volume mounting broken by cluster environment variables (http proxy)

Hello all,I have a slightly niche issue here, albeit one that others are likely to run into.Using databricks on Azure, my organisation has included extended our WAN into the cloud, so that all compute clusters are granted a private IP address that ca...

  • 4675 Views
  • 3 replies
  • 3 kudos
Latest Reply
Seb_G
New Contributor III
  • 3 kudos

Unfortunately, the only solution I found was to not use the proxy globally. Good luck!

  • 3 kudos
2 More Replies
satishreddy1326
by New Contributor
  • 114 Views
  • 2 replies
  • 1 kudos

Regarding External Location

If I update the ownership or privileges of an external location, will it have any impact on the associated container? Specifically, while users continue to access the container through the external location during frequent updates, could these change...

Administration & Architecture
Databricks
External Location
  • 114 Views
  • 2 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

If you transfer ownership of the external location to another principal, the new owner gains full control.Existing privileges may remain intact unless explicitly revoked.However, if the new owner modifies privileges or policies, it could affect user ...

  • 1 kudos
1 More Replies
satishreddy1326
by New Contributor
  • 130 Views
  • 1 replies
  • 0 kudos

Schemas

If I frequently update the schema—such as changing ownership—will it have any impact on the underlying tables or views?

  • 130 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Nope. If your users have appropriate grants on external location then changing the owner won't have any negative effect on their ability to interact with data.

  • 0 kudos
schluckie
by New Contributor
  • 211 Views
  • 2 replies
  • 0 kudos

How to find when was the last time a user logged in to Azure Databricks?

Hi,I am asked to prepare a report on Azure Databricks usage in our company. Most metadata I need, like a list of users, workspaces, users rights on workspaces etc. I already have, but I do not know how to determine when the last time was a user logge...

  • 211 Views
  • 2 replies
  • 0 kudos
Latest Reply
ncf5031
New Contributor II
  • 0 kudos

Hi @schluckie you can check using a query like the one below and just change the ip address to match your conventions, or remove it entirely. SELECT event_time, user_identity.email, source_ip_address, action_name, response.status_code FROM system.acc...

  • 0 kudos
1 More Replies
Tito
by New Contributor II
  • 3869 Views
  • 1 replies
  • 0 kudos

VS Code Databricks Connect Cluster Configuration

I am currently setting up the VSCode extension for Databricks Connect, and it’s working fine so far. However, I have a question about cluster configurations. I want to access Unity Catalog from VSCode through the extension, and I’ve noticed that I ca...

  • 3869 Views
  • 1 replies
  • 0 kudos
Latest Reply
dkushari
Databricks Employee
  • 0 kudos

Hi @Tito, you can use the standard cluster (formerly known as the shared cluster) from VSCode using DBConnect. Here is an example, from databricks.connect import DatabricksSession # Option 1: Use cluster_id from .databrickscfg automatically # Since ...

  • 0 kudos