cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

PiotrM
by New Contributor III
  • 2638 Views
  • 4 replies
  • 3 kudos

Drop table - permission management

Hello,I'm trying to wrap my head around the permission management for dropping tables in UC enabled schemas.According to docs: To drop a table you must have the MANAGE privilege on the table, be its owner, or the owner of the schema, catalog, or meta...

  • 2638 Views
  • 4 replies
  • 3 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 3 kudos

Hi @PiotrM, I see there is a feature request already in place. It's been considered for the future: https://databricks.aha.io/ideas/ideas/DB-I-7480

  • 3 kudos
3 More Replies
ianc
by New Contributor
  • 1794 Views
  • 2 replies
  • 0 kudos

spark.databricks documentation

I cannot find any documentation related to the spark.databricks.* I was able to find the spark related documentation but it does not contain any information on possible properties or arguments for spark.databricks in particular. Thank you!

  • 1794 Views
  • 2 replies
  • 0 kudos
Latest Reply
KustoszEnjoyer
New Contributor II
  • 0 kudos

Thus, as of now, the documentation is lacing an obvious and easy to provide element, that can only be found partially, spread around random threads over the internet, or gained by guess-asking the platform developers.When will it be made available?

  • 0 kudos
1 More Replies
satycse06
by New Contributor
  • 1108 Views
  • 1 replies
  • 0 kudos

How worker nodes get the packages during scale-up?

Hi,We are working with one of the repository where we used to download the artifact/python package from that repository using index url in global init script but now the logic is going to be change we need give the cred to download the package and th...

  • 1108 Views
  • 1 replies
  • 0 kudos
Latest Reply
Vidhi_Khaitan
Databricks Employee
  • 0 kudos

Yes, the new worker node will execute the global init script independently when it starts. It does not get the package from the driver or other existing nodes and will hit the configured index URL directly, and try to download the package on its own....

  • 0 kudos
andreapeterson
by Contributor
  • 2524 Views
  • 5 replies
  • 4 kudos

Resolved! Account level groups

When I query my user from an account client and workspace client, I get different answers. Why is this?  In addition, why can I only see some account level groups from my workspace, and not others?

  • 2524 Views
  • 5 replies
  • 4 kudos
Latest Reply
vr
Contributor III
  • 4 kudos

If you have a relatively modern Databricks instance, when you create a group in workspace UI, it creates an account-level group (which you can see in "Source" column – it says "Account"). So this process essentially consists of two steps: 1) create a...

  • 4 kudos
4 More Replies
chinmay0924
by New Contributor III
  • 674 Views
  • 1 replies
  • 0 kudos

Resolved! How to create a function using the functions API in databricks?

https://docs.databricks.com/api/workspace/functions/createThis documentation gives the sample request payload, and one of the fields is type_json, and there is very little explanation of what is expected in this field. What am I supposed to pass here...

  • 674 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Contributor III
  • 0 kudos

Hi @chinmay0924 ,The type_json field describes your function’s input parameters and return type using a specific JSON format. You’ll need to include each parameter’s name, type (like "STRING", "INT", "ARRAY", or "STRUCT"), and position, along with th...

  • 0 kudos
LauJohansson
by Contributor
  • 1730 Views
  • 3 replies
  • 1 kudos

Terraform - Azure Databricks workspace without NAT gateway

Hi all,I have experienced an increase in costs - even when not using Databricks compute.It is due to the NAT-gateway, that are (suddenly) automatically deployed.When creating Azure Databricks workspaces using Terraform:A NAT-gateway is created. When ...

LauJohansson_0-1729142670306.png LauJohansson_1-1729142785587.png
  • 1730 Views
  • 3 replies
  • 1 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 1 kudos

In Azure Databricks, a NAT Gateway will be required (by Microsoft) for all egress from VMs, which affects Databricks compute: Azure updates | Microsoft Azure

  • 1 kudos
2 More Replies
Angus-Dawson
by New Contributor III
  • 703 Views
  • 1 replies
  • 1 kudos

Databricks Runtime 16.4 LTS has inconsistent Spark and Delta Lake versions

Per the release notes for Databricks Runtime 16.4 LTS, the environment has Apache Spark 3.5.2 and Delta Lake 3.3.1:https://docs.databricks.com/aws/en/release-notes/runtime/16.4ltsHowever, Delta Lake 3.3.1 is built on Spark 3.5.3; the newest version o...

  • 703 Views
  • 1 replies
  • 1 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 1 kudos

We saw the same thing in previous runtime versions, and even a point-point version broke our code.  We actually log the spark version in one pipeline and see different versions popping up from time to time.  Apparently the long term goal is to move t...

  • 1 kudos
m2chrisp
by New Contributor II
  • 430 Views
  • 0 replies
  • 0 kudos

Setting catalog isolation mode and workspace bindings within a notebook using Python SDK

Hi,I have a set of notebooks which configure new catalogs, set permissions, create default schemas, attach Azure Storage accounts as external volumes, create Git Folders and set current branches, etc.All this works just fine.One thing I'm trying to a...

  • 430 Views
  • 0 replies
  • 0 kudos
Marthinus
by New Contributor III
  • 1713 Views
  • 2 replies
  • 0 kudos

Resolved! External Locations to Azure Storage via Private Endpoint

When working with Azure Databricks (with VNET injection) to connect securely to an Azure Storage account via private endpoint, there's a few locations it needs to connect to, firstly the vnet that databricks is connected to, which works well when con...

  • 1713 Views
  • 2 replies
  • 0 kudos
Latest Reply
Marthinus
New Contributor III
  • 0 kudos

I've read that in the documentation, and when I now tried with an Access Connector for Azure Databricks instead of my own service principal, it seems to have worked, shockingly, even if I completely block network access on the storage account with ze...

  • 0 kudos
1 More Replies
PradeepPrabha
by New Contributor II
  • 1247 Views
  • 5 replies
  • 0 kudos

Any documentation mentioning connectivity from Azure SQL database connectivity to Azure Databricks

Any documentation available to connect from the Azure SQL database to Azure Databricks SQL workspace. We created a SQL warehouse personal access token for a user in a different team who can connect from his on-prem SQL DB to Databricks using the conn...

  • 1247 Views
  • 5 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

  Here are some considerations: SQL Trigger: Define a trigger in Azure SQL that activates on specific DML operations (e.g., INSERT, UPDATE). External Call: The trigger can log events to an intermediate service (like a control table or Event Grid). ...

  • 0 kudos
4 More Replies
Maxadbuser
by New Contributor III
  • 981 Views
  • 3 replies
  • 2 kudos

Resolved! Compute page Resolver error

I have an issue with my Databricks workspaceWhen i try to attach a notebook to my cluster, the cluster seems to have disappeared.After navigating to the Compute page, it gives me a 'Resolver Error' and no further information. None of the computes/clu...

Maxadbuser_0-1748853981454.jpeg
  • 981 Views
  • 3 replies
  • 2 kudos
Latest Reply
Advika
Databricks Employee
  • 2 kudos

Hello @Maxadbuser! This was a known issue, and the engineering team has implemented a fix. Please check and confirm if you're still experiencing the problem.

  • 2 kudos
2 More Replies
gillesfromparis
by New Contributor II
  • 907 Views
  • 3 replies
  • 1 kudos

Resolved! NAT Gateway IP update

Hi, My Databricks (Premium) account was deployed on AWS.It was provisioned a few months ago from the AWS MarketPlace with the QuickStart method, based on CloudFormation. The NAT Gateway initially created by the CloudFormation stack has been incidenta...

  • 907 Views
  • 3 replies
  • 1 kudos
Latest Reply
gillesfromparis
New Contributor II
  • 1 kudos

Thanks for your answer.Yes I did all of that, as well as allowing the traffic coming from my NAT Instance as an inbound rule of the Security Group of the private instances (and the other way around), and no restriction on the outbound traffic.I suspe...

  • 1 kudos
2 More Replies
Edyta
by New Contributor II
  • 27502 Views
  • 6 replies
  • 1 kudos

Resolved! Delete Databricks account

Hi everyone, as in the topic, I would like to delete unnnecesarily created account. I have found outdated solutions (e.g. https://community.databricks.com/t5/data-engineering/how-to-delete-databricks-account/m-p/6323#M2501), but they do not work anym...

  • 27502 Views
  • 6 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Edyta ,FOR AWS:Manage your subscription | Databricks on AWS" Before you delete a Databricks account, you must first cancel your Databricks subscription and delete all Unity Catalog metastores in the account. After you delete all metastores associ...

  • 1 kudos
5 More Replies
Aminsn
by New Contributor III
  • 446 Views
  • 1 replies
  • 0 kudos

Is it possible to let multiple Apps share the same compute?

Apparently, for every app deployed on Databricks, a separate VM is allocated, costing 0.5 DBU/hour. This seems inefficient, why can't a single VM support multiple apps? It feels like a waste of money and resources to allocate independent VMs per app ...

  • 446 Views
  • 1 replies
  • 0 kudos
Latest Reply
Shua42
Databricks Employee
  • 0 kudos

Hi @Aminsn , You're understanding is correct in that only 1 running app can be deployed per app instance. If you want to maximize the utilization of the compute, one option could be to create a multi-page app, where the landing page will direct users...

  • 0 kudos