cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Malthe
by Contributor II
  • 883 Views
  • 1 replies
  • 3 kudos

Resolved! Cost of SQL Warehouse scaled to 0 clusters

A SQL Warehouse can be scaled to a minimum of 0.Presumably, there is still a cost to keeping the resource active, because we also have Auto-Stop which can completely stop the warehouse after a configurable amount of time.This cost is not documented. ...

  • 883 Views
  • 1 replies
  • 3 kudos
Latest Reply
MoJaMa
Databricks Employee
  • 3 kudos

1. Just to clarify, minimum is 1, not 0. See https://docs.databricks.com/api/workspace/warehouses/create#min_num_clusters 2. How it works is based on concurrency. Assuming your max clusters is 2, and based on the concurrency, we don't need the 2nd cl...

  • 3 kudos
ankit_kothiya1
by New Contributor II
  • 1781 Views
  • 2 replies
  • 2 kudos

Resolved! JJDBC Insert Performance and Unsupported Data Types

We are reaching out regarding two observations with the Databricks JDBC driver:We’ve noticed that each INSERT query is taking approximately 1 second to execute via the JDBC driver (please refer to the attached screenshot). This seems unusually slow f...

ankit_kothiya1_0-1750157093748.png
  • 1781 Views
  • 2 replies
  • 2 kudos
Latest Reply
Saritha_S
Databricks Employee
  • 2 kudos

Hi @ankit_kothiya1  Please find below my findings for your query 1. Slow INSERT Performance via Databricks JDBC Driver Observation:Each INSERT query takes about 1 second via the Databricks JDBC driver, which is unusually slow for high-throughput use ...

  • 2 kudos
1 More Replies
chandru44
by New Contributor II
  • 1338 Views
  • 2 replies
  • 0 kudos

Guidance on Populating the cloud_infra_cost Table in System Catalog

In the system catalog, there are three tables: cloud_infra_cost, list_prices, and usage. While the list_prices and usage tables contain cost-related information, the cloud_infra_cost table is currently empty. I am using AWS cloud. Can anyone provide ...

Screenshot_1.png
  • 1338 Views
  • 2 replies
  • 0 kudos
Latest Reply
aranjan99
Contributor
  • 0 kudos

I am opted in for the features in preview but I dont see any data in this table

  • 0 kudos
1 More Replies
noorbasha534
by Valued Contributor II
  • 1168 Views
  • 3 replies
  • 1 kudos

Principals given access to and their owners

Hi allIn a large global data platform built with Azure Databricks, I like to know the best practice of how we maintain the users to which Databricks objects (typically views) have been access to, for example - a view has been given access to a servic...

  • 1168 Views
  • 3 replies
  • 1 kudos
Latest Reply
jameshughes
Contributor II
  • 1 kudos

@noorbasha534 - I created a helper function/script to do this in my environment that queries the Unity Catalog system tables to generate a unique list of impacted principals/users.  It takes in a list of fully qualified object names and will display ...

  • 1 kudos
2 More Replies
Derek_Czarny
by New Contributor III
  • 2936 Views
  • 8 replies
  • 2 kudos

Resolved! Creating Groups with API and Python

I am working on a notebook to help me create Azure Databricks Groups.  When I create a group in a workspace using the UI, it automatically creates the group at the account level and links them.  When I create a group using the API, and I create the w...

  • 2936 Views
  • 8 replies
  • 2 kudos
Latest Reply
pranav5
New Contributor II
  • 2 kudos

I have a couple of questions regarding the Token to achieve this, If I create a workspace PAT token, is it limited to only the workspace or all the workspaces I have access to. And Do my account admin privileges translated to the PAT token I create i...

  • 2 kudos
7 More Replies
gowtham-talluru
by New Contributor
  • 776 Views
  • 1 replies
  • 0 kudos

How to write files to Databricks Volumes while running code in local VS Code (without cp)

How to write files to Databricks Volumes while running code in local VS Code (without cp)I always struggle to seamlessly use VS Code with databricks. Its so not user friendly. Do you also feel the same?

  • 776 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor
  • 0 kudos

Hi @gowtham-talluru, If you're trying to write directly to Volumes from local code, you can use the Databricks SDK for Python.Try something like this:from databricks.sdk import WorkspaceClientw = WorkspaceClient()with open("local_file.csv", "rb") as ...

  • 0 kudos
colin-db
by New Contributor II
  • 1621 Views
  • 2 replies
  • 2 kudos

Resolved! Transfer Account Ownership

I have the same issue as this previous user, who had their question resolved before an actionable solution was provided: https://community.databricks.com/t5/data-engineering/how-to-transfer-ownership-of-a-databricks-cloud-standard-account/td-p/34737I...

  • 1621 Views
  • 2 replies
  • 2 kudos
Latest Reply
jameshughes
Contributor II
  • 2 kudos

You are going to have a very difficult time with the transfer as it can only be done on the backside by Databricks.  Your only real option would be to have your customer create their own account and migrate the workspace assets over outside of having...

  • 2 kudos
1 More Replies
Uri
by New Contributor II
  • 1031 Views
  • 2 replies
  • 1 kudos

users' usage report (for frontend power bi)

Hi All,Hi All,I'm currently working on retrieving usage information by querying system tables. At the moment, I'm using the system.access.audit table. However, I've noticed that the list of users retrieved appears to be incomplete when compared to si...

  • 1031 Views
  • 2 replies
  • 1 kudos
Latest Reply
Uri
New Contributor II
  • 1 kudos

thank you for the replay.if i understand correctly, when using PBI direct query connectivity the users being used is not s service principle but the end user who open the PBI dashboard. correct?did you implement any usage report?  Regards,Uri

  • 1 kudos
1 More Replies
mandarsu
by New Contributor
  • 2113 Views
  • 1 replies
  • 0 kudos

unable to enable external sharing when creating deltashare - azure databricks trial

I have started a PayGo Azure tenancy and a Databricks 14 day trialI have signed up using my gmail account.with the above user , I logged into azure and created a workspace  and tried to share a schema deltasharing.I am unable to share to a open user ...

  • 2113 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor
  • 0 kudos

Hi @mandarsu ,To enable external Delta Sharing in Databricks:Enable External Sharing:Go to the Databricks Account Console, open your Unity Catalog metastore settings, and enable the “External Delta Sharing” option.Check Permissions:Ensure you have th...

  • 0 kudos
Gmera
by New Contributor
  • 4775 Views
  • 2 replies
  • 2 kudos

Resolved! Cost

Do you have information that helps me optimize costs and follow up?

  • 4775 Views
  • 2 replies
  • 2 kudos
Latest Reply
jameshughes
Contributor II
  • 2 kudos

@Athul97 provided a pretty solid list of best practices.  To go deeper into Budgets & Alerts, I have found a lot of good success with the Consumption and Budget feature in the Databricks Account Portal under the Usage menu.  Once you embed tagging in...

  • 2 kudos
1 More Replies
BriGuy
by New Contributor II
  • 1466 Views
  • 3 replies
  • 0 kudos

python databricks sdk get object path from id

when using the databricks SDK to get permissions of objects we get inherited_from_object=['/directories/1636517342231743']from what I can see the workspace list and get_status methods only work with the actual path.  Is there a way to look up that di...

  • 1466 Views
  • 3 replies
  • 0 kudos
Latest Reply
CURIOUS_DE
Contributor III
  • 0 kudos

@BriGuy  Here I have small code snippet which we have used. Hope this works well with youfrom databricks.sdk import WorkspaceClient w = WorkspaceClient() def find_path_by_object_id(target_id, base_path="/"): items = w.workspace.list(path=base_p...

  • 0 kudos
2 More Replies
PiotrM
by New Contributor III
  • 3874 Views
  • 4 replies
  • 3 kudos

Drop table - permission management

Hello,I'm trying to wrap my head around the permission management for dropping tables in UC enabled schemas.According to docs: To drop a table you must have the MANAGE privilege on the table, be its owner, or the owner of the schema, catalog, or meta...

  • 3874 Views
  • 4 replies
  • 3 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 3 kudos

Hi @PiotrM, I see there is a feature request already in place. It's been considered for the future: https://databricks.aha.io/ideas/ideas/DB-I-7480

  • 3 kudos
3 More Replies
ianc
by New Contributor
  • 1952 Views
  • 2 replies
  • 0 kudos

spark.databricks documentation

I cannot find any documentation related to the spark.databricks.* I was able to find the spark related documentation but it does not contain any information on possible properties or arguments for spark.databricks in particular. Thank you!

  • 1952 Views
  • 2 replies
  • 0 kudos
Latest Reply
KustoszEnjoyer
New Contributor II
  • 0 kudos

Thus, as of now, the documentation is lacing an obvious and easy to provide element, that can only be found partially, spread around random threads over the internet, or gained by guess-asking the platform developers.When will it be made available?

  • 0 kudos
1 More Replies
satycse06
by New Contributor
  • 1296 Views
  • 1 replies
  • 0 kudos

How worker nodes get the packages during scale-up?

Hi,We are working with one of the repository where we used to download the artifact/python package from that repository using index url in global init script but now the logic is going to be change we need give the cred to download the package and th...

  • 1296 Views
  • 1 replies
  • 0 kudos
Latest Reply
Vidhi_Khaitan
Databricks Employee
  • 0 kudos

Yes, the new worker node will execute the global init script independently when it starts. It does not get the package from the driver or other existing nodes and will hit the configured index URL directly, and try to download the package on its own....

  • 0 kudos
andreapeterson
by Contributor
  • 3259 Views
  • 5 replies
  • 4 kudos

Resolved! Account level groups

When I query my user from an account client and workspace client, I get different answers. Why is this?  In addition, why can I only see some account level groups from my workspace, and not others?

  • 3259 Views
  • 5 replies
  • 4 kudos
Latest Reply
vr
Valued Contributor
  • 4 kudos

If you have a relatively modern Databricks instance, when you create a group in workspace UI, it creates an account-level group (which you can see in "Source" column – it says "Account"). So this process essentially consists of two steps: 1) create a...

  • 4 kudos
4 More Replies