cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

gowtham-talluru
by New Contributor
  • 554 Views
  • 1 replies
  • 0 kudos

How to write files to Databricks Volumes while running code in local VS Code (without cp)

How to write files to Databricks Volumes while running code in local VS Code (without cp)I always struggle to seamlessly use VS Code with databricks. Its so not user friendly. Do you also feel the same?

  • 554 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor
  • 0 kudos

Hi @gowtham-talluru, If you're trying to write directly to Volumes from local code, you can use the Databricks SDK for Python.Try something like this:from databricks.sdk import WorkspaceClientw = WorkspaceClient()with open("local_file.csv", "rb") as ...

  • 0 kudos
colin-db
by New Contributor II
  • 1293 Views
  • 2 replies
  • 2 kudos

Resolved! Transfer Account Ownership

I have the same issue as this previous user, who had their question resolved before an actionable solution was provided: https://community.databricks.com/t5/data-engineering/how-to-transfer-ownership-of-a-databricks-cloud-standard-account/td-p/34737I...

  • 1293 Views
  • 2 replies
  • 2 kudos
Latest Reply
jameshughes
Contributor II
  • 2 kudos

You are going to have a very difficult time with the transfer as it can only be done on the backside by Databricks.  Your only real option would be to have your customer create their own account and migrate the workspace assets over outside of having...

  • 2 kudos
1 More Replies
Uri
by New Contributor II
  • 865 Views
  • 2 replies
  • 1 kudos

users' usage report (for frontend power bi)

Hi All,Hi All,I'm currently working on retrieving usage information by querying system tables. At the moment, I'm using the system.access.audit table. However, I've noticed that the list of users retrieved appears to be incomplete when compared to si...

  • 865 Views
  • 2 replies
  • 1 kudos
Latest Reply
Uri
New Contributor II
  • 1 kudos

thank you for the replay.if i understand correctly, when using PBI direct query connectivity the users being used is not s service principle but the end user who open the PBI dashboard. correct?did you implement any usage report?  Regards,Uri

  • 1 kudos
1 More Replies
mandarsu
by New Contributor
  • 1824 Views
  • 1 replies
  • 0 kudos

unable to enable external sharing when creating deltashare - azure databricks trial

I have started a PayGo Azure tenancy and a Databricks 14 day trialI have signed up using my gmail account.with the above user , I logged into azure and created a workspace  and tried to share a schema deltasharing.I am unable to share to a open user ...

  • 1824 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor
  • 0 kudos

Hi @mandarsu ,To enable external Delta Sharing in Databricks:Enable External Sharing:Go to the Databricks Account Console, open your Unity Catalog metastore settings, and enable the “External Delta Sharing” option.Check Permissions:Ensure you have th...

  • 0 kudos
Gmera
by New Contributor
  • 4251 Views
  • 2 replies
  • 2 kudos

Resolved! Cost

Do you have information that helps me optimize costs and follow up?

  • 4251 Views
  • 2 replies
  • 2 kudos
Latest Reply
jameshughes
Contributor II
  • 2 kudos

@Athul97 provided a pretty solid list of best practices.  To go deeper into Budgets & Alerts, I have found a lot of good success with the Consumption and Budget feature in the Databricks Account Portal under the Usage menu.  Once you embed tagging in...

  • 2 kudos
1 More Replies
Kayla
by Valued Contributor II
  • 446 Views
  • 0 replies
  • 0 kudos

GKE Cluster Shows "Starting" Even After its turned on

Curious if anyone else has run into this. After changing to GKE based clusters, they all turn on but don't show as turned on - we'll have it show as "Starting" but be able to see the same cluster in the dropdown that's already active. "Changing" to t...

Kayla_0-1749815522351.png
  • 446 Views
  • 0 replies
  • 0 kudos
BriGuy
by New Contributor II
  • 1231 Views
  • 3 replies
  • 0 kudos

python databricks sdk get object path from id

when using the databricks SDK to get permissions of objects we get inherited_from_object=['/directories/1636517342231743']from what I can see the workspace list and get_status methods only work with the actual path.  Is there a way to look up that di...

  • 1231 Views
  • 3 replies
  • 0 kudos
Latest Reply
CURIOUS_DE
Contributor III
  • 0 kudos

@BriGuy  Here I have small code snippet which we have used. Hope this works well with youfrom databricks.sdk import WorkspaceClient w = WorkspaceClient() def find_path_by_object_id(target_id, base_path="/"): items = w.workspace.list(path=base_p...

  • 0 kudos
2 More Replies
Rjdudley
by Honored Contributor
  • 3297 Views
  • 1 replies
  • 0 kudos

Lakeflow Connect: can't change general privilege requirements

I want to set up Lakeflow Connect to ETL data from Azure SQL Server (Microsoft SQL Azure (RTM) - 12.0.2000.8 Feb 9 2025) using change tracking (we don't need the data retention of CDC).  In the documentation, there is a list off system tables, views ...

  • 3297 Views
  • 1 replies
  • 0 kudos
Latest Reply
andreys
New Contributor II
  • 0 kudos

Got same issue. Did you find a way to configure required permissions? 

  • 0 kudos
PiotrM
by New Contributor III
  • 3148 Views
  • 4 replies
  • 3 kudos

Drop table - permission management

Hello,I'm trying to wrap my head around the permission management for dropping tables in UC enabled schemas.According to docs: To drop a table you must have the MANAGE privilege on the table, be its owner, or the owner of the schema, catalog, or meta...

  • 3148 Views
  • 4 replies
  • 3 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 3 kudos

Hi @PiotrM, I see there is a feature request already in place. It's been considered for the future: https://databricks.aha.io/ideas/ideas/DB-I-7480

  • 3 kudos
3 More Replies
ianc
by New Contributor
  • 1868 Views
  • 2 replies
  • 0 kudos

spark.databricks documentation

I cannot find any documentation related to the spark.databricks.* I was able to find the spark related documentation but it does not contain any information on possible properties or arguments for spark.databricks in particular. Thank you!

  • 1868 Views
  • 2 replies
  • 0 kudos
Latest Reply
KustoszEnjoyer
New Contributor II
  • 0 kudos

Thus, as of now, the documentation is lacing an obvious and easy to provide element, that can only be found partially, spread around random threads over the internet, or gained by guess-asking the platform developers.When will it be made available?

  • 0 kudos
1 More Replies
satycse06
by New Contributor
  • 1167 Views
  • 1 replies
  • 0 kudos

How worker nodes get the packages during scale-up?

Hi,We are working with one of the repository where we used to download the artifact/python package from that repository using index url in global init script but now the logic is going to be change we need give the cred to download the package and th...

  • 1167 Views
  • 1 replies
  • 0 kudos
Latest Reply
Vidhi_Khaitan
Databricks Employee
  • 0 kudos

Yes, the new worker node will execute the global init script independently when it starts. It does not get the package from the driver or other existing nodes and will hit the configured index URL directly, and try to download the package on its own....

  • 0 kudos
andreapeterson
by Contributor
  • 2787 Views
  • 5 replies
  • 4 kudos

Resolved! Account level groups

When I query my user from an account client and workspace client, I get different answers. Why is this?  In addition, why can I only see some account level groups from my workspace, and not others?

  • 2787 Views
  • 5 replies
  • 4 kudos
Latest Reply
vr
Contributor III
  • 4 kudos

If you have a relatively modern Databricks instance, when you create a group in workspace UI, it creates an account-level group (which you can see in "Source" column – it says "Account"). So this process essentially consists of two steps: 1) create a...

  • 4 kudos
4 More Replies
chinmay0924
by New Contributor III
  • 770 Views
  • 1 replies
  • 0 kudos

Resolved! How to create a function using the functions API in databricks?

https://docs.databricks.com/api/workspace/functions/createThis documentation gives the sample request payload, and one of the fields is type_json, and there is very little explanation of what is expected in this field. What am I supposed to pass here...

  • 770 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor
  • 0 kudos

Hi @chinmay0924 ,The type_json field describes your function’s input parameters and return type using a specific JSON format. You’ll need to include each parameter’s name, type (like "STRING", "INT", "ARRAY", or "STRUCT"), and position, along with th...

  • 0 kudos
LauJohansson
by Contributor
  • 1990 Views
  • 3 replies
  • 1 kudos

Terraform - Azure Databricks workspace without NAT gateway

Hi all,I have experienced an increase in costs - even when not using Databricks compute.It is due to the NAT-gateway, that are (suddenly) automatically deployed.When creating Azure Databricks workspaces using Terraform:A NAT-gateway is created. When ...

LauJohansson_0-1729142670306.png LauJohansson_1-1729142785587.png
  • 1990 Views
  • 3 replies
  • 1 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 1 kudos

In Azure Databricks, a NAT Gateway will be required (by Microsoft) for all egress from VMs, which affects Databricks compute: Azure updates | Microsoft Azure

  • 1 kudos
2 More Replies