cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Lakehouse Architecture


Forum Posts

l0ginp
by New Contributor III
  • 5145 Views
  • 3 replies
  • 0 kudos

Resolved! I have questions about "Premium Automated Serverless Compute - Promo DBU."

"Premium Automated Serverless Compute - Promo DBU" expenses arise from what, how can I disable it, and why are the costs so high? In the picture, I am using AzureThank you in advance for the advice  

l0ginp_2-1717951633113.png
  • 5145 Views
  • 3 replies
  • 0 kudos
Latest Reply
agallard
Contributor
  • 0 kudos

Hello everyone!I had the same problem for two months. I went crazy looking for what was spending my entire subscription amount. Premium Automated Serverless Compute - Promo DBU > â‚¬8,236 After searching everywhere for information about it and reading ...

  • 0 kudos
2 More Replies
rfreitas
by New Contributor II
  • 10229 Views
  • 2 replies
  • 1 kudos

Notebook and folder owner

Hi allWe can use this API https://docs.databricks.com/api/workspace/dbsqlpermissions/transferownership to transfer the ownership of a Query.Is there anything similar for notebooks and folders?

  • 10229 Views
  • 2 replies
  • 1 kudos
Latest Reply
sparkplug
New Contributor II
  • 1 kudos

This api only allows to set permissions based on permission level , which doesn't include changing OWNER. Any suggestions on this particular request.

  • 1 kudos
1 More Replies
KUMAR__111
by New Contributor II
  • 278 Views
  • 3 replies
  • 0 kudos

How to get cost per job which runs on ALL_PURPOSE_COMPUTE ??

with system.billing.usage table i could get cost per jobs which are runs on JOB_COMPUTE but not for jobs which runs on ALL_PURPOSE_COMPUTE.

  • 278 Views
  • 3 replies
  • 0 kudos
Latest Reply
KUMAR__111
New Contributor II
  • 0 kudos

If nowhere DBU is captured for jobs under ALL_PURPOSE_COMPUTE then cost breakdown-based cluster events is very difficult as more than 2 jobs can parallel. So mapping is very difficult to break down cost for specific job.let me know if I am missing an...

  • 0 kudos
2 More Replies
JessieWen
by Databricks Employee
  • 216 Views
  • 1 replies
  • 2 kudos

slow cluster start up time up to 30 min gcp

instance type: e2-highmem-2

  • 216 Views
  • 1 replies
  • 2 kudos
Latest Reply
JessieWen
Databricks Employee
  • 2 kudos

please use a higher-powered instance type (e.g. n2-highmem-4). The instance type you are currently using (i.e. e2-highmem-2) is significantly underpowered and will result in slower cluster launch times.

  • 2 kudos
CarolinaK
by New Contributor II
  • 487 Views
  • 4 replies
  • 0 kudos

Unity Catalog hive_metastore schemas

Hi all,Apologies if this is the wrong group but I was looking in Unity Catalog and noticed that you have different schemas in the hive_metastore depending on if you select a cluster or if you select a warehouse. Could someone please explain what the ...

  • 487 Views
  • 4 replies
  • 0 kudos
Latest Reply
navallyemul
New Contributor III
  • 0 kudos

No schemas are directly attached to compute resources, whether it's an all-purpose cluster or a SQL warehouse in serverless mode.

  • 0 kudos
3 More Replies
VCHK
by New Contributor
  • 380 Views
  • 1 replies
  • 0 kudos

Databricks Workflow/Jobs View Log Permission

If we don't want to expose admin right to user group. What should we do to allow a specific user group to have permission to view all of the job logs in a Databricks account? We don't want to grant job level permission too.Thanks,VC

  • 380 Views
  • 1 replies
  • 0 kudos
Latest Reply
SathyaSDE
Contributor
  • 0 kudos

Hi, I guess you can use Databricks API to list jobs and set Can view permission to all jobs.Sample code below: import requestsfrom databricks_cli.sdk import ApiClient, JobsService, PermissionsService# Initialize the API clientapi_client = ApiClient( ...

  • 0 kudos
cuhlmann
by New Contributor
  • 318 Views
  • 1 replies
  • 0 kudos

data ingestion from external system - auth via client certificate

Hi Community,we have the requirement to ingest data in azure databricks from external systems.Our customer ask us to use Client Certificate as authentication method.Requests - https://requests.readthedocs.io/en/latest/user/advanced/Aiohttp - https://...

  • 318 Views
  • 1 replies
  • 0 kudos
Latest Reply
filipniziol
Contributor III
  • 0 kudos

Hi @cuhlmann ,As I understand you need to ingest data into Azure Databricks from external systems, and your customer requires using client certificate authentication. The challenge is that the client certificate is stored in Azure Key Vault, but the ...

  • 0 kudos
JianfengHuang
by New Contributor
  • 203 Views
  • 2 replies
  • 1 kudos

Bill for Premium subscription

hi, there, I have subscribed the Premium plan of databricks, How can I get the bills for this subscription? I didn't find it from the account settings. Anyone can help?

  • 203 Views
  • 2 replies
  • 1 kudos
Latest Reply
gchandra
Databricks Employee
  • 1 kudos

AWS https://docs.databricks.com/en/admin/account-settings/account.html Azure https://learn.microsoft.com/en-us/azure/databricks/admin/account-settings/account GCP https://docs.gcp.databricks.com/en/admin/account-settings/account.html  

  • 1 kudos
1 More Replies
ossinova
by Contributor II
  • 3033 Views
  • 4 replies
  • 0 kudos

Override default Personal Compute policy using terraform / disable Personal Compute policy

I want to programmatically do some adjustments to the default personal compute resource or preferably create my own custom one based on the same configuration or policy family (in which all users can gain access to) when deploying a new workspace usi...

  • 3033 Views
  • 4 replies
  • 0 kudos
Latest Reply
Clowa
New Contributor II
  • 0 kudos

Only way I got it working was by importing the pre-existing policy into terraform and do an overwrite as already mentioned by @jsimonovic . The full code example looks like this:import { id = "001BF0AC280610B4" # Polcy ID of the pre-existing person...

  • 0 kudos
3 More Replies
mattiags
by New Contributor II
  • 470 Views
  • 1 replies
  • 1 kudos

Resolved! Retention for hive_metastore tables

HiI have a notebook that creates tables in the hive_metastore with the following code: df.write.format("delta").mode("overwrite").saveAsTable(output_table_name) Which is the retantion for the data saved in the hive metastore? is there any configurati...

  • 470 Views
  • 1 replies
  • 1 kudos
Latest Reply
Stefan-Koch
Contributor III
  • 1 kudos

Hi mattiagsAs long as you do not delete the data via notebook or in the data lake, it will not be deleted in any other way. This means that there is no retention time in this sense, or conversely, it is infinite until you deliberately delete the data...

  • 1 kudos
Veilraj
by New Contributor II
  • 250 Views
  • 0 replies
  • 0 kudos

Configuration of NCC for Serverless to access SQL server running in a Azure VM

Hi Team, I am following this link to configure NCC for a Serverless compute to access a SQL Server running in a Azure VM. https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/This references to adding privat...

  • 250 Views
  • 0 replies
  • 0 kudos
abhishekdas
by New Contributor II
  • 1450 Views
  • 3 replies
  • 0 kudos

Resolved! Databricks on AWS - Changes to your Unity Catalog storage credentials

Hi Context: On June 30, 2023, AWS updated its IAM role trust policy, which requires updating Unity Catalog storage credentials. Databricks previously sent an email communication to customers in March 2023 on this topic and updated the documentation a...

  • 1450 Views
  • 3 replies
  • 0 kudos
Latest Reply
abhishekdas
New Contributor II
  • 0 kudos

Thank you for the response @MoJaMa - we will try it out tomorrow and post an update here.

  • 0 kudos
2 More Replies
NadithK
by Contributor
  • 286 Views
  • 0 replies
  • 1 kudos

Pre-loading docker images to cluster pool instances still requires docker URL at cluster creation

I am trying to pre-load a docker image to a Databricks cluster pool instance.As per this article I used the REST API to create the cluster pool and defined a custom Azure container registry as the source for the docker images.https://learn.microsoft....

  • 286 Views
  • 0 replies
  • 1 kudos
AlbertWang
by Contributor III
  • 1271 Views
  • 4 replies
  • 1 kudos

Resolved! How to use Databricks CLI as a service principal?

Hi all,I have a question about how to use Databricks CLI on my local environment as a service principal?I have installed Databricks CLI and configured the file `.databrickscfg` as shown below. [DEFAULT] host = https://adb-123123123.1.azuredatabr...

  • 1271 Views
  • 4 replies
  • 1 kudos
Latest Reply
Stefan-Koch
Contributor III
  • 1 kudos

got you.I found a working solution. Try this one:[devsp] azure_workspace_resource_id = /subscriptions/bc0cd1..././.../Databricks/workspaces/my-workspace azure_tenant_id = bc0cd1... azure_client_id = fa0cd1... azure_client_secr...

  • 1 kudos
3 More Replies
LauJohansson
by Contributor
  • 297 Views
  • 1 replies
  • 1 kudos

Terraform - Azure Databricks workspace without NAT gateway

Hi all,I have experienced an increase in costs - even when not using Databricks compute.It is due to the NAT-gateway, that are (suddenly) automatically deployed.When creating Azure Databricks workspaces using Terraform:A NAT-gateway is created. When ...

LauJohansson_0-1729142670306.png LauJohansson_1-1729142785587.png
  • 297 Views
  • 1 replies
  • 1 kudos
Latest Reply
saurabh18cs
Contributor II
  • 1 kudos

try by adding more properties:Also, Ensure that the subnets used by Azure Databricks do not have settings that require a NAT gateway.Consider using private endpoints for Azure Databricks to avoid the need for a NAT gateway.   infrastructure_encryptio...

  • 1 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels