cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AndrewHess
by New Contributor II
  • 557 Views
  • 4 replies
  • 0 kudos

Unity Group management, Group: Manager role

We would like to have the ability to assign an individual and/or group to the "Group: Manager" role, providing them with the ability to add/remove users without the need to be an account or workspace administrator.  Ideally this would be an option fo...

AndrewHess_0-1730378933657.png
  • 557 Views
  • 4 replies
  • 0 kudos
Latest Reply
AndrewHess
New Contributor II
  • 0 kudos

thanks @NandiniN , we have looked through that documentation and still have not been able to get anything to work without the user also being an account or workspace admin.  The way i'm interpreting the documentation (screenshot) is the API currently...

  • 0 kudos
3 More Replies
Stefan-Koch
by Valued Contributor
  • 651 Views
  • 2 replies
  • 0 kudos

List files in Databricks Workspace with Databricks CLI

I want to list all files in my Workspace with the CLIThere's a command for it: databricks fs ls dbfs:/When I run this, I get this result: I can then list the content of databricks-datasets, but no other directory. How can I list the content of the Wo...

StefanKoch_0-1730407022149.png StefanKoch_1-1730407116748.png
  • 651 Views
  • 2 replies
  • 0 kudos
Latest Reply
Stefan-Koch
Valued Contributor
  • 0 kudos

I know it's possible with Databricks SDK, but I want to solve it with the CLI on the Terminal.  

  • 0 kudos
1 More Replies
hiro12
by New Contributor
  • 282 Views
  • 1 replies
  • 0 kudos

Enabling Object Lock for the S3 bucket that is delivering audit logs

Hello Community,I am trying to enable Object Lock on the S3 bucket to which the audit log is delivered, but the following error occurs if Object Lock is enabled when the delivery settings are enabled.> {"error_code":"PERMISSION_DENIED","message":"Fai...

  • 282 Views
  • 1 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Hi @hiro12  Enabling Object Lock on an S3 bucket after configuring the delivery settings should not affect the ongoing delivery of audit logs. But I would say, it is better to understand the root cause of the error. The error you encountered when ena...

  • 0 kudos
DrK
by New Contributor III
  • 2185 Views
  • 8 replies
  • 3 kudos

Open Delta Sharing and Deletion Vectors

Hi,Just experimenting with open delta sharing and running into a few technical traps.  Mainly that if deletion vectors are enabled on a delta table (which they are by default now) we get errors when trying to query a table (specifically with PowerBI)...

  • 2185 Views
  • 8 replies
  • 3 kudos
Latest Reply
DarioBarbadillo
New Contributor II
  • 3 kudos

@NandiniN we are talking of PowerBI connection, so you cannot set that option.@F_Goudarzi I have just tried out with PBI Desktop Version: 2.132.1053.0 and it is running (I did not disable Deletion Vectors into my table.) I also tried with last versio...

  • 3 kudos
7 More Replies
stevenayers-bge
by Contributor
  • 1230 Views
  • 3 replies
  • 0 kudos

Proxy (Zscaler) & Databricks/Spark Connect "Cannot check peer: missing selected ALPN property"

Summary:We use Zscaler and are trying to use Databricks Connect to develop pyspark code locally. At first, we received SSL HTTP errors, which we resolved by ensuring Python's request library could find Zscaler's CA cert (setting REQUESTS_CA_BUNDLE en...

  • 1230 Views
  • 3 replies
  • 0 kudos
Latest Reply
Ganeshpendu
New Contributor II
  • 0 kudos

Hello Stevenayers-bge,checking if you come across any solution on above mentioned issue?if yes could you please post here, I really appreciate 

  • 0 kudos
2 More Replies
PabloCSD
by Valued Contributor
  • 662 Views
  • 3 replies
  • 0 kudos

Generate a Workflow that Waits for Library Installation

I have a process in DBX/DAB and I am using Service Principal for generating a token for reaching the artifacts feed, for security this token lasts 1 hour.import requests YOUR_AZURE_TENANT_ID = ... YOUR_SERVICE_PRINCIPAL_CLIENT_ID = ... YOUR_SECRET_S...

  • 662 Views
  • 3 replies
  • 0 kudos
Latest Reply
agallard
Contributor
  • 0 kudos

Hi @PabloCSD,here are some refined solutions that keep costs low and ensure the main workflow waits until the token is generated:Instead of separating the token generation and main tasks, consider generating the token directly within the initializati...

  • 0 kudos
2 More Replies
Pasawat
by New Contributor III
  • 1400 Views
  • 7 replies
  • 2 kudos

Resolved! Could not access databricks on iphone any webbrowser

Hi around passed 2 weeks, I tried to access databricks via my ios(17.6) safari and chrome.both web browser could not access.I tried to clear cache and login with Global Administrator in Azure AD before already it same still blank page.could you pleas...

  • 1400 Views
  • 7 replies
  • 2 kudos
Latest Reply
Maros
New Contributor II
  • 2 kudos

Hi,I can confirm it works as well. According to my info, there was a release to fiz some Chrome issues which had a positive impact on iOS devices.Glad it works again.

  • 2 kudos
6 More Replies
KLin
by New Contributor III
  • 790 Views
  • 3 replies
  • 0 kudos

Resolved! Terraform Destroy not able to prune Databricks Provisioned GKE Cluster on GCP

Hi there,newbie here in Databricks on GCP. I provisioned my Databricks workspace with Terraform and all worked well. Now when I would like to target destroy my workspace, issues occur:When I do terraform destroy -target module.workspace, the workspac...

  • 790 Views
  • 3 replies
  • 0 kudos
Latest Reply
HaggMan
New Contributor III
  • 0 kudos

Ha, that's true, too. I forget how long it takes things to delete, but I've run into it many time. Best of luck to you!

  • 0 kudos
2 More Replies
l0ginp
by New Contributor III
  • 5406 Views
  • 3 replies
  • 0 kudos

Resolved! I have questions about "Premium Automated Serverless Compute - Promo DBU."

"Premium Automated Serverless Compute - Promo DBU" expenses arise from what, how can I disable it, and why are the costs so high? In the picture, I am using AzureThank you in advance for the advice  

l0ginp_2-1717951633113.png
  • 5406 Views
  • 3 replies
  • 0 kudos
Latest Reply
agallard
Contributor
  • 0 kudos

Hello everyone!I had the same problem for two months. I went crazy looking for what was spending my entire subscription amount. Premium Automated Serverless Compute - Promo DBU > €8,236 After searching everywhere for information about it and reading ...

  • 0 kudos
2 More Replies
rfreitas
by New Contributor II
  • 10438 Views
  • 2 replies
  • 1 kudos

Notebook and folder owner

Hi allWe can use this API https://docs.databricks.com/api/workspace/dbsqlpermissions/transferownership to transfer the ownership of a Query.Is there anything similar for notebooks and folders?

  • 10438 Views
  • 2 replies
  • 1 kudos
Latest Reply
sparkplug
New Contributor III
  • 1 kudos

This api only allows to set permissions based on permission level , which doesn't include changing OWNER. Any suggestions on this particular request.

  • 1 kudos
1 More Replies
KUMAR__111
by New Contributor II
  • 329 Views
  • 3 replies
  • 0 kudos

How to get cost per job which runs on ALL_PURPOSE_COMPUTE ??

with system.billing.usage table i could get cost per jobs which are runs on JOB_COMPUTE but not for jobs which runs on ALL_PURPOSE_COMPUTE.

  • 329 Views
  • 3 replies
  • 0 kudos
Latest Reply
KUMAR__111
New Contributor II
  • 0 kudos

If nowhere DBU is captured for jobs under ALL_PURPOSE_COMPUTE then cost breakdown-based cluster events is very difficult as more than 2 jobs can parallel. So mapping is very difficult to break down cost for specific job.let me know if I am missing an...

  • 0 kudos
2 More Replies
JessieWen
by Databricks Employee
  • 287 Views
  • 1 replies
  • 2 kudos

slow cluster start up time up to 30 min gcp

instance type: e2-highmem-2

  • 287 Views
  • 1 replies
  • 2 kudos
Latest Reply
JessieWen
Databricks Employee
  • 2 kudos

please use a higher-powered instance type (e.g. n2-highmem-4). The instance type you are currently using (i.e. e2-highmem-2) is significantly underpowered and will result in slower cluster launch times.

  • 2 kudos
CarolinaK
by New Contributor II
  • 589 Views
  • 4 replies
  • 0 kudos

Unity Catalog hive_metastore schemas

Hi all,Apologies if this is the wrong group but I was looking in Unity Catalog and noticed that you have different schemas in the hive_metastore depending on if you select a cluster or if you select a warehouse. Could someone please explain what the ...

  • 589 Views
  • 4 replies
  • 0 kudos
Latest Reply
navallyemul
New Contributor III
  • 0 kudos

No schemas are directly attached to compute resources, whether it's an all-purpose cluster or a SQL warehouse in serverless mode.

  • 0 kudos
3 More Replies
VCHK
by New Contributor
  • 445 Views
  • 1 replies
  • 0 kudos

Databricks Workflow/Jobs View Log Permission

If we don't want to expose admin right to user group. What should we do to allow a specific user group to have permission to view all of the job logs in a Databricks account? We don't want to grant job level permission too.Thanks,VC

  • 445 Views
  • 1 replies
  • 0 kudos
Latest Reply
SathyaSDE
Contributor
  • 0 kudos

Hi, I guess you can use Databricks API to list jobs and set Can view permission to all jobs.Sample code below: import requestsfrom databricks_cli.sdk import ApiClient, JobsService, PermissionsService# Initialize the API clientapi_client = ApiClient( ...

  • 0 kudos
cuhlmann
by New Contributor
  • 383 Views
  • 1 replies
  • 0 kudos

data ingestion from external system - auth via client certificate

Hi Community,we have the requirement to ingest data in azure databricks from external systems.Our customer ask us to use Client Certificate as authentication method.Requests - https://requests.readthedocs.io/en/latest/user/advanced/Aiohttp - https://...

  • 383 Views
  • 1 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @cuhlmann ,As I understand you need to ingest data into Azure Databricks from external systems, and your customer requires using client certificate authentication. The challenge is that the client certificate is stored in Azure Key Vault, but the ...

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels