cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ChinuLee
by New Contributor II
  • 650 Views
  • 1 replies
  • 1 kudos

Task level Definition

Hello, According to the databricks_job resource terraform documentation, the description argument is supported at the task level. However, when I update the Terraform configuration to include a task description, the terraform plan reflects the change...

  • 650 Views
  • 1 replies
  • 1 kudos
Latest Reply
eniwoke
Contributor II
  • 1 kudos

Hi @ChinuLee, the task-level description appears to be hidden in the UI, both during manual creation and when viewing tasks after creation. However, it is visible when you choose the "View as Code" option, as shown below:So it is supported, just hidd...

  • 1 kudos
AlbertWang
by Valued Contributor
  • 7083 Views
  • 8 replies
  • 0 kudos

Resolved! Cannot use Terraform to create Databricks Storage Credential

Hi all,When I use Terraform in an Azure DevOps pipeline to create Databricks Storage Credential, I got the following error. Has anybody met the same error before? Or is there any idea how to debug it? Error: cannot create storage credential: failed d...

  • 7083 Views
  • 8 replies
  • 0 kudos
Latest Reply
MichaelFu
New Contributor II
  • 0 kudos

How exactly do you need to configure auth_type in this case? I tried different options but nothing seems to work. I also would like to use the Service Connection from Azure DevOps Pipeline to deploy Databricks via TerraformTaskV4@4.

  • 0 kudos
7 More Replies
cgrass
by New Contributor III
  • 5180 Views
  • 1 replies
  • 3 kudos

Creating Group in Terraform using external_id

The documentation here doesn't give much information about how to use `external_id` when creating a new group. If I reference the object_id for an Azure AD Group, the databricks group gets created but the members from the AD group are not added, nor ...

  • 5180 Views
  • 1 replies
  • 3 kudos
Latest Reply
MiPa
New Contributor II
  • 3 kudos

Greetings from the future! Now it is clear that external_id, which IS Azure's ObjectID, comes from the internal sync mechanism, that can be enabled in your account under previews:I was able to reference my security group in Terraform and create that ...

  • 3 kudos
gardenmap
by New Contributor II
  • 1243 Views
  • 2 replies
  • 1 kudos

How to Optimize Delta Table Performance in Databricks?

I'm working with large Delta tables in Databricks and noticing slower performance during read operations. I've already enabled Z-ordering and auto-optimize, but it still feels sluggish at scale. Are there best practices or settings I should adjust fo...

  • 1243 Views
  • 2 replies
  • 1 kudos
Latest Reply
igorborba
New Contributor II
  • 1 kudos

Hi @gardenmap, if possible can you detail more?For example, in my case what I've done:For tables above 1TB as it's can segregated by date, we've decided to enable a partition by the date column;Independent if it's partitioned or not, we decided to ma...

  • 1 kudos
1 More Replies
leo-machado
by New Contributor III
  • 2867 Views
  • 3 replies
  • 6 kudos

Refresh permission on Lakeview Dashboard

Hi folks!I'm sure I'm not the only one, but our users have the tendency to click the big Refresh button on all dashboards every time they open them.Using resources efficiently is something I value deeply, so our team came up with a schedule policy - ...

image (2).png
  • 2867 Views
  • 3 replies
  • 6 kudos
Latest Reply
holly
Databricks Employee
  • 6 kudos

Hey @leo-machado , yes I did request it formally and shared this post with the team - heads up, once something has been prioritised it can still take between 6-18 months to build.

  • 6 kudos
2 More Replies
antonionuzzo
by New Contributor III
  • 585 Views
  • 1 replies
  • 1 kudos

restrict workspace admin from creating service principal

Hello,I would like to restrict workspace admins from creating service principals and leave this privilege only to the account admin. Is this possible? I am aware of the RestrictWorkspaceAdmins command, but it does not meet my needs. Additionally, I h...

  • 585 Views
  • 1 replies
  • 1 kudos
Latest Reply
Advika
Databricks Employee
  • 1 kudos

Hello @antonionuzzo! Based on the documentation and my understanding, there isn’t a built-in way to restrict the creation of service principals exclusively to account admins. And as you mentioned, the RestrictWorkspaceAdmins setting doesn’t cover thi...

  • 1 kudos
Th0rs7en
by New Contributor
  • 1128 Views
  • 1 replies
  • 0 kudos

How to enable / setup OAuth for DBT with Databricks

I tried configure / setup DBT to authenticate with OAuth to Databricks, following the tutorials https://community.databricks.com/t5/technical-blog/using-dbt-core-with-oauth-on-azure-databricks/ba-p/46605 and https://docs.databricks.com/aws/en/partner...

  • 1128 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Here are some things to consider: Debugging and Setting Up OAuth Authentication with dbt and Databricks Steps to Analyze and Setup Authentication: 1. Verify Credentials and Configuration: - Double-check the auth_type: oauth is correctly specified, as...

  • 0 kudos
m_weirath
by New Contributor II
  • 4022 Views
  • 1 replies
  • 1 kudos

Cannot remove users group "CAN_MANAGE" from /Shared

I have a Unity Catalog enabled workspace and I have full privileges including Account Admin.  I would like to be able to remove the "CAN_MANAGE" privilege from the "users" group.  According to the documentation, this should be possible.  According to...

  • 4022 Views
  • 1 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

Hi, Can you please share the entire stacktrace to the cause of Cannot modify permissions of directory. Thanks!

  • 1 kudos
johnb1
by Contributor
  • 6279 Views
  • 11 replies
  • 3 kudos

Resolved! Deploy Workflow only to specific target (Databricks Asset Bundles)

I am using Databricks Asset Bundles to deploy Databricks workflows to all of my target environments (dev, staging, prod). However, I have one specific workflow that is supposed to be deployed only to the dev target environment.How can I implement tha...

  • 6279 Views
  • 11 replies
  • 3 kudos
Latest Reply
asham
New Contributor II
  • 3 kudos

Hi, also curious about this update, this would be a really helpful feature for us since we have humongous job specifications that are specific for dev and test environments. Adding these to the top level databricks.yml file is really cluttering up ou...

  • 3 kudos
10 More Replies
GeorgeFmz
by New Contributor II
  • 2021 Views
  • 4 replies
  • 0 kudos

Serverless python version

In the documentation about serverless compute release notes, it is stated that "Serverless compute for notebooks and jobs uses environment versions", and that "Serverless compute always runs using the most recently released version listed here.". At ...

  • 2021 Views
  • 4 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

It seems the configuration aligns with doc https://docs.databricks.com/aws/en/release-notes/serverless/environment-version-two which specifies that the Python version in 3.11.10

  • 0 kudos
3 More Replies
antonionuzzo
by New Contributor III
  • 958 Views
  • 1 replies
  • 1 kudos

Resolved! service principal control plane access management

hi, our account admin has created a service principal to automate job execution. however, our security team is concerned that, by design, anyone with the service principal credentials might access the control plane, where the service principal is def...

  • 958 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

On the docs it states: Service principals give automated tools and scripts API-only access to Databricks resources, providing greater security than using users accounts.https://docs.databricks.com/gcp/en/admin/users-groups/service-principals#what-is-...

  • 1 kudos
lucasbergamo
by New Contributor
  • 759 Views
  • 1 replies
  • 0 kudos

Issue with Verification Code Input on Login

Hello, I hope you're well.​I'd like to report a bug encountered when entering the verification code to log into the platform. When I type the code without Caps Lock enabled, the input field displays the characters in uppercase, but the code isn't acc...

  • 759 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @lucasbergamo! Thank you for bringing this to our attention. I'll share this with the relevant team for further investigation. In the meantime, as a workaround you can continue using Caps Lock while entering the verification code to log in.

  • 0 kudos
ashokz
by New Contributor II
  • 1489 Views
  • 1 replies
  • 0 kudos

Is it possible expand/extend subnet CIDR of an existing azure databricks workspace

Is it possible expand/extend subnet CIDR of an existing azure databricks workspace. Currently our workspace is maxed out, Is it possible expand/extend subnet CIDR of an existing azure databricks workspace without having create a new one

  • 1489 Views
  • 1 replies
  • 0 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 0 kudos

Yes, it is possible to expand or extend the subnet CIDR of an existing Azure Databricks workspace without creating a new one, but this capability is specifically applicable if the workspace is deployed with VNet injection.   For workspaces that use V...

  • 0 kudos
noorbasha534
by Valued Contributor II
  • 4703 Views
  • 3 replies
  • 1 kudos

Azure Databricks Status

Dear all,I wanted to check if anyone implemented the solution of capturing information from Databricks status page in real-time 24x7 and load that into a log or table...https://learn.microsoft.com/en-us/azure/databricks/resources/statuswhat is the be...

  • 4703 Views
  • 3 replies
  • 1 kudos
Latest Reply
TheRealOliver
Contributor
  • 1 kudos

It seems that the webhook is the way!There is nothing about system status in Databricks REST API.There is nothing about system status in the System Tables schema.

  • 1 kudos
2 More Replies
david_btmpl
by New Contributor II
  • 724 Views
  • 1 replies
  • 2 kudos

for_each_task with pool clusters

I am trying to run a `for_each_task` across different inputs of length `N` and `concurrency` `M` where N >> M.  To mitigate cluster setup time I want to use pool clusters.Now, when I set everything up, I notice that instead of `M` concurrent clusters...

  • 724 Views
  • 1 replies
  • 2 kudos
Latest Reply
SP_6721
Honored Contributor
  • 2 kudos

Hi @david_btmpl When you set up a Databricks workflow using for_each_task with a cluster pool (instance_pool_id), Databricks will, by default, reuse the same cluster for all concurrent tasks in that job. So even if you’ve set a higher concurrency (li...

  • 2 kudos