cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Lakehouse Architecture


Forum Posts

Maxi1693
by New Contributor II
  • 2761 Views
  • 2 replies
  • 0 kudos

Error running 80 task at same time in Job, how limit this?

Hi! I have a Job running to process multiple streaming tables. In the beginning, it was working fine, but now I have 80 tables running in this job, the problem is that all the runs are trying to run at the same time throwing an error. Is there a way ...

  • 2761 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Maxi1693, It appears that you’re encountering issues with parallel execution of tasks in your Databricks job. Let’s address this by considering a few strategies: Concurrency Limit for Tasks: Databricks allows a maximum of 1000 concurrent tas...

  • 0 kudos
1 More Replies
achistef
by New Contributor III
  • 1094 Views
  • 5 replies
  • 6 kudos

Resolved! Secret scope with Azure RBAC

Hello!We have lots of Azure keyvaults that we use in our Azure Databricks workspaces. We have created secret scopes that are backed by the keyvaults. Azure supports two ways of authenticating to keyvaults:- Access policies, which has been marked as l...

  • 1094 Views
  • 5 replies
  • 6 kudos
Latest Reply
achistef
New Contributor III
  • 6 kudos

That is very helpful, thank you for your answers.FYI there is some outdated documentation about this topic https://learn.microsoft.com/en-us/azure/databricks/security/secrets/secret-scopes#configure-your-azure-key-vault-instance-for-azure-databricks

  • 6 kudos
4 More Replies
jlowe
by New Contributor
  • 695 Views
  • 1 replies
  • 0 kudos

DABs vs API

DABs seem like a great way to build a project based workflow.  However, I have a need to take a user's SQL and create a TABLE/VIEW based on it.  We plan to use the API to submit the request but is this the best way going forward?  Should we consider ...

  • 695 Views
  • 1 replies
  • 0 kudos
Latest Reply
Shazaamzaa
New Contributor III
  • 0 kudos

DABs use Terraform which in turn calls the API anyway. So, I guess it comes down to personal preference of deployment consistency, code maintenance etc. However, from your description of "taking a user's SQL" I'm assuming some sort of real-time appli...

  • 0 kudos
Meghana_Vasavad
by New Contributor III
  • 302 Views
  • 1 replies
  • 0 kudos

Assistance Required for Enabling Unity Catalog in Databricks Workspace

Hi,I hope this message finds you well.I am reaching out regarding a concern with Databricks Administrator privileges. I have an Azure subscription and I use Azure Databricks for my tutorials, but I currently do not have Global Administrator access, w...

  • 302 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Honored Contributor
  • 0 kudos

Hello, each account should have at least one account administrator that is the one that has the permissions on the managed account and that can give you this access if applicable or assist you with the set up of the UC, you might need to ask internal...

  • 0 kudos
wsunwall
by New Contributor II
  • 491 Views
  • 1 replies
  • 1 kudos

Disabling Default Personal Compute Policy in Terraform (AWS)

I'm looking to disable the Personal Compute Policy in Terraform, however I don't see an option to do this. According to this documentation here https://docs.databricks.com/en/admin/clusters/personal-compute.html#customize-the-personal-compute-policy ...

  • 491 Views
  • 1 replies
  • 1 kudos
Latest Reply
mggl
New Contributor II
  • 1 kudos

I have the same issue in terraform-provider-google.I found how to do it from Admin UI (UI -> Settings -> Features Enablement -> disable Personal compute). At the same time I didn't find how to have the same behavior from Terraform.

  • 1 kudos
Edyta
by New Contributor II
  • 592 Views
  • 1 replies
  • 0 kudos

Resolved! Delete Databricks account

Hi everyone, as in the topic, I would like to delete unnnecesarily created account. I have found outdated solutions (e.g. https://community.databricks.com/t5/data-engineering/how-to-delete-databricks-account/m-p/6323#M2501), but they do not work anym...

  • 592 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Contributor
  • 0 kudos

Hi @Edyta ,FOR AWS:Manage your subscription | Databricks on AWS" Before you delete a Databricks account, you must first cancel your Databricks subscription and delete all Unity Catalog metastores in the account. After you delete all metastores associ...

  • 0 kudos
AmandaOhare
by New Contributor II
  • 1212 Views
  • 3 replies
  • 2 kudos

Resolved! Message queue to directly trigger job

Hi All,I'm very new to Databricks and trying to understand my use case a bit better.I have a databricks script / job that I want to be reactive to events outside of my databricks environment. A best case scenario would be if my script / job could aut...

  • 1212 Views
  • 3 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Contributor
  • 2 kudos

Hi @AmandaOhare ,You can use AWS Lambda to achieve that. You can setup queue trigger that will activate AWS lambda function. In that function you can call datbricks rest API that will launch workflow/job.

  • 2 kudos
2 More Replies
leo-machado
by New Contributor
  • 352 Views
  • 1 replies
  • 1 kudos

Refresh permission on Lakeview Dashboard

Hi folks!I'm sure I'm not the only one, but our users have the tendency to click the big Refresh button on all dashboards every time they open them.Using resources efficiently is something I value deeply, so our team came up with a schedule policy - ...

image (2).png
  • 352 Views
  • 1 replies
  • 1 kudos
Latest Reply
holly
Valued Contributor III
  • 1 kudos

Hi @leo-machado this is a very reasonable request. It's not feasible right now, but I'll pass it on to the product team as I'm sure you're not the only one. 

  • 1 kudos
pvilasra
by New Contributor II
  • 208 Views
  • 1 replies
  • 0 kudos

Unable to access databricls workspaces

Hi Team, Previously we are able to access databricks workspaces using username and password but now its asking for token, But we dont have access to mail account with username that we are using while login. Could you please assist on the issue how ca...

  • 208 Views
  • 1 replies
  • 0 kudos
Latest Reply
pvilasra
New Contributor II
  • 0 kudos

Anyone can help on this ?

  • 0 kudos
ma10
by New Contributor
  • 118 Views
  • 0 replies
  • 0 kudos

Issue with updating email with SCIM Provisioning

Hi all,For our set-up we have configured SCIM provisioning using Entra ID, group assignment on Azure is dealt with by IdentityIQ Sailpoint, and have enabled SSO for Databricks. It has/is working fine apart from one scenario. The original email assign...

  • 118 Views
  • 0 replies
  • 0 kudos
soumiknow
by New Contributor
  • 124 Views
  • 0 replies
  • 0 kudos

How to add 'additionallyAllowedTenants' in Databricks config or PySpark config?

I have a multi-tenant Azure app. I am using this app's credentials to read ADLS container files from Databricks cluster using PySpark dataframe.I need to set this 'additionallyAllowedTenants' flag value to '*' or a specific tenant_id of the multi-ten...

  • 124 Views
  • 0 replies
  • 0 kudos
ADuma
by New Contributor III
  • 683 Views
  • 1 replies
  • 0 kudos

Resolved! Using Databricks Connect with PyCharm behind Proxy

I am using Databricks connect with PyCharm in order to locally develop my pipelines before deploying to Databricks. I set up authentication for databricks-connect using the following guide https://learn.microsoft.com/en-us/azure/databricks/dev-tools/...

  • 683 Views
  • 1 replies
  • 0 kudos
Latest Reply
ADuma
New Contributor III
  • 0 kudos

I managed to solve the problem by including my login and password in the HTTPS_PROXY environment variable.I set it to something like this HTTPS_PROXY = "http://username:password@proxy.company.com:port" and databricks-connect can create spark sessions...

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels