- 6972 Views
- 3 replies
- 1 kudos
Resolved! How many workspaces should we have?
There is a default limitation for workspace numbers in a Databricks account, which is 50 for an enterprise account: https://docs.databricks.com/en/resources/limits.html What is the best practice if we need more than 50 workspaces? Will we need more a...
- 6972 Views
- 3 replies
- 1 kudos
- 1 kudos
Thank you so much for helping me. I didn't even expect it.
- 1 kudos
- 2321 Views
- 0 replies
- 0 kudos
Problem with provisioning lakehouse via Terraform
I am trying to provision the lakehouse on Azure using the Terraform template provided here: https://github.com/databricks/terraform-databricks-examples/tree/main/examples/adb-lakehouse. I have started from scratch and have only added a resource group...
- 2321 Views
- 0 replies
- 0 kudos
- 2574 Views
- 1 replies
- 1 kudos
Help Needed: Grants remaining for removed Groups and Service Principal
We have an issue in a Workspace which is managed by Terraform, a change went in to update the Group and Service Principal (SP) names but due to the internal ordering the Groups and SP were removed and replaced before the Grants were updated.If we now...
- 2574 Views
- 1 replies
- 1 kudos
- 1 kudos
We are experiencing a similar issue, except with the storage credential resource. We created some storage credentials using Terraform, but when trying to destroy them using Terraform they were ignored. So we decided to manually delete the storage cre...
- 1 kudos
- 3040 Views
- 1 replies
- 1 kudos
Why do we need different AWS Accounts ?
I am just going through the AWS Databricks Platform Administration course and I am curious to know about the Cloud Accounts used to setup DatabricksWhen we are setting up the Databricks, we are using AWS Account but we are not using that account sayi...
- 3040 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @HappySK, The AWS account you're using basically serves as the data plane - this is where your data lives, and the compute resources that process it (at least in the classical compute model) will be provisioned there as well. As part of the confi...
- 1 kudos
- 4885 Views
- 0 replies
- 0 kudos
Consequences of removing a workspace from a metastore in Azure Databricks
In the documentation (Enable a workspace for Unity Catalog - Azure Databricks | Microsoft Learn), it appears that I can remove a workspace from a metastore, and as long as the workspace has jobs that don't use tables, files, and models stored in any ...
- 4885 Views
- 0 replies
- 0 kudos
- 1432 Views
- 0 replies
- 1 kudos
PAT Tokens access restrictions
Hi,We have a Databricks workspace on which we have disabled PAT's for now. Moving ahead we want the developers to use Personal Access Tokens but only for development purpose.As I know the current option to enable PAT's give access to the whole REST A...
- 1432 Views
- 0 replies
- 1 kudos
- 1336 Views
- 0 replies
- 1 kudos
Restricting Spark Connect Behind Premium Plan Paywall?
Quoting the databricks-connect docs, "For Databricks Runtime 13.0 and above, Databricks Connect is now built on open-source Spark Connect." What is odd to me is that a requirement for utilizing this open source Spark feature on Databricks, is Unity C...
- 1336 Views
- 0 replies
- 1 kudos
- 2192 Views
- 1 replies
- 1 kudos
Cluster failed to start
I am getting this error in my Partner Databricks Account and I had tried several methods to start the cluster. As i don't have access to console.aws.amazon.com/ec2. I was not able to check the details/logs in the ec2 instance. I am getting the follow...
- 2192 Views
- 1 replies
- 1 kudos
- 1 kudos
Here is a similar topic:https://community.databricks.com/t5/machine-learning/problem-with-spinning-up-a-cluster-on-a-new-workspace/m-p/29996To actually fix/analyse the issue, you need access to the EC2 console unfortunately. I assume someone in the ...
- 1 kudos
- 2369 Views
- 0 replies
- 0 kudos
Service Principal for remote repository in workflow/job expiring token
I would like to create a databricks Job where the 'Run as' field is set to a ServicePrincipal. The Job points to notebooks stored in Azure DevOps.The step I've already performed are:I created the Service Principal and I'm now able to see it into the ...
- 2369 Views
- 0 replies
- 0 kudos
- 1481 Views
- 0 replies
- 0 kudos
Ubuntu 18.4 EOL
Hi,last July 18th we were informed by Databricks that Ubuntu version 20.04 (operating system: Ubuntu 20.04.4 LTS) was going to be the only certified and supported Ubuntu version for the 10.4 runtime cluster we use. We have been experiencing some issu...
- 1481 Views
- 0 replies
- 0 kudos
- 10534 Views
- 5 replies
- 2 kudos
Resolved! Unable to list service principal in Job details RUN AS
I added the service principal in Admin Settings > Service Principal and then enabled all the Configurations "allow cluster creation", "databricks SQL access" and "workspace access". In the Permission settings I have enabled "Service principal: Manage...
- 10534 Views
- 5 replies
- 2 kudos
- 2 kudos
For future readers - don't forget to add your email (e.g. me@foo.com) in the Service Principals permissions tab. This way, you will be able to see the newly-created service principal in the dropdown menu.
- 2 kudos
- 3257 Views
- 0 replies
- 0 kudos
Workspace creation via terraform provider fails on AWS
I'm trying to create a new workspace in a empty account. I have managed to create all the other resources without issues but when I try to create the workspace it fails with the following error:Error: cannot create mws workspaces: MALFORMED_REQUEST: ...
- 3257 Views
- 0 replies
- 0 kudos
- 2073 Views
- 0 replies
- 0 kudos
Clean up Databricks confidential computing resources
Hello All,I created a Databricks Premium Workspace for a Confidential Computing PoC. After creating a VM from Databricks UI, it came to notice that there is a new RG with managed identity, NAT Gateway, Public IP, security group, and a VNET (/16). I w...
- 2073 Views
- 0 replies
- 0 kudos
- 2939 Views
- 2 replies
- 0 kudos
Extreme RocksDB memory usage
During migration to production workload, I switched some queries to use RocksDB. I am concerned with its memory usage though. Here is sample output from my streaming query: "stateOperators" : [ { "operatorName" : "dedupeWithinWatermark", "...
- 2939 Views
- 2 replies
- 0 kudos
- 0 kudos
Thank you for the input. Is there any particular reason why deduplication watermark makes it store everything and not just the key needed for deduplication? The 1st record has to be written to the table anyway, and its content is irrelevant as it jus...
- 0 kudos
- 3261 Views
- 0 replies
- 0 kudos
Monitoring job metrics
Hi,We need to monitor Databricks jobs and we have made a setup where are able to get the prometheus metrics, however, we are lagging an overview of which metrics refer to what.Namely, we need to monitor the following:failed jobs : is a job failedtabl...
- 3261 Views
- 0 replies
- 0 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
64 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 121 | |
| 42 | |
| 37 | |
| 31 | |
| 25 |