- 2126 Views
- 1 replies
- 1 kudos
Cluster failed to start
I am getting this error in my Partner Databricks Account and I had tried several methods to start the cluster. As i don't have access to console.aws.amazon.com/ec2. I was not able to check the details/logs in the ec2 instance. I am getting the follow...
- 2126 Views
- 1 replies
- 1 kudos
- 1 kudos
Here is a similar topic:https://community.databricks.com/t5/machine-learning/problem-with-spinning-up-a-cluster-on-a-new-workspace/m-p/29996To actually fix/analyse the issue, you need access to the EC2 console unfortunately. I assume someone in the ...
- 1 kudos
- 2315 Views
- 0 replies
- 0 kudos
Service Principal for remote repository in workflow/job expiring token
I would like to create a databricks Job where the 'Run as' field is set to a ServicePrincipal. The Job points to notebooks stored in Azure DevOps.The step I've already performed are:I created the Service Principal and I'm now able to see it into the ...
- 2315 Views
- 0 replies
- 0 kudos
- 1443 Views
- 0 replies
- 0 kudos
Ubuntu 18.4 EOL
Hi,last July 18th we were informed by Databricks that Ubuntu version 20.04 (operating system: Ubuntu 20.04.4 LTS) was going to be the only certified and supported Ubuntu version for the 10.4 runtime cluster we use. We have been experiencing some issu...
- 1443 Views
- 0 replies
- 0 kudos
- 10254 Views
- 5 replies
- 2 kudos
Resolved! Unable to list service principal in Job details RUN AS
I added the service principal in Admin Settings > Service Principal and then enabled all the Configurations "allow cluster creation", "databricks SQL access" and "workspace access". In the Permission settings I have enabled "Service principal: Manage...
- 10254 Views
- 5 replies
- 2 kudos
- 2 kudos
For future readers - don't forget to add your email (e.g. me@foo.com) in the Service Principals permissions tab. This way, you will be able to see the newly-created service principal in the dropdown menu.
- 2 kudos
- 3203 Views
- 0 replies
- 0 kudos
Workspace creation via terraform provider fails on AWS
I'm trying to create a new workspace in a empty account. I have managed to create all the other resources without issues but when I try to create the workspace it fails with the following error:Error: cannot create mws workspaces: MALFORMED_REQUEST: ...
- 3203 Views
- 0 replies
- 0 kudos
- 2005 Views
- 0 replies
- 0 kudos
Clean up Databricks confidential computing resources
Hello All,I created a Databricks Premium Workspace for a Confidential Computing PoC. After creating a VM from Databricks UI, it came to notice that there is a new RG with managed identity, NAT Gateway, Public IP, security group, and a VNET (/16). I w...
- 2005 Views
- 0 replies
- 0 kudos
- 2807 Views
- 2 replies
- 0 kudos
Extreme RocksDB memory usage
During migration to production workload, I switched some queries to use RocksDB. I am concerned with its memory usage though. Here is sample output from my streaming query: "stateOperators" : [ { "operatorName" : "dedupeWithinWatermark", "...
- 2807 Views
- 2 replies
- 0 kudos
- 0 kudos
Thank you for the input. Is there any particular reason why deduplication watermark makes it store everything and not just the key needed for deduplication? The 1st record has to be written to the table anyway, and its content is irrelevant as it jus...
- 0 kudos
- 3212 Views
- 0 replies
- 0 kudos
Monitoring job metrics
Hi,We need to monitor Databricks jobs and we have made a setup where are able to get the prometheus metrics, however, we are lagging an overview of which metrics refer to what.Namely, we need to monitor the following:failed jobs : is a job failedtabl...
- 3212 Views
- 0 replies
- 0 kudos
- 7875 Views
- 2 replies
- 0 kudos
How to change Workspace Owner?
Our Databricks workspace was created by a personal account. Now the person has left the organization. We would like to change the owner to a Service account(preferably, else to an Admin account).Questions:Is it possible to change the owner of the wor...
- 7875 Views
- 2 replies
- 0 kudos
- 0 kudos
Are you in aws or Azure.When you say workspace admin, that could be many. So, you can have multiple workspace admin.
- 0 kudos
- 8009 Views
- 2 replies
- 1 kudos
Using a custom Hostname in Databricks CLI instead of per-workspace URL
Hi,At our organization, we have added front end privatelink connection to a Databricks workspace in Azure, and public access to the workspace is disabled. I am able to access the workspace UI with the private IP (in the browser), and able to call the...
- 8009 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Retired_mod ,Thank you for the support.Really appreciate it.Thanks
- 1 kudos
- 2854 Views
- 1 replies
- 0 kudos
Modularization of Databricks Workflows
Given the size of a Workflow may become too big to manage in a single Terraform project, what would be your recommendation as a best practice to manage and deploy the workflows via code to maintainer a predictable result between environments?Would it...
- 2854 Views
- 1 replies
- 0 kudos
- 1967 Views
- 0 replies
- 0 kudos
terraform/databricks setting default_catalog_name
While configuring databricks, we've set the "default_catalog_name", which sets the default schema when users connect via an ODBC connection. While the naming isn't consistent, this does have one desired effect, that is, when users connect, it default...
- 1967 Views
- 0 replies
- 0 kudos
- 1250 Views
- 1 replies
- 0 kudos
Can I edit the ADLSg2 storage location for a schema?
I want to alter the schema and basically point it to a new path in the data lake #UnityCatalog
- 1250 Views
- 1 replies
- 0 kudos
- 0 kudos
don't think so.You can alter the owner and dbproperties using the alter schema command, but not the location.https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema
- 0 kudos
- 3348 Views
- 0 replies
- 0 kudos
Struggling with UC Volume Paths
I am trying to setup my volumes and give them paths in the data lake but I keep getting this message:Input path url 'abfss://my-container@my-storage-account.dfs.core.windows.net/' overlaps with managed storage within 'CreateVolume' callThere WAS some...
- 3348 Views
- 0 replies
- 0 kudos
- 4164 Views
- 0 replies
- 0 kudos
Error: cannot create permissions: invalid character '<' looking for beginning of value
I'm trying to use terraform to assign a cluster policy to an account-level group (sync'd from AAD via SCIM)My provider is configured like thisprovider "databricks" {alias = "azure_account"host = "accounts.azuredatabricks.net"account_id = "%DATABRICKS...
- 4164 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
58 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 38 | |
| 37 | |
| 28 | |
| 25 |