- 1057 Views
- 0 replies
- 1 kudos
Quoting the databricks-connect docs, "For Databricks Runtime 13.0 and above, Databricks Connect is now built on open-source Spark Connect." What is odd to me is that a requirement for utilizing this open source Spark feature on Databricks, is Unity C...
- 1057 Views
- 0 replies
- 1 kudos
- 1746 Views
- 1 replies
- 1 kudos
I am getting this error in my Partner Databricks Account and I had tried several methods to start the cluster. As i don't have access to console.aws.amazon.com/ec2. I was not able to check the details/logs in the ec2 instance. I am getting the follow...
- 1746 Views
- 1 replies
- 1 kudos
Latest Reply
Here is a similar topic:https://community.databricks.com/t5/machine-learning/problem-with-spinning-up-a-cluster-on-a-new-workspace/m-p/29996To actually fix/analyse the issue, you need access to the EC2 console unfortunately. I assume someone in the ...
- 2011 Views
- 0 replies
- 0 kudos
I would like to create a databricks Job where the 'Run as' field is set to a ServicePrincipal. The Job points to notebooks stored in Azure DevOps.The step I've already performed are:I created the Service Principal and I'm now able to see it into the ...
- 2011 Views
- 0 replies
- 0 kudos
- 1083 Views
- 0 replies
- 0 kudos
Hi,last July 18th we were informed by Databricks that Ubuntu version 20.04 (operating system: Ubuntu 20.04.4 LTS) was going to be the only certified and supported Ubuntu version for the 10.4 runtime cluster we use. We have been experiencing some issu...
- 1083 Views
- 0 replies
- 0 kudos
- 8128 Views
- 5 replies
- 2 kudos
I added the service principal in Admin Settings > Service Principal and then enabled all the Configurations "allow cluster creation", "databricks SQL access" and "workspace access". In the Permission settings I have enabled "Service principal: Manage...
- 8128 Views
- 5 replies
- 2 kudos
Latest Reply
For future readers - don't forget to add your email (e.g. me@foo.com) in the Service Principals permissions tab. This way, you will be able to see the newly-created service principal in the dropdown menu.
4 More Replies
- 2545 Views
- 0 replies
- 0 kudos
I'm trying to create a new workspace in a empty account. I have managed to create all the other resources without issues but when I try to create the workspace it fails with the following error:Error: cannot create mws workspaces: MALFORMED_REQUEST: ...
- 2545 Views
- 0 replies
- 0 kudos
- 1741 Views
- 0 replies
- 0 kudos
Hello All,I created a Databricks Premium Workspace for a Confidential Computing PoC. After creating a VM from Databricks UI, it came to notice that there is a new RG with managed identity, NAT Gateway, Public IP, security group, and a VNET (/16). I w...
- 1741 Views
- 0 replies
- 0 kudos
by
PetePP
• New Contributor II
- 1967 Views
- 2 replies
- 0 kudos
During migration to production workload, I switched some queries to use RocksDB. I am concerned with its memory usage though. Here is sample output from my streaming query: "stateOperators" : [ {
"operatorName" : "dedupeWithinWatermark",
"...
- 1967 Views
- 2 replies
- 0 kudos
Latest Reply
Thank you for the input. Is there any particular reason why deduplication watermark makes it store everything and not just the key needed for deduplication? The 1st record has to be written to the table anyway, and its content is irrelevant as it jus...
1 More Replies
by
Bagger
• New Contributor II
- 2883 Views
- 0 replies
- 0 kudos
Hi,We need to monitor Databricks jobs and we have made a setup where are able to get the prometheus metrics, however, we are lagging an overview of which metrics refer to what.Namely, we need to monitor the following:failed jobs : is a job failedtabl...
- 2883 Views
- 0 replies
- 0 kudos
by
Ajay3
• New Contributor
- 2782 Views
- 0 replies
- 0 kudos
Hi,I need to install the below maven coordinates on the clusters using databricks init scripts.1. coordinate: com.microsoft.azure:synapseml_2.12:0.11.2 with repo https://mmlspark.azureedge.net/maven2. coordinate: com.microsoft.azure:spark-mssql-conne...
- 2782 Views
- 0 replies
- 0 kudos
- 6910 Views
- 2 replies
- 0 kudos
Our Databricks workspace was created by a personal account. Now the person has left the organization. We would like to change the owner to a Service account(preferably, else to an Admin account).Questions:Is it possible to change the owner of the wor...
- 6910 Views
- 2 replies
- 0 kudos
Latest Reply
Atanu
Databricks Employee
Are you in aws or Azure.When you say workspace admin, that could be many. So, you can have multiple workspace admin.
1 More Replies
- 7115 Views
- 2 replies
- 1 kudos
Hi,At our organization, we have added front end privatelink connection to a Databricks workspace in Azure, and public access to the workspace is disabled. I am able to access the workspace UI with the private IP (in the browser), and able to call the...
- 7115 Views
- 2 replies
- 1 kudos
Latest Reply
Hi @Retired_mod ,Thank you for the support.Really appreciate it.Thanks
1 More Replies
- 2316 Views
- 1 replies
- 0 kudos
Given the size of a Workflow may become too big to manage in a single Terraform project, what would be your recommendation as a best practice to manage and deploy the workflows via code to maintainer a predictable result between environments?Would it...
- 2316 Views
- 1 replies
- 0 kudos
by
re
• New Contributor II
- 1440 Views
- 0 replies
- 0 kudos
While configuring databricks, we've set the "default_catalog_name", which sets the default schema when users connect via an ODBC connection. While the naming isn't consistent, this does have one desired effect, that is, when users connect, it default...
- 1440 Views
- 0 replies
- 0 kudos
by
_YSF
• New Contributor II
- 964 Views
- 1 replies
- 0 kudos
I want to alter the schema and basically point it to a new path in the data lake #UnityCatalog
- 964 Views
- 1 replies
- 0 kudos
Latest Reply
don't think so.You can alter the owner and dbproperties using the alter schema command, but not the location.https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema