cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

jaganadhg
by New Contributor
  • 1723 Views
  • 0 replies
  • 0 kudos

Clean up Databricks confidential computing resources

Hello All,I created a Databricks Premium Workspace for a Confidential Computing PoC. After creating a VM from Databricks UI, it came to notice that there is a new RG with managed identity, NAT Gateway, Public IP, security group, and a VNET (/16). I w...

Administration & Architecture
Confidential Compute
  • 1723 Views
  • 0 replies
  • 0 kudos
PetePP
by New Contributor II
  • 1864 Views
  • 2 replies
  • 0 kudos

Extreme RocksDB memory usage

During migration to production workload, I switched some queries to use RocksDB. I am concerned with its memory usage though. Here is sample output from my streaming query:   "stateOperators" : [ { "operatorName" : "dedupeWithinWatermark", "...

  • 1864 Views
  • 2 replies
  • 0 kudos
Latest Reply
PetePP
New Contributor II
  • 0 kudos

Thank you for the input. Is there any particular reason why deduplication watermark makes it store everything and not just the key needed for deduplication? The 1st record has to be written to the table anyway, and its content is irrelevant as it jus...

  • 0 kudos
1 More Replies
Bagger
by New Contributor II
  • 2860 Views
  • 0 replies
  • 0 kudos

Monitoring job metrics

Hi,We need to monitor Databricks jobs and we have made a setup where are able to get the prometheus metrics, however, we are lagging an overview of which metrics refer to what.Namely, we need to monitor the following:failed jobs : is a job failedtabl...

Administration & Architecture
jobs
metrics
prometheus
  • 2860 Views
  • 0 replies
  • 0 kudos
Ajay3
by New Contributor
  • 2749 Views
  • 0 replies
  • 0 kudos

How can I install maven coordinates using init script?

Hi,I need to install the below maven coordinates on the clusters using databricks init scripts.1. coordinate: com.microsoft.azure:synapseml_2.12:0.11.2 with repo https://mmlspark.azureedge.net/maven2. coordinate: com.microsoft.azure:spark-mssql-conne...

  • 2749 Views
  • 0 replies
  • 0 kudos
nihar_ghude
by New Contributor II
  • 6847 Views
  • 2 replies
  • 0 kudos

How to change Workspace Owner?

Our Databricks workspace was created by a personal account. Now the person has left the organization. We would like to change the owner to a Service account(preferably, else to an Admin account).Questions:Is it possible to change the owner of the wor...

Administration & Architecture
admin
change owner
workspace owner
  • 6847 Views
  • 2 replies
  • 0 kudos
Latest Reply
Atanu
Databricks Employee
  • 0 kudos

Are you in aws or Azure.When you say workspace admin, that could be many. So, you can have multiple workspace admin.  

  • 0 kudos
1 More Replies
NadithK
by Contributor
  • 7043 Views
  • 2 replies
  • 1 kudos

Using a custom Hostname in Databricks CLI instead of per-workspace URL

Hi,At our organization, we have added front end privatelink connection to a Databricks workspace in Azure, and public access to the workspace is disabled. I am able to access the workspace UI with the private IP (in the browser), and able to call the...

  • 7043 Views
  • 2 replies
  • 1 kudos
Latest Reply
NadithK
Contributor
  • 1 kudos

Hi @Retired_mod ,Thank you for the support.Really appreciate it.Thanks

  • 1 kudos
1 More Replies
re
by New Contributor II
  • 1398 Views
  • 0 replies
  • 0 kudos

terraform/databricks setting default_catalog_name

While configuring databricks, we've set the "default_catalog_name", which sets the default schema when users connect via an ODBC connection. While the naming isn't consistent, this does have one desired effect, that is, when users connect, it default...

  • 1398 Views
  • 0 replies
  • 0 kudos
_YSF
by New Contributor II
  • 950 Views
  • 1 replies
  • 0 kudos

Can I edit the ADLSg2 storage location for a schema?

I want to alter the schema and basically point it to a new path in the data lake #UnityCatalog

  • 950 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

don't think so.You can alter the owner and dbproperties using the alter schema command, but not the location.https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql-ref-syntax-ddl-alter-schema

  • 0 kudos
_YSF
by New Contributor II
  • 2931 Views
  • 0 replies
  • 0 kudos

Struggling with UC Volume Paths

I am trying to setup my volumes and give them paths in the data lake but I keep getting this message:Input path url 'abfss://my-container@my-storage-account.dfs.core.windows.net/' overlaps with managed storage within 'CreateVolume' callThere WAS some...

  • 2931 Views
  • 0 replies
  • 0 kudos
abhaigh
by New Contributor III
  • 3683 Views
  • 0 replies
  • 0 kudos

Error: cannot create permissions: invalid character '<' looking for beginning of value

I'm trying to use terraform to assign a cluster policy to an account-level group (sync'd from AAD via SCIM)My provider is configured like thisprovider "databricks" {alias = "azure_account"host = "accounts.azuredatabricks.net"account_id = "%DATABRICKS...

  • 3683 Views
  • 0 replies
  • 0 kudos
paritoshsh
by New Contributor II
  • 2547 Views
  • 1 replies
  • 1 kudos

Resolved! Terraform Repos Git URL Allow List

Hi,I am provisioning databricks workspaces using terraform and want to add specific github repo url that can be used. In UI there is an option for that but when it comes to terraform there is nothing specific. I came across custom_config option here ...

Screenshot 2023-08-08 174830.jpg
  • 2547 Views
  • 1 replies
  • 1 kudos
Latest Reply
Amine
Databricks Employee
  • 1 kudos

Hello,This can normally be achieved using this terraform resource:resource "databricks_workspace_conf" "this" { custom_config = { "enableProjectsAllowList": true, "projectsAllowList": "url1,url2,url3", } }Cheers

  • 1 kudos
ArjenSmedes
by New Contributor
  • 8640 Views
  • 0 replies
  • 0 kudos

Databricks workspace in our own VNET

We have setup a Databricks workspace in our own Azure VNET, including a private endpoint. Connecting to the WS works fine (through the private ip address). However, when creating my first cluster, I run into this problem:"ADD_NODES_FAILED...Failed to...

  • 8640 Views
  • 0 replies
  • 0 kudos
Palkers
by New Contributor III
  • 725 Views
  • 0 replies
  • 0 kudos

Data Marketplace private exchange

I want to use Data Markerplace but only as private / local mode, so don't want to publish any products outside my organization.I know I can create private listing , but it can be done only from provider console.I'm added to marketplace role but not s...

  • 725 Views
  • 0 replies
  • 0 kudos