cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

_YSF
by New Contributor II
  • 3271 Views
  • 0 replies
  • 0 kudos

Struggling with UC Volume Paths

I am trying to setup my volumes and give them paths in the data lake but I keep getting this message:Input path url 'abfss://my-container@my-storage-account.dfs.core.windows.net/' overlaps with managed storage within 'CreateVolume' callThere WAS some...

  • 3271 Views
  • 0 replies
  • 0 kudos
abhaigh
by New Contributor III
  • 4086 Views
  • 0 replies
  • 0 kudos

Error: cannot create permissions: invalid character '<' looking for beginning of value

I'm trying to use terraform to assign a cluster policy to an account-level group (sync'd from AAD via SCIM)My provider is configured like thisprovider "databricks" {alias = "azure_account"host = "accounts.azuredatabricks.net"account_id = "%DATABRICKS...

  • 4086 Views
  • 0 replies
  • 0 kudos
paritoshsh
by New Contributor II
  • 3283 Views
  • 1 replies
  • 1 kudos

Resolved! Terraform Repos Git URL Allow List

Hi,I am provisioning databricks workspaces using terraform and want to add specific github repo url that can be used. In UI there is an option for that but when it comes to terraform there is nothing specific. I came across custom_config option here ...

Screenshot 2023-08-08 174830.jpg
  • 3283 Views
  • 1 replies
  • 1 kudos
Latest Reply
Amine
Databricks Employee
  • 1 kudos

Hello,This can normally be achieved using this terraform resource:resource "databricks_workspace_conf" "this" { custom_config = { "enableProjectsAllowList": true, "projectsAllowList": "url1,url2,url3", } }Cheers

  • 1 kudos
ArjenSmedes
by New Contributor
  • 9123 Views
  • 0 replies
  • 0 kudos

Databricks workspace in our own VNET

We have setup a Databricks workspace in our own Azure VNET, including a private endpoint. Connecting to the WS works fine (through the private ip address). However, when creating my first cluster, I run into this problem:"ADD_NODES_FAILED...Failed to...

  • 9123 Views
  • 0 replies
  • 0 kudos
Palkers
by New Contributor III
  • 934 Views
  • 0 replies
  • 0 kudos

Data Marketplace private exchange

I want to use Data Markerplace but only as private / local mode, so don't want to publish any products outside my organization.I know I can create private listing , but it can be done only from provider console.I'm added to marketplace role but not s...

  • 934 Views
  • 0 replies
  • 0 kudos
NadithK
by Contributor
  • 6244 Views
  • 1 replies
  • 2 kudos

Assigning Databricks Account Admin role to User group

At my current organization, we have a few users with the Databricks Account admin role assigned. But as per our company policy, individual users should not be given such elevated privileges. They should be given to user groups so that users in those ...

  • 6244 Views
  • 1 replies
  • 2 kudos
Latest Reply
NadithK
Contributor
  • 2 kudos

Hi Kaniz,Thank you for the feedback and I was able to find below solution from the article you have mentioned.https://docs.databricks.com/en/administration-guide/users-groups/groups.html#assign-account-admin-roles-to-a-groupSeems we can use Databrick...

  • 2 kudos
jrosend
by New Contributor III
  • 1704 Views
  • 2 replies
  • 1 kudos

Global search programatically

Hi!At the workspace header, there is a search box that allow us to look for a text in all notebooks in the workspace. Is there a way via CLI or API to call the global search https://<workspace-domain>/graphql/SearchGql so the result can be analysed a...

  • 1704 Views
  • 2 replies
  • 1 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 1 kudos

if you checked in the notebooks into a git repo, the search in the git repo (or API) might save you.

  • 1 kudos
1 More Replies
Xyguo
by New Contributor
  • 2067 Views
  • 1 replies
  • 1 kudos

Resolved! REST API workspace list content doesn't work with Queries

Hi, I'm trying to export the SQL Queries in certain folders in workspace, but the list content API callGET  /api/2.0/workspace/list  doesnt work with queries? how should I export only queries in a certain folder in workspace? Thank you very much  

Xyguo_0-1691335659836.png
  • 2067 Views
  • 1 replies
  • 1 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 1 kudos

@Xyguo - Currently, exporting a SQL query file is not supported.  Kindly create an idea feature request by following the article listed - https://docs.databricks.com/en/resources/ideas.html#create-an-idea-in-the-ideas-portal to raise a feature reques...

  • 1 kudos
milicevica
by New Contributor
  • 8482 Views
  • 2 replies
  • 1 kudos

Resolved! Multiple orphan vms in managed resource group after starting and terminating my personal cluster

Hi today, i had problem with starting the cluster and therefore i did multiple times star and terminating. The problem is that this actions always started one new VM in the managed resource group but never turned it off. Therefore i ended up with mul...

  • 8482 Views
  • 2 replies
  • 1 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 1 kudos

It always takes a few minutes, even 10, before the machine is terminated in Azure. It is better to set a pool of machines in data bricks and use them from there so we will not be keeping Zombie machines.

  • 1 kudos
1 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 2773 Views
  • 0 replies
  • 1 kudos

Databricks Enhances Job Monitoring with Duration Thresholds for Workflow

Databricks has introduced Duration Thresholds for workflows. This new addition allows users to set time limits for workflow execution, significantly improving monitoring of job performance. When a job exceeds the preset duration, the system triggers ...

ezgif-4-8ab9474b98.gif
  • 2773 Views
  • 0 replies
  • 1 kudos
vasu2
by New Contributor
  • 908 Views
  • 0 replies
  • 0 kudos

Multi bronze layers : multi client data

data coming in from multiple sources belonging to multiple customers.Should I create Single storage account name it bronze and then separate containers for each client's data to be put in.And then merge the data in the silver layer. What's the best p...

  • 908 Views
  • 0 replies
  • 0 kudos
niklas
by Contributor
  • 6423 Views
  • 1 replies
  • 0 kudos

Resolved! Can't set account admin using Terraform

I want to set the account admin for a service principal in order to create the Unity Catalog metastore. The Terraform code looks like this: data "databricks_service_principal" "application" { count = var.environment == "dev" ? 1 : 0 application_...

Administration & Architecture
access
account_admin
azure
role
Terraform
  • 6423 Views
  • 1 replies
  • 0 kudos
shawnbarrick
by New Contributor III
  • 5701 Views
  • 2 replies
  • 0 kudos

INVALID_PARAMETER_VALUE.LOCATION_OVERLAP: overlaps with managed storage error with S3 paths

We're trying to read from an S3 bucket using unity catalog and are selectively getting "INVALID_PARAMETER_VALUE.LOCATION_OVERLAP: overlaps with managed storage error"  errors within the same bucket.  This works: "dbutils.fs.ls("s3://BUCKETNAME/dev/he...

  • 5701 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi,Could you please elaborate on the issue here? Running the list command on a managed directory is not supported in Unity Catalog. Catalog/schema storage locations are reserved for managed storage.Please tag @Debayan  with your next comment which wi...

  • 0 kudos
1 More Replies
DianGermishuize
by New Contributor II
  • 1935 Views
  • 1 replies
  • 0 kudos

Bitbucket Repo Add Assistance

I am trying to add a Bitbucket cloud repo to my workspace, but keep getting a credentials invalid error. I made sure to set a username and generate an app password with repo read write permissions. What am i doing wrong? 

  • 1935 Views
  • 1 replies
  • 0 kudos
Latest Reply
Harrison_S
Databricks Employee
  • 0 kudos

Hello,Did you also set the password in User Settings > Git integration ? If so can you paste the full error?

  • 0 kudos