cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

cleversuresh
by New Contributor II
  • 485 Views
  • 0 replies
  • 0 kudos

vocareum lab showing no workspace found.

Hi,I bought Generative AI Engineering with Databricks lab subscription by paying $75 just to get the hands on experience.The labs worked for one day and then they stop working. Please active the labs ASAP, I am preparing for the certification exam an...

  • 485 Views
  • 0 replies
  • 0 kudos
PetrSindelar
by New Contributor II
  • 4296 Views
  • 3 replies
  • 2 kudos

Connecting external location from a different tenant in Azure

Hi,we have a setup with 2 different Azure tenants. In tenant A we have a storage account that we want to connect as an external location to a databricks workspace in the tenant B. For that we have established a private endpoint from the storage accou...

  • 4296 Views
  • 3 replies
  • 2 kudos
Latest Reply
Behwar
New Contributor II
  • 2 kudos

It would be superb to connect between two tenants with Azure Databricks Access Connector

  • 2 kudos
2 More Replies
TheManOfSteele
by New Contributor III
  • 1061 Views
  • 1 replies
  • 0 kudos

Resolved! Prevent service principal UUID from appearing on job name

Hello!I am using service principal id to authenticate my databricks bundle. But when the job runs, this id is automatically appended to both the name and tags column on the jobs run page. In my databricks.yml file I have name: "[${var.environment}]" ...

  • 1061 Views
  • 1 replies
  • 0 kudos
Latest Reply
breaka
New Contributor III
  • 0 kudos

Hi!Sounds like the "development" mode. DAB will automatically prefix your jobname with <env> <user name> if you set "mode" to "development" in the databricks.yml file. The name lookup for service principals apparently doesn't work nicely and you get ...

  • 0 kudos
Maria_S
by New Contributor III
  • 1395 Views
  • 4 replies
  • 5 kudos

Why all workspace users can see my user folder

Hi, I am a Databricks account admin user with admin access to our workspace. My user folder for some reason is visible to all workspace users. I have checked permissions settings where possible and cannot see anything that would indicate fully shared...

Capture1.PNG Capture2.PNG
  • 1395 Views
  • 4 replies
  • 5 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 5 kudos

workspace is visible for all , you have to make changes in Admin console ,you will find this feature there to disable it 

  • 5 kudos
3 More Replies
TheManOfSteele
by New Contributor III
  • 1191 Views
  • 1 replies
  • 1 kudos

Resolved! How to Pass Azure variable to databricks.yml file

Hello I would like to find a way to pass a variable from my Azure variables to my databricks yml file.    For example I would like to pass the variable BUNDLE_TARGET to the location in this databricks.yml fileIs there a way to do something like this?...

TheManOfSteele_0-1722601812560.png TheManOfSteele_1-1722601836664.png TheManOfSteele_3-1722601999059.png
  • 1191 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @TheManOfSteele ,Here are the examples of how to achieve that. I think the simplest way would be to set environment variables.azure devops - How can I pass parameters to databricks.yml in Databricks Asset Bundles? - Stack OverflowDatabricks Asset ...

  • 1 kudos
TheManOfSteele
by New Contributor III
  • 2156 Views
  • 0 replies
  • 0 kudos

Azure Pipeline Release Bundle Validate Failure

Hello,I am trying to create a CI/CD pipeline. After the build pipeline, I am trying to create a release to databricks that runs a notebook. I am trying to run this as a service principal. During the bundle validate step I am getting this error. "Erro...

TheManOfSteele_0-1722516587122.png
  • 2156 Views
  • 0 replies
  • 0 kudos
gor
by New Contributor III
  • 1274 Views
  • 3 replies
  • 0 kudos

Resolved! Workspace selector not working

On the top-right of the Databricks GUI is a selector to select workspaces from.Since yesterday morning (approximately the same time the Microsoft outage happened) that selector stopped working.  Instead of a dropdown of Workspaces, we only get a spin...

  • 1274 Views
  • 3 replies
  • 0 kudos
Latest Reply
NandiniN
Databricks Employee
  • 0 kudos

Glad to know the issue has stopped occurring, but in case it re-occurs in future, we could collect some backend logs, to understand what is causing the slowness, when the issue is and resolve the service. I would request you to involve the azure team...

  • 0 kudos
2 More Replies
AsgerLarsen
by New Contributor III
  • 3289 Views
  • 2 replies
  • 0 kudos

Azure Network Settings in regards to Databricks Table Monitoring

I have set up my Unity Catalog on an Azure Data Lake which uses the companies virtual network to allow access.I have all privileges on my account, so I am able to create, alter or delete catalogs, schemas and tables. I can do these things either usin...

Skærmbillede 2024-07-25 110646.png
  • 3289 Views
  • 2 replies
  • 0 kudos
Latest Reply
AsgerLarsen
New Contributor III
  • 0 kudos

Hi Kaniz,Thanks for the response and for identifying the problem.I would like some steps on how to adjust the network settings, as everything I have tried so far, hasn't seemed to work.

  • 0 kudos
1 More Replies
rhammonds1
by New Contributor
  • 815 Views
  • 1 replies
  • 0 kudos

databricks-connect 14.3 spark error against 14.3 cluster with data_security_mode = NONE

I am running into an issue with trying to use a 14.3 cluster with databricks-connect 14.3.My cluster config:  { "autoscale": { "min_workers": 2, "max_workers": 10 }, "cluster_name": "Developer Cluster", "spark_version": "14.3.x-scala2...

  • 815 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Are you running the latest version of the Databricks connect?

  • 0 kudos
gabo2023
by New Contributor III
  • 2612 Views
  • 2 replies
  • 3 kudos

Unable to read resources - Unsupported Protocol Scheme (Terraform AWS)

Hello everyone!Over the last few weeks my company has been trying to deploy a Databricks workspace on AWS adapted to the customer's needs, using Terraform. To do this, we started from a base code on Databricks own github (https://github.com/databrick...

image.png image (1).png
  • 2612 Views
  • 2 replies
  • 3 kudos
Latest Reply
meeran007
New Contributor II
  • 3 kudos

Whats the solution for this? facing same issue.

  • 3 kudos
1 More Replies
pawelzak
by New Contributor III
  • 2723 Views
  • 2 replies
  • 0 kudos

Databricks dashboard programatically

Hi,How can I create a databricks dashboard, filters and visuals programatically (api, terraform, sdk, cli...)?Thanks,Pawel

  • 2723 Views
  • 2 replies
  • 0 kudos
Latest Reply
marcin-sg
New Contributor III
  • 0 kudos

Maybe slightly late (maybe because development was late :P), but hopefully it will also help other.1. There seems to be support added to the newest terraform databricks provider - 1.49.0 - here2. Other solution would be to use databricks cli (e.g. `d...

  • 0 kudos
1 More Replies
Schofield
by New Contributor III
  • 655 Views
  • 1 replies
  • 2 kudos

List deployed Databricks asset bundles (DABs)?

Is there a databricks cli command or REST API to list all the DABs that have been deployed to a workspace?

  • 655 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @Schofield ,Unfortunately, I don't think there is out of the box command that will provide you this information yet. As a workaround, you can try write some code that will extract this information from REST API. For example, you can use /api/2.1/j...

  • 2 kudos
lizou8
by New Contributor III
  • 880 Views
  • 2 replies
  • 2 kudos

Resolved! delta sharing issue after enable predictive optimization

Some of our delta sharing tables are not working May be related to this, or maybe not, we enabled predictive optimization on all tables a few days agoes not working any morebut any new tables created works fine after setting thisSET TBLPROPERTIES (de...

  • 880 Views
  • 2 replies
  • 2 kudos
Latest Reply
lizou8
New Contributor III
  • 2 kudos

after some debugging, I find out a very unique cause if we used a JSON string in column comment, and it make sense that a JSON string in column comment breaks delta sharingexample: column COMMENT  {"key": "primary_key", "is_identity": "true"}The erro...

  • 2 kudos
1 More Replies
JonLaRose
by New Contributor III
  • 3213 Views
  • 2 replies
  • 1 kudos

Delta Lake S3 multi-cluster writes - DynamoDB

Hi there!I'm trying to figure out how the multi-writers architecture for Delta Lake tables is implemented under the hood.I understand that a DynamoDB table is used to provide mutual exclusion, but the question is: where is the table located? Is it in...

  • 3213 Views
  • 2 replies
  • 1 kudos
Latest Reply
prem14f
New Contributor II
  • 1 kudos

Hi, could you please help me here? How can i use this configuration in DataBricks? So I will maintain my transcription logs there, and with Parallel, I can use the Delta-RS job.spark.conf.set("spark.delta.logStore.s3a.impl", "io.delta.storage.S3Dynam...

  • 1 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels