cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

robert-moyai
by New Contributor II
  • 915 Views
  • 3 replies
  • 2 kudos

Databricks OAuth: User-based OAuth (U2M) Databricks Connect in Apps

I'm looking to use a databricks session in a Databricks app. The databricks session should be able to use user-based oauth (U2M) to ensure the app has same privileges as the authenticated user using the app. Databricks apps have the ability to use th...

  • 915 Views
  • 3 replies
  • 2 kudos
Latest Reply
robert-moyai
New Contributor II
  • 2 kudos

Thanks for you response and the links. But the documentation doesn't explicitly explain why the spark connect has been placed out of scope and what app builders should use to implement proper data governance using on behave of user permissions.

  • 2 kudos
2 More Replies
trailblazer
by New Contributor III
  • 3065 Views
  • 5 replies
  • 3 kudos

Resolved! Connecting Azure databricks with firewall enabled Azure storage account

Hi I am trying to connect from Azure Databrick workspace to Azure gen2 storage account securely. The storage account is set up with these options1. Enabled from selected virtual networks and IP addresses- we whitelisted few ips 2. Added Microsoft.Dat...

  • 3065 Views
  • 5 replies
  • 3 kudos
Latest Reply
mkkao924
New Contributor II
  • 3 kudos

I am having exact issue as @trailblazer , that if I enable traffic for all network, I can read/write to storage account, if I only allow selected network, including the VNet, then it doesn't. I am using Serverless setup. I also followed the firewall ...

  • 3 kudos
4 More Replies
juan_maedo
by New Contributor III
  • 1166 Views
  • 1 replies
  • 1 kudos

Resolved! Job Notifications specifically on Succeeded with Failures

Hi everyone,I have a set of jobs that always execute the last task regardless of whether the previous ones failed or not (using the ‘ALL done’ execution dependency).When moving to production and wanting to enable notifications, there is no option to ...

  • 1166 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

Databricks does not provide a direct way to distinguish or send notifications specifically for a "Succeeded with failures" state at the job level—the job is classified as "Success" even when some upstream tasks have failed, if the last (leaf) task is...

  • 1 kudos
MiriamHundemer
by New Contributor III
  • 832 Views
  • 2 replies
  • 3 kudos

Resolved! Error when trying to destory databricks_permissions with OpenTofu

Hi,In our company's project we created a databricks_user for a service account (which is needed for our deployment process) via OpenTofu and afterwards adjusted permissions to that "user's" user folder using the databricks_permissions resource.resour...

  • 832 Views
  • 2 replies
  • 3 kudos
Latest Reply
NandiniN
Databricks Employee
  • 3 kudos

Hi @MiriamHundemer , The issue occurs because the owner of the home folder (in this case, the databricks_user.databricks_deployment_sa service account) often has an unremovable CAN_MANAGE permission on its own home directory. When OpenTofu attempts t...

  • 3 kudos
1 More Replies
Hil
by New Contributor III
  • 865 Views
  • 4 replies
  • 1 kudos

Resolved! Deply databricks workspace on azure with terraform - failed state: legacy access

I'm trying to deploy a workspace on azure via terraform and i'm getting the following error:"INVALID_PARAMETER_VALUE: Given value cannot be set for workspace~<id>~default_namespace_ws~ because: cannot set default namespace to hive_metastore since leg...

  • 865 Views
  • 4 replies
  • 1 kudos
Latest Reply
Hil
New Contributor III
  • 1 kudos

I found the issue, The setting automatically assigned workspaces to this metastore was checked. Unchecking this and manually assigning the metastore worked.

  • 1 kudos
3 More Replies
APJESK
by New Contributor III
  • 459 Views
  • 1 replies
  • 1 kudos

Clarification on Unity Catalog Metastore - Metadata and storage

Where does the Unity Catalog metastore metadata actually reside?Is it stored and managed in the Databricks account (control plane)?Or does it get stored in the customer-managed S3 bucket when we create a bucket for Unity Catalog metastore?I want to c...

  • 459 Views
  • 1 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

@APJESK Replied here https://community.databricks.com/t5/data-governance/clarification-on-unity-catalog-metastore-metadata-and-storage/td-p/133389 

  • 1 kudos
xavier_db
by New Contributor III
  • 634 Views
  • 3 replies
  • 3 kudos

Resolved! Databricks GCP login with company account

I created gmail on my company account, then I logged in in gcp databricks, till yesterday it was working fine, yesterday I logged in into gmail account, there it asked for another gmail id, so I have provided new, but today I am not able to login usi...

xavier_db_0-1757212610358.png
  • 634 Views
  • 3 replies
  • 3 kudos
Latest Reply
Advika
Community Manager
  • 3 kudos

Hello @xavier_db! Were you able to get this login issue resolved? If yes, it would be great if you could share what worked for you so others facing the same problem can benefit as well.

  • 3 kudos
2 More Replies
tabasco
by New Contributor III
  • 724 Views
  • 4 replies
  • 6 kudos

Resolved! Is there a way to register S3 compatible tables?

Hi everyone,I have successfully registered AWS S3 tables in unity catalog, but I would like to register S3-compatible as well.But, to create an EXTERNAL LOCATION in unity catalog, it seems I must register a credential. But the only suported credentia...

  • 724 Views
  • 4 replies
  • 6 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 6 kudos

Hey @tabasco, did you check out the External Table documentation in the Databricks AWS docs?External Locationhttps://docs.databricks.com/aws/en/sql/language-manual/sql-ref-external-locationsCredentialshttps://docs.databricks.com/aws/en/sql/language-m...

  • 6 kudos
3 More Replies
noorbasha534
by Valued Contributor II
  • 1232 Views
  • 1 replies
  • 1 kudos

Workload identity federation policy

Dear allCan I create a single workload federation policy for all devops pipelines?Our set-up : we have code version controlled in Github repos. And, we use Azure DevOps pipelines to authenticate with Databricks via a service principal currently and d...

  • 1232 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @noorbasha534 ,In docs they are giving following example of subject requirements for Azure Devops. So, the subject (sub) claim must uniquely identify the workload. So as long as all of your pipelines resides in the same organization, same project ...

  • 1 kudos
ashokz
by New Contributor II
  • 2639 Views
  • 6 replies
  • 9 kudos

Resolved! Is it possible restore a deleted catalog and schema

Is it possible restore a deleted catalog and schema.if CASCADE is used even though schemas and tables are present in catalog, catalog will be dropped.Is it possible to restore catalog or is possible to restrict the use of CACADE command.Thank you.

  • 2639 Views
  • 6 replies
  • 9 kudos
Latest Reply
immassiv
New Contributor III
  • 9 kudos

@Louis_Frolio I cannot click any "Accept as Solution" button, as I was not the one creating the post, I believe

  • 9 kudos
5 More Replies
tana_sakakimiya
by Contributor
  • 647 Views
  • 1 replies
  • 1 kudos

Resolved! Dev/Prod Environments in AWS: Separate Accounts vs. Separate Workspaces?

Hello everyone,I'm looking for some advice on best practices. When setting up development and production environments for Databricks on AWS, is it better to use completely separate AWS accounts, or is it sufficient to use separate workspaces within a...

  • 647 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @tana_sakakimiya , I allow myself copy and paste brilliant answer on similar question provided by user Isi:"Option A: Multiple Databricks accounts and multiple AWS accountsThis model offers the highest level of isolation. Each environment lives in...

  • 1 kudos
Ninad-Mulik
by New Contributor III
  • 4672 Views
  • 3 replies
  • 4 kudos

Resolved! Enable the "Billable Usage Download' Accounts API on our Azure

 Hi Databricks Support,Could you please confirm whether the Billable Usage Download endpoint can be enabled on Azure Databricks account, similar to how it’s available on AWS/GCP? If yes, what are the steps that should be followed for the same.. else ...

  • 4672 Views
  • 3 replies
  • 4 kudos
Latest Reply
Ninad-Mulik
New Contributor III
  • 4 kudos

Thank you.. !! i finally got to know wat to do..!!

  • 4 kudos
2 More Replies
chloe_nm
by New Contributor
  • 3811 Views
  • 1 replies
  • 1 kudos

Mismatch cuda/cudnn version on Databricks Runtime GPU ML version

I have a cluster on Databricks with configuration Databricks Runtime Version16.4 LTS ML Beta (includes Apache Spark 3.5.2, GPU, Scala 2.12), and another cluster with configuration 16.0 ML  (includes Apache Spark 3.5.2, GPU, Scala 2.12).  According to...

  • 3811 Views
  • 1 replies
  • 1 kudos
Latest Reply
lin-yuan
Databricks Employee
  • 1 kudos

There could be library related conflicts in 16.0ML that got fixed in 16.4ML. I would always recommend to use the LTS version. Thanks

  • 1 kudos
Kaitsu
by New Contributor
  • 483 Views
  • 2 replies
  • 1 kudos

DataBricks JDBC driver fails when unsupported property is submitted

Hello!The Databricks jdbc driver applies a property that is not supported by the connector as a Spark server-side property for the client session. How can I avoid that ? With some tools I do not have 100% control, eg they may add a custom jdbc connec...

  • 483 Views
  • 2 replies
  • 1 kudos
Latest Reply
NandiniN
Databricks Employee
  • 1 kudos

Hi @Kaitsu ,  The documentation mentions, If you specify a property that is not supported by the connector, then the connector attempts to apply the property as a Spark server-side property for the client session. Unlike many other JDBC drivers that ...

  • 1 kudos
1 More Replies
NadithK
by Contributor
  • 4078 Views
  • 2 replies
  • 2 kudos

Pre-loading docker images to cluster pool instances still requires docker URL at cluster creation

I am trying to pre-load a docker image to a Databricks cluster pool instance.As per this article I used the REST API to create the cluster pool and defined a custom Azure container registry as the source for the docker images.https://learn.microsoft....

  • 4078 Views
  • 2 replies
  • 2 kudos
Latest Reply
krupakar1329
New Contributor II
  • 2 kudos

@NadithK Pre-loading Docker images to cluster pool instances is for performance optimization (faster cluster startup), but you still must specify the Docker image in your cluster configuration. The pre-loading doesn't eliminate the requirement to dec...

  • 2 kudos
1 More Replies