cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

barendlinders
by New Contributor II
  • 418 Views
  • 3 replies
  • 0 kudos

Resolved! CREATE EXTERNAL LOCATION read only through SQL

Hello,I have set up a storage credential with READ ONLY access to my Azure storage account (landing-zone). I want to create an EXTERNAL LOCATION using SQL, set the EXTERNAL LOCATION to read only and use the read-only storage credential I made.I canno...

barendlinders_0-1736427738959.png
  • 418 Views
  • 3 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Sure, happy to help, let me know in case you need any other assistance.

  • 0 kudos
2 More Replies
NielsMH
by New Contributor III
  • 647 Views
  • 2 replies
  • 0 kudos

Resolved! connect azure openai service deployments from databricks workspace

HiFor the company i work for, I have created an azure openai service instance, with the intention to deploy models and interact with them from a databricks workspace.The basic properties of my azure openai service are:Network: separate vnet for the r...

  • 647 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 0 kudos

This sounds like a scenario for the Mosaic AI Gateway, Mosaic AI Gateway | Databricks

  • 0 kudos
1 More Replies
Mukee
by New Contributor II
  • 3686 Views
  • 3 replies
  • 0 kudos

How to get Workspace name with workspaceId?

I have an AWS Managed Databricks instance. I am trying to get a workspace name with workspace ID. Thank you very much for your time and assistance.

  • 3686 Views
  • 3 replies
  • 0 kudos
Latest Reply
holly
Databricks Employee
  • 0 kudos

Hi All, There's a few ways to find your workspace name: Quickly: it's the name in the URL before .cloud, so in the URL https://devrel.cloud.databricks.com/?o=556812371150522 the name is 'devrel'Whilst in the notebook: spark.conf.get("spark.databricks...

  • 0 kudos
2 More Replies
Matty
by New Contributor III
  • 1970 Views
  • 7 replies
  • 2 kudos

Azure basic public IP SKU retirement

With Azure notifying that retirement of the basic public IP SKU in 2025 I'd like to understand how compute cluster workers can be moved over from basic to standard SKU? We've used Terraform to deploy our Databricks environment but I'd like to underst...

  • 1970 Views
  • 7 replies
  • 2 kudos
Latest Reply
Matty
New Contributor III
  • 2 kudos

I contacted Microsoft, they spoke with their SME of the Databricks team who have confirmed they're aware of this and will be transitioning from Basic to Standard SKU before deprecation - a update from Microsoft will be getting released regarding this...

  • 2 kudos
6 More Replies
andre-h
by New Contributor III
  • 755 Views
  • 4 replies
  • 3 kudos

Resolved! Install Python dependency on job cluster from a privately hosted GitLab repository (HTTPS/SSH)

Hello,We intend to deploy a Databricks workflow based on a Python wheel file which needs to run on a job cluster. There is a dependency declared in pyproject.toml which is another Python project living in a private Gitlab repository. We therefore nee...

  • 755 Views
  • 4 replies
  • 3 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 3 kudos

Hi @andre-h ,As a good alternative you can build the python package (wheel or egg) in your gitlab or github workflows and upload the package to dedicated cloud storage bucket. Then followed by you can specify the cloud storage path of your python lib...

  • 3 kudos
3 More Replies
SamGreene
by Contributor II
  • 666 Views
  • 7 replies
  • 2 kudos

Resolved! Unable to get CLI working on new laptop

Hi,I have two Windows 10 laptops.  The old one is working fine with Databricks CLI version 0.210.0.  I installed the CLI on a new laptop and copied my config file.  I can test connections to my default config which uses my PAT.  However, I have some ...

  • 666 Views
  • 7 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

Great to hear issue got resolved, let us know in case any other assistance is required.

  • 2 kudos
6 More Replies
Rajasql
by New Contributor II
  • 504 Views
  • 4 replies
  • 1 kudos

Resolved! regexp_count seems to not work as it should

The below SQL's should give different answers as regexp rules state that * is a special character which needs to be escaped with \ to be considered as a literal string. This second should literally match for A*B and return 2 but it is also taking AB ...

  • 504 Views
  • 4 replies
  • 1 kudos
Latest Reply
PabloCSD
Valued Contributor
  • 1 kudos

Hello @Rajasql ,Try this way (it worked for me in a Databricks notebook and it returns 2):SELECT regexp_count('nA*BsABreA*Bthe', 'A\\*B') str_cnt;

  • 1 kudos
3 More Replies
FabianGutierrez
by Contributor
  • 763 Views
  • 1 replies
  • 0 kudos

Looking for experiences with DABS CLI Deployment, Terraform and Security

Hi Community,I hope my topic finds you well. Within our Databricks landscape we decided to use DABS (Databricks Asset Bundles) however we found out (the hard way) that it uses Terraform for Deployment purposes. This is a concern now for Security and ...

  • 763 Views
  • 1 replies
  • 0 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 0 kudos

Hi @FabianGutierrez ,As I closely observed Databricks Asset Bundle is leveraging terraform under the hood, but it's not generating terraform.tfstate file. And more over the `.databricks` is gitignored which will not be synced to remote repo.Additiona...

  • 0 kudos
drumcircle
by New Contributor II
  • 221 Views
  • 1 replies
  • 1 kudos

Resolved! Compute fleets on Azure Databricks

It seems that compute fleets have been supported on AWS for almost 2 years.  Azure compute fleets went into preview in November 2024.   Has anyone heard of how or when compute fleets will be supported on Azure Databricks?

  • 221 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

This is currently on development but to ETA has been shared yet by our engineering team. But might be coming soon.

  • 1 kudos
jjsnlee
by New Contributor II
  • 945 Views
  • 11 replies
  • 0 kudos

Can't create AWS p3 instance

Hi, I'm trying to create a `p3.2xlarge` in my workspace, but the cluster fails to instantiate, specifically getting this error message: `No zone supports both the driver instance type [p3.2xlarge] and the worker instance type [p3.2xlarge]` (though I ...

  • 945 Views
  • 11 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

I was able to start a cluster with same exact configuration on my internal environment with no issues, I have selected east-1a as the AZ to deploy.By any chance have you engaged AWS support on this?

  • 0 kudos
10 More Replies
Venugopal
by New Contributor III
  • 481 Views
  • 3 replies
  • 0 kudos

Mimic system table functionality at custom catalog level

Hi,I am exploring system tables . I want to have our environment specific data in different catalogs. While it is possible to get audit and other usage info from system tables under system catalog,how can I achieve the same in my custom catalog that ...

  • 481 Views
  • 3 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Just to be clear, what you want to do is have a set of system tables such as audit logs in each catalog for your environments, so when you query the data for those tables you just get information from your environment. On this case there is no built ...

  • 0 kudos
2 More Replies
sharat_n
by New Contributor
  • 600 Views
  • 1 replies
  • 1 kudos

Delta lake : delete data from storage manually instead of vacuum

Hi AllWe have a unique use case where we are unable to run vacuum to clean our storage space of delta lake tables. Since we have data partitioned by date, we plan to delete files older than a certain date directly from storage. Could this lead to any...

  • 600 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

Deleting files older than a certain date directly from storage without using the VACUUM command can lead to potential issues with your Delta Lake tables. Here are the key points to consider: Corruption Risk: Directly deleting files from storage can ...

  • 1 kudos
Erik
by Valued Contributor III
  • 437 Views
  • 1 replies
  • 1 kudos

Resolved! Where is the Open Apache Hive Metastore API?

I 2023 it was announced that databricks has made a "Hive Metastore (HMS) interface for Databricks Unity Catalog, which allows any software compatible with Apache Hive to connect to Unity Catalog".Is this discontinued? If not, is there any documentati...

  • 437 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

It seems that this option has been deprecated, it was a private preview but is no longer available for enrollment

  • 1 kudos
ambigus9
by New Contributor III
  • 1076 Views
  • 10 replies
  • 0 kudos

Resolved! Failed to add 3 workers to the compute. Will attempt retry: true. Reason: Driver unresponsive

Currently I trying to Create a Compute Cluster on a Workspaces with Privatelink and Custom VPC.I'm using Terraform: https://registry.terraform.io/providers/databricks/databricks/latest/docs/guides/aws-private-link-workspaceAfter the deployment is com...

ambigus9_0-1735912629708.png ambigus9_1-1735912708564.png ambigus9_2-1735912741139.png
  • 1076 Views
  • 10 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @ambigus9, Looks like based on connectivity test to the RDS it's not working. Can you check if there is any Firewall blocking the request, since connection is not going through the RDS.

  • 0 kudos
9 More Replies
xecel
by New Contributor II
  • 443 Views
  • 1 replies
  • 0 kudos

Resolved! How to Retrieve Admin and Non-Admin Permissions at Workspace Level in Azure Databricks.

Hello,I am working on a project to document permissions for both admins and non-admin users across all relevant objects at the workspace level in Azure Databricks (e.g., tables, jobs, clusters, etc.).I understand that admin-level permissions might be...

  • 443 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

In Databricks the object permissions are based in the object itself and not the user. Unfortunately as of now there is no way to get all the objects permissions in a single built in query.There is custom options as for example for clusters, first run...

  • 0 kudos