cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

sparrap
by New Contributor
  • 1245 Views
  • 2 replies
  • 0 kudos

Error when Connecting Databricks Cluster to RStudio Desktop App

Hi! I am trying to connect RStudio to my Databricks Cluster, I already change the permissions to CAN MANAGE and CAN ATTACH to the cluster. Also I have verified to have the correct python version and Databricks version in my computer.This is the error...

  • 1245 Views
  • 2 replies
  • 0 kudos
Latest Reply
mikvaar
New Contributor III
  • 0 kudos

This seems to solve the problem: https://github.com/sparklyr/sparklyr/issues/3449Apparently sparklyr requires that Unity Catalog is enabled on the cluster in order to get the connection working right.

  • 0 kudos
1 More Replies
sparkplug
by New Contributor III
  • 676 Views
  • 1 replies
  • 1 kudos

Resolved! How do I track notebooks in all purpose compute?

I am trying to map out costs for a Shared cluster used in our organization. Since Databricks does not store the sessions in all purpose compute or who accessed the cluster, what are some possible options that I can track which notebooks were attached...

  • 676 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @sparkplug, You can use the audit logs and billing usage table: https://docs.databricks.com/en/admin/account-settings/audit-logs.html

  • 1 kudos
navi_bricks
by New Contributor II
  • 3576 Views
  • 9 replies
  • 1 kudos

Need to move files from one Volume to other

We recently enabled Unity catalog on our workspace, as part of certain transformations(Custom clustered Datapipelines(python)) we need to move file from one volume to other volume. As the job itself runs on a service principal that has access to exte...

  • 3576 Views
  • 9 replies
  • 1 kudos
Latest Reply
Dnirmania
Contributor
  • 1 kudos

Not all job clusters work well with Volumes. I used following type cluster to access files from Volume. 

  • 1 kudos
8 More Replies
kpriya
by New Contributor II
  • 424 Views
  • 1 replies
  • 2 kudos

Missing Permission option for DLT pipline

I do not see permission option in DLT pipeline's page kebab menu as well, can some one help on this?

  • 424 Views
  • 1 replies
  • 2 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 2 kudos

Hi @kpriya, Ensure you are on the correct page where the pipeline is listed. Click on the pipeline name to go into its details Once you are on the pipeline details page, look for the kebab menu (three vertical dots) associated with the pipeline. This...

  • 2 kudos
SravanThotakura
by New Contributor II
  • 916 Views
  • 3 replies
  • 0 kudos

Resolved! Spark Job fails with No plan for OptimizedForeachBatchFastpath

Hi Team,I am trying to run a job on Databricks cluster 14.3LTS which streams data from parquet to custom sink. I am facing below error. The same code used to work a month back, however i am facing this issue recently.org.apache.spark.SparkException: ...

  • 916 Views
  • 3 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Great to hear the issue got fixed.

  • 0 kudos
2 More Replies
templier2
by New Contributor II
  • 1456 Views
  • 2 replies
  • 0 kudos

databricks bundle init is not working

I use Databricks Standard Tier workspaces in Azure.When I am running databricks bundle init, I am getting an error:Error: failed to compute file content for {{.project_name}}/resources/{{.project_name}}.pipeline.yml.tmpl. template: :6:20: executing "...

  • 1456 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @templier2, The error you are encountering when running databricks bundle init is due to the fact that Unity Catalog is not supported on Standard tier workspaces

  • 0 kudos
1 More Replies
barendlinders
by New Contributor II
  • 989 Views
  • 3 replies
  • 0 kudos

Resolved! CREATE EXTERNAL LOCATION read only through SQL

Hello,I have set up a storage credential with READ ONLY access to my Azure storage account (landing-zone). I want to create an EXTERNAL LOCATION using SQL, set the EXTERNAL LOCATION to read only and use the read-only storage credential I made.I canno...

barendlinders_0-1736427738959.png
  • 989 Views
  • 3 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Sure, happy to help, let me know in case you need any other assistance.

  • 0 kudos
2 More Replies
NielsMH
by New Contributor III
  • 1545 Views
  • 2 replies
  • 0 kudos

Resolved! connect azure openai service deployments from databricks workspace

HiFor the company i work for, I have created an azure openai service instance, with the intention to deploy models and interact with them from a databricks workspace.The basic properties of my azure openai service are:Network: separate vnet for the r...

  • 1545 Views
  • 2 replies
  • 0 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 0 kudos

This sounds like a scenario for the Mosaic AI Gateway, Mosaic AI Gateway | Databricks

  • 0 kudos
1 More Replies
Mukee
by New Contributor II
  • 4734 Views
  • 3 replies
  • 0 kudos

How to get Workspace name with workspaceId?

I have an AWS Managed Databricks instance. I am trying to get a workspace name with workspace ID. Thank you very much for your time and assistance.

  • 4734 Views
  • 3 replies
  • 0 kudos
Latest Reply
holly
Databricks Employee
  • 0 kudos

Hi All, There's a few ways to find your workspace name: Quickly: it's the name in the URL before .cloud, so in the URL https://devrel.cloud.databricks.com/?o=556812371150522 the name is 'devrel'Whilst in the notebook: spark.conf.get("spark.databricks...

  • 0 kudos
2 More Replies
Matty
by New Contributor III
  • 3089 Views
  • 7 replies
  • 2 kudos

Azure basic public IP SKU retirement

With Azure notifying that retirement of the basic public IP SKU in 2025 I'd like to understand how compute cluster workers can be moved over from basic to standard SKU? We've used Terraform to deploy our Databricks environment but I'd like to underst...

  • 3089 Views
  • 7 replies
  • 2 kudos
Latest Reply
Matty
New Contributor III
  • 2 kudos

I contacted Microsoft, they spoke with their SME of the Databricks team who have confirmed they're aware of this and will be transitioning from Basic to Standard SKU before deprecation - a update from Microsoft will be getting released regarding this...

  • 2 kudos
6 More Replies
andre-h
by New Contributor III
  • 1685 Views
  • 4 replies
  • 3 kudos

Resolved! Install Python dependency on job cluster from a privately hosted GitLab repository (HTTPS/SSH)

Hello,We intend to deploy a Databricks workflow based on a Python wheel file which needs to run on a job cluster. There is a dependency declared in pyproject.toml which is another Python project living in a private Gitlab repository. We therefore nee...

  • 1685 Views
  • 4 replies
  • 3 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 3 kudos

Hi @andre-h ,As a good alternative you can build the python package (wheel or egg) in your gitlab or github workflows and upload the package to dedicated cloud storage bucket. Then followed by you can specify the cloud storage path of your python lib...

  • 3 kudos
3 More Replies
SamGreene
by Contributor II
  • 1536 Views
  • 7 replies
  • 2 kudos

Resolved! Unable to get CLI working on new laptop

Hi,I have two Windows 10 laptops.  The old one is working fine with Databricks CLI version 0.210.0.  I installed the CLI on a new laptop and copied my config file.  I can test connections to my default config which uses my PAT.  However, I have some ...

  • 1536 Views
  • 7 replies
  • 2 kudos
Latest Reply
Walter_C
Databricks Employee
  • 2 kudos

Great to hear issue got resolved, let us know in case any other assistance is required.

  • 2 kudos
6 More Replies
Rajasql
by New Contributor II
  • 1077 Views
  • 4 replies
  • 1 kudos

Resolved! regexp_count seems to not work as it should

The below SQL's should give different answers as regexp rules state that * is a special character which needs to be escaped with \ to be considered as a literal string. This second should literally match for A*B and return 2 but it is also taking AB ...

  • 1077 Views
  • 4 replies
  • 1 kudos
Latest Reply
PabloCSD
Valued Contributor II
  • 1 kudos

Hello @Rajasql ,Try this way (it worked for me in a Databricks notebook and it returns 2):SELECT regexp_count('nA*BsABreA*Bthe', 'A\\*B') str_cnt;

  • 1 kudos
3 More Replies
FabianGutierrez
by Contributor
  • 1139 Views
  • 1 replies
  • 0 kudos

Looking for experiences with DABS CLI Deployment, Terraform and Security

Hi Community,I hope my topic finds you well. Within our Databricks landscape we decided to use DABS (Databricks Asset Bundles) however we found out (the hard way) that it uses Terraform for Deployment purposes. This is a concern now for Security and ...

  • 1139 Views
  • 1 replies
  • 0 kudos
Latest Reply
hari-prasad
Valued Contributor II
  • 0 kudos

Hi @FabianGutierrez ,As I closely observed Databricks Asset Bundle is leveraging terraform under the hood, but it's not generating terraform.tfstate file. And more over the `.databricks` is gitignored which will not be synced to remote repo.Additiona...

  • 0 kudos
drumcircle
by New Contributor II
  • 501 Views
  • 1 replies
  • 1 kudos

Resolved! Compute fleets on Azure Databricks

It seems that compute fleets have been supported on AWS for almost 2 years.  Azure compute fleets went into preview in November 2024.   Has anyone heard of how or when compute fleets will be supported on Azure Databricks?

  • 501 Views
  • 1 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

This is currently on development but to ETA has been shared yet by our engineering team. But might be coming soon.

  • 1 kudos