- 4596 Views
- 9 replies
- 1 kudos
Need to move files from one Volume to other
We recently enabled Unity catalog on our workspace, as part of certain transformations(Custom clustered Datapipelines(python)) we need to move file from one volume to other volume. As the job itself runs on a service principal that has access to exte...
- 4596 Views
- 9 replies
- 1 kudos
- 1 kudos
Not all job clusters work well with Volumes. I used following type cluster to access files from Volume.
- 1 kudos
- 508 Views
- 1 replies
- 2 kudos
Missing Permission option for DLT pipline
I do not see permission option in DLT pipeline's page kebab menu as well, can some one help on this?
- 508 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @kpriya, Ensure you are on the correct page where the pipeline is listed. Click on the pipeline name to go into its details Once you are on the pipeline details page, look for the kebab menu (three vertical dots) associated with the pipeline. This...
- 2 kudos
- 1698 Views
- 2 replies
- 0 kudos
databricks bundle init is not working
I use Databricks Standard Tier workspaces in Azure.When I am running databricks bundle init, I am getting an error:Error: failed to compute file content for {{.project_name}}/resources/{{.project_name}}.pipeline.yml.tmpl. template: :6:20: executing "...
- 1698 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @templier2, The error you are encountering when running databricks bundle init is due to the fact that Unity Catalog is not supported on Standard tier workspaces
- 0 kudos
- 1253 Views
- 3 replies
- 0 kudos
Resolved! CREATE EXTERNAL LOCATION read only through SQL
Hello,I have set up a storage credential with READ ONLY access to my Azure storage account (landing-zone). I want to create an EXTERNAL LOCATION using SQL, set the EXTERNAL LOCATION to read only and use the read-only storage credential I made.I canno...
- 1253 Views
- 3 replies
- 0 kudos
- 0 kudos
Sure, happy to help, let me know in case you need any other assistance.
- 0 kudos
- 1877 Views
- 2 replies
- 0 kudos
Resolved! connect azure openai service deployments from databricks workspace
HiFor the company i work for, I have created an azure openai service instance, with the intention to deploy models and interact with them from a databricks workspace.The basic properties of my azure openai service are:Network: separate vnet for the r...
- 1877 Views
- 2 replies
- 0 kudos
- 0 kudos
This sounds like a scenario for the Mosaic AI Gateway, Mosaic AI Gateway | Databricks
- 0 kudos
- 5335 Views
- 3 replies
- 0 kudos
How to get Workspace name with workspaceId?
I have an AWS Managed Databricks instance. I am trying to get a workspace name with workspace ID. Thank you very much for your time and assistance.
- 5335 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi All, There's a few ways to find your workspace name: Quickly: it's the name in the URL before .cloud, so in the URL https://devrel.cloud.databricks.com/?o=556812371150522 the name is 'devrel'Whilst in the notebook: spark.conf.get("spark.databricks...
- 0 kudos
- 3574 Views
- 7 replies
- 3 kudos
Azure basic public IP SKU retirement
With Azure notifying that retirement of the basic public IP SKU in 2025 I'd like to understand how compute cluster workers can be moved over from basic to standard SKU? We've used Terraform to deploy our Databricks environment but I'd like to underst...
- 3574 Views
- 7 replies
- 3 kudos
- 3 kudos
I contacted Microsoft, they spoke with their SME of the Databricks team who have confirmed they're aware of this and will be transitioning from Basic to Standard SKU before deprecation - a update from Microsoft will be getting released regarding this...
- 3 kudos
- 2253 Views
- 4 replies
- 3 kudos
Resolved! Install Python dependency on job cluster from a privately hosted GitLab repository (HTTPS/SSH)
Hello,We intend to deploy a Databricks workflow based on a Python wheel file which needs to run on a job cluster. There is a dependency declared in pyproject.toml which is another Python project living in a private Gitlab repository. We therefore nee...
- 2253 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @andre-h ,As a good alternative you can build the python package (wheel or egg) in your gitlab or github workflows and upload the package to dedicated cloud storage bucket. Then followed by you can specify the cloud storage path of your python lib...
- 3 kudos
- 1837 Views
- 7 replies
- 2 kudos
Resolved! Unable to get CLI working on new laptop
Hi,I have two Windows 10 laptops. The old one is working fine with Databricks CLI version 0.210.0. I installed the CLI on a new laptop and copied my config file. I can test connections to my default config which uses my PAT. However, I have some ...
- 1837 Views
- 7 replies
- 2 kudos
- 2 kudos
Great to hear issue got resolved, let us know in case any other assistance is required.
- 2 kudos
- 1336 Views
- 4 replies
- 1 kudos
Resolved! regexp_count seems to not work as it should
The below SQL's should give different answers as regexp rules state that * is a special character which needs to be escaped with \ to be considered as a literal string. This second should literally match for A*B and return 2 but it is also taking AB ...
- 1336 Views
- 4 replies
- 1 kudos
- 1 kudos
Hello @Rajasql ,Try this way (it worked for me in a Databricks notebook and it returns 2):SELECT regexp_count('nA*BsABreA*Bthe', 'A\\*B') str_cnt;
- 1 kudos
- 1280 Views
- 1 replies
- 0 kudos
Looking for experiences with DABS CLI Deployment, Terraform and Security
Hi Community,I hope my topic finds you well. Within our Databricks landscape we decided to use DABS (Databricks Asset Bundles) however we found out (the hard way) that it uses Terraform for Deployment purposes. This is a concern now for Security and ...
- 1280 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @FabianGutierrez ,As I closely observed Databricks Asset Bundle is leveraging terraform under the hood, but it's not generating terraform.tfstate file. And more over the `.databricks` is gitignored which will not be synced to remote repo.Additiona...
- 0 kudos
- 686 Views
- 1 replies
- 1 kudos
Resolved! Compute fleets on Azure Databricks
It seems that compute fleets have been supported on AWS for almost 2 years. Azure compute fleets went into preview in November 2024. Has anyone heard of how or when compute fleets will be supported on Azure Databricks?
- 686 Views
- 1 replies
- 1 kudos
- 1 kudos
This is currently on development but to ETA has been shared yet by our engineering team. But might be coming soon.
- 1 kudos
- 2407 Views
- 11 replies
- 0 kudos
Can't create AWS p3 instance
Hi, I'm trying to create a `p3.2xlarge` in my workspace, but the cluster fails to instantiate, specifically getting this error message: `No zone supports both the driver instance type [p3.2xlarge] and the worker instance type [p3.2xlarge]` (though I ...
- 2407 Views
- 11 replies
- 0 kudos
- 0 kudos
I was able to start a cluster with same exact configuration on my internal environment with no issues, I have selected east-1a as the AZ to deploy.By any chance have you engaged AWS support on this?
- 0 kudos
- 948 Views
- 3 replies
- 0 kudos
Mimic system table functionality at custom catalog level
Hi,I am exploring system tables . I want to have our environment specific data in different catalogs. While it is possible to get audit and other usage info from system tables under system catalog,how can I achieve the same in my custom catalog that ...
- 948 Views
- 3 replies
- 0 kudos
- 0 kudos
Just to be clear, what you want to do is have a set of system tables such as audit logs in each catalog for your environments, so when you query the data for those tables you just get information from your environment. On this case there is no built ...
- 0 kudos
- 1543 Views
- 1 replies
- 1 kudos
Delta lake : delete data from storage manually instead of vacuum
Hi AllWe have a unique use case where we are unable to run vacuum to clean our storage space of delta lake tables. Since we have data partitioned by date, we plan to delete files older than a certain date directly from storage. Could this lead to any...
- 1543 Views
- 1 replies
- 1 kudos
- 1 kudos
Deleting files older than a certain date directly from storage without using the VACUUM command can lead to potential issues with your Delta Lake tables. Here are the key points to consider: Corruption Risk: Directly deleting files from storage can ...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
47 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 108 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |