- 325 Views
- 1 replies
- 2 kudos
Missing Permission option for DLT pipline
I do not see permission option in DLT pipeline's page kebab menu as well, can some one help on this?
- 325 Views
- 1 replies
- 2 kudos
- 2 kudos
Hi @kpriya, Ensure you are on the correct page where the pipeline is listed. Click on the pipeline name to go into its details Once you are on the pipeline details page, look for the kebab menu (three vertical dots) associated with the pipeline. This...
- 2 kudos
- 425 Views
- 1 replies
- 0 kudos
Databricks Apps not working in postman
I have a question regarding Databricks Apps. I have deployed my databricks Apps, and its working on my laptop, but when I try to open the same url in my mobile its redirecting to databricks signin page, and also its not working through postman as wel...
- 425 Views
- 1 replies
- 0 kudos
- 0 kudos
Is this issue specifically with the Databricks Apps am I right? Are you getting and erro message?
- 0 kudos
- 737 Views
- 3 replies
- 0 kudos
Resolved! Spark Job fails with No plan for OptimizedForeachBatchFastpath
Hi Team,I am trying to run a job on Databricks cluster 14.3LTS which streams data from parquet to custom sink. I am facing below error. The same code used to work a month back, however i am facing this issue recently.org.apache.spark.SparkException: ...
- 737 Views
- 3 replies
- 0 kudos
- 1228 Views
- 2 replies
- 0 kudos
databricks bundle init is not working
I use Databricks Standard Tier workspaces in Azure.When I am running databricks bundle init, I am getting an error:Error: failed to compute file content for {{.project_name}}/resources/{{.project_name}}.pipeline.yml.tmpl. template: :6:20: executing "...
- 1228 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @templier2, The error you are encountering when running databricks bundle init is due to the fact that Unity Catalog is not supported on Standard tier workspaces
- 0 kudos
- 747 Views
- 3 replies
- 0 kudos
Resolved! CREATE EXTERNAL LOCATION read only through SQL
Hello,I have set up a storage credential with READ ONLY access to my Azure storage account (landing-zone). I want to create an EXTERNAL LOCATION using SQL, set the EXTERNAL LOCATION to read only and use the read-only storage credential I made.I canno...
- 747 Views
- 3 replies
- 0 kudos
- 0 kudos
Sure, happy to help, let me know in case you need any other assistance.
- 0 kudos
- 1201 Views
- 2 replies
- 0 kudos
Resolved! connect azure openai service deployments from databricks workspace
HiFor the company i work for, I have created an azure openai service instance, with the intention to deploy models and interact with them from a databricks workspace.The basic properties of my azure openai service are:Network: separate vnet for the r...
- 1201 Views
- 2 replies
- 0 kudos
- 0 kudos
This sounds like a scenario for the Mosaic AI Gateway, Mosaic AI Gateway | Databricks
- 0 kudos
- 4308 Views
- 3 replies
- 0 kudos
How to get Workspace name with workspaceId?
I have an AWS Managed Databricks instance. I am trying to get a workspace name with workspace ID. Thank you very much for your time and assistance.
- 4308 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi All, There's a few ways to find your workspace name: Quickly: it's the name in the URL before .cloud, so in the URL https://devrel.cloud.databricks.com/?o=556812371150522 the name is 'devrel'Whilst in the notebook: spark.conf.get("spark.databricks...
- 0 kudos
- 2674 Views
- 7 replies
- 2 kudos
Azure basic public IP SKU retirement
With Azure notifying that retirement of the basic public IP SKU in 2025 I'd like to understand how compute cluster workers can be moved over from basic to standard SKU? We've used Terraform to deploy our Databricks environment but I'd like to underst...
- 2674 Views
- 7 replies
- 2 kudos
- 2 kudos
I contacted Microsoft, they spoke with their SME of the Databricks team who have confirmed they're aware of this and will be transitioning from Basic to Standard SKU before deprecation - a update from Microsoft will be getting released regarding this...
- 2 kudos
- 1346 Views
- 4 replies
- 3 kudos
Resolved! Install Python dependency on job cluster from a privately hosted GitLab repository (HTTPS/SSH)
Hello,We intend to deploy a Databricks workflow based on a Python wheel file which needs to run on a job cluster. There is a dependency declared in pyproject.toml which is another Python project living in a private Gitlab repository. We therefore nee...
- 1346 Views
- 4 replies
- 3 kudos
- 3 kudos
Hi @andre-h ,As a good alternative you can build the python package (wheel or egg) in your gitlab or github workflows and upload the package to dedicated cloud storage bucket. Then followed by you can specify the cloud storage path of your python lib...
- 3 kudos
- 1179 Views
- 7 replies
- 2 kudos
Resolved! Unable to get CLI working on new laptop
Hi,I have two Windows 10 laptops. The old one is working fine with Databricks CLI version 0.210.0. I installed the CLI on a new laptop and copied my config file. I can test connections to my default config which uses my PAT. However, I have some ...
- 1179 Views
- 7 replies
- 2 kudos
- 2 kudos
Great to hear issue got resolved, let us know in case any other assistance is required.
- 2 kudos
- 916 Views
- 4 replies
- 1 kudos
Resolved! regexp_count seems to not work as it should
The below SQL's should give different answers as regexp rules state that * is a special character which needs to be escaped with \ to be considered as a literal string. This second should literally match for A*B and return 2 but it is also taking AB ...
- 916 Views
- 4 replies
- 1 kudos
- 1 kudos
Hello @Rajasql ,Try this way (it worked for me in a Databricks notebook and it returns 2):SELECT regexp_count('nA*BsABreA*Bthe', 'A\\*B') str_cnt;
- 1 kudos
- 1025 Views
- 1 replies
- 0 kudos
Looking for experiences with DABS CLI Deployment, Terraform and Security
Hi Community,I hope my topic finds you well. Within our Databricks landscape we decided to use DABS (Databricks Asset Bundles) however we found out (the hard way) that it uses Terraform for Deployment purposes. This is a concern now for Security and ...
- 1025 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @FabianGutierrez ,As I closely observed Databricks Asset Bundle is leveraging terraform under the hood, but it's not generating terraform.tfstate file. And more over the `.databricks` is gitignored which will not be synced to remote repo.Additiona...
- 0 kudos
- 402 Views
- 1 replies
- 1 kudos
Resolved! Compute fleets on Azure Databricks
It seems that compute fleets have been supported on AWS for almost 2 years. Azure compute fleets went into preview in November 2024. Has anyone heard of how or when compute fleets will be supported on Azure Databricks?
- 402 Views
- 1 replies
- 1 kudos
- 1 kudos
This is currently on development but to ETA has been shared yet by our engineering team. But might be coming soon.
- 1 kudos
- 1408 Views
- 11 replies
- 0 kudos
Can't create AWS p3 instance
Hi, I'm trying to create a `p3.2xlarge` in my workspace, but the cluster fails to instantiate, specifically getting this error message: `No zone supports both the driver instance type [p3.2xlarge] and the worker instance type [p3.2xlarge]` (though I ...
- 1408 Views
- 11 replies
- 0 kudos
- 0 kudos
I was able to start a cluster with same exact configuration on my internal environment with no issues, I have selected east-1a as the AZ to deploy.By any chance have you engaged AWS support on this?
- 0 kudos
- 687 Views
- 3 replies
- 0 kudos
Mimic system table functionality at custom catalog level
Hi,I am exploring system tables . I want to have our environment specific data in different catalogs. While it is possible to get audit and other usage info from system tables under system catalog,how can I achieve the same in my custom catalog that ...
- 687 Views
- 3 replies
- 0 kudos
- 0 kudos
Just to be clear, what you want to do is have a set of system tables such as audit logs in each catalog for your environments, so when you query the data for those tables you just get information from your environment. On this case there is no built ...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
20 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
43 | |
34 | |
25 | |
17 | |
10 |