- 34 Views
- 1 replies
- 0 kudos
Databricks Asset Bundles capability for cross cloud migration
Hi everyone,We are planning a migration from Azure Databricks to GCP Databricks and would like to understand whether Databricks Asset Bundles (DAB) can be used to migrate workspace assets such as jobs, pipelines, notebooks, and custom serving endpoin...
- 34 Views
- 1 replies
- 0 kudos
- 0 kudos
DAB's are useful but not sufficient. They work well for re-creating control-plane assets such as jobs, notebooks, DLT/Lakeflow pipelines, and model serving endpoints in a target workspace, even across clouds, by using environment-specific targets and...
- 0 kudos
- 79 Views
- 2 replies
- 1 kudos
How do you manage alerts?
Hey all,I'm curious how do teams manage Databricks alerts?My use case is that I have around 10 Spark workflows, and need to validate their output tables.My first iteration was to create alerts manually, e.g. define SQL, evaluation criteria, notificat...
- 79 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @smirnoal ,If you want to have more dynamic behaviour you can use Python for Databricks Asset Bundles which extends Databricks Asset Bundles so that you can:Define resources in Python code. These definitions can coexist with resources defined in Y...
- 1 kudos
- 129 Views
- 3 replies
- 3 kudos
AI/BI Dashboard embed issue in Databricks App
Hi everyone,I’ve created an AI/BI Dashboard in Azure Databricks and successfully published it, generated an embed link. My goal is to embed this dashboard inside a Databricks App (Streamlit) using an iframe.However, when I try to render the dashboard...
- 129 Views
- 3 replies
- 3 kudos
- 3 kudos
Hmmm, this is new to me. However, I did some poking around in our internal docs and I have come up with a few more suggestions/tips you can chase down. Not sure if it will help but it gives you a little more to work with. That specific error string ...
- 3 kudos
- 188 Views
- 4 replies
- 7 kudos
Issue when creating Salesforce Connector
HiI'm trying to create a Salesforce Connector in Lakeflow.In the "salesforce authentication step", I'm entering my Salesforce Username and Password and then I get stucked with the following error message : "OAUTH_APPROVAL_ERROR_GENERIC"My Salesforce...
- 188 Views
- 4 replies
- 7 kudos
- 7 kudos
Hi guys Sorry to bother you again...I know, the salesforce adm in my company is quite "narrow-minded" regarding security issues, and I don't know enough about security protocol to answer him.Do you know why we need a "Connected App" usage while we ar...
- 7 kudos
- 87 Views
- 3 replies
- 0 kudos
Can't display histogram in Databricks Notebooks
Is it a known bug? It says to use display(dataframe) for is not workingHow can I display it using Databricks Visualizations? Thanks in advance
- 87 Views
- 3 replies
- 0 kudos
- 0 kudos
The describe looks like a completely different table to the image of values, they have different field names and different types. At what point do you get the error? Does it let you get the visualisation pop up? It should work fine. Is it when you tr...
- 0 kudos
- 97 Views
- 1 replies
- 0 kudos
Delta Sharing from Databricks to SAP BDC fails with invalid_client error
ContextWe are in the process of extracting data between SAP BDC Datasphere and Databricks (Brownfield Implementation).SAP Datasphere is hosted in AWS (eu10)Databricks is hosted in Azure (West Europe)The BDC Connect System is located in the same regio...
- 97 Views
- 1 replies
- 0 kudos
- 0 kudos
This is a common challenge in enterprise SAP Datasphere and Databricks integrations, particularly in brownfield, cross-cloud setups. We’ve seen multiple cases where sharing between SAP and Databricks works as expected, while the reverse path introduc...
- 0 kudos
- 137 Views
- 2 replies
- 1 kudos
Databricks Free Edition Account Migration
Hello,I set up a Databricks Free Edition account with the intention of running it on Azure, since my environment is based in the Azure cloud. However, the account was provisioned on AWS instead. Is there a way to migrate it? Please provide the steps ...
- 137 Views
- 2 replies
- 1 kudos
- 1 kudos
@libpekin - Short Answer - its AWS-only and no such 'automated' path/choice to migrate to Azure.
- 1 kudos
- 258 Views
- 5 replies
- 0 kudos
Proxy configuration - while bootstraping
I am trying to start a cluster in Az databricks,our policy is to use proxy for outbound traffic. I have configured http_proxy, https_proxy, HTTP_PROXY, HTTPS_PROXY, no_proxy and NO_PROXY List in env variables and global . Made sure proxy is bypassin...
- 258 Views
- 5 replies
- 0 kudos
- 149 Views
- 1 replies
- 0 kudos
Databricks import directory false positive import
Hello evryone,I'm using databricks CLI to moving several directories from Azure Repos to Databricks Workspace.The problem is that files are not updating properly, with no error to display.The self-hosted agent in pipeline i'm using has installed the ...
- 149 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Giuseppe_C,Databricks CLI is not syncing updates during your pipeline runs. Several teams we work with have faced the same issue with legacy CLI versions and workspace import behavior. We’ve helped them stabilize CI/CD pipelines for Databricks, i...
- 0 kudos
- 144 Views
- 1 replies
- 0 kudos
Databricks Job : Unable to read Databricks job run parameter in scala code and sql query.
we are created data bricks job with Jar (scala code) and provided parameters/jar parameters and able to read those as arguments in main method. we are running job with parameters (run parameters / job parameters) , those parametes are not able to re...
- 144 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @kumarV , I did some digging and here are some hints/tips to help you further troubleshoot. Yep — this really comes down to how parameters flow through Lakeflow Jobs depending on the task type. JAR tasks are the odd duck: they don’t get the same ...
- 0 kudos
- 176 Views
- 2 replies
- 2 kudos
Resolved! Model serving with provisioned throughput fails
I'm trying to serve a model with provisioned throughput but I'm getting this error:Build could not start due to an internal error. If you are serving a model from UC and Azure storage firewall or Private Link is configured on your storage account, pl...
- 176 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi team, Creating an endpoint in your workspace needs Serverless, and so you need to update the storage account’s firewall to allow Databricks serverless compute via your workspace’s Network Connectivity Configuration (NCC). If the storage account f...
- 2 kudos
- 129 Views
- 1 replies
- 1 kudos
Updating projects created from Databricks Asset Bundles
Hi allWe are using Databricks Asset Bundles for our data science / ML projects. The asset bundle we have, have spawned quite a few projects by now, but now we need to make some updates to the asset bundle. The updates should also be added to the spaw...
- 129 Views
- 1 replies
- 1 kudos
- 1 kudos
Greetings @Sleiny , Here’s what’s really going on, plus a pragmatic, field-tested plan you can actually execute without tearing up your repo strategy. Let’s dig in. What’s happening Databricks Asset Bundles templates are used at initialization time ...
- 1 kudos
- 312 Views
- 5 replies
- 6 kudos
Resolved! AbfsRestOperationException when adding privatelink.dfs.core.windows.net
Hey Databricks forum,Have been searching a lot, but can't find a solution. I have the following setup:- a vnet connected to the databricks workspace with - public-subnet (deligated to Microsoft.Databricks/workspaces) and a NSG - private-subnet (d...
- 312 Views
- 5 replies
- 6 kudos
- 6 kudos
Yes, that's the solution! I thought I had tested this (maybe some caching..)When I changed it to abfss://metastore@<storageaccount>.dfs.core.windows.net it still failed with:Failed to access cloud storage: [AbfsRestOperationException]The storage publ...
- 6 kudos
- 3790 Views
- 6 replies
- 1 kudos
Unable to destroy NCC private endpoint
Hi TeamAccidentally, we removed one of the NCC private endpoints from our storage account that was created using Terraform. When I tried to destroy and recreate it, I encountered the following error. According to some articles, the private endpoint w...
- 3790 Views
- 6 replies
- 1 kudos
- 1 kudos
Just let the state forget about it: terraform state rm 'your_module.your_terraformresource' you can find that terraform resource by using: terraform state list | grep -i databricks_mws_ncc_private_endpoint_rule and later validating id: terraform stat...
- 1 kudos
- 969 Views
- 3 replies
- 3 kudos
Resolved! Lakebase -- Enable RLS in synced Table
Dear all,I am currently testing Lakebase for integration in our overall system. In particular I need to enable RLS on a Lakebase table, which is synced from a "Delta Streaming Table" in UC. Setting up the data sync was no trouble, in UC I am the owne...
- 969 Views
- 3 replies
- 3 kudos
- 3 kudos
Hello @DaPo! Could you please confirm whether you are the owner of the table within the Lakebase Postgres (not just in Unity Catalog)?Also, can you try creating a view on the synced table and then configure RLS on that view?
- 3 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
59 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |