- 956 Views
- 2 replies
- 3 kudos
Resolved! Databricks Free Edition Account Migration
Hello,I set up a Databricks Free Edition account with the intention of running it on Azure, since my environment is based in the Azure cloud. However, the account was provisioned on AWS instead. Is there a way to migrate it? Please provide the steps ...
- 956 Views
- 2 replies
- 3 kudos
- 3 kudos
@libpekin - Short Answer - its AWS-only and no such 'automated' path/choice to migrate to Azure.
- 3 kudos
- 1067 Views
- 5 replies
- 0 kudos
Proxy configuration - while bootstraping
I am trying to start a cluster in Az databricks,our policy is to use proxy for outbound traffic. I have configured http_proxy, https_proxy, HTTP_PROXY, HTTPS_PROXY, no_proxy and NO_PROXY List in env variables and global . Made sure proxy is bypassin...
- 1067 Views
- 5 replies
- 0 kudos
- 1174 Views
- 2 replies
- 2 kudos
Resolved! Model serving with provisioned throughput fails
I'm trying to serve a model with provisioned throughput but I'm getting this error:Build could not start due to an internal error. If you are serving a model from UC and Azure storage firewall or Private Link is configured on your storage account, pl...
- 1174 Views
- 2 replies
- 2 kudos
- 2 kudos
Hi team, Creating an endpoint in your workspace needs Serverless, and so you need to update the storage account’s firewall to allow Databricks serverless compute via your workspace’s Network Connectivity Configuration (NCC). If the storage account f...
- 2 kudos
- 1515 Views
- 1 replies
- 1 kudos
Resolved! Updating projects created from Databricks Asset Bundles
Hi allWe are using Databricks Asset Bundles for our data science / ML projects. The asset bundle we have, have spawned quite a few projects by now, but now we need to make some updates to the asset bundle. The updates should also be added to the spaw...
- 1515 Views
- 1 replies
- 1 kudos
- 1 kudos
Greetings @Sleiny , Here’s what’s really going on, plus a pragmatic, field-tested plan you can actually execute without tearing up your repo strategy. Let’s dig in. What’s happening Databricks Asset Bundles templates are used at initialization time ...
- 1 kudos
- 2153 Views
- 5 replies
- 6 kudos
Resolved! AbfsRestOperationException when adding privatelink.dfs.core.windows.net
Hey Databricks forum,Have been searching a lot, but can't find a solution. I have the following setup:- a vnet connected to the databricks workspace with - public-subnet (deligated to Microsoft.Databricks/workspaces) and a NSG - private-subnet (d...
- 2153 Views
- 5 replies
- 6 kudos
- 6 kudos
Yes, that's the solution! I thought I had tested this (maybe some caching..)When I changed it to abfss://metastore@<storageaccount>.dfs.core.windows.net it still failed with:Failed to access cloud storage: [AbfsRestOperationException]The storage publ...
- 6 kudos
- 5176 Views
- 6 replies
- 1 kudos
Unable to destroy NCC private endpoint
Hi TeamAccidentally, we removed one of the NCC private endpoints from our storage account that was created using Terraform. When I tried to destroy and recreate it, I encountered the following error. According to some articles, the private endpoint w...
- 5176 Views
- 6 replies
- 1 kudos
- 1 kudos
Just let the state forget about it: terraform state rm 'your_module.your_terraformresource' you can find that terraform resource by using: terraform state list | grep -i databricks_mws_ncc_private_endpoint_rule and later validating id: terraform stat...
- 1 kudos
- 666 Views
- 1 replies
- 1 kudos
Resolved! My trial is about to expire
I'm aware, my workspace/subscription will be converted into a 'pay-as-you-go' model. That's okay - however I wonder why you don't provide a non-restricted plan just for learning. I'm sure there are ways to block commercial use. However, that's not my...
- 666 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @quakenbush ,In the past you had to create a new VNet injected workspace and migrate all workloads from the existing managed workspace to enable VNet injection. This process was necessary because there was no direct way to convert a managed worksp...
- 1 kudos
- 1519 Views
- 6 replies
- 0 kudos
Skepticism about U2M OAuth: Does Snowflake Federation Actually Switch User Identity per Query?
Hi everyone,I'm currently setting up Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth). However, I'm skeptical that the connection truly switches the user identity dynamically for each Databricks user (https://docs.databricks....
- 1519 Views
- 6 replies
- 0 kudos
- 0 kudos
Snowflake federation with Databricks using Microsoft Entra ID (U2M OAuth) is intended to support per-user identity propagation—that is, each Databricks user is supposed to have queries executed under their own Snowflake identity at query time, rather...
- 0 kudos
- 1171 Views
- 2 replies
- 3 kudos
Resolved! Azure Databricks Meters vs Databricks SKUs from system.billing table
When it comes to DBU, I am being charged by Azure for the following meters:- Premium Jobs Compute DBU <-- DBUs that my job computes are spending- Premium Serverless SQL DBU <-- DBUs that the SQL Warehouse compute is spending- Premium All-Purpose Phot...
- 1171 Views
- 2 replies
- 3 kudos
- 1972 Views
- 5 replies
- 0 kudos
Databricks Asset Bundle Deployment Fails in GitHub Actions with Federated Identity Credentials
I am using a service principal with workspace admin access to deploy Databricks asset bundles. The deployment works successfully via Jenkins using the same credentials and commands. However, when attempting the deployment through GitHub Actions, I en...
- 1972 Views
- 5 replies
- 0 kudos
- 0 kudos
Environment variables override .databrickscfg, that's why it is probably failing to OIDC. Make sure that you have correct specification in your databricks.yml so it will be source of true. Smth like: - name: Deploy bundle env: DATABRICKS_HOST: ...
- 0 kudos
- 1479 Views
- 4 replies
- 1 kudos
TCO calculator for Databricks Analytics
Similar to the cloud infra calculators, is there a TCO calculator exist for Databricks?Lets say we have the inputs such as Number of source tables, data pipelines (estimated number), data growth per day, transfromation complexity and target reports a...
- 1479 Views
- 4 replies
- 1 kudos
- 1 kudos
@szymon_dybczak - I am aware of that calculator, however, the challenge is - how to even calculate the number of DBU it will consume based on the volume of data processing etc. The tool starts with the Infra and compute inputs. However, my question i...
- 1 kudos
- 484 Views
- 1 replies
- 0 kudos
Does Databricks support HNS in GCP?
Hello,I need to set up some buckets in GCP which will be used as an analytics and productive data lake. I am getting diverging feedback on whether hierarchical namespaces (HNS) should be enabled for these buckets.On one hand, HNS is advisable for ana...
- 484 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @IvanPopov ,According to docs Google Cloud Storage hierarchical namespace (HNS) is not supported with external locations. You must disable hierarchical namespace before creating an external location.
- 0 kudos
- 2435 Views
- 7 replies
- 3 kudos
UC volumes not useable in Apps?
I have to install some custom library in a Python Databricks App. According to the documentation this should be possible through UC volumes:https://docs.databricks.com/aws/en/dev-tools/databricks-apps/dependencies#install-wheel-files-from-unity-catal...
- 2435 Views
- 7 replies
- 3 kudos
- 484 Views
- 2 replies
- 0 kudos
Internet Access from Serverless Databricks - free trial
Hi community. I started to use databricks quick set up free trial and I have been trying to access internet from a python notebook but I haven't been able to do so. Even my UI is different. Is it becasue I am using free trial?
- 484 Views
- 2 replies
- 0 kudos
- 0 kudos
I changed the set up and I linked it to aws workspace. It doesn't raise any error now.But I was using requests
- 0 kudos
- 1159 Views
- 3 replies
- 1 kudos
What is the maximum number of workspaces per account on GCP
I found this in the docs: "you can create at most 200 workspaces per week in the same Google Cloud project" -But that directly contradicts the 20 limit that is mentioned in resource limits docs . But Azure has no limit, and AWS has a limit of 50. So...
- 1159 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @rmusti ,This is a bit confusing but not contradictory. Here's the important line in the docs:"For limits where Fixed is No, you can request a limit increase through your Databricks account team."So, below you have a table with resource limits. In...
- 1 kudos
-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
79 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 54 | |
| 38 | |
| 36 | |
| 25 |