- 1055 Views
- 3 replies
- 1 kudos
Resolved! NAT Gateway IP update
Hi, My Databricks (Premium) account was deployed on AWS.It was provisioned a few months ago from the AWS MarketPlace with the QuickStart method, based on CloudFormation. The NAT Gateway initially created by the CloudFormation stack has been incidenta...
- 1055 Views
- 3 replies
- 1 kudos
- 1 kudos
Thanks for your answer.Yes I did all of that, as well as allowing the traffic coming from my NAT Instance as an inbound rule of the Security Group of the private instances (and the other way around), and no restriction on the outbound traffic.I suspe...
- 1 kudos
- 28621 Views
- 6 replies
- 1 kudos
Resolved! Delete Databricks account
Hi everyone, as in the topic, I would like to delete unnnecesarily created account. I have found outdated solutions (e.g. https://community.databricks.com/t5/data-engineering/how-to-delete-databricks-account/m-p/6323#M2501), but they do not work anym...
- 28621 Views
- 6 replies
- 1 kudos
- 1 kudos
Hi @Edyta ,FOR AWS:Manage your subscription | Databricks on AWS" Before you delete a Databricks account, you must first cancel your Databricks subscription and delete all Unity Catalog metastores in the account. After you delete all metastores associ...
- 1 kudos
- 558 Views
- 1 replies
- 0 kudos
Is it possible to let multiple Apps share the same compute?
Apparently, for every app deployed on Databricks, a separate VM is allocated, costing 0.5 DBU/hour. This seems inefficient, why can't a single VM support multiple apps? It feels like a waste of money and resources to allocate independent VMs per app ...
- 558 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Aminsn , You're understanding is correct in that only 1 running app can be deployed per app instance. If you want to maximize the utilization of the compute, one option could be to create a multi-page app, where the landing page will direct users...
- 0 kudos
- 1141 Views
- 0 replies
- 0 kudos
Databricks OAUTH(OIDC) with ORY Network
Hi,we are trying to setup OIDC AUTH for Databricks with our Ory Network account. So far we have been using it without any issues with all of our apps and now we wanted to set it up also for Databricks. Unfortunately after many attempts with different...
- 1141 Views
- 0 replies
- 0 kudos
- 5633 Views
- 2 replies
- 0 kudos
Databricks App : Limitations
I have some questions regarding Databricks App.1) Can we use Framework other than mentioned in documentation( Streamlit,Flask,Dash,Gradio,Shiny).2) Can we allocate compute more than 2 vCPU and 6GB memory to any App.3) Any other programming language o...
- 5633 Views
- 2 replies
- 0 kudos
- 0 kudos
1.) You can use most Python-based application frameworks, including some beyond those mentioned above.(Reference here) 2.) Currently, app capacity is limited to 2 vCPUs and 6 GB of RAM. However, future updates may introduce options for scaling out an...
- 0 kudos
- 1528 Views
- 4 replies
- 2 kudos
Resolved! OAuth API for service user
Is there a way to programmatically create an OAuth secret for for a workspace service principal via API/SDK? As of now, the only way I can see doing this is through UI
- 1528 Views
- 4 replies
- 2 kudos
- 2 kudos
@andreapeterson isn't it the API you are looking for?https://docs.databricks.com/api/azure/account/serviceprincipalsecrets/createIt is an account-level API, but, counterintuitively, when we create service principals in the workspace, they propagate i...
- 2 kudos
- 635 Views
- 1 replies
- 0 kudos
Load assignment during Distributed training
Hi,I wanted to confirm, in a distributed training, if there is any way that I can control what kind/amount of load/data can be send to specific worker nodes, manually ..Or is it completely automatically handled by spark's scheduler, and we don't have...
- 635 Views
- 1 replies
- 0 kudos
- 0 kudos
From what I know, Spark automatically handles how data and workload are distributed across worker nodes during distributed training, you can't manually control exactly what or how much data goes to a specific node. You can still influence the distrib...
- 0 kudos
- 775 Views
- 1 replies
- 0 kudos
Running Driver Intensive workloads on all purpose compute
Recently observed when we run a driver intensive code on a all purpose compute. The parallel runs of the same pattern/kind jobs are getting failedExample:Job triggerd on all purpose compute with compute stats of 4 core and 8 gigs ram for driverLets s...
- 775 Views
- 1 replies
- 0 kudos
- 0 kudos
this will help you # Cluster config adjustmentsspark.conf.set("spark.driver.memory", "16g") # Double current allocationspark.conf.set("spark.driver.maxResultSize", "8g") # Prevent large collects
- 0 kudos
- 1061 Views
- 3 replies
- 3 kudos
Resolved! how to modify workspace creator
Hi. Our 3 workspaces were created by a consultant who is no longer with us. The workspace shows his name still . How can we change it? Also what kind of account should be a creator of a workspace: Service principal, AD account, AD group? Pease see b...
- 1061 Views
- 3 replies
- 3 kudos
- 3 kudos
You can file an Azure ticket and request them to contact Databricks Support, who will contact Databricks Engineering, to change that 'workspace owner'. If you have a Databricks Account Team, they can also file an internal ticket to Databricks Enginee...
- 3 kudos
- 1148 Views
- 4 replies
- 0 kudos
Terraforming Git credentials for service principals
I am terraforming service principals in my Databricks workspace and it works great until I need to assign Git credentials to my SP. In the UI we have these options to configure credentials on service principal page:However the Terraform resource I fo...
- 1148 Views
- 4 replies
- 0 kudos
- 0 kudos
You're a little bit ahead of me in this process, so I haven't tried the solution yet, but it looks like you create a git credential resource for the service principal. This requires a token, which I think must be generated in the console. My refere...
- 0 kudos
- 931 Views
- 1 replies
- 0 kudos
Unity Catalog: 403 Error When Connecting S3 via IAM Role and Storage Credential
Hi,We're currently setting up Databricks Unity Catalog on AWS. We created an S3 bucket and assigned an IAM role (databricks-storage-role) to give Databricks access.Note: Databricks doesn't use the IAM role directly. Instead, it requires a Storage Cre...
- 931 Views
- 1 replies
- 0 kudos
- 0 kudos
Have you follow any specific guide for the creation of the same? Are you setting up a Unity Catalog Metastore or the default storage for the workspace?For the Metastore creation have you follow steps in https://docs.databricks.com/aws/en/data-governa...
- 0 kudos
- 2693 Views
- 7 replies
- 1 kudos
How to install (mssql) drivers to jobcompute?
Hello, I'm having this issue with job-computes:The snippet of the code is as follows: 84 if self.conf["persist_to_sql"]: 85 # persist to sql 86 df_parsed.write.format( 87 "com.microsoft.sqlserver.jdbc.spark" 88...
- 2693 Views
- 7 replies
- 1 kudos
- 1 kudos
For a job compute, you would have to go init script route. Can you please highlight, the cause of the failure of library installation via init script?
- 1 kudos
- 2414 Views
- 1 replies
- 0 kudos
Static IP for existing workspace
Is there a way to have static IP addresses for Azure Databricks without creating new workspace?We have worked a lot in 2 workspaces (dev and main), but now we need static IP addresses for both to work with some APIs. Do we really have to recreate the...
- 2414 Views
- 1 replies
- 0 kudos
- 0 kudos
I don't think so, at least not on Azure. What you need to do depends on how you set up your workspaces. In Azure, if you just use a default install, a NAT gateway is created and configured for you, so you probably already have a static IP.If you us...
- 0 kudos
- 10164 Views
- 11 replies
- 0 kudos
What is the best practice for connecting Power BI to Azure Databricks?
I refer this document to connect Power BI Desktop and Power BI Service to Azure Databricks.Connect Power BI to Azure Databricks - Azure Databricks | Microsoft LearnHowever, I have a couple of quesitions and concerns. Can anyone kindly help?It seems l...
- 10164 Views
- 11 replies
- 0 kudos
- 0 kudos
To securely connect Power BI to Azure Databricks, avoid using PATs and instead configure a Databricks Service Principal with SQL Warehouse access. Power BI Service does not support Client Credential authentication, so Service Principal authentication...
- 0 kudos
- 832 Views
- 1 replies
- 0 kudos
Gcs databricks community
Hello,I would like to know if it is possible to connect my Databricks community account with a Google cloud storage account via a notebook.I tried to connect it via the json key of my gcs service account but the notebook always gives this error when ...
- 832 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @oricaruso To connect to GCS, you typically need to set the service account JSON key in the cluster’s Spark config, not just in the notebook. However, since the Community Edition has several limitations, like the absence of secret scopes, restrict...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
43 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
97 | |
37 | |
26 | |
25 | |
18 |