cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

JamesDryden
by New Contributor II
  • 1005 Views
  • 3 replies
  • 5 kudos

How are you deploying graphs?

Hi all, I have a couple of use cases that may benefit from using graphs.  I'm interested in whether anyone has graph databases in Production and, if so, whether you're using GraphFrames, Neo4j or something else?  What is the architecture you have the...

  • 1005 Views
  • 3 replies
  • 5 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 5 kudos

Up to now the way to go is graphx or graphframes.There is also the possibility to use python libraries or others (single node that is), perhaps even Arrow-based.Another option is to load the data to a graph database and then move back to databricks a...

  • 5 kudos
2 More Replies
sri840
by New Contributor
  • 553 Views
  • 3 replies
  • 0 kudos

Databricks Asset bundles

Hi Team,In our company we are planning to migrate our workflows with Databricks Asset bundles, is it mandatory to install Databricks CLI tool for getting started with DAB ? Any one who integrated with Github with CICD pipeline please let me know the ...

  • 553 Views
  • 3 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I forgot the CI/CD part:that is not that hard.  Basically in DAB you define the type of environment you are using.If you use 'development', DAB assumes you are in actual development mode (feature branch).  so there you can connect git and put the fil...

  • 0 kudos
2 More Replies
Kousuke_0716
by New Contributor
  • 300 Views
  • 1 replies
  • 0 kudos

De facto Standard for Databricks on AWS

Hello,I am working on creating an architecture diagram for Databricks on AWS.I would like to adopt the de facto standard used by enterprises. Based on my research, I have identified the following components:Network: Customer-managed VPC,Secure Cluste...

  • 300 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I would not call it a 'standard' but a possible architecture.  The great thing about the cloud is you can complete the puzzle in many ways and make it as complex or as easy as possible.Also I would not consider Fivetran to be standard in companies.  ...

  • 0 kudos
johnb1
by Contributor
  • 2823 Views
  • 10 replies
  • 3 kudos

Resolved! Deploy Workflow only to specific target (Databricks Asset Bundles)

I am using Databricks Asset Bundles to deploy Databricks workflows to all of my target environments (dev, staging, prod). However, I have one specific workflow that is supposed to be deployed only to the dev target environment.How can I implement tha...

  • 2823 Views
  • 10 replies
  • 3 kudos
Latest Reply
meghanav_hmc
New Contributor II
  • 3 kudos

Hi, I'm also looking to deploy different jobs in different targets. And these jobs are defined in a separate .yml file and we'll need to reference these jobs in the targets accordingly. Any updates on this implementation?

  • 3 kudos
9 More Replies
bigger_dave
by New Contributor II
  • 449 Views
  • 6 replies
  • 0 kudos

permissions tab is missing from policy UI

Hi Team.When I try to create a new policy the permissions tab is missing.I am an account admin.Any ideas why?Many thanks.Dave.

  • 449 Views
  • 6 replies
  • 0 kudos
Latest Reply
MadhuB
Contributor III
  • 0 kudos

@bigger_dave If you are trying to create a compute policy, permissions tab should be available during configuration. If you wanted to grant to an existing policy, then permissions tab is available once you choose edit the policy. If you are looking f...

  • 0 kudos
5 More Replies
NelsonE
by New Contributor III
  • 815 Views
  • 6 replies
  • 0 kudos

Resolved! Updating Workspace Cluster

Hello,My organization is experiencing difficulties updating our Google Kubernetes Engine (GKE) cluster.We've reviewed the official GKE documentation for automated cluster updates, but it appears to primarily focus on AWS integrations. We haven't foun...

  • 815 Views
  • 6 replies
  • 0 kudos
Latest Reply
JaxonMiller
New Contributor II
  • 0 kudos

You can try Terraform or gcloud scripts for automation?

  • 0 kudos
5 More Replies
Leo_310
by New Contributor II
  • 770 Views
  • 1 replies
  • 0 kudos

OAuth Url and ClientId Validation

HiI am trying to setup an oauth connection with databricks, so I ask the user to enter their Workspace URL and ClientId.Once the user enters these values, I want to validate whether they are correct or not, so I ask them to login by redirecting them ...

  • 770 Views
  • 1 replies
  • 0 kudos
Latest Reply
Leo_310
New Contributor II
  • 0 kudos

RFC for the reference https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.2.1

  • 0 kudos
filipniziol
by Esteemed Contributor
  • 3748 Views
  • 5 replies
  • 2 kudos

Resolved! How to enable Genie?

Hi All,Based on the article below to enable Genie one needs to:1. Enable Azure AI services-powered featuresThat is done:2. Enable Genie must be enabled from the Previews pageI do not see Genie among Previews:I am using Azure Databricks. Any idea how ...

filipniziol_0-1727681205423.png filipniziol_1-1727681253578.png
  • 3748 Views
  • 5 replies
  • 2 kudos
Latest Reply
usman61
New Contributor II
  • 2 kudos

I can access to preview on account level but can't see Genie in Previews

  • 2 kudos
4 More Replies
nskiran1
by New Contributor II
  • 358 Views
  • 2 replies
  • 3 kudos

Databricks shared workspace

We have a Self service portal through which users can launch databricks clusters of different configurations.  This portal is set up to work in Dev, Sandbox and Prod environments. We have configured databricks workspaces only for Sandbox and Prod por...

  • 358 Views
  • 2 replies
  • 3 kudos
Latest Reply
nskiran
New Contributor III
  • 3 kudos

@Alberto_Umana Thanks for sharing doc linksWe have exact same set up to support shared databricks workspace. But still Im facing issue while adding instance profileI am trying to add AWS Instance Profile created in source AWS Account (No databricks w...

  • 3 kudos
1 More Replies
invalidargument
by New Contributor III
  • 240 Views
  • 1 replies
  • 2 kudos

displayHTML <a href="#id"> not working

Many packages output a html-report, e.g. ydata-profiler. The report contains links to other parts of the report. But when the user clicks the links a new window is opened instead of scrolling to the correct section of the displayed html.Could this be...

  • 240 Views
  • 1 replies
  • 2 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 2 kudos

Hello @invalidargument, Currently, there is no direct support from the Databricks end to modify this behavior without using such a workaround. The displayHTML function in Databricks renders HTML content within an iframe, and the injected JavaScript h...

  • 2 kudos
JacekJacek
by New Contributor III
  • 495 Views
  • 1 replies
  • 2 kudos

Resolved! Databricks Docker CLI image - how to debug Terraform when deploying Asset Bundle

We're having issues when deploying asset bundle using docker databricks cli image.Validation part passes OK:14:21:15 Name: test 14:21:15 Target: prototype-dev 14:21:15 Workspace: 14:21:15 Host: https://adb-xxx.azuredatabricks.net/ 14:21:15 ...

  • 495 Views
  • 1 replies
  • 2 kudos
Latest Reply
JacekJacek
New Contributor III
  • 2 kudos

OK, as it turns out - in order to bypass proxy we needed to set no_proxy env variable in both upper and lower case (!), like this:NO_PROXY="adb-xxx.azuredatabricks.net" docker run \ -v %teamcity.build.checkoutDir%:/my-bundle \ -v %teamcity.build...

  • 2 kudos
AnkurMittal008
by New Contributor III
  • 411 Views
  • 2 replies
  • 1 kudos

Databricks Apps: AWS Secret Manager Access

We are exploring Databricks Apps.We want Databricks APP to interact with AWS Secret Manager. How we can configure this and configure IAM on AWS side for this to take place.@app 

  • 411 Views
  • 2 replies
  • 1 kudos
Latest Reply
AnkurMittal008
New Contributor III
  • 1 kudos

Thanks @Alberto_Umana .. Yes we will try to use databricks secret, that can be helpful. Couple of other questions on Databricks App 1) Can we use Framework other than mentioned in documentation( Streamlit,Flask,Dash,Gradio,Shiny).2) If required can w...

  • 1 kudos
1 More Replies
loic
by Contributor
  • 509 Views
  • 4 replies
  • 0 kudos

Different NCC having same subnets

Hello,We are forwarding this Microsoft tutorial to secure our storage access:https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-firewallWe have a weird behavior when we create several NCCs in th...

  • 509 Views
  • 4 replies
  • 0 kudos
Latest Reply
loic
Contributor
  • 0 kudos

Ok, so no, I correctly set the subnets of my NCC in the Virtual Networks setting as documented:https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-firewallThis setting is working fine, without th...

  • 0 kudos
3 More Replies
carlos_tasayco
by New Contributor III
  • 257 Views
  • 1 replies
  • 2 kudos

How can you enable {{"data_security_mode": "USER_ISOLATION" }} sql warehouses

Hi,I want to create an sql warehouse with {{"data_security_mode": "USER_ISOLATION" }} however, I dont find the section to get the json file of my cluster. Thanks

  • 257 Views
  • 1 replies
  • 2 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 2 kudos

Hello @carlos_tasayco, Access mode are not configurable in SQL warehouses. Please see: https://docs.databricks.com/api/workspace/warehouses/create

  • 2 kudos
carlos_tasayco
by New Contributor III
  • 665 Views
  • 0 replies
  • 1 kudos

Get managedResourceGroup from serverless

Hello,In my job I have a task where I should modify a notebook to get dynamically the environment, for example:This is how we get it:dic = {"D":"dev", "Q":"qa", "P":"prod"}managedResourceGroup = spark.conf.get("spark.databricks.xxxxx")xxxxx_Index = m...

  • 665 Views
  • 0 replies
  • 1 kudos