cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

eondatatech
by New Contributor
  • 1932 Views
  • 1 replies
  • 0 kudos

creating Workspace in AWS with Quickstart is giving error

Hello, While  creating workspace in AWS using Quickstart, I get below error.  I used both admin Account and root account to create this but both gave the same issue. Any help is appreciated.  The resource CopyZipsFunction is in a CREATE_FAILED stateT...

  • 1932 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @eondatatech, Ensure that both the admin and the root account you are using to create the workspace have the necessary IAM permissions to create and manage Lambda functions. Specifically, check if the CreateFunction and PassRole permissions are...

  • 0 kudos
edouardtouze
by New Contributor II
  • 2047 Views
  • 3 replies
  • 1 kudos

Databricks on GCP with GKE | Cluster stuck in starting status | GKE allocation ressource failing

Hi Databricks Community,I’m currently facing several challenges with my Databricks clusters running on Google Kubernetes Engine (GKE). I hope someone here might have insights or suggestions to resolve the issues.Problem Overview:I am experiencing fre...

  • 2047 Views
  • 3 replies
  • 1 kudos
Latest Reply
chalkboardbrad
New Contributor II
  • 1 kudos

I am having similar issues. first time I am using the `databricks_cluster` resource, my terraform apply does not gracefully complete, and I see numerous errors about:1. Can’t scale up a node pool because of a failing scheduling predicateThe autoscale...

  • 1 kudos
2 More Replies
MDV
by New Contributor III
  • 2580 Views
  • 1 replies
  • 0 kudos

Resolved! ALTER TABLE ... ALTER COLUMN .... SYNC IDENTITY not working anymore ?

Hello,I recently noticed that the ALTER TABLE ALTER COLUMN SYNC IDENTITY command is no longer functioning as expected.I have an IDENTITY column on my table:D_Category_SID BIGINT GENERATED BY DEFAULT AS IDENTITY (START WITH 1 INCREMENT BY 1)Previously...

  • 2580 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @MDV, Thanks for your question. According to the recent updates, the SYNC IDENTITY command is now more restrictive and follows stronger invariants. Specifically, it no longer allows the high watermark to be reduced to ensure that there is no ri...

  • 0 kudos
ianchenmu
by New Contributor III
  • 1325 Views
  • 3 replies
  • 0 kudos

Is there a way to switch default cluster associated with a workflow job

Hi, I have a workflow job that is connected to a default cluster (see blow)I know I can swap the cluster. However, sometimes the cluster is not active but when I start the workflow job, I will wait for the cluster to beome activated. It will take som...

ianchenmu_0-1739506495579.png
  • 1325 Views
  • 3 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I suppose you can call the databricks api to run those workflows?Or is that a no go?

  • 0 kudos
2 More Replies
Paladin
by New Contributor
  • 914 Views
  • 3 replies
  • 0 kudos

Does Databricks support configuring more than 1 Microsoft Entra ID in 1 Databricks account for SSO?

Can I configure more than 1 Microsoft Entra ID for a Databricks account for SSO? For example, I have 2 Microsoft Entra IDs: AD1 and AD2,  and I want to configure them into 1 Databricks account, so I can share the data or workspaces to the users in th...

  • 914 Views
  • 3 replies
  • 0 kudos
Latest Reply
Rjdudley
Honored Contributor
  • 0 kudos

No, an account is specific to the EntraID tenant and region, so you can only integrate SCIM with one tenant.  You'd have to make the users in AD2 guests in AD1 and then manage all the users in AD1.  We have a similar setup.  Clunky but works.

  • 0 kudos
2 More Replies
erigaud
by Honored Contributor
  • 1340 Views
  • 1 replies
  • 2 kudos

Resolved! Running job within job fails

Hello,I have a job with a task of type "Run Job". Everything is deployed using asset bundles and the deployment works fine, however when running the job, the Job step fails with error "PERMISSION DENIED : User unknown does not have Manage Run or Owne...

  • 1340 Views
  • 1 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

the permissions of a main job are not copied to nested jobs, so the executing user needs the proper permissions for the main job and the nested job.This can be defined in the permissions seciont of the job (not the task).I for one am waiting for a ce...

  • 2 kudos
JamesDryden
by New Contributor II
  • 2846 Views
  • 3 replies
  • 5 kudos

How are you deploying graphs?

Hi all, I have a couple of use cases that may benefit from using graphs.  I'm interested in whether anyone has graph databases in Production and, if so, whether you're using GraphFrames, Neo4j or something else?  What is the architecture you have the...

  • 2846 Views
  • 3 replies
  • 5 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 5 kudos

Up to now the way to go is graphx or graphframes.There is also the possibility to use python libraries or others (single node that is), perhaps even Arrow-based.Another option is to load the data to a graph database and then move back to databricks a...

  • 5 kudos
2 More Replies
sri840
by New Contributor
  • 1280 Views
  • 3 replies
  • 0 kudos

Databricks Asset bundles

Hi Team,In our company we are planning to migrate our workflows with Databricks Asset bundles, is it mandatory to install Databricks CLI tool for getting started with DAB ? Any one who integrated with Github with CICD pipeline please let me know the ...

  • 1280 Views
  • 3 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I forgot the CI/CD part:that is not that hard.  Basically in DAB you define the type of environment you are using.If you use 'development', DAB assumes you are in actual development mode (feature branch).  so there you can connect git and put the fil...

  • 0 kudos
2 More Replies
Kousuke_0716
by New Contributor
  • 823 Views
  • 1 replies
  • 0 kudos

De facto Standard for Databricks on AWS

Hello,I am working on creating an architecture diagram for Databricks on AWS.I would like to adopt the de facto standard used by enterprises. Based on my research, I have identified the following components:Network: Customer-managed VPC,Secure Cluste...

  • 823 Views
  • 1 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I would not call it a 'standard' but a possible architecture.  The great thing about the cloud is you can complete the puzzle in many ways and make it as complex or as easy as possible.Also I would not consider Fivetran to be standard in companies.  ...

  • 0 kudos
bigger_dave
by New Contributor II
  • 1700 Views
  • 6 replies
  • 0 kudos

permissions tab is missing from policy UI

Hi Team.When I try to create a new policy the permissions tab is missing.I am an account admin.Any ideas why?Many thanks.Dave.

  • 1700 Views
  • 6 replies
  • 0 kudos
Latest Reply
MadhuB
Valued Contributor
  • 0 kudos

@bigger_dave If you are trying to create a compute policy, permissions tab should be available during configuration. If you wanted to grant to an existing policy, then permissions tab is available once you choose edit the policy. If you are looking f...

  • 0 kudos
5 More Replies
NelsonE
by New Contributor III
  • 1922 Views
  • 6 replies
  • 0 kudos

Resolved! Updating Workspace Cluster

Hello,My organization is experiencing difficulties updating our Google Kubernetes Engine (GKE) cluster.We've reviewed the official GKE documentation for automated cluster updates, but it appears to primarily focus on AWS integrations. We haven't foun...

  • 1922 Views
  • 6 replies
  • 0 kudos
Latest Reply
JaxonMiller
New Contributor II
  • 0 kudos

You can try Terraform or gcloud scripts for automation?

  • 0 kudos
5 More Replies
filipniziol
by Esteemed Contributor
  • 7610 Views
  • 5 replies
  • 2 kudos

Resolved! How to enable Genie?

Hi All,Based on the article below to enable Genie one needs to:1. Enable Azure AI services-powered featuresThat is done:2. Enable Genie must be enabled from the Previews pageI do not see Genie among Previews:I am using Azure Databricks. Any idea how ...

filipniziol_0-1727681205423.png filipniziol_1-1727681253578.png
  • 7610 Views
  • 5 replies
  • 2 kudos
Latest Reply
usman61
New Contributor II
  • 2 kudos

I can access to preview on account level but can't see Genie in Previews

  • 2 kudos
4 More Replies
nskiran1
by New Contributor II
  • 4325 Views
  • 2 replies
  • 3 kudos

Databricks shared workspace

We have a Self service portal through which users can launch databricks clusters of different configurations.  This portal is set up to work in Dev, Sandbox and Prod environments. We have configured databricks workspaces only for Sandbox and Prod por...

  • 4325 Views
  • 2 replies
  • 3 kudos
Latest Reply
nskiran
New Contributor III
  • 3 kudos

@Alberto_Umana Thanks for sharing doc linksWe have exact same set up to support shared databricks workspace. But still Im facing issue while adding instance profileI am trying to add AWS Instance Profile created in source AWS Account (No databricks w...

  • 3 kudos
1 More Replies
invalidargument
by New Contributor III
  • 734 Views
  • 1 replies
  • 2 kudos

displayHTML <a href="#id"> not working

Many packages output a html-report, e.g. ydata-profiler. The report contains links to other parts of the report. But when the user clicks the links a new window is opened instead of scrolling to the correct section of the displayed html.Could this be...

  • 734 Views
  • 1 replies
  • 2 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 2 kudos

Hello @invalidargument, Currently, there is no direct support from the Databricks end to modify this behavior without using such a workaround. The displayHTML function in Databricks renders HTML content within an iframe, and the injected JavaScript h...

  • 2 kudos