- 652 Views
- 3 replies
- 0 kudos
Is there a way to switch default cluster associated with a workflow job
Hi, I have a workflow job that is connected to a default cluster (see blow)I know I can swap the cluster. However, sometimes the cluster is not active but when I start the workflow job, I will wait for the cluster to beome activated. It will take som...
- 652 Views
- 3 replies
- 0 kudos
- 0 kudos
I suppose you can call the databricks api to run those workflows?Or is that a no go?
- 0 kudos
- 758 Views
- 0 replies
- 0 kudos
Unity Catalog Not Enabled on Job Cluster When Creating DLT in GCP Databricks
I am trying to create a Delta Live Table (DLT) in my GCP Databricks workspace, but I am encountering an issue where Unity Catalog is not enabled on the job cluster.Steps I followed:Created a DLT pipeline using the Databricks UI.Selected the appropria...
- 758 Views
- 0 replies
- 0 kudos
- 301 Views
- 3 replies
- 0 kudos
Does Databricks support configuring more than 1 Microsoft Entra ID in 1 Databricks account for SSO?
Can I configure more than 1 Microsoft Entra ID for a Databricks account for SSO? For example, I have 2 Microsoft Entra IDs: AD1 and AD2, and I want to configure them into 1 Databricks account, so I can share the data or workspaces to the users in th...
- 301 Views
- 3 replies
- 0 kudos
- 0 kudos
No, an account is specific to the EntraID tenant and region, so you can only integrate SCIM with one tenant. You'd have to make the users in AD2 guests in AD1 and then manage all the users in AD1. We have a similar setup. Clunky but works.
- 0 kudos
- 732 Views
- 0 replies
- 0 kudos
How to upload a file to Unity catalog volume using databricks asset bundles
Hi,I am working on a CI CD blueprint for developers, using which developers can create their bundle for jobs / workflows and then create a volume to which they will upload a wheel file or a jar file which will be used as a dependency in their noteboo...
- 732 Views
- 0 replies
- 0 kudos
- 925 Views
- 1 replies
- 0 kudos
Automating Version Control for Databricks Workflows
I am currently using Databricks Asset Bundles to manage and deploy workflows. While I have successfully automated the version control for notebooks, I am facing challenges with workflows. Specifically, I am looking to automate the process of fetching...
- 925 Views
- 1 replies
- 0 kudos
- 0 kudos
using the UI that is not possible I think.When using DAB and YML files it can be done.So I suggest you create the workflow using the UI (because it is easy to use) and then create a DAB out of that (using bundle generate).I admit, there is still some...
- 0 kudos
- 535 Views
- 1 replies
- 2 kudos
Resolved! Running job within job fails
Hello,I have a job with a task of type "Run Job". Everything is deployed using asset bundles and the deployment works fine, however when running the job, the Job step fails with error "PERMISSION DENIED : User unknown does not have Manage Run or Owne...
- 535 Views
- 1 replies
- 2 kudos
- 2 kudos
the permissions of a main job are not copied to nested jobs, so the executing user needs the proper permissions for the main job and the nested job.This can be defined in the permissions seciont of the job (not the task).I for one am waiting for a ce...
- 2 kudos
- 970 Views
- 3 replies
- 5 kudos
How are you deploying graphs?
Hi all, I have a couple of use cases that may benefit from using graphs. I'm interested in whether anyone has graph databases in Production and, if so, whether you're using GraphFrames, Neo4j or something else? What is the architecture you have the...
- 970 Views
- 3 replies
- 5 kudos
- 5 kudos
Up to now the way to go is graphx or graphframes.There is also the possibility to use python libraries or others (single node that is), perhaps even Arrow-based.Another option is to load the data to a graph database and then move back to databricks a...
- 5 kudos
- 526 Views
- 3 replies
- 0 kudos
Databricks Asset bundles
Hi Team,In our company we are planning to migrate our workflows with Databricks Asset bundles, is it mandatory to install Databricks CLI tool for getting started with DAB ? Any one who integrated with Github with CICD pipeline please let me know the ...
- 526 Views
- 3 replies
- 0 kudos
- 0 kudos
I forgot the CI/CD part:that is not that hard. Basically in DAB you define the type of environment you are using.If you use 'development', DAB assumes you are in actual development mode (feature branch). so there you can connect git and put the fil...
- 0 kudos
- 285 Views
- 1 replies
- 0 kudos
De facto Standard for Databricks on AWS
Hello,I am working on creating an architecture diagram for Databricks on AWS.I would like to adopt the de facto standard used by enterprises. Based on my research, I have identified the following components:Network: Customer-managed VPC,Secure Cluste...
- 285 Views
- 1 replies
- 0 kudos
- 0 kudos
I would not call it a 'standard' but a possible architecture. The great thing about the cloud is you can complete the puzzle in many ways and make it as complex or as easy as possible.Also I would not consider Fivetran to be standard in companies. ...
- 0 kudos
- 2749 Views
- 10 replies
- 3 kudos
Resolved! Deploy Workflow only to specific target (Databricks Asset Bundles)
I am using Databricks Asset Bundles to deploy Databricks workflows to all of my target environments (dev, staging, prod). However, I have one specific workflow that is supposed to be deployed only to the dev target environment.How can I implement tha...
- 2749 Views
- 10 replies
- 3 kudos
- 3 kudos
Hi, I'm also looking to deploy different jobs in different targets. And these jobs are defined in a separate .yml file and we'll need to reference these jobs in the targets accordingly. Any updates on this implementation?
- 3 kudos
- 406 Views
- 6 replies
- 0 kudos
permissions tab is missing from policy UI
Hi Team.When I try to create a new policy the permissions tab is missing.I am an account admin.Any ideas why?Many thanks.Dave.
- 406 Views
- 6 replies
- 0 kudos
- 0 kudos
@bigger_dave If you are trying to create a compute policy, permissions tab should be available during configuration. If you wanted to grant to an existing policy, then permissions tab is available once you choose edit the policy. If you are looking f...
- 0 kudos
- 776 Views
- 6 replies
- 0 kudos
Resolved! Updating Workspace Cluster
Hello,My organization is experiencing difficulties updating our Google Kubernetes Engine (GKE) cluster.We've reviewed the official GKE documentation for automated cluster updates, but it appears to primarily focus on AWS integrations. We haven't foun...
- 776 Views
- 6 replies
- 0 kudos
- 0 kudos
You can try Terraform or gcloud scripts for automation?
- 0 kudos
- 656 Views
- 1 replies
- 0 kudos
OAuth Url and ClientId Validation
HiI am trying to setup an oauth connection with databricks, so I ask the user to enter their Workspace URL and ClientId.Once the user enters these values, I want to validate whether they are correct or not, so I ask them to login by redirecting them ...
- 656 Views
- 1 replies
- 0 kudos
- 0 kudos
RFC for the reference https://datatracker.ietf.org/doc/html/rfc6749#section-4.1.2.1
- 0 kudos
- 3625 Views
- 5 replies
- 2 kudos
Resolved! How to enable Genie?
Hi All,Based on the article below to enable Genie one needs to:1. Enable Azure AI services-powered featuresThat is done:2. Enable Genie must be enabled from the Previews pageI do not see Genie among Previews:I am using Azure Databricks. Any idea how ...
- 3625 Views
- 5 replies
- 2 kudos
- 2 kudos
I can access to preview on account level but can't see Genie in Previews
- 2 kudos
- 331 Views
- 2 replies
- 3 kudos
Databricks shared workspace
We have a Self service portal through which users can launch databricks clusters of different configurations. This portal is set up to work in Dev, Sandbox and Prod environments. We have configured databricks workspaces only for Sandbox and Prod por...
- 331 Views
- 2 replies
- 3 kudos
- 3 kudos
@Alberto_Umana Thanks for sharing doc linksWe have exact same set up to support shared databricks workspace. But still Im facing issue while adding instance profileI am trying to add AWS Instance Profile created in source AWS Account (No databricks w...
- 3 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Partner
10 -
Public Preview
1 -
Rest API
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
43 | |
31 | |
25 | |
17 | |
10 |