- 1257 Views
- 3 replies
- 0 kudos
system schema permission
I've Databricks workspace admin permissions and want to run few queries on system.billing schema to get more info on billing of dbx. Getting below errror: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have USE SCHEMA on Schema 'sy...
- 1257 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @PoojaD, You should access an admin to get you access: GRANT USE SCHEMA ON SCHEMA system.billing TO [Your User];GRANT SELECT ON TABLE system.billing.usage TO [Your User];
- 0 kudos
- 711 Views
- 1 replies
- 0 kudos
Removing the trial version as it is running cost
HI, I have a trial version on my AWS which keeps running and is eating up a dollar per day for the last couple of days. How do I disable it and use it only when required or completely remove it?
- 711 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @psgcbe, You can follow below steps: Terminate All Compute Resources:First, navigate to the AWS Management Console.Go to the EC2 Dashboard.Select Instances and terminate any running instances related to your trial. Cancel Your Subscription:Afte...
- 0 kudos
- 1126 Views
- 1 replies
- 1 kudos
Resolved! Timeout settings for Postgresql external catalog connection?
Is there any way to configure timeouts for external catalog connections? We are getting some timeouts with complex queries accessing a pgsql database through the catalog. We tried configuring the connection and we got this error │ Error: cannot upda...
- 1126 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @ErikApption, there is no direct support for a connectTimeout option in the connection settings through Unity Catalog as of now. You might need to explore these alternative timeout configurations or consider adjusting your database handling to ...
- 1 kudos
- 1766 Views
- 3 replies
- 0 kudos
Cannot create a workspace on GCP
Hi,I have been using Databricks for a couple of months and been spinning up workspaces with Terraform. The other day we decided to end our POC and move on to a MVP. This meant cleaning up all workspaces and GCP. after the cleanup was done I wanted to...
- 1766 Views
- 3 replies
- 0 kudos
- 0 kudos
Did you try from Marketplace? You may get there more detailed error.
- 0 kudos
- 1117 Views
- 2 replies
- 0 kudos
Can we create an external location from a different tenant in Azure
We are looking to add an external location which points to a storage account in another Azure tenant. Is this possible? Could you point to any documentation around this.Currently, when we try to add a new credential providing a DBX access connector a...
- 1117 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for the response @Alberto_Umana .Looks like the IDs are all provided correctly. Here is the config -Tenant A Tenant BDatabricks is hosted here ...
- 0 kudos
- 518 Views
- 0 replies
- 0 kudos
UCX Account Admin authentication error in Azure Databricks
Hi Team,I am using Azure Databricks to implement UCX. The UCX installation is completed properly. But facing issues when I am executing commands with account admin role. I am account admin in Azure Databricks (https://accounts.azuredatabricks.net/). ...
- 518 Views
- 0 replies
- 0 kudos
- 1886 Views
- 1 replies
- 0 kudos
creating Workspace in AWS with Quickstart is giving error
Hello, While creating workspace in AWS using Quickstart, I get below error. I used both admin Account and root account to create this but both gave the same issue. Any help is appreciated. The resource CopyZipsFunction is in a CREATE_FAILED stateT...
- 1886 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @eondatatech, Ensure that both the admin and the root account you are using to create the workspace have the necessary IAM permissions to create and manage Lambda functions. Specifically, check if the CreateFunction and PassRole permissions are...
- 0 kudos
- 1841 Views
- 3 replies
- 1 kudos
Databricks on GCP with GKE | Cluster stuck in starting status | GKE allocation ressource failing
Hi Databricks Community,I’m currently facing several challenges with my Databricks clusters running on Google Kubernetes Engine (GKE). I hope someone here might have insights or suggestions to resolve the issues.Problem Overview:I am experiencing fre...
- 1841 Views
- 3 replies
- 1 kudos
- 1 kudos
I am having similar issues. first time I am using the `databricks_cluster` resource, my terraform apply does not gracefully complete, and I see numerous errors about:1. Can’t scale up a node pool because of a failing scheduling predicateThe autoscale...
- 1 kudos
- 2195 Views
- 1 replies
- 0 kudos
Resolved! ALTER TABLE ... ALTER COLUMN .... SYNC IDENTITY not working anymore ?
Hello,I recently noticed that the ALTER TABLE ALTER COLUMN SYNC IDENTITY command is no longer functioning as expected.I have an IDENTITY column on my table:D_Category_SID BIGINT GENERATED BY DEFAULT AS IDENTITY (START WITH 1 INCREMENT BY 1)Previously...
- 2195 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @MDV, Thanks for your question. According to the recent updates, the SYNC IDENTITY command is now more restrictive and follows stronger invariants. Specifically, it no longer allows the high watermark to be reduced to ensure that there is no ri...
- 0 kudos
- 1234 Views
- 3 replies
- 0 kudos
Is there a way to switch default cluster associated with a workflow job
Hi, I have a workflow job that is connected to a default cluster (see blow)I know I can swap the cluster. However, sometimes the cluster is not active but when I start the workflow job, I will wait for the cluster to beome activated. It will take som...
- 1234 Views
- 3 replies
- 0 kudos
- 0 kudos
I suppose you can call the databricks api to run those workflows?Or is that a no go?
- 0 kudos
- 846 Views
- 3 replies
- 0 kudos
Does Databricks support configuring more than 1 Microsoft Entra ID in 1 Databricks account for SSO?
Can I configure more than 1 Microsoft Entra ID for a Databricks account for SSO? For example, I have 2 Microsoft Entra IDs: AD1 and AD2, and I want to configure them into 1 Databricks account, so I can share the data or workspaces to the users in th...
- 846 Views
- 3 replies
- 0 kudos
- 0 kudos
No, an account is specific to the EntraID tenant and region, so you can only integrate SCIM with one tenant. You'd have to make the users in AD2 guests in AD1 and then manage all the users in AD1. We have a similar setup. Clunky but works.
- 0 kudos
- 1259 Views
- 1 replies
- 2 kudos
Resolved! Running job within job fails
Hello,I have a job with a task of type "Run Job". Everything is deployed using asset bundles and the deployment works fine, however when running the job, the Job step fails with error "PERMISSION DENIED : User unknown does not have Manage Run or Owne...
- 1259 Views
- 1 replies
- 2 kudos
- 2 kudos
the permissions of a main job are not copied to nested jobs, so the executing user needs the proper permissions for the main job and the nested job.This can be defined in the permissions seciont of the job (not the task).I for one am waiting for a ce...
- 2 kudos
- 2603 Views
- 3 replies
- 5 kudos
How are you deploying graphs?
Hi all, I have a couple of use cases that may benefit from using graphs. I'm interested in whether anyone has graph databases in Production and, if so, whether you're using GraphFrames, Neo4j or something else? What is the architecture you have the...
- 2603 Views
- 3 replies
- 5 kudos
- 5 kudos
Up to now the way to go is graphx or graphframes.There is also the possibility to use python libraries or others (single node that is), perhaps even Arrow-based.Another option is to load the data to a graph database and then move back to databricks a...
- 5 kudos
- 1182 Views
- 3 replies
- 0 kudos
Databricks Asset bundles
Hi Team,In our company we are planning to migrate our workflows with Databricks Asset bundles, is it mandatory to install Databricks CLI tool for getting started with DAB ? Any one who integrated with Github with CICD pipeline please let me know the ...
- 1182 Views
- 3 replies
- 0 kudos
- 0 kudos
I forgot the CI/CD part:that is not that hard. Basically in DAB you define the type of environment you are using.If you use 'development', DAB assumes you are in actual development mode (feature branch). so there you can connect git and put the fil...
- 0 kudos
- 770 Views
- 1 replies
- 0 kudos
De facto Standard for Databricks on AWS
Hello,I am working on creating an architecture diagram for Databricks on AWS.I would like to adopt the de facto standard used by enterprises. Based on my research, I have identified the following components:Network: Customer-managed VPC,Secure Cluste...
- 770 Views
- 1 replies
- 0 kudos
- 0 kudos
I would not call it a 'standard' but a possible architecture. The great thing about the cloud is you can complete the puzzle in many ways and make it as complex or as easy as possible.Also I would not consider Fivetran to be standard in companies. ...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
58 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 118 | |
| 37 | |
| 36 | |
| 28 | |
| 25 |