- 915 Views
- 2 replies
- 0 kudos
Removing storage account location from metastore fails
I am trying to remove the storage account location for our UC metastore. I am getting the error:I have tried assigning my user and service principal permission to create external location.
- 915 Views
- 2 replies
- 0 kudos
- 0 kudos
I accomplished it by creating a storage account credential and external location manually. Then I was able to able the remove the metastore path.What then happen was that the path for __databricks_internal catalog was set to the storage account that ...
- 0 kudos
- 2342 Views
- 2 replies
- 0 kudos
Resolved! How does a non-admin user read a public s3 bucket on serverless?
As an admin, I can easily read a public s3 bucket from serverless:spark.read.parquet("s3://[public bucket]/[path]").display()So can a non-admin user, from classic compute. But why does a non-admin user, from serverless (both environments 1 & 2) get t...
- 2342 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @spd_dat, Is the S3 bucket in the same region as your workspace? It might required using a IAM role / S3 bucket to allow the bucket even if it is public. Just for a test can you try giving the user who is trying the below permission: GRANT SELECT ...
- 0 kudos
- 720 Views
- 1 replies
- 0 kudos
AWS custom role for Databricks clusters - no instance profile ARN
Try to follow the instructions to create custom IAM role for EC2 instance in Databricks clusters, but I can't find the instance profile ARN on the role. If I create a regual IAM role on EC2, I can find both role ARN and instance profile ARN.https://d...
- 720 Views
- 1 replies
- 0 kudos
- 0 kudos
@Wayne I need to understand more about what you’re trying to achieve,but if you’re looking to grant permissions to the EC2 instances running behind a Databricks cluster using an instance profile, the following documentation provides a detailed explan...
- 0 kudos
- 11730 Views
- 2 replies
- 0 kudos
- 11730 Views
- 2 replies
- 0 kudos
- 0 kudos
You can get the job details from the jobs get api, which takes the job id as a parameter. This will give you all the information available about the job, specifically the job name. Please note that there is not a field called "job description" in the...
- 0 kudos
- 1222 Views
- 3 replies
- 0 kudos
system schema permission
I've Databricks workspace admin permissions and want to run few queries on system.billing schema to get more info on billing of dbx. Getting below errror: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have USE SCHEMA on Schema 'sy...
- 1222 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @PoojaD, You should access an admin to get you access: GRANT USE SCHEMA ON SCHEMA system.billing TO [Your User];GRANT SELECT ON TABLE system.billing.usage TO [Your User];
- 0 kudos
- 675 Views
- 1 replies
- 0 kudos
Removing the trial version as it is running cost
HI, I have a trial version on my AWS which keeps running and is eating up a dollar per day for the last couple of days. How do I disable it and use it only when required or completely remove it?
- 675 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @psgcbe, You can follow below steps: Terminate All Compute Resources:First, navigate to the AWS Management Console.Go to the EC2 Dashboard.Select Instances and terminate any running instances related to your trial. Cancel Your Subscription:Afte...
- 0 kudos
- 1088 Views
- 1 replies
- 1 kudos
Resolved! Timeout settings for Postgresql external catalog connection?
Is there any way to configure timeouts for external catalog connections? We are getting some timeouts with complex queries accessing a pgsql database through the catalog. We tried configuring the connection and we got this error │ Error: cannot upda...
- 1088 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @ErikApption, there is no direct support for a connectTimeout option in the connection settings through Unity Catalog as of now. You might need to explore these alternative timeout configurations or consider adjusting your database handling to ...
- 1 kudos
- 1739 Views
- 3 replies
- 0 kudos
Cannot create a workspace on GCP
Hi,I have been using Databricks for a couple of months and been spinning up workspaces with Terraform. The other day we decided to end our POC and move on to a MVP. This meant cleaning up all workspaces and GCP. after the cleanup was done I wanted to...
- 1739 Views
- 3 replies
- 0 kudos
- 0 kudos
Did you try from Marketplace? You may get there more detailed error.
- 0 kudos
- 1064 Views
- 2 replies
- 0 kudos
Can we create an external location from a different tenant in Azure
We are looking to add an external location which points to a storage account in another Azure tenant. Is this possible? Could you point to any documentation around this.Currently, when we try to add a new credential providing a DBX access connector a...
- 1064 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for the response @Alberto_Umana .Looks like the IDs are all provided correctly. Here is the config -Tenant A Tenant BDatabricks is hosted here ...
- 0 kudos
- 508 Views
- 0 replies
- 0 kudos
UCX Account Admin authentication error in Azure Databricks
Hi Team,I am using Azure Databricks to implement UCX. The UCX installation is completed properly. But facing issues when I am executing commands with account admin role. I am account admin in Azure Databricks (https://accounts.azuredatabricks.net/). ...
- 508 Views
- 0 replies
- 0 kudos
- 1862 Views
- 1 replies
- 0 kudos
creating Workspace in AWS with Quickstart is giving error
Hello, While creating workspace in AWS using Quickstart, I get below error. I used both admin Account and root account to create this but both gave the same issue. Any help is appreciated. The resource CopyZipsFunction is in a CREATE_FAILED stateT...
- 1862 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @eondatatech, Ensure that both the admin and the root account you are using to create the workspace have the necessary IAM permissions to create and manage Lambda functions. Specifically, check if the CreateFunction and PassRole permissions are...
- 0 kudos
- 1772 Views
- 3 replies
- 1 kudos
Databricks on GCP with GKE | Cluster stuck in starting status | GKE allocation ressource failing
Hi Databricks Community,I’m currently facing several challenges with my Databricks clusters running on Google Kubernetes Engine (GKE). I hope someone here might have insights or suggestions to resolve the issues.Problem Overview:I am experiencing fre...
- 1772 Views
- 3 replies
- 1 kudos
- 1 kudos
I am having similar issues. first time I am using the `databricks_cluster` resource, my terraform apply does not gracefully complete, and I see numerous errors about:1. Can’t scale up a node pool because of a failing scheduling predicateThe autoscale...
- 1 kudos
- 2131 Views
- 1 replies
- 0 kudos
Resolved! ALTER TABLE ... ALTER COLUMN .... SYNC IDENTITY not working anymore ?
Hello,I recently noticed that the ALTER TABLE ALTER COLUMN SYNC IDENTITY command is no longer functioning as expected.I have an IDENTITY column on my table:D_Category_SID BIGINT GENERATED BY DEFAULT AS IDENTITY (START WITH 1 INCREMENT BY 1)Previously...
- 2131 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @MDV, Thanks for your question. According to the recent updates, the SYNC IDENTITY command is now more restrictive and follows stronger invariants. Specifically, it no longer allows the high watermark to be reduced to ensure that there is no ri...
- 0 kudos
- 1213 Views
- 3 replies
- 0 kudos
Is there a way to switch default cluster associated with a workflow job
Hi, I have a workflow job that is connected to a default cluster (see blow)I know I can swap the cluster. However, sometimes the cluster is not active but when I start the workflow job, I will wait for the cluster to beome activated. It will take som...
- 1213 Views
- 3 replies
- 0 kudos
- 0 kudos
I suppose you can call the databricks api to run those workflows?Or is that a no go?
- 0 kudos
- 825 Views
- 3 replies
- 0 kudos
Does Databricks support configuring more than 1 Microsoft Entra ID in 1 Databricks account for SSO?
Can I configure more than 1 Microsoft Entra ID for a Databricks account for SSO? For example, I have 2 Microsoft Entra IDs: AD1 and AD2, and I want to configure them into 1 Databricks account, so I can share the data or workspaces to the users in th...
- 825 Views
- 3 replies
- 0 kudos
- 0 kudos
No, an account is specific to the EntraID tenant and region, so you can only integrate SCIM with one tenant. You'd have to make the users in AD2 guests in AD1 and then manage all the users in AD1. We have a similar setup. Clunky but works.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
55 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 113 | |
| 37 | |
| 34 | |
| 26 | |
| 25 |