- 841 Views
- 1 replies
- 0 kudos
Communication between multiple region workspaces & metastore
Hi EveryoneWe currently have a workspace and metastore in the EastUS region, and we’re planning to set up another workspace and metastore in the Canada region. Additionally, we need to be able to access data from the Canada region within the US regio...
- 841 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Dnirmania! Delta Sharing is ideal for read-only, cross-platform data access without duplication. Direct metastore connections offer low-latency access between Databricks workspaces under a unified governance model. Additionally, you may explor...
- 0 kudos
- 762 Views
- 1 replies
- 0 kudos
Query not returning result - no internet connection
Hi,New to DBricks. We have set up our new environment and imported some data to tables. However when we run a query against the table using a newly created warehouse we can see the query runs fine but throw the error: failed communicating with serve...
- 762 Views
- 1 replies
- 0 kudos
- 0 kudos
It sounds like there might be a network configuration issue or a client-side problem preventing your query results from appearing in the session, even though they're executing successfully on the server. Here are a few things you could check:Network ...
- 0 kudos
- 2442 Views
- 2 replies
- 2 kudos
Restrict a Workspace User from Creating/Managing Databricks Jobs
Hello Databricks team,I currently have a workspace user, and I want to disable their ability to create or manage Databricks jobs entirely. Specifically, I would like to prevent the user from accessing the "Create Job" option in the Databricks UI or v...
- 2442 Views
- 2 replies
- 2 kudos
- 2 kudos
Are there any updates on any of these internal feature requests? It's a pretty big failure in the permissions model that we can't prevent users from scheduling arbitrary workloads on compute they are able to access.
- 2 kudos
- 3272 Views
- 0 replies
- 0 kudos
Lakehouse Federation - Unable to connect to Snowflake using "PEM Private Key"
Hi,I'm currently using Lakehouse Federation feature on databricks to run queries against Snowflake datawarehouse. Today I'm using a service credential to establish the connection (user id & pwd), but I have to change it to use private key. I tried us...
- 3272 Views
- 0 replies
- 0 kudos
- 883 Views
- 2 replies
- 0 kudos
Can AWS workspaces share subnets?
The docs state:"You can choose to share one subnet across multiple workspaces or both subnets across workspaces."as well as:"You can reuse existing security groups rather than create new ones."and on this page:"If you plan to share a VPC and subnets ...
- 883 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @spd_dat, I will check with the team internally and try to replicate the same in my workspace. This should be possible as documentation indicates. Based on the error you are hitting, could you please share the configuration you are setting in your...
- 0 kudos
- 484 Views
- 1 replies
- 0 kudos
Databricks SSO enableed to Azure AD and this set up was deleted
HI Team,The SSO was enabled on the Databricks account with Azure AD and the environment is on AWS platform.The enterprise application which was used is been deleted from azure AD. No emergency user access has been set up.It is locked up, possible to ...
- 484 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Ashok_AWS, Could you file a case with us, we can help you. Send a mail to: help@databricks.comhttps://docs.databricks.com/aws/en/resources/support
- 0 kudos
- 1491 Views
- 3 replies
- 2 kudos
Question about moving to Serverless compute
Hi my organization is using Databricks in Canada Est region and Serverless isn't available in our region yet? at all?I would like to know if it is worth the effort to change region for the Canada Central, where Serverless compute is available. We do ...
- 1491 Views
- 3 replies
- 2 kudos
- 2 kudos
hi Takuya Omi, thank you for responding. My question is: if we are to migrate our existing workspaces (3) and UC to Canada Central. Is it doable? Is it worth it? What does it imply? What are the best practices to do so?Thank you.
- 2 kudos
- 896 Views
- 2 replies
- 0 kudos
Removing storage account location from metastore fails
I am trying to remove the storage account location for our UC metastore. I am getting the error:I have tried assigning my user and service principal permission to create external location.
- 896 Views
- 2 replies
- 0 kudos
- 0 kudos
I accomplished it by creating a storage account credential and external location manually. Then I was able to able the remove the metastore path.What then happen was that the path for __databricks_internal catalog was set to the storage account that ...
- 0 kudos
- 2327 Views
- 2 replies
- 0 kudos
Resolved! How does a non-admin user read a public s3 bucket on serverless?
As an admin, I can easily read a public s3 bucket from serverless:spark.read.parquet("s3://[public bucket]/[path]").display()So can a non-admin user, from classic compute. But why does a non-admin user, from serverless (both environments 1 & 2) get t...
- 2327 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @spd_dat, Is the S3 bucket in the same region as your workspace? It might required using a IAM role / S3 bucket to allow the bucket even if it is public. Just for a test can you try giving the user who is trying the below permission: GRANT SELECT ...
- 0 kudos
- 707 Views
- 1 replies
- 0 kudos
AWS custom role for Databricks clusters - no instance profile ARN
Try to follow the instructions to create custom IAM role for EC2 instance in Databricks clusters, but I can't find the instance profile ARN on the role. If I create a regual IAM role on EC2, I can find both role ARN and instance profile ARN.https://d...
- 707 Views
- 1 replies
- 0 kudos
- 0 kudos
@Wayne I need to understand more about what you’re trying to achieve,but if you’re looking to grant permissions to the EC2 instances running behind a Databricks cluster using an instance profile, the following documentation provides a detailed explan...
- 0 kudos
- 11674 Views
- 2 replies
- 0 kudos
- 11674 Views
- 2 replies
- 0 kudos
- 0 kudos
You can get the job details from the jobs get api, which takes the job id as a parameter. This will give you all the information available about the job, specifically the job name. Please note that there is not a field called "job description" in the...
- 0 kudos
- 1189 Views
- 3 replies
- 0 kudos
system schema permission
I've Databricks workspace admin permissions and want to run few queries on system.billing schema to get more info on billing of dbx. Getting below errror: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have USE SCHEMA on Schema 'sy...
- 1189 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @PoojaD, You should access an admin to get you access: GRANT USE SCHEMA ON SCHEMA system.billing TO [Your User];GRANT SELECT ON TABLE system.billing.usage TO [Your User];
- 0 kudos
- 657 Views
- 1 replies
- 0 kudos
Removing the trial version as it is running cost
HI, I have a trial version on my AWS which keeps running and is eating up a dollar per day for the last couple of days. How do I disable it and use it only when required or completely remove it?
- 657 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @psgcbe, You can follow below steps: Terminate All Compute Resources:First, navigate to the AWS Management Console.Go to the EC2 Dashboard.Select Instances and terminate any running instances related to your trial. Cancel Your Subscription:Afte...
- 0 kudos
- 1074 Views
- 1 replies
- 1 kudos
Resolved! Timeout settings for Postgresql external catalog connection?
Is there any way to configure timeouts for external catalog connections? We are getting some timeouts with complex queries accessing a pgsql database through the catalog. We tried configuring the connection and we got this error │ Error: cannot upda...
- 1074 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @ErikApption, there is no direct support for a connectTimeout option in the connection settings through Unity Catalog as of now. You might need to explore these alternative timeout configurations or consider adjusting your database handling to ...
- 1 kudos
- 1702 Views
- 3 replies
- 0 kudos
Cannot create a workspace on GCP
Hi,I have been using Databricks for a couple of months and been spinning up workspaces with Terraform. The other day we decided to end our POC and move on to a MVP. This meant cleaning up all workspaces and GCP. after the cleanup was done I wanted to...
- 1702 Views
- 3 replies
- 0 kudos
- 0 kudos
Did you try from Marketplace? You may get there more detailed error.
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
53 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 111 | |
| 37 | |
| 34 | |
| 25 | |
| 24 |