- 60 Views
- 0 replies
- 0 kudos
Hi everyone,I'm new to Databricks and am trying to connect my Google Cloud Storage bucket to my Databricks workspace. I have a 43GB CSV file stored in a GCP bucket that I want to work with. Here’s what I've done so far:Bucket Setup:I created a GCP bu...
- 60 Views
- 0 replies
- 0 kudos
- 151 Views
- 1 replies
- 0 kudos
I am in the process of setting up a new Databricks account for AWS commercial. I mistakenly setup the account with the email: databricks-external-nonprod-account-owner@slingshotaerospace.com to not be self-managed and I would like for this new accoun...
- 151 Views
- 1 replies
- 0 kudos
Latest Reply
Or better yet if we could delete it so I can re-create the account.
- 169 Views
- 2 replies
- 0 kudos
HiI am trying to establish a method of accessing secrets from AWS Secrets Manager and understand this can be done with boto as suggested from AWS.We have created all of the relevant IAM roles, instance profiles etc. Accessing S3 with this method is ...
- 169 Views
- 2 replies
- 0 kudos
Latest Reply
Hi @Alberto_Umana,Yes, the Role has the SecretsManagerReadWrite policy.Also within my furthet investigation I tried running it via Personal Cluster and it worked!Basically, 3 scenarios:- Shared Cluster with applied InstanceProfile - Secrets failing- ...
1 More Replies
- 111 Views
- 1 replies
- 0 kudos
Hi EveryoneWe currently have a workspace and metastore in the EastUS region, and we’re planning to set up another workspace and metastore in the Canada region. Additionally, we need to be able to access data from the Canada region within the US regio...
- 111 Views
- 1 replies
- 0 kudos
Latest Reply
Hello @Dnirmania!
Delta Sharing is ideal for read-only, cross-platform data access without duplication. Direct metastore connections offer low-latency access between Databricks workspaces under a unified governance model. Additionally, you may explor...
by
FarBo
• New Contributor III
- 144 Views
- 2 replies
- 1 kudos
Hi,We want to enable some system system tables in our databricks workspace using this command:curl -v -X PUT -H "Authorization: Bearer <PAT token>" "https://adb-0000000000.azuredatabricks.net/api/2.0/unity-catalog/metastores/<metastore-id>/systemsche...
- 144 Views
- 2 replies
- 1 kudos
Latest Reply
Allia
Databricks Employee
@FarBo The billing schema is enabled by default. Other schemas must be enabled manually.https://docs.databricks.com/aws/en/admin/system-tables#enable-system-table-schemas
1 More Replies
by
saiV06
• New Contributor III
- 131 Views
- 0 replies
- 0 kudos
Hi,I'm currently using Lakehouse Federation feature on databricks to run queries against Snowflake datawarehouse. Today I'm using a service credential to establish the connection (user id & pwd), but I have to change it to use private key. I tried us...
- 131 Views
- 0 replies
- 0 kudos
by
Behwar
• New Contributor III
- 898 Views
- 4 replies
- 1 kudos
Hello,I've deployed Azure Databricks with a standard Private Link setup (no public IP). Everything works as expected—I can log in via the private/internal network, create clusters, and manage workloads without any issues.When I create a Databricks Ap...
- 898 Views
- 4 replies
- 1 kudos
Latest Reply
Do you have a private endpoint for databricks_ui_api ? You need to establish a private endpoint for users to access web app.
3 More Replies
- 212 Views
- 1 replies
- 1 kudos
Is there any way to configure timeouts for external catalog connections? We are getting some timeouts with complex queries accessing a pgsql database through the catalog. We tried configuring the connection and we got this error │ Error: cannot upda...
- 212 Views
- 1 replies
- 1 kudos
Latest Reply
Hello @ErikApption,
there is no direct support for a connectTimeout option in the connection settings through Unity Catalog as of now. You might need to explore these alternative timeout configurations or consider adjusting your database handling to ...
- 584 Views
- 3 replies
- 0 kudos
Hi,I have been using Databricks for a couple of months and been spinning up workspaces with Terraform. The other day we decided to end our POC and move on to a MVP. This meant cleaning up all workspaces and GCP. after the cleanup was done I wanted to...
- 584 Views
- 3 replies
- 0 kudos
Latest Reply
Did you try from Marketplace? You may get there more detailed error.
2 More Replies
- 92 Views
- 0 replies
- 0 kudos
- 92 Views
- 0 replies
- 0 kudos
- 379 Views
- 3 replies
- 1 kudos
Hi Databricks Community,I’m currently facing several challenges with my Databricks clusters running on Google Kubernetes Engine (GKE). I hope someone here might have insights or suggestions to resolve the issues.Problem Overview:I am experiencing fre...
- 379 Views
- 3 replies
- 1 kudos
Latest Reply
I am having similar issues. first time I am using the `databricks_cluster` resource, my terraform apply does not gracefully complete, and I see numerous errors about:1. Can’t scale up a node pool because of a failing scheduling predicateThe autoscale...
2 More Replies
- 640 Views
- 3 replies
- 5 kudos
Hi all, I have a couple of use cases that may benefit from using graphs. I'm interested in whether anyone has graph databases in Production and, if so, whether you're using GraphFrames, Neo4j or something else? What is the architecture you have the...
- 640 Views
- 3 replies
- 5 kudos
Latest Reply
Up to now the way to go is graphx or graphframes.There is also the possibility to use python libraries or others (single node that is), perhaps even Arrow-based.Another option is to load the data to a graph database and then move back to databricks a...
2 More Replies
- 222 Views
- 1 replies
- 0 kudos
Hello,I am working on creating an architecture diagram for Databricks on AWS.I would like to adopt the de facto standard used by enterprises. Based on my research, I have identified the following components:Network: Customer-managed VPC,Secure Cluste...
- 222 Views
- 1 replies
- 0 kudos
Latest Reply
I would not call it a 'standard' but a possible architecture. The great thing about the cloud is you can complete the puzzle in many ways and make it as complex or as easy as possible.Also I would not consider Fivetran to be standard in companies. ...
- 656 Views
- 6 replies
- 0 kudos
Hello,My organization is experiencing difficulties updating our Google Kubernetes Engine (GKE) cluster.We've reviewed the official GKE documentation for automated cluster updates, but it appears to primarily focus on AWS integrations. We haven't foun...
- 656 Views
- 6 replies
- 0 kudos
Latest Reply
You can try Terraform or gcloud scripts for automation?
5 More Replies
- 385 Views
- 5 replies
- 1 kudos
Hello experts,I am trying to get receipts for the monthly payments done to Databricks. I need them for the financial department of the organization I am working for. The only billing information I get access to is the usage dashboards and the tables ...
- 385 Views
- 5 replies
- 1 kudos