cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

swee
by New Contributor
  • 963 Views
  • 1 replies
  • 1 kudos

Resolved! Establish Cross cloud connectivity between Azure Databricks and AWS s3

Hello.We have the cross cloud configuration set as below:AWS - VPC, Transit Gateways, AWS Direct ConnectOn Premise Data centerAzure - Vnet, Transit Vnet , Express Route.We are trying to create a Databricks storage credential as below. The AWS IAM obj...

  • 963 Views
  • 1 replies
  • 1 kudos
Latest Reply
Sai_Ponugoti
Databricks Employee
  • 1 kudos

Hello @swee ,Thank you for your query.If your storage account is private, you would need to establish a route to that storage account so you can read data.This is because if your storage is private, your storage account will block access to the publi...

  • 1 kudos
asharkman
by New Contributor III
  • 4889 Views
  • 8 replies
  • 2 kudos

Resolved! Reporting serverless costs to azure costs

So, we've just recently applied serverless budget polices to some of our vector searches and apps. At the moment they're all going to azure under one general tag that we created.However, we needed more definition. So i added the serverless budget pol...

  • 4889 Views
  • 8 replies
  • 2 kudos
Latest Reply
mrsimon0007
New Contributor II
  • 2 kudos

Billing or set up explicit export pipelines. Check whether your serverless budget policy tags are under a different namespace in Azure, as sometimes they show up nested.

  • 2 kudos
7 More Replies
akmukherjee
by New Contributor III
  • 9062 Views
  • 24 replies
  • 5 kudos

Resolved! Unable to enable Serverless Notebooks

Hello there,I have a Databricks Premium Subscription but am not able to enable Serverless Notebooks (as that option does not seem to exist). I have gone through DB documentation and have Unity Catalog Enabled. I even opened a ticket (00591635) but it...

  • 9062 Views
  • 24 replies
  • 5 kudos
Latest Reply
Fellnerse
New Contributor II
  • 5 kudos

We have the same exact thing happening. The /config endpoint shows it is enabled, but we can not select it. The account is several month old. @Walter_C could you help out here as well?

  • 5 kudos
23 More Replies
jp_allard1
by New Contributor
  • 1847 Views
  • 2 replies
  • 2 kudos

Resolved! Databricks One

Hello,I can not find where I enable databricks one in my workspace. Can someone help me understand were this is located or who can grant me access to this feature?I checked the the "Previews" in my account and it is not there.Thanks in advance.Best,J...

  • 1847 Views
  • 2 replies
  • 2 kudos
Latest Reply
koji_kawamura
Databricks Employee
  • 2 kudos

Hi @jp_allard1  Databricks One is now in Public Preview. It is a Workspace level feature, so a user who has Workspace Admin role should be able to enable it from the Workspace Preview setting page as shown in this screenshot.  

  • 2 kudos
1 More Replies
yumnus
by Databricks Partner
  • 2776 Views
  • 7 replies
  • 0 kudos

Not able to connect to GCP Secret Manager except when using "No isolation shared" Cluster

Hey everyone,We’re trying to access secrets stored in GCP Secret Manager using its Python package from Databricks on GCP. However, we can only reach the Secret Manager when using "No Isolation Shared" clusters, which is not an option for us. Currentl...

  • 2776 Views
  • 7 replies
  • 0 kudos
Latest Reply
blemgorfell
New Contributor II
  • 0 kudos

This is a huge issue. We are seeing the same thing. google auth is broken for databricks on GCP? Only with no isolation enabled is it able to access the metadata service and get credentials.Why is the metadata service not reachable? I would be shocke...

  • 0 kudos
6 More Replies
APJESK
by Contributor
  • 854 Views
  • 1 replies
  • 2 kudos

Resolved! Serverless Workspace Observability

I’m setting up observability for a Databricks serverless workspace on AWS and need some guidance.I know we can configure audit logs for S3 delivery, but I’m unsure if that alone is sufficient.For a complete observability setup especially when integra...

  • 854 Views
  • 1 replies
  • 2 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 2 kudos

Hey @APJESK - thanks for reaching out!  For comprehensive observability in a Databricks serverless workspace on AWS, particularly when integrating with tools like CloudWatch, Splunk, or Kibana, enabling audit log delivery to S3 is a crucial first st...

  • 2 kudos
snazkx
by New Contributor
  • 947 Views
  • 1 replies
  • 1 kudos

Databricks Oauth errors with terraform deployment in gcp

Trying to deploy databricks workspace in GCP using terraform with a customer managed VPC. Only difference from the terraform provider configuration is that I have a pre-created shared VPC in a host project, and a dedicated workspace project with the ...

Administration & Architecture
GCP databricks
gcp-databricks
Terraform
  • 947 Views
  • 1 replies
  • 1 kudos
Latest Reply
Sai_Ponugoti
Databricks Employee
  • 1 kudos

Hey @snazkx ,Thank you for your question.We have a set of modules built by the Databricks field team, which helps you to create your workspace with a new VPC or use a pre-existing VPC (BYOVPC)You can find more information on our git repo.Could you pl...

  • 1 kudos
807326
by New Contributor II
  • 8260 Views
  • 3 replies
  • 0 kudos

Enable automatic schema evolution for Delta Lake merge for an SQL warehouse

Hello! We tried to update our integration scripts and use SQL warehouses instead of general compute clusters to fetch and update data, but we faced a problem. We use automatic schema evolution when we merge tables, but with SQL warehouse, when we try...

  • 8260 Views
  • 3 replies
  • 0 kudos
Latest Reply
User16844475297
Databricks Employee
  • 0 kudos

This feature has been released. Please see docs for more details - https://docs.databricks.com/aws/en/sql/language-manual/delta-merge-into#with-schema-evolution

  • 0 kudos
2 More Replies
mariadelmar
by New Contributor
  • 456 Views
  • 1 replies
  • 0 kudos

Community account & Cluster activation

Hi all,I am facing troubles to access the community version, when I access the log in process states that I have not a workspace, and I do as I have an account on the free version running out well. The point is that I am doing an exercise for master'...

  • 456 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @mariadelmar ,If I understood correctly, you have an account in Databricks Free Edition and you would also like to log in to the Community Edition.Now, if you did not previously have a registered account in the Community Edition (before the Free E...

  • 0 kudos
mariadelmar
by New Contributor
  • 652 Views
  • 1 replies
  • 1 kudos

Resolved! Community log in

Hi all,I am facing troubles to access the community versión, when I acces the log in process states that I have not a workspace, and do, as I have an account on the free version running out well. T'he point is that I am doing an exercise for master's...

  • 652 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @mariadelmar ,If I understood correctly, you have an account in Databricks Free Edition and you would also like to log in to the Community Edition.Now, if you did not previously have a registered account in the Community Edition (before the Free E...

  • 1 kudos
APJESK
by Contributor
  • 1166 Views
  • 3 replies
  • 1 kudos

Resolved! Regarding Traditional workspace - Classic and Serverless Architecture

Why does Databricks require creating AWS resources on our AWS account (IAM role, VPC, subnets, security groups) when deploying a Traditional workspace, even if we plan to use only serverless compute, which runs fully in the Databricks account and onl...

  • 1166 Views
  • 3 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Hey @APJESK , Databricks requires AWS resources such as IAM roles, VPCs, subnets, and security groups when deploying a Traditional workspace—even if you plan to use only serverless compute—because of how the platform distinguishes between workspace t...

  • 1 kudos
2 More Replies
SørenBrandt2
by New Contributor II
  • 1721 Views
  • 3 replies
  • 0 kudos

Terraform - Assign permissions to Entra Id group

Dear All, Using a Terraform workspace-level provider, I am trying to add an Entra Id group to the account, and then assign permissions to the group.  The Terraform provider runs in the context of an Entra Id user account with workspace admin permissi...

  • 1721 Views
  • 3 replies
  • 0 kudos
Latest Reply
nayan_wylde
Esteemed Contributor II
  • 0 kudos

@SørenBrandt2 Here are few quick checks you can do and rerun.1. Please make sure the Service Principle running the terraform code have  Group Manager role on the specific account group. With that role, it can read that group at the account and retrie...

  • 0 kudos
2 More Replies
Paul_Headey
by Databricks Partner
  • 572 Views
  • 1 replies
  • 0 kudos

Resolved! Lakebridge Access

Shortly after Bladebridge was acquired I requested access to  the Converter , but was told that that won't be available to partners as tool, but only as a professional service.    Is this still the case or can I get my team access to the Converter?  ...

  • 572 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @Paul_Headey! For the most accurate and up-to-date information on this, please reach out to your Databricks representative or contact help@databricks.com.

  • 0 kudos
APJESK
by Contributor
  • 1025 Views
  • 3 replies
  • 2 kudos

Can I subscribe to Databricks in AWS Marketplace via Terraform/automation? How to handle this ?

how to subscribe the Databricks product from AWS - market place using Terraform or other automation tool 

  • 1025 Views
  • 3 replies
  • 2 kudos
Latest Reply
APJESK
Contributor
  • 2 kudos

For now, We got clarification that Databricks subscription on AWS market place can't be automate. I will be in touch more question are on the way

  • 2 kudos
2 More Replies
Venugopal
by New Contributor III
  • 5546 Views
  • 3 replies
  • 0 kudos

How to upload a file to Unity catalog volume using databricks asset bundles

Hi,I am working on a CI CD blueprint for developers, using which developers can create their bundle for jobs / workflows and then create a volume to which they will upload a wheel file or a jar file which will be used as a dependency in their noteboo...

  • 5546 Views
  • 3 replies
  • 0 kudos
Latest Reply
chanukya-pekala
Contributor III
  • 0 kudos

With this setup, users who are entitled to access the catalog will have the access to use the volume, if permissions are set in this way. And, users will be able to utilize the notebook and we need to provide documentation either to clone the noteboo...

  • 0 kudos
2 More Replies