cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

yumnus
by New Contributor III
  • 1496 Views
  • 7 replies
  • 0 kudos

Not able to connect to GCP Secret Manager except when using "No isolation shared" Cluster

Hey everyone,We’re trying to access secrets stored in GCP Secret Manager using its Python package from Databricks on GCP. However, we can only reach the Secret Manager when using "No Isolation Shared" clusters, which is not an option for us. Currentl...

  • 1496 Views
  • 7 replies
  • 0 kudos
Latest Reply
blemgorfell
New Contributor II
  • 0 kudos

This is a huge issue. We are seeing the same thing. google auth is broken for databricks on GCP? Only with no isolation enabled is it able to access the metadata service and get credentials.Why is the metadata service not reachable? I would be shocke...

  • 0 kudos
6 More Replies
APJESK
by New Contributor III
  • 296 Views
  • 1 replies
  • 2 kudos

Resolved! Serverless Workspace Observability

I’m setting up observability for a Databricks serverless workspace on AWS and need some guidance.I know we can configure audit logs for S3 delivery, but I’m unsure if that alone is sufficient.For a complete observability setup especially when integra...

  • 296 Views
  • 1 replies
  • 2 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 2 kudos

Hey @APJESK - thanks for reaching out!  For comprehensive observability in a Databricks serverless workspace on AWS, particularly when integrating with tools like CloudWatch, Splunk, or Kibana, enabling audit log delivery to S3 is a crucial first st...

  • 2 kudos
snazkx
by New Contributor
  • 351 Views
  • 1 replies
  • 1 kudos

Databricks Oauth errors with terraform deployment in gcp

Trying to deploy databricks workspace in GCP using terraform with a customer managed VPC. Only difference from the terraform provider configuration is that I have a pre-created shared VPC in a host project, and a dedicated workspace project with the ...

Administration & Architecture
GCP databricks
gcp-databricks
Terraform
  • 351 Views
  • 1 replies
  • 1 kudos
Latest Reply
Sai_Ponugoti
Databricks Employee
  • 1 kudos

Hey @snazkx ,Thank you for your question.We have a set of modules built by the Databricks field team, which helps you to create your workspace with a new VPC or use a pre-existing VPC (BYOVPC)You can find more information on our git repo.Could you pl...

  • 1 kudos
807326
by New Contributor II
  • 6877 Views
  • 3 replies
  • 0 kudos

Enable automatic schema evolution for Delta Lake merge for an SQL warehouse

Hello! We tried to update our integration scripts and use SQL warehouses instead of general compute clusters to fetch and update data, but we faced a problem. We use automatic schema evolution when we merge tables, but with SQL warehouse, when we try...

  • 6877 Views
  • 3 replies
  • 0 kudos
Latest Reply
User16844475297
Databricks Employee
  • 0 kudos

This feature has been released. Please see docs for more details - https://docs.databricks.com/aws/en/sql/language-manual/delta-merge-into#with-schema-evolution

  • 0 kudos
2 More Replies
mariadelmar
by New Contributor
  • 225 Views
  • 1 replies
  • 0 kudos

Community account & Cluster activation

Hi all,I am facing troubles to access the community version, when I access the log in process states that I have not a workspace, and I do as I have an account on the free version running out well. The point is that I am doing an exercise for master'...

  • 225 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @mariadelmar ,If I understood correctly, you have an account in Databricks Free Edition and you would also like to log in to the Community Edition.Now, if you did not previously have a registered account in the Community Edition (before the Free E...

  • 0 kudos
mariadelmar
by New Contributor
  • 280 Views
  • 1 replies
  • 1 kudos

Resolved! Community log in

Hi all,I am facing troubles to access the community versión, when I acces the log in process states that I have not a workspace, and do, as I have an account on the free version running out well. T'he point is that I am doing an exercise for master's...

  • 280 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @mariadelmar ,If I understood correctly, you have an account in Databricks Free Edition and you would also like to log in to the Community Edition.Now, if you did not previously have a registered account in the Community Edition (before the Free E...

  • 1 kudos
APJESK
by New Contributor III
  • 375 Views
  • 3 replies
  • 1 kudos

Resolved! Regarding Traditional workspace - Classic and Serverless Architecture

Why does Databricks require creating AWS resources on our AWS account (IAM role, VPC, subnets, security groups) when deploying a Traditional workspace, even if we plan to use only serverless compute, which runs fully in the Databricks account and onl...

  • 375 Views
  • 3 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Hey @APJESK , Databricks requires AWS resources such as IAM roles, VPCs, subnets, and security groups when deploying a Traditional workspace—even if you plan to use only serverless compute—because of how the platform distinguishes between workspace t...

  • 1 kudos
2 More Replies
SørenBrandt2
by New Contributor II
  • 459 Views
  • 3 replies
  • 0 kudos

Terraform - Assign permissions to Entra Id group

Dear All, Using a Terraform workspace-level provider, I am trying to add an Entra Id group to the account, and then assign permissions to the group.  The Terraform provider runs in the context of an Entra Id user account with workspace admin permissi...

  • 459 Views
  • 3 replies
  • 0 kudos
Latest Reply
nayan_wylde
Honored Contributor III
  • 0 kudos

@SørenBrandt2 Here are few quick checks you can do and rerun.1. Please make sure the Service Principle running the terraform code have  Group Manager role on the specific account group. With that role, it can read that group at the account and retrie...

  • 0 kudos
2 More Replies
Paul_Headey
by New Contributor
  • 213 Views
  • 1 replies
  • 0 kudos

Resolved! Lakebridge Access

Shortly after Bladebridge was acquired I requested access to  the Converter , but was told that that won't be available to partners as tool, but only as a professional service.    Is this still the case or can I get my team access to the Converter?  ...

  • 213 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @Paul_Headey! For the most accurate and up-to-date information on this, please reach out to your Databricks representative or contact help@databricks.com.

  • 0 kudos
APJESK
by New Contributor III
  • 384 Views
  • 3 replies
  • 2 kudos

Can I subscribe to Databricks in AWS Marketplace via Terraform/automation? How to handle this ?

how to subscribe the Databricks product from AWS - market place using Terraform or other automation tool 

  • 384 Views
  • 3 replies
  • 2 kudos
Latest Reply
APJESK
New Contributor III
  • 2 kudos

For now, We got clarification that Databricks subscription on AWS market place can't be automate. I will be in touch more question are on the way

  • 2 kudos
2 More Replies
Venugopal
by New Contributor III
  • 3860 Views
  • 3 replies
  • 0 kudos

How to upload a file to Unity catalog volume using databricks asset bundles

Hi,I am working on a CI CD blueprint for developers, using which developers can create their bundle for jobs / workflows and then create a volume to which they will upload a wheel file or a jar file which will be used as a dependency in their noteboo...

  • 3860 Views
  • 3 replies
  • 0 kudos
Latest Reply
chanukya-pekala
Contributor II
  • 0 kudos

With this setup, users who are entitled to access the catalog will have the access to use the volume, if permissions are set in this way. And, users will be able to utilize the notebook and we need to provide documentation either to clone the noteboo...

  • 0 kudos
2 More Replies
tana_sakakimiya
by New Contributor III
  • 845 Views
  • 6 replies
  • 5 kudos

Resolved! Event-driven Architecture with Lake Monitoring without "Trigger on Arrival" on DABs

AWS databricksI want to create data quality monitoring and event-driven architecture without trigger on file arrival but once at deploy.I plan to create a job which trigger once at deploy.The job run this tasks sequentially.1. run script to create ex...

  • 845 Views
  • 6 replies
  • 5 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor II
  • 5 kudos

@tana_sakakimiya ah, I think I see the difference. My screenshot says that "external tables" backed by delta lake will work. This means, you'll need to have the table already created in databricks, from your external location i.e. make an external ta...

  • 5 kudos
5 More Replies
AmpolJon
by New Contributor II
  • 858 Views
  • 6 replies
  • 3 kudos

Resolved! How to sending parameters from http request to in job running notebook

I've try to trigger job running via n8n workflow, which can command to make notebook running properly.BUT another bullet to achieve is I have to send some data to that job to be rub as well, i googled it and can't find solutions anywhere. My setup wa...

  • 858 Views
  • 6 replies
  • 3 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor II
  • 3 kudos

@AmpolJon I don't think you should be giving up on the method, the API allows you to pass job parameters to it and you can retrieve them from in the Python Notebook. Here's an example.1. Call on the API, https://docs.databricks.com/api/workspace/jobs...

  • 3 kudos
5 More Replies
lmcconnell1665
by New Contributor
  • 368 Views
  • 1 replies
  • 1 kudos

AWS Serverless NCC

I have setup a new Databricks workspace in AWS for one of my customers and they are using serverless compute. We are trying to obtain durable IP addresses that we can whitelist on a Redshift instance so that Databricks can run federated queries again...

  • 368 Views
  • 1 replies
  • 1 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 1 kudos

Dear @lmcconnell1665 Greatings for the day!!The serverless firewall feature (which enables the stable public IPs you're seeking via the NCC's default rules) is currently in public preview on Databricks AWS. This means it requires explicit enablement ...

  • 1 kudos
Michael_Appiah
by Contributor II
  • 4011 Views
  • 7 replies
  • 9 kudos

Resolved! Enforcing Tags on SQL Warehouses

Is there a way to enforce tags on SQL Warehouses? Regular cluster policies do not apply to SQL Warehouses and budget policies do not cover SQL Warehouses either, which I find quite surprising given the fact that budget policies are documented as "Att...

  • 4011 Views
  • 7 replies
  • 9 kudos
Latest Reply
nayan_wylde
Honored Contributor III
  • 9 kudos

@Michael_Appiah Yes the default tag policies doesn't apply on warehouse. The solution that I can recommend is assign a tag block if you are deploying using terraform, asset bundle etc to deploy the warehouse. The other solution that I use is I run a ...

  • 9 kudos
6 More Replies