cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

snazkx
by New Contributor
  • 559 Views
  • 1 replies
  • 1 kudos

Databricks Oauth errors with terraform deployment in gcp

Trying to deploy databricks workspace in GCP using terraform with a customer managed VPC. Only difference from the terraform provider configuration is that I have a pre-created shared VPC in a host project, and a dedicated workspace project with the ...

Administration & Architecture
GCP databricks
gcp-databricks
Terraform
  • 559 Views
  • 1 replies
  • 1 kudos
Latest Reply
Sai_Ponugoti
Databricks Employee
  • 1 kudos

Hey @snazkx ,Thank you for your question.We have a set of modules built by the Databricks field team, which helps you to create your workspace with a new VPC or use a pre-existing VPC (BYOVPC)You can find more information on our git repo.Could you pl...

  • 1 kudos
807326
by New Contributor II
  • 7271 Views
  • 3 replies
  • 0 kudos

Enable automatic schema evolution for Delta Lake merge for an SQL warehouse

Hello! We tried to update our integration scripts and use SQL warehouses instead of general compute clusters to fetch and update data, but we faced a problem. We use automatic schema evolution when we merge tables, but with SQL warehouse, when we try...

  • 7271 Views
  • 3 replies
  • 0 kudos
Latest Reply
User16844475297
Databricks Employee
  • 0 kudos

This feature has been released. Please see docs for more details - https://docs.databricks.com/aws/en/sql/language-manual/delta-merge-into#with-schema-evolution

  • 0 kudos
2 More Replies
mariadelmar
by New Contributor
  • 315 Views
  • 1 replies
  • 0 kudos

Community account & Cluster activation

Hi all,I am facing troubles to access the community version, when I access the log in process states that I have not a workspace, and I do as I have an account on the free version running out well. The point is that I am doing an exercise for master'...

  • 315 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @mariadelmar ,If I understood correctly, you have an account in Databricks Free Edition and you would also like to log in to the Community Edition.Now, if you did not previously have a registered account in the Community Edition (before the Free E...

  • 0 kudos
mariadelmar
by New Contributor
  • 478 Views
  • 1 replies
  • 1 kudos

Resolved! Community log in

Hi all,I am facing troubles to access the community versión, when I acces the log in process states that I have not a workspace, and do, as I have an account on the free version running out well. T'he point is that I am doing an exercise for master's...

  • 478 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @mariadelmar ,If I understood correctly, you have an account in Databricks Free Edition and you would also like to log in to the Community Edition.Now, if you did not previously have a registered account in the Community Edition (before the Free E...

  • 1 kudos
APJESK
by New Contributor III
  • 778 Views
  • 3 replies
  • 1 kudos

Resolved! Regarding Traditional workspace - Classic and Serverless Architecture

Why does Databricks require creating AWS resources on our AWS account (IAM role, VPC, subnets, security groups) when deploying a Traditional workspace, even if we plan to use only serverless compute, which runs fully in the Databricks account and onl...

  • 778 Views
  • 3 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Hey @APJESK , Databricks requires AWS resources such as IAM roles, VPCs, subnets, and security groups when deploying a Traditional workspace—even if you plan to use only serverless compute—because of how the platform distinguishes between workspace t...

  • 1 kudos
2 More Replies
SørenBrandt2
by New Contributor II
  • 836 Views
  • 3 replies
  • 0 kudos

Terraform - Assign permissions to Entra Id group

Dear All, Using a Terraform workspace-level provider, I am trying to add an Entra Id group to the account, and then assign permissions to the group.  The Terraform provider runs in the context of an Entra Id user account with workspace admin permissi...

  • 836 Views
  • 3 replies
  • 0 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 0 kudos

@SørenBrandt2 Here are few quick checks you can do and rerun.1. Please make sure the Service Principle running the terraform code have  Group Manager role on the specific account group. With that role, it can read that group at the account and retrie...

  • 0 kudos
2 More Replies
Paul_Headey
by New Contributor
  • 385 Views
  • 1 replies
  • 0 kudos

Resolved! Lakebridge Access

Shortly after Bladebridge was acquired I requested access to  the Converter , but was told that that won't be available to partners as tool, but only as a professional service.    Is this still the case or can I get my team access to the Converter?  ...

  • 385 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @Paul_Headey! For the most accurate and up-to-date information on this, please reach out to your Databricks representative or contact help@databricks.com.

  • 0 kudos
APJESK
by New Contributor III
  • 634 Views
  • 3 replies
  • 2 kudos

Can I subscribe to Databricks in AWS Marketplace via Terraform/automation? How to handle this ?

how to subscribe the Databricks product from AWS - market place using Terraform or other automation tool 

  • 634 Views
  • 3 replies
  • 2 kudos
Latest Reply
APJESK
New Contributor III
  • 2 kudos

For now, We got clarification that Databricks subscription on AWS market place can't be automate. I will be in touch more question are on the way

  • 2 kudos
2 More Replies
Venugopal
by New Contributor III
  • 4353 Views
  • 3 replies
  • 0 kudos

How to upload a file to Unity catalog volume using databricks asset bundles

Hi,I am working on a CI CD blueprint for developers, using which developers can create their bundle for jobs / workflows and then create a volume to which they will upload a wheel file or a jar file which will be used as a dependency in their noteboo...

  • 4353 Views
  • 3 replies
  • 0 kudos
Latest Reply
chanukya-pekala
Contributor III
  • 0 kudos

With this setup, users who are entitled to access the catalog will have the access to use the volume, if permissions are set in this way. And, users will be able to utilize the notebook and we need to provide documentation either to clone the noteboo...

  • 0 kudos
2 More Replies
tana_sakakimiya
by Contributor
  • 1674 Views
  • 6 replies
  • 5 kudos

Resolved! Event-driven Architecture with Lake Monitoring without "Trigger on Arrival" on DABs

AWS databricksI want to create data quality monitoring and event-driven architecture without trigger on file arrival but once at deploy.I plan to create a job which trigger once at deploy.The job run this tasks sequentially.1. run script to create ex...

  • 1674 Views
  • 6 replies
  • 5 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 5 kudos

@tana_sakakimiya ah, I think I see the difference. My screenshot says that "external tables" backed by delta lake will work. This means, you'll need to have the table already created in databricks, from your external location i.e. make an external ta...

  • 5 kudos
5 More Replies
AmpolJon
by New Contributor II
  • 1603 Views
  • 6 replies
  • 3 kudos

Resolved! How to sending parameters from http request to in job running notebook

I've try to trigger job running via n8n workflow, which can command to make notebook running properly.BUT another bullet to achieve is I have to send some data to that job to be rub as well, i googled it and can't find solutions anywhere. My setup wa...

  • 1603 Views
  • 6 replies
  • 3 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 3 kudos

@AmpolJon I don't think you should be giving up on the method, the API allows you to pass job parameters to it and you can retrieve them from in the Python Notebook. Here's an example.1. Call on the API, https://docs.databricks.com/api/workspace/jobs...

  • 3 kudos
5 More Replies
lmcconnell1665
by New Contributor
  • 673 Views
  • 1 replies
  • 1 kudos

AWS Serverless NCC

I have setup a new Databricks workspace in AWS for one of my customers and they are using serverless compute. We are trying to obtain durable IP addresses that we can whitelist on a Redshift instance so that Databricks can run federated queries again...

  • 673 Views
  • 1 replies
  • 1 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 1 kudos

Dear @lmcconnell1665 Greatings for the day!!The serverless firewall feature (which enables the stable public IPs you're seeking via the NCC's default rules) is currently in public preview on Databricks AWS. This means it requires explicit enablement ...

  • 1 kudos
Michael_Appiah
by Contributor II
  • 4984 Views
  • 7 replies
  • 9 kudos

Resolved! Enforcing Tags on SQL Warehouses

Is there a way to enforce tags on SQL Warehouses? Regular cluster policies do not apply to SQL Warehouses and budget policies do not cover SQL Warehouses either, which I find quite surprising given the fact that budget policies are documented as "Att...

  • 4984 Views
  • 7 replies
  • 9 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 9 kudos

@Michael_Appiah Yes the default tag policies doesn't apply on warehouse. The solution that I can recommend is assign a tag block if you are deploying using terraform, asset bundle etc to deploy the warehouse. The other solution that I use is I run a ...

  • 9 kudos
6 More Replies
APJESK
by New Contributor III
  • 407 Views
  • 1 replies
  • 2 kudos

Resolved! Regarding - Serverless workspace deployment

When creating a Serverless workspace in Databricks, is there any option to have the workspace’s default (root) storage bucket created in our own AWS account instead of the Databricks-managed account? I know we can set up external locations for data, ...

  • 407 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @APJESK ,Unfortunately, this is no possible as of now. 

  • 2 kudos
smurug24
by New Contributor
  • 5325 Views
  • 2 replies
  • 0 kudos

Databricks - Cost difference between Job Clusters and DLT

Wanted to know about the cost comparison and certain specific feature details between job clusters and DLT,Per the Pricing page (based on both Azure Pricing and Databricks Pricing page) following is the understanding - Region: US EastProvisionedJobs ...

  • 5325 Views
  • 2 replies
  • 0 kudos
Latest Reply
thomas-totter
New Contributor III
  • 0 kudos

@smurug24 wrote:In DLT Provisioned, there will be two clusters - updates (for performing the actual data processing) and maintenance (for performing the maintenance operations). So in case of DLT serverless as well, will it be internally running two ...

  • 0 kudos
1 More Replies