cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

AmpolJon
by New Contributor II
  • 1626 Views
  • 6 replies
  • 3 kudos

Resolved! How to sending parameters from http request to in job running notebook

I've try to trigger job running via n8n workflow, which can command to make notebook running properly.BUT another bullet to achieve is I have to send some data to that job to be rub as well, i googled it and can't find solutions anywhere. My setup wa...

  • 1626 Views
  • 6 replies
  • 3 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 3 kudos

@AmpolJon I don't think you should be giving up on the method, the API allows you to pass job parameters to it and you can retrieve them from in the Python Notebook. Here's an example.1. Call on the API, https://docs.databricks.com/api/workspace/jobs...

  • 3 kudos
5 More Replies
lmcconnell1665
by New Contributor
  • 687 Views
  • 1 replies
  • 1 kudos

AWS Serverless NCC

I have setup a new Databricks workspace in AWS for one of my customers and they are using serverless compute. We are trying to obtain durable IP addresses that we can whitelist on a Redshift instance so that Databricks can run federated queries again...

  • 687 Views
  • 1 replies
  • 1 kudos
Latest Reply
Khaja_Zaffer
Contributor III
  • 1 kudos

Dear @lmcconnell1665 Greatings for the day!!The serverless firewall feature (which enables the stable public IPs you're seeking via the NCC's default rules) is currently in public preview on Databricks AWS. This means it requires explicit enablement ...

  • 1 kudos
Michael_Appiah
by Contributor II
  • 5045 Views
  • 7 replies
  • 9 kudos

Resolved! Enforcing Tags on SQL Warehouses

Is there a way to enforce tags on SQL Warehouses? Regular cluster policies do not apply to SQL Warehouses and budget policies do not cover SQL Warehouses either, which I find quite surprising given the fact that budget policies are documented as "Att...

  • 5045 Views
  • 7 replies
  • 9 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 9 kudos

@Michael_Appiah Yes the default tag policies doesn't apply on warehouse. The solution that I can recommend is assign a tag block if you are deploying using terraform, asset bundle etc to deploy the warehouse. The other solution that I use is I run a ...

  • 9 kudos
6 More Replies
APJESK
by New Contributor III
  • 408 Views
  • 1 replies
  • 2 kudos

Resolved! Regarding - Serverless workspace deployment

When creating a Serverless workspace in Databricks, is there any option to have the workspace’s default (root) storage bucket created in our own AWS account instead of the Databricks-managed account? I know we can set up external locations for data, ...

  • 408 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @APJESK ,Unfortunately, this is no possible as of now. 

  • 2 kudos
smurug24
by New Contributor
  • 5361 Views
  • 2 replies
  • 0 kudos

Databricks - Cost difference between Job Clusters and DLT

Wanted to know about the cost comparison and certain specific feature details between job clusters and DLT,Per the Pricing page (based on both Azure Pricing and Databricks Pricing page) following is the understanding - Region: US EastProvisionedJobs ...

  • 5361 Views
  • 2 replies
  • 0 kudos
Latest Reply
thomas-totter
New Contributor III
  • 0 kudos

@smurug24 wrote:In DLT Provisioned, there will be two clusters - updates (for performing the actual data processing) and maintenance (for performing the maintenance operations). So in case of DLT serverless as well, will it be internally running two ...

  • 0 kudos
1 More Replies
mydefaultlogin
by New Contributor II
  • 253 Views
  • 1 replies
  • 0 kudos

Way to set 'source' format as default for everyone on the Workspace level (instead of ipynb)

HiI don't want any ipynb notebooks in the project I manage. I don't want any 'output' committed into our repo. I don't like to run diff tools on the ipynb notebooks. etc. I want every user in my workspaces to use simple .py files. They are easier to ...

  • 253 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @mydefaultlogin ,I don't think such an option exists. It's per user setting and curently you can't enforce it 

  • 0 kudos
alesventus
by Contributor
  • 2461 Views
  • 3 replies
  • 1 kudos

Move metastore to another azure subscription

Hi, We need to migrate our metastore with Unity Catalog to a new Azure subscription while remaining in the same Azure region. Currently, we have two workspaces attached to a single Unity Catalog. I’m looking for the best approach to move the metastor...

alesventus_0-1742817858728.png
  • 2461 Views
  • 3 replies
  • 1 kudos
Latest Reply
Nivethan_Venkat
Contributor III
  • 1 kudos

Hi @alesventus,There are few points to be considered before migrating from One metastore to another. We need to see how the catalogs, schemas and tables are created as of now.If you have created everything has managed like managed catalog, schema and...

  • 1 kudos
2 More Replies
IUC08
by New Contributor III
  • 643 Views
  • 4 replies
  • 1 kudos

Resolved! REST API for swapping cluster

Hi Team, I am trying to find REST API reference for swapping a cluster but unable to find it in the documentation. Can anyone please tell me what is the REST API reference for swapping an existing cluster to another existing cluster, if present?If no...

  • 643 Views
  • 4 replies
  • 1 kudos
Latest Reply
IUC08
New Contributor III
  • 1 kudos

Hi @szymon_dybczak,It helps! I am able to change the clusters. Thanks a lot!

  • 1 kudos
3 More Replies
gs2
by New Contributor II
  • 2693 Views
  • 9 replies
  • 6 kudos

Issue accessing databricks secrets from ADF

Hello -Seeing an issue where notebook triggered from ADF is not able to access secret scopes, which was working earlier. Here are the steps I did 1. Provide ADF contributor role permission in databrick workspace. - we tested this and were able to tri...

  • 2693 Views
  • 9 replies
  • 6 kudos
Latest Reply
Isi
Honored Contributor III
  • 6 kudos

Hey @gs2 @IkuyoshiKuroda ,I have reviewed the documentation:According to the Databricks documentation on secret scopes, there are two types:Databricks-backed scopes → secrets are stored inside Databricks. These do not support authentication via Azure...

  • 6 kudos
8 More Replies
spoltier
by New Contributor III
  • 988 Views
  • 2 replies
  • 0 kudos

Resolved! Using pip cache for pypi compute libraries

I am able to configure pip's behavior w.r.t index url by setting PIP_INDEX_URL, PIP_TRUSTED_HOST etc. I would like to cache compute-wide pypi libraries, to improve cluster startup performance / reliability. However, I notice that PIP_CACHE_DIR has no...

  • 988 Views
  • 2 replies
  • 0 kudos
Latest Reply
spoltier
New Contributor III
  • 0 kudos

Hi Isi,We moved away from docker images for the reasons you mention, and because they otherwise had issues for us. We are already using artifactory (as hinted by the environment variables mentioned in my post). I wanted to try further improving the s...

  • 0 kudos
1 More Replies
faisal_da
by New Contributor II
  • 2276 Views
  • 7 replies
  • 2 kudos

Unable to see Manage Account option in the Databricks Workspace

Hi, I have an organizational account who is the owner of the databricks workspace (premium) and also the global administrator. Still, I don't see "Account Console" option in the databricks after clicking the "manage account" option.I have tried to cl...

  • 2276 Views
  • 7 replies
  • 2 kudos
Latest Reply
naikashok
New Contributor II
  • 2 kudos

Hi @szymon_dybczak ,Yes have databricks premium account and admin as well

  • 2 kudos
6 More Replies
deepud26
by New Contributor II
  • 645 Views
  • 2 replies
  • 2 kudos

Connectivity of dbx serverless to api gateway dns in aws is failing

Hello, I'm trying to establish the connection between dbx serverless account which is in "Account A" to the api gateway dns name which is in "Account B" is failing. i have created a api gateway with the custom domain for ex : "api-dev.amazon.com" and...

  • 645 Views
  • 2 replies
  • 2 kudos
Latest Reply
deepud26
New Contributor II
  • 2 kudos

Hello @ilir_nuredini ,I'm trying the below way import requestsimport json API_URL = 'https://api-dev.amazon.com/v1/quality'headers = {    "Authorization": f"Bearer {token}",    "Content-Type": "application/json"}response = requests.get(API_URL, heade...

  • 2 kudos
1 More Replies
Lovebo
by New Contributor II
  • 555 Views
  • 1 replies
  • 2 kudos

Error creating Databricks workspace on AWS (CREATE_FAILED with JSONDecodeError)

Hi everyone,I’m trying to create a Databricks workspace on AWS by following the official quick start guide:https://docs.databricks.com/aws/en/admin/workspace/quick-startHowever, my CloudFormation stack fails with the following error:  The resource cr...

  • 555 Views
  • 1 replies
  • 2 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor III
  • 2 kudos

Hi @Lovebo , I'm not too familiar with AWS but happy to have a go at helping. Let's see if we can rule out some of the more obvious things which are easy to miss on the documentation.First of all, are you just following the guide through: https://doc...

  • 2 kudos
noorbasha534
by Valued Contributor II
  • 500 Views
  • 2 replies
  • 0 kudos

Configure job access_control_list block from a single place outside of job definition

Hi allIs it possible to configure the access_control_list block ((basically that contains permissions)) from a single place outside of Databricks job definition? This is changing as we are re-defining our permissions model, and each time resulting in...

  • 500 Views
  • 2 replies
  • 0 kudos
Latest Reply
Pilsner
Valued Contributor III
  • 0 kudos

Hello @noorbasha534 If you are trying to alter the permissions across many jobs at scale, I believe there are a couple of options to help speed up the process. Firstly, creating groups of users should help, as you can then easily change an entire gro...

  • 0 kudos
1 More Replies
sunykim
by New Contributor II
  • 588 Views
  • 3 replies
  • 0 kudos

databricks terraform provider, databricks_credential resource, service

I cannot make the databricks_credential resource create a service credential. It works fine with storage credentials. However, when i put `purpose = "SERVICE"` plus aws_iam_role and comment, in the apply phase it fails with `Error: cannot create cred...

  • 588 Views
  • 3 replies
  • 0 kudos
Latest Reply
sunykim
New Contributor II
  • 0 kudos

I have the same error message now when trying to create a USE_SCHEMA grant for a service principal as in https://registry.terraform.io/providers/databricks/databricks/latest/docs/resources/grant#schema-grants . I create a new service principal and th...

  • 0 kudos
2 More Replies