cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

asharkman
by New Contributor II
  • 43 Views
  • 2 replies
  • 0 kudos

Reporting serverless costs to azure costs

So, we've just recently applied serverless budget polices to some of our vector searches and apps. At the moment they're all going to azure under one general tag that we created.However, we needed more definition. So i added the serverless budget pol...

  • 43 Views
  • 2 replies
  • 0 kudos
Latest Reply
SP_6721
Contributor III
  • 0 kudos

Hi @asharkman ,Tags from serverless budget policies are included in Azure cost exports.Try checking in the Azure Portal under Cost Analysis, add Tag as a filter, and enter your custom tag key.Also it can take a few hours for the tags to appear in Azu...

  • 0 kudos
1 More Replies
AlexMc
by New Contributor III
  • 175 Views
  • 3 replies
  • 1 kudos

Library installation failed for library due to user error for pypi

Hi!I get the below error when a cluster job starts up and tries to install a Python .whl file. (Which is hosted on an Azure Artefact feed, though this seems more like a problem of trying to read from a disk/network storage). The failure is seemingly ...

  • 175 Views
  • 3 replies
  • 1 kudos
Latest Reply
AlexMc
New Contributor III
  • 1 kudos

Thanks both - I think the problem is that this library installation is called when creating a new Job & Task via the rest endpoint. Where the libraires are specified in the .json file. So short version, don't think I can 'get at' the pip install call...

  • 1 kudos
2 More Replies
Dnirmania
by Contributor
  • 1926 Views
  • 1 replies
  • 0 kudos

Unable to destroy NCC private endpoint

Hi TeamAccidentally, we removed one of the NCC private endpoints from our storage account that was created using Terraform. When I tried to destroy and recreate it, I encountered the following error. According to some articles, the private endpoint w...

  • 1926 Views
  • 1 replies
  • 0 kudos
Latest Reply
Vidhi_Khaitan
Databricks Employee
  • 0 kudos

  Once a private endpoint rule is deactivated, it isn't immediately removed. Instead, it will be scheduled for purging after a set time period. In your case, the rule is slated for purging at the timestamp mentioned. This situation can occur in scena...

  • 0 kudos
philmac750
by New Contributor
  • 107 Views
  • 1 replies
  • 0 kudos

Resolved! Can I add another user to Free edition

Is it possible to add another user to Free edition ?I am wanting to test what they can see when they connect as a restricted user i.e. only granted Browse on 1 catalogThanks

  • 107 Views
  • 1 replies
  • 0 kudos
Latest Reply
philmac750
New Contributor
  • 0 kudos

Apologies.  Yes you can - I was in the wrong console.

  • 0 kudos
prodrick
by New Contributor II
  • 545 Views
  • 2 replies
  • 0 kudos

Resolved! Webhook Authentication

If want to send notifications via webhook to Splunk, Datadog, or LogicMonitor, how might I configure Databricks to authenticate using the destination platform's bearer token?

  • 545 Views
  • 2 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @prodrick ,It looks like webhooks destination supports only basic authentication using username and password. But you can try to paste bearer token to password section. Some webhook endpoints accept Bearer tokens in the password field while leavin...

  • 0 kudos
1 More Replies
MariusE
by New Contributor II
  • 487 Views
  • 3 replies
  • 2 kudos

Support for Unity Catalog External Data in Fabric OneLake

Hi community!We have set up a Fabric Link with our Dynamics, and want to attach the data in Unity Catalog using the External Data connector.But it doesn't look like Databricks support other than the default dls endpoints against Azure.Is there any wa...

MariusE_0-1751441479398.png
  • 487 Views
  • 3 replies
  • 2 kudos
Latest Reply
MariusE
New Contributor II
  • 2 kudos

Thanks @szymon_dybczak , do you know if support is on the roadmap? Since the current supported way of doing this with credential passthrough on the Compute is deprecated.Regards Marius

  • 2 kudos
2 More Replies
noklamchan
by New Contributor II
  • 641 Views
  • 2 replies
  • 3 kudos

How to access UnityCatalog's Volume inside Databricks App?

I am more familiar with DBFS, which seems to be replaced by UnityCatalog Volume now. When I create a Databricks App, it allowed me to add resource to pick UC volume. How do I actually access the volume inside the app? I cannot find any example, the a...

  • 641 Views
  • 2 replies
  • 3 kudos
Latest Reply
noklamchan
New Contributor II
  • 3 kudos

As I mention, I am not reading a table, so Spark is not the right fit here (plus I don't want to included spark as dependencies to read a csv either). I also don't have dbutils.I found this works:```cfg = Config() #This is available inside appw = Wor...

  • 3 kudos
1 More Replies
chinmay0924
by New Contributor III
  • 566 Views
  • 1 replies
  • 0 kudos

Resolved! How to create a function using the functions API in databricks?

https://docs.databricks.com/api/workspace/functions/createThis documentation gives the sample request payload, and one of the fields is type_json, and there is very little explanation of what is expected in this field. What am I supposed to pass here...

  • 566 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Contributor III
  • 0 kudos

Hi @chinmay0924 ,The type_json field describes your function’s input parameters and return type using a specific JSON format. You’ll need to include each parameter’s name, type (like "STRING", "INT", "ARRAY", or "STRUCT"), and position, along with th...

  • 0 kudos
PradeepPrabha
by New Contributor II
  • 1115 Views
  • 5 replies
  • 0 kudos

Any documentation mentioning connectivity from Azure SQL database connectivity to Azure Databricks

Any documentation available to connect from the Azure SQL database to Azure Databricks SQL workspace. We created a SQL warehouse personal access token for a user in a different team who can connect from his on-prem SQL DB to Databricks using the conn...

  • 1115 Views
  • 5 replies
  • 0 kudos
Latest Reply
BigRoux
Databricks Employee
  • 0 kudos

  Here are some considerations: SQL Trigger: Define a trigger in Azure SQL that activates on specific DML operations (e.g., INSERT, UPDATE). External Call: The trigger can log events to an intermediate service (like a control table or Event Grid). ...

  • 0 kudos
4 More Replies
Aminsn
by New Contributor III
  • 346 Views
  • 1 replies
  • 0 kudos

Is it possible to let multiple Apps share the same compute?

Apparently, for every app deployed on Databricks, a separate VM is allocated, costing 0.5 DBU/hour. This seems inefficient, why can't a single VM support multiple apps? It feels like a waste of money and resources to allocate independent VMs per app ...

  • 346 Views
  • 1 replies
  • 0 kudos
Latest Reply
Shua42
Databricks Employee
  • 0 kudos

Hi @Aminsn , You're understanding is correct in that only 1 running app can be deployed per app instance. If you want to maximize the utilization of the compute, one option could be to create a multi-page app, where the landing page will direct users...

  • 0 kudos
n-var
by New Contributor
  • 658 Views
  • 1 replies
  • 0 kudos

Unity Catalog: 403 Error When Connecting S3 via IAM Role and Storage Credential

Hi,We're currently setting up Databricks Unity Catalog on AWS. We created an S3 bucket and assigned an IAM role (databricks-storage-role) to give Databricks access.Note: Databricks doesn't use the IAM role directly. Instead, it requires a Storage Cre...

  • 658 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Have you follow any specific guide for the creation of the same? Are you setting up a Unity Catalog Metastore or the default storage for the workspace?For the Metastore creation have you follow steps in https://docs.databricks.com/aws/en/data-governa...

  • 0 kudos
oktarinet
by New Contributor II
  • 1748 Views
  • 2 replies
  • 0 kudos

azure databricks automatic user provisioning via terraform

Hi community, Azure databricks recently announced a new user management feature (now in public preview) called automatic-identity-management , which allows Azure databricks to access Azure Entra ID directly and grant users and groups permissions and ...

  • 1748 Views
  • 2 replies
  • 0 kudos
Latest Reply
saurabh18cs
Honored Contributor
  • 0 kudos

Hi , I think The automatic identity management feature provisions Azure Entra ID users and groups directly into Databricks. However, Terraform's databricks_group and databricks_group_member resources are designed for managing groups and memberships w...

  • 0 kudos
1 More Replies
Dharma25
by New Contributor II
  • 1594 Views
  • 1 replies
  • 0 kudos

workflow not pickingup correct host value (While working with MLflow model registry URI)

Exception: mlflow.exceptions.MlflowException: An API request to https://canada.cloud.databricks.com/api/2.0/mlflow/model-versions/list-artifacts failed due to a timeout. The error message was: HTTPSConnectionPool(host='canada.cloud.databricks.com', p...

  • 1594 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Databricks Employee
  • 0 kudos

Hello @Dharma25! It looks like this post duplicates one you shared earlier. A response has already been provided in the Original thread. I recommend continuing the discussion there to keep the conversation focused and organized.

  • 0 kudos
Rvwijk
by New Contributor II
  • 1604 Views
  • 1 replies
  • 0 kudos

New default notebook format (IPYNB) causes unintended changes on release

Dear Databricks,We have noticed the following issue since the new default notebook format has been set to IPYNB. When we release our code from (for example) DEV to TST using a release pipeline built in Azure DevOps, we see unintended changes popping ...

  • 1604 Views
  • 1 replies
  • 0 kudos
Latest Reply
Rvwijk
New Contributor II
  • 0 kudos

Seems like something went wrong with attaching the screenshot. So here we go.

  • 0 kudos
chandru44
by New Contributor
  • 1941 Views
  • 0 replies
  • 0 kudos

Networking Challenges with Databricks Serverless Compute (Control Plane) When Connecting to On-Prem

Hi Databricks Community,I'm working through some networking challenges when connecting Databricks clusters to various data sources and wanted to get advice or best practices from others who may have faced similar issues.Current Setup:I have four type...

Databricks Serverless Community Post.drawio (2).png
  • 1941 Views
  • 0 replies
  • 0 kudos