cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
Explore discussions on Databricks administration, deployment strategies, and architectural best practices. Connect with administrators and architects to optimize your Databricks environment for performance, scalability, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Dicer
by Valued Contributor
  • 4465 Views
  • 1 replies
  • 1 kudos

Resolved! Import folder (no .whl or .jar files) and run the `python3 setup.py bdist_wheel` for lib install

I want to import the ibapi python module in Azure Databricks Notebook.Before this, I downloaded the the TWS API folder from https://interactivebrokers.github.io/# I need to go through the following steps to install the API:Download and install TWS Ga...

  • 4465 Views
  • 1 replies
  • 1 kudos
Latest Reply
arpit
Databricks Employee
  • 1 kudos

You can try to upload the folder in the workspace location and try to cd in the desired folder and try to install in via notebook. But it would be a notebook scope installation. If you are looking for a cluster scoped installation then you would need...

  • 1 kudos
Datastoryteller
by New Contributor II
  • 2358 Views
  • 3 replies
  • 0 kudos

Permission denied using patchelf

Hello all I would like to run a Python script on a Shared cluster: The Python script tries to do, under the hood, a call to `patchelf` utility, in order to set a r-path. Something along those lines:execute('patchelf',['--set-rpath', rpath, lib])The p...

Screenshot 2024-04-17 at 16.00.03.png
  • 2358 Views
  • 3 replies
  • 0 kudos
Latest Reply
Datastoryteller
New Contributor II
  • 0 kudos

I basically followed this tutorial https://learn.microsoft.com/en-gb/azure/databricks/data-governance/unity-catalog/get-started and it seemed to be working, so I guess Unity Catalog is enabled?

  • 0 kudos
2 More Replies
erigaud
by Honored Contributor
  • 5888 Views
  • 2 replies
  • 0 kudos

Resolved! Call an Azure Function App with Access Restrictions from a Databricks Workspace

Hello,As the title says, I am trying to call an function from an Azure Function App configured with access restrictions from a python notebook in my Databricks workspace. The Function App resource is in a different subscription as the Databricks work...

  • 5888 Views
  • 2 replies
  • 0 kudos
Latest Reply
erigaud
Honored Contributor
  • 0 kudos

Update : Problem was fixed ! The key was to set an VNET rule in the access restriction, giving access directly to the subnets used by Databricks.It seems like for Microsoft to Microsoft connections, the IP addresses are not used, so adding the IP ran...

  • 0 kudos
1 More Replies
alm
by New Contributor III
  • 2555 Views
  • 0 replies
  • 0 kudos

Azure Devops repos access

I have a Databricks setup, where the users and their permissions are handled in Microsoft Azure using AD groups and then provisioned (account level) using a provisioning connector to Databricks. The code repositories are in Azure Devops where users a...

  • 2555 Views
  • 0 replies
  • 0 kudos
sushant047_ms
by New Contributor III
  • 7022 Views
  • 3 replies
  • 2 kudos

How to bind a User assigned Managed identity to Databricks to access external resources?

Is there a way to bind a user assigned managed identity to Databricks? We want to access some SQL DBs, Redis cache from our Spark code running on Databricks using Managed Identity instead of Service Principals and basic authentication.As of today, Da...

Administration & Architecture
azure
managed-identity
user-assigned-managed-identity
  • 7022 Views
  • 3 replies
  • 2 kudos
Latest Reply
sushant047_ms
New Contributor III
  • 2 kudos

@Carpender correcting my comment above, Databricks assigned Managed Identity is working and we are able to access but as stated in the original question we are looking for authorization using User Assigned Managed Identity (UAMI). With UAMI we cannot...

  • 2 kudos
2 More Replies
DavidZS
by New Contributor
  • 5777 Views
  • 1 replies
  • 0 kudos

How to setup service principal to assing account-level groups to workspaces using terraform

Based on best practices, we have set up SCIM provisioning using Microsoft Entra ID to synchronize Entra ID groups to our Databricks account. All workspaces have identity federation enabled.However, how should workspace administrators assign account-l...

  • 5777 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Have you tried giving Manager role on the group to the service principal which is workspace admin? Once you do this you may be able to use the settings to  In workspace context, adding account-level group to a workspace in databricks_permission_assig...

  • 0 kudos
mroy
by Contributor
  • 2074 Views
  • 1 replies
  • 0 kudos

Resolved! Instances are not being terminated in time (extra AWS costs)

For a few days we have been trying to figure out why our AWS costs suddenly went up around March 20th, and we just found the answer: the EC2 instances are left in an unterminated state for a couple of minutes at the end of each run! This is a very se...

mroy_0-1712023586396.png
  • 2074 Views
  • 1 replies
  • 0 kudos
Latest Reply
mroy
Contributor
  • 0 kudos

Nevermind, this was actually due to a reservation that expired. 臘‍

  • 0 kudos
ossinova
by Contributor II
  • 2825 Views
  • 2 replies
  • 1 kudos

Error: Node.js SQL Driver auth using M2M

I am trying to follow the documentation in order to establish a M2M authentication through the Node.js SQL Driver. I am however having issues as it results in the following error message. What is it that I am not seeing here?Error msg:{"level":"info"...

  • 2825 Views
  • 2 replies
  • 1 kudos
Latest Reply
ta2
New Contributor II
  • 1 kudos

Not sure if you're still suck on this but I had the same issue and managed to resolve it by adding azureTenantId: <my-tenant-id> and useDatabricksOAuthInAzure: true in the client connection options. Hope this helps!

  • 1 kudos
1 More Replies
gwilson
by New Contributor II
  • 4735 Views
  • 2 replies
  • 0 kudos

Setup unity catalog external location to minio

We have a minio server running in Azure that we have connected to the spark clusters directly. As we move to unity catalog, we would like to make the data stored in our minio servers accessible as an external location in Azure Databricks account via ...

  • 4735 Views
  • 2 replies
  • 0 kudos
Latest Reply
174817
New Contributor III
  • 0 kudos

Hi @Retired_mod ,I have a server on Azure that supports the S3 protocol, and I am trying to follow these instructions in order to use Unity on Azure DataBricks with it.  I am not sure about this part of your reply:Set the Spark configuration values i...

  • 0 kudos
1 More Replies
breaka
by New Contributor III
  • 6947 Views
  • 3 replies
  • 3 kudos

Operations on Unity Catalog take too long

Hi!We are currently PoC-ing Databricks with Unity Catalog on AWS but it seems there are some issues.Creating a database in an existing (unity) catalog takes over 10 minutes. Creating an external table on top of an existing delta table (CREATE TABLE m...

  • 6947 Views
  • 3 replies
  • 3 kudos
Latest Reply
breaka
New Contributor III
  • 3 kudos

PS: Apparently I'm not allowed to use the world H E A L T H (without spaces) in my reply (The message body contains H e a l t h, which is not permitted in this community. Please remove this content before sending your post.)

  • 3 kudos
2 More Replies
gabriel_lazo
by New Contributor II
  • 2912 Views
  • 1 replies
  • 0 kudos

How to configure an AWS so that workspace databricks can only access the s3 acces point using VPC

My team requires a configuration so that a databricks workspace can connect to aws s3 access point through VPC and that other databricks workspaces cannot access it if they are not within the route table.I have searched online, but I have only found ...

  • 2912 Views
  • 1 replies
  • 0 kudos
Priyam1
by New Contributor III
  • 3863 Views
  • 1 replies
  • 0 kudos

Access Logs

How can I check the timing when a particular AAD group was given access to a particular schema in a unity catalogue?Is there any API I can call to get this logs?

  • 3863 Views
  • 1 replies
  • 0 kudos
migq2
by New Contributor III
  • 4691 Views
  • 4 replies
  • 0 kudos

Use Unity External Location with full paths in delta_log

I have an external delta table in unity catalog (let's call it mycatalog.myschema.mytable) that only consists of a `_delta_log` directory that I create semi-manually, with the corresponding JSON files that define it. The JSON files point to parquet f...

  • 4691 Views
  • 4 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I suggest you look at something else than UC for such cases.  I also wonder if delta lake is the right format.

  • 0 kudos
3 More Replies