cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

ossinova
by Contributor II
  • 400 Views
  • 2 replies
  • 0 kudos

Error: Node.js SQL Driver auth using M2M

I am trying to follow the documentation in order to establish a M2M authentication through the Node.js SQL Driver. I am however having issues as it results in the following error message. What is it that I am not seeing here?Error msg:{"level":"info"...

  • 400 Views
  • 2 replies
  • 0 kudos
Latest Reply
ta2
New Contributor II
  • 0 kudos

Not sure if you're still suck on this but I had the same issue and managed to resolve it by adding azureTenantId: <my-tenant-id> and useDatabricksOAuthInAzure: true in the client connection options. Hope this helps!

  • 0 kudos
1 More Replies
MarcoRezende
by New Contributor
  • 440 Views
  • 1 replies
  • 0 kudos

It's possible to sync account group/user to workspace without the need to do it manually?

I am using Databricks SCIM for my Databricks Account, so when i add a user or group in the SCIM connector, the user or group its created in Databricks Account. After this, i need to manually assign the user/group to the workspaces. My boss wants to o...

  • 440 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @MarcoRezende, You can streamline user and group provisioning in Databricks using SCIM (System for Cross-domain Identity Management). Here’s how you can achieve this: Configure SCIM Provisioning: Databricks recommends setting up provisioning ...

  • 0 kudos
ac0
by New Contributor III
  • 199 Views
  • 1 replies
  • 0 kudos

Delta Live Table pipeline steps explanation

Does anyone have documentation on what is actually occurring in each of these steps?Creating update Waiting for resourcesInitializingSetting up tablesRendering graphFor example, what is the difference between initializing and setting up tables? I am ...

  • 199 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @ac0 ,  Initialization involves setting up the execution environment for your data processing tasks. This step includes: Cluster Initialization: Spinning up a compute cluster (if not already active) to execute your pipeline.Loading Dependencies: L...

  • 0 kudos
Mukee
by New Contributor II
  • 858 Views
  • 2 replies
  • 0 kudos

How to get Workspace name with workspaceId?

I have an AWS Managed Databricks instance. I am trying to get a workspace name with workspace ID. Thank you very much for your time and assistance.

  • 858 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Mukee, To retrieve the workspace name associated with a workspace ID in your AWS Managed Databricks instance, follow these steps: Login to your Databricks workspace.Look at the URL displayed in your browser’s address bar.Delete your workspace ...

  • 0 kudos
1 More Replies
gwilson
by New Contributor II
  • 841 Views
  • 3 replies
  • 0 kudos

Setup unity catalog external location to minio

We have a minio server running in Azure that we have connected to the spark clusters directly. As we move to unity catalog, we would like to make the data stored in our minio servers accessible as an external location in Azure Databricks account via ...

  • 841 Views
  • 3 replies
  • 0 kudos
Latest Reply
174817
New Contributor II
  • 0 kudos

Hi @Kaniz ,I have a server on Azure that supports the S3 protocol, and I am trying to follow these instructions in order to use Unity on Azure DataBricks with it.  I am not sure about this part of your reply:Set the Spark configuration values in the ...

  • 0 kudos
2 More Replies
breaka
by New Contributor II
  • 715 Views
  • 4 replies
  • 2 kudos

Operations on Unity Catalog take too long

Hi!We are currently PoC-ing Databricks with Unity Catalog on AWS but it seems there are some issues.Creating a database in an existing (unity) catalog takes over 10 minutes. Creating an external table on top of an existing delta table (CREATE TABLE m...

  • 715 Views
  • 4 replies
  • 2 kudos
Latest Reply
breaka
New Contributor II
  • 2 kudos

PS: Apparently I'm not allowed to use the world H E A L T H (without spaces) in my reply (The message body contains H e a l t h, which is not permitted in this community. Please remove this content before sending your post.)

  • 2 kudos
3 More Replies
gabriel_lazo
by New Contributor II
  • 640 Views
  • 3 replies
  • 0 kudos

How to configure an AWS so that workspace databricks can only access the s3 acces point using VPC

My team requires a configuration so that a databricks workspace can connect to aws s3 access point through VPC and that other databricks workspaces cannot access it if they are not within the route table.I have searched online, but I have only found ...

  • 640 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 0 kudos
2 More Replies
zsucic1
by New Contributor III
  • 1307 Views
  • 6 replies
  • 4 kudos

Resolved! Current Azure Managed Identity capabilities 2024?

Hello everyone, I have a few questions about MI capabilites: Is it possible to define a managed identity for Azure Databricks Service resource and use it for e.g.: Writing to Azure SQL Server database Authenticating to Azure Devops in order to downlo...

  • 1307 Views
  • 6 replies
  • 4 kudos
Latest Reply
zsucic1
New Contributor III
  • 4 kudos

Kaniz, thank you very much, you are the best! I will get to work implementing your advice

  • 4 kudos
5 More Replies
Priyam1
by New Contributor III
  • 1764 Views
  • 2 replies
  • 0 kudos

Access Logs

How can I check the timing when a particular AAD group was given access to a particular schema in a unity catalogue?Is there any API I can call to get this logs?

  • 1764 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Priyam1, To track when a specific Azure Active Directory (AAD) group was granted access to a particular schema in a Unity Catalog, you have a few options: Unity Catalog Privileges and Access Control: Unity Catalog allows you to control access...

  • 0 kudos
1 More Replies
Avvar2022
by New Contributor III
  • 397 Views
  • 1 replies
  • 0 kudos

Is there a setting which restricts users from Creating Job and Pipeline?

as far i know currently ((as of 03-25-2024) databricks don't any workspace admin settings option to restrict users from creating a workflow/job or delta pipelines. Here is the use case for it Example: you have 3 tier landscape Dev, Qa and Prod.It is ...

Administration & Architecture
administration
jobs
pipelines
  • 397 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Avvar2022, As of March 25, 2024, you are correct that Databricks does not natively provide a direct workspace admin setting to restrict users from creating workflows, jobs, or Delta pipelines.

  • 0 kudos
migq2
by New Contributor II
  • 732 Views
  • 5 replies
  • 0 kudos

Use Unity External Location with full paths in delta_log

I have an external delta table in unity catalog (let's call it mycatalog.myschema.mytable) that only consists of a `_delta_log` directory that I create semi-manually, with the corresponding JSON files that define it. The JSON files point to parquet f...

  • 732 Views
  • 5 replies
  • 0 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 0 kudos

I suggest you look at something else than UC for such cases.  I also wonder if delta lake is the right format.

  • 0 kudos
4 More Replies
avrm91
by New Contributor II
  • 455 Views
  • 2 replies
  • 1 kudos

GCP - (DWH) Cluster Start-up Delayed - Failing to start

I face the issue that my fresh new Databricks workspace is not capable to start any cluster."Cluster Start-up Delayed. Please wait while we continue to try and start the cluster. No action is required from you."After 1830 seconds (30,5 minutes) the w...

  • 455 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @avrm91,  Verify that your project has sufficient CPU quota in the Google Cloud Platform (GCP) project associated with your Databricks workspace. If the quota is exceeded, it can prevent cluster nodes from launching.You can check your GCP quotas i...

  • 1 kudos
1 More Replies
rmubeenhsal
by New Contributor II
  • 477 Views
  • 2 replies
  • 0 kudos

authorizationfailure on ls fs on mount point files

One of our users has as of last week started seeing an authorization failure when he tries to list the files in the Azure storage account using Databricks Cli or Databricks API(using Python). He can list files on the Databricks portal or through the ...

  • 477 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Valued Contributor II
  • 0 kudos

Have you checked the list of allowed ip addresses that are set for the Storage account in Azure? Is user using VPN or internal network, we might need to confirm if the network where the user is trying to list is set as allowed. 

  • 0 kudos
1 More Replies
curiousoctopus
by New Contributor II
  • 368 Views
  • 1 replies
  • 0 kudos

User not authorised to copy files to dbfs

Hi,I'm trying to use a service principal to copy files to dbfs using the command line "databricks fs cp <source> <target>" but get back "User not authorised". I configured the authentication with PAT token and it is successful as I can deploy and lau...

  • 368 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Valued Contributor II
  • 0 kudos

In Databricks, data access permissions are often managed separately from workspace permissions. For DBFS, access control is typically managed through the underlying cloud storage (Azure Blob Storage, S3, etc.). The service principal needs to have the...

  • 0 kudos
mchirouze
by New Contributor
  • 508 Views
  • 1 replies
  • 0 kudos

Send formatted html email from email distribution address

Hi, I have created an email distribution list "#MyList@mycompany.com". In the RShiny world I was able to send emails by a) getting the IP of the server I was sending the emails from and b) whitelisting that IP address within my company's SMTP Relay r...

  • 508 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @mchirouze, To set up email services in Databricks, you have a few options depending on your requirements. Let’s explore them: Workspace Email Settings: As a workspace admin user, you can configure when users receive emails for certain events ...

  • 0 kudos
Labels