cancel
Showing results for 
Search instead for 
Did you mean: 
Administration & Architecture
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

newuser12445
by New Contributor
  • 391 Views
  • 1 replies
  • 0 kudos

Unity Catalog Enabled Clusters using PrivateNIC

Hello,When reviewing the VM settings for Databricks worker VMs, we can see that there are two(2) NICs.A primary ( PublicNIC (primary)) and a secondary (PrivateNIC (primary)).The workers VM is always assigned the PublicNIC and this is reachable from w...

  • 391 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @newuser12445, It seems like you’re dealing with some networking configuration issues related to Databricks worker VMs and their network interfaces. Let’s break down the situation: NICs (Network Interface Cards): You mentioned that each Databr...

  • 0 kudos
adurand-accure
by New Contributor II
  • 163 Views
  • 1 replies
  • 0 kudos

SQL Warehouse tag list from system table ?

Hello,  Is there a way to get the tags of SQL Warehouse clusters from system tables ?  like you do with system.compute.clustersThanks, 

  • 163 Views
  • 1 replies
  • 0 kudos
Latest Reply
adurand-accure
New Contributor II
  • 0 kudos

Answering my own question : system.billing.usage.custom_tags['cluster-owner'] @ databricks : I don't really understand the logic here

  • 0 kudos
Daniela_Boamba
by New Contributor III
  • 1273 Views
  • 2 replies
  • 0 kudos

Resolved! Databricks SSO Azure AD

Hello,I'm trying to test SSO with Azure AD.The test sso is passing on dtabricks and I can connect  to databricks using SSO.When I try to  test with postman to obtain a token I have the next error message :{"error_description":"OAuth application with ...

Administration & Architecture
AWS
Azure AD
Databricks
  • 1273 Views
  • 2 replies
  • 0 kudos
Latest Reply
Daniela_Boamba
New Contributor III
  • 0 kudos

Hello,The issue was with the postman.In postman you don't have to give the client id from your IDP but the client id from databricks "App connections".it is working well now.thank you. 

  • 0 kudos
1 More Replies
pfpmeijers
by New Contributor II
  • 259 Views
  • 2 replies
  • 0 kudos

Databricks on premise GDCE

Hello, Any plans for supporting Databricks on GDCE or other on private cloud-native stack/HW on premise?Regards, Patrick

  • 259 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @pfpmeijers, As of now, Databricks primarily operates as a unified, open analytics platform for constructing, deploying, sharing, and maintaining enterprise-grade data, analytics, and AI solutions at scale. It seamlessly integrates with cloud stor...

  • 0 kudos
1 More Replies
corp
by New Contributor II
  • 213 Views
  • 2 replies
  • 1 kudos

inter connected notebook

How to use inter connected notebook, available in databricks?

  • 213 Views
  • 2 replies
  • 1 kudos
Latest Reply
mhiltner
New Contributor II
  • 1 kudos

Do you mean running one notebook from another and using variables and functions defined in the other one? If that's what you're seeking, try using the magic command %run + notebook path.  You can find some documentation about it here: https://docs.da...

  • 1 kudos
1 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 565 Views
  • 2 replies
  • 0 kudos

Asset Bundles -> creation of Azure DevOps pipeline

If you choose in asset bundles mlops-stacks, it will create for you out of the box many nice things, including a pipeline to deploy to dev/stage/prod. #databricks

cicd.png
  • 565 Views
  • 2 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

Thank you for sharing this @Hubert-Dudek 

  • 0 kudos
1 More Replies
clyormz
by New Contributor
  • 456 Views
  • 1 replies
  • 0 kudos

databricks on azure jdbc

Hello Databricks teamI have one question regarding data bricks on azure configuration using jdbc [Simba][SparkJDBCDriver](700100)I am getting below error message : java.sql.SQLException: [Simba][SparkJDBCDriver](700100) Connection timeout expired. De...

  • 456 Views
  • 1 replies
  • 0 kudos
Latest Reply
jose_gonzalez
Moderator
  • 0 kudos

Check your network connection. Try "%sh nc -zv {hostname} {port}"

  • 0 kudos
ossinova
by Contributor II
  • 917 Views
  • 3 replies
  • 0 kudos

Override default Personal Compute policy using terraform / disable Personal Compute policy

I want to programmatically do some adjustments to the default personal compute resource or preferably create my own custom one based on the same configuration or policy family (in which all users can gain access to) when deploying a new workspace usi...

  • 917 Views
  • 3 replies
  • 0 kudos
Latest Reply
feiyun0112
Contributor III
  • 0 kudos

use api to create new cluster, set autotermination_minutes parameterhttps://docs.databricks.com/api/workspace/clusters/create#autotermination_minutes 

  • 0 kudos
2 More Replies
Snoonan
by New Contributor III
  • 870 Views
  • 1 replies
  • 0 kudos

Resolved! Creating Databricks workspace

Hi all,I am creating a Databricks workspace that has its own virtual network.When I create it I get this error:'The workspace 'xxxxxx' is in a failed state and cannot be launched. Please review error details in the activity log tab and retry your ope...

  • 870 Views
  • 1 replies
  • 0 kudos
Latest Reply
Snoonan
New Contributor III
  • 0 kudos

Hi all,I resolved the issue.My subnets did not have the correct delegations.Thanks,Sean

  • 0 kudos
Joaquim
by New Contributor II
  • 697 Views
  • 3 replies
  • 0 kudos

New admin question: How do you enable R on a existing cluster?

Hello Community. I have a user trying to use R and receive the error message illustrated on the attachment. I can't seem to find correct documentation on enabling R on an existing cluster. Would anyone be able to point me in the right direction? Than...

  • 697 Views
  • 3 replies
  • 0 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 0 kudos

This will help you

  • 0 kudos
2 More Replies
saikrishna3390
by New Contributor II
  • 750 Views
  • 2 replies
  • 1 kudos

Resolved! How do we get user list who accessed specific table or view in Unity catalog for last 6 months

We have a business use case where we want to track users who accessed a specific table in Unity catalog for last 6 months . Is there a way where we can pull this data ? 

  • 750 Views
  • 2 replies
  • 1 kudos
Latest Reply
Aviral-Bhardwaj
Esteemed Contributor III
  • 1 kudos

yes system table will have all details

  • 1 kudos
1 More Replies
agarg
by New Contributor II
  • 252 Views
  • 2 replies
  • 1 kudos

Databricks REST API to fetch mount points

Is there a way to fetch workspace  mount points ( mount infos) through REST API or SQL-query  ? ( similar to the python API "display(dbutils.fs.mounts())" )  I couldn't find any REST API for the mounts in the official databricks API  documentation ( ...

  • 252 Views
  • 2 replies
  • 1 kudos
Latest Reply
agarg
New Contributor II
  • 1 kudos

Thank you for the response @Kaniz . As per your comment above I tried looking for any system catalog tables (https://docs.databricks.com/en/sql/language-manual/sql-ref-information-schema.html) that could provide the relevant information regarding the...

  • 1 kudos
1 More Replies
madhura
by New Contributor II
  • 357 Views
  • 3 replies
  • 0 kudos

Rest endpoint for data bricks audit logs

I am trying to find official documentation link to get audit logs of data bricks. unable to find it. referred on.https://docs.databricks.com/en/administration-guide/account-settings/audit-logs.htmlhttps://docs.databricks.com/api/workspace/introductio...

  • 357 Views
  • 3 replies
  • 0 kudos
Latest Reply
Yeshwanth
Valued Contributor II
  • 0 kudos

@madhura  I could not find any endpoint that can be used to get the Audit logs. However, you can enable system tables in your workspace and try to read the data just like you read from any other table. Please check this to enable system tables: https...

  • 0 kudos
2 More Replies
Priyam1
by New Contributor III
  • 398 Views
  • 1 replies
  • 0 kudos

Databricks Job alerts

I'm currently running jobs on job clusters and would like these jobs to time out after 168 hours (7 days), at which point a new job cluster will be assigned. This timeout is specifically to ensure that jobs don't run on the same cluster for too long,...

  • 398 Views
  • 1 replies
  • 0 kudos
Latest Reply
Yeshwanth
Valued Contributor II
  • 0 kudos

@Priyam1 Good day!Based on the information provided, it seems that we do not have a direct way to mute notifications for timed-out jobs while still receiving alerts for job failures. You can reduce the number of notifications sent by filtering out no...

  • 0 kudos
mathiaskvist
by New Contributor III
  • 2885 Views
  • 7 replies
  • 1 kudos

Resolved! Valid Workspace Conf keys

HiI'm trying to automate the configuration of Admin Settings of our Databricks Workspace using Terraform. However identifying the correct config keys is very difficult.Databricks exposes a Workspace Conf API (Enable/disable features | Workspace Conf ...

  • 2885 Views
  • 7 replies
  • 1 kudos
Latest Reply
Alexis_Chicoine
New Contributor II
  • 1 kudos

I wanted to know the key for Store interactive notebook results in customer account.It's not ideal but by using the browser dev tools you can find out what it is by looking at the network activity after toggling it in the UI.  

  • 1 kudos
6 More Replies
Labels