- 2906 Views
- 6 replies
- 0 kudos
Unable to see sample data in Hive Metastore after moving to GCE
Hi,We have recently moved from GKE to GCE, it is taking forever to load the sample data in the manged delta tables.Even running simple select sql statements are taking forever. Totally clueless here, any help will be appreciatedThanks
- 2906 Views
- 6 replies
- 0 kudos
- 0 kudos
Hi All,Strangely after struggle for 2 days we figured out that we can't run the cluster in scalable mode, so after selecting single node mode we are able to execute queries and job. It seems there is a bug in the Databrick's GKE to GCE migration. Won...
- 0 kudos
- 1106 Views
- 1 replies
- 0 kudos
Resolved! Best Approach to Retrieve Policy IDs Across Multiple Workspaces
Hi, I’m aware of the API endpoint api/2.0/policies/clusters/list to fetch a list of policy IDs and names. However, we have 50 different workspaces, and I need to retrieve the specific policy ID and name for each one.Could you advise on the most effic...
- 1106 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Chinu ,Databricks does not provide a global API to query all workspaces in a single call. I guess your only option for now is to use scripting approach.
- 0 kudos
- 4604 Views
- 2 replies
- 1 kudos
Resolved! [Azure Databricks]: Use managed identity to access mlflow models and artifacts
Hello! I am new to Azure Databricks and have a question: In my current setup, I am running some containerized python code within an azure functions app. In this code, I need to download some models and artifacts stored via mlflow in our Azure Databri...
- 4604 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @quad_t, were you able to find a solution to this problem? I'm having similar issues when trying to use MSI to connect to MLflow.
- 1 kudos
- 1180 Views
- 1 replies
- 1 kudos
Resolved! ADLS Gen2 with Unity Catalog on Azure Databricks / is Workspace Admin permissions sufficient?
Hello,I want to use an ADLS Gen2 Storage Account for Managed Delta Tables on Azure Databricks. The mounting/connection should be managed by Unity Catalog. There is only going to be a single workspace (for now).Does this require Account Admin permissi...
- 1180 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @raffael ,It depends on where you want to configure managed storage location. If you want to do this at Metastore level then you have to be an account admin and you need to do this during metastore creation.You can also configure storage location ...
- 1 kudos
- 1669 Views
- 2 replies
- 0 kudos
Resolved! Webhook Authentication
If want to send notifications via webhook to Splunk, Datadog, or LogicMonitor, how might I configure Databricks to authenticate using the destination platform's bearer token?
- 1669 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @prodrick ,It looks like webhooks destination supports only basic authentication using username and password. But you can try to paste bearer token to password section. Some webhook endpoints accept Bearer tokens in the password field while leavin...
- 0 kudos
- 8505 Views
- 11 replies
- 3 kudos
Resolved! Valid Workspace Conf keys
HiI'm trying to automate the configuration of Admin Settings of our Databricks Workspace using Terraform. However identifying the correct config keys is very difficult.Databricks exposes a Workspace Conf API (Enable/disable features | Workspace Conf ...
- 8505 Views
- 11 replies
- 3 kudos
- 3 kudos
something that may be of interest. Not a substitute to official documentation:https://github.com/databricks/terraform-provider-databricks/issues/3365
- 3 kudos
- 1218 Views
- 1 replies
- 0 kudos
Serverless and private connectivity - unable to create managed table
Hi,I am trying to setup Private connectivity for my serverless compute to my managed storage which is at the catalog level.I created the NCC, endpoints, external location and credentials with the required access. My storage account public network acc...
- 1218 Views
- 1 replies
- 0 kudos
- 0 kudos
Try to do an nslooup to your storage from a notebook.# From your Databricks cluster, test DNS resolution%sh nslookup yourstorageaccount.blob.core.windows.net # Should resolve to private IP, not public IP # Check if storage account allows traffic f...
- 0 kudos
- 2400 Views
- 3 replies
- 1 kudos
Resolved! Monitoring pool usage
Hi All,I am working on creating a dashboard for databricks instance pools. I am capturing maximum usage and scheduled a jpb to capture the info every 15 mins and I aggregate to see if any point the max usage is greater then 85% of capacity, Is there ...
- 2400 Views
- 3 replies
- 1 kudos
- 1 kudos
The ai_forecast function did the magic and it have the forecast on pool usage that predicted for next 30 days
- 1 kudos
- 2622 Views
- 3 replies
- 2 kudos
Support for Unity Catalog External Data in Fabric OneLake
Hi community!We have set up a Fabric Link with our Dynamics, and want to attach the data in Unity Catalog using the External Data connector.But it doesn't look like Databricks support other than the default dls endpoints against Azure.Is there any wa...
- 2622 Views
- 3 replies
- 2 kudos
- 2 kudos
Thanks @szymon_dybczak , do you know if support is on the roadmap? Since the current supported way of doing this with credential passthrough on the Compute is deprecated.Regards Marius
- 2 kudos
- 2334 Views
- 2 replies
- 3 kudos
Resolved! Can't create a new table from uploaded file.
I've just started using the Community Edition through the AWS marketplace and I'm trying to set up tables to share with a customer. I've managed to create 3 of the tables, but when uploading a small version of the fourth file, I'm having problems. T...
- 2334 Views
- 2 replies
- 3 kudos
- 3 kudos
Thank you, Lou.By loading manually, I found the error that wasn't being displayed in the UI. Once I took care of this, everything loaded just fine.
- 3 kudos
- 1095 Views
- 3 replies
- 0 kudos
Usage of Databricks apps or UI driven approach to create & maintain Databricks infrastructure
Hi all,the CI/CD based process to create & maintain Databricks infrastructure (UC securables, Metastore securables, Workspace securables) is resulting into high time to market in our case. So, we are planning to make it UI driven, as on, create a Da...
- 1095 Views
- 3 replies
- 0 kudos
- 0 kudos
UI Driven approach is definetly a bad idea for deployment. I have seen most organisation using terraform or biceps for deployment.Why UI-Driven Infrastructure is Wrong1. No version control or audit trail2. Configuration Drift & Inconsistency3. No Dis...
- 0 kudos
- 3470 Views
- 13 replies
- 2 kudos
Resolved! Which API to use to list groups in which a given user is a member
Is there an API that can be used to list groups in which a given user is a member? Specifically, I’d be interested in account (not workspace) groups.It seems there used to be a workspace-level list-parents API referred to in the answers to this quest...
- 3470 Views
- 13 replies
- 2 kudos
- 2 kudos
there will be 2 system tables soon - users, groups. Probably, that makes life easy. I have already requested the Databricks RAS meant for my customer to enabled these for our control plane
- 2 kudos
- 1996 Views
- 4 replies
- 2 kudos
Resolved! Unable to get external groups from list group details API
Hi Team, I see that automatic group sync is enabled in our azure databricks by default but I am not able to get the groups using list groups endpoint. What is the right way to get it. I've an automated process to bring certain account groups to works...
- 1996 Views
- 4 replies
- 2 kudos
- 2 kudos
I checked it and I can confirm that it worked. I enabled automatic identity management, then I've added EntraId group to the workspace and using above endpoint I was able to get information about that group.
- 2 kudos
- 1616 Views
- 2 replies
- 0 kudos
Is there a way to see Job/Workflow in the lineage of a table/view
Hi allOne frequent question I get from the data users is - how often a view is refreshed? ((users are given access to views not tables))I was thinking to guide them to leverage lineage for the same. However, I noticed the lineage tab in the catalog e...
- 1616 Views
- 2 replies
- 0 kudos
- 0 kudos
@Vidhi_Khaitan the user wants to know the frequency of the schedule, not whether the view/table is fresh...
- 0 kudos
- 1111 Views
- 1 replies
- 0 kudos
Job Runs Permissions for Runs Submit API via ADF
I wanted to assign group permissions to Job runs in Azure Databricks which are triggered from ADF via Runs Submit API.How can we assign Job Runs Permissions in Azure Databricks for ADF triggers by Runs Submit API?
- 1111 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @hajojko ,Good Day!I believe this is what you can try for your use case - https://docs.databricks.com/api/workspace/jobs_21/submit#access_control_list
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
43 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
97 | |
37 | |
26 | |
25 | |
18 |