- 579 Views
- 7 replies
- 1 kudos
Resolved! Deployment of private databricks workspace.
I tried to create configuration of Databricks with Vlan injection and I faced few problem during deploymen.1. I tried to deploy my workspace using IaC and terraform. Whole time I face issue with NSG even when I create configuration as follow in this ...
- 579 Views
- 7 replies
- 1 kudos
- 1 kudos
All issues was resolvedReady to deploy codelocals { default_tags = { terraform = "true" workload = var.app env = var.environment } } resource "azurerm_databricks_access_connector...
- 1 kudos
- 311 Views
- 4 replies
- 0 kudos
Real-time output missing when using “Upload and Run File” from VS Code
I am running Python files on a Databricks cluster using the VS Code Databricks extension, specifically the “Upload and Run File” command.I cannot get real-time output in the Debug Console. I have checked the official docs:https://learn.microsoft.com/...
- 311 Views
- 4 replies
- 0 kudos
- 0 kudos
Yes, prints and loggings are viewable in driver logs as they happen. If the same file is run in databricks Web UI they are viewable on output window as they happen as well. But, when run through VS code, unfortunately they are not visible in the debu...
- 0 kudos
- 198 Views
- 1 replies
- 0 kudos
Printing Notebook Dashboards
Is it possible to print the tables in a notebook dashboard to a PDF? I have about 10 tables for stratifications in a dashboard that would be great to print all at once into a clean pdf report.
- 198 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @DArcher - are you using legacy dashboard or modern Lakeview(AI/BI) dashboard?In the legacy one via notebook, there is no such direct way to export. You will have to do custom python script perhaps to write the output in html and then print to PDF...
- 0 kudos
- 1864 Views
- 6 replies
- 2 kudos
cloud_infra_costs
I was looking at the system catalog and realized that there is an empty table called cloud_infra_costs. Could you tell me what is this for and why it is empty?
- 1864 Views
- 6 replies
- 2 kudos
- 2 kudos
You can also take a look at this built-in cost control dashboard explained in the below video or official databricks documentation at https://docs.databricks.com/aws/en/admin/usage/ . Concerning the dashboard, relevant subject for me was you can insp...
- 2 kudos
- 302 Views
- 1 replies
- 0 kudos
Resolved! Account Creation
I attempted to create my account, but I ran into an issue during the process. Here is what I did: - I visited your website and entered my verification code. - When asked about how I will use the database, I selected “For Work, Set up with my cloud” s...
- 302 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @minkun81! It looks like you’re stuck in a verification-code loop. Could you try using a different browser, switching to incognito mode, or clearing your cache and cookies before attempting the login again?Also, please make sure you're followin...
- 0 kudos
- 314 Views
- 2 replies
- 2 kudos
Resolved! Databricks Systems Tables Link
When will databricks provide an out-of-box solution that automatically links cluster metadata(system.compute), job & task telemetry(system.lakeflow) and mlflow model metrics(system.mlflow) without requiring custom sql join for training and inferences...
- 314 Views
- 2 replies
- 2 kudos
- 2 kudos
There is no public commitment or release forecast for fully automated cross-system monitoring or UI linking system.compute, system.lakeflow, and system.mlflow without custom queries. There is definitely potential for future releases in this area but ...
- 2 kudos
- 418 Views
- 3 replies
- 2 kudos
Resolved! How to change the display name for a Service Principal
Hi,I'm trying to change the displayName for some EntraID managed service principals.I've tried using the scim API with a PATCH request, I get a HTTP 200 response, but the displayName remains the same.I tried the same with Databricks managed service ...
- 418 Views
- 3 replies
- 2 kudos
- 2 kudos
@Fabrice_MONNIER - If the name isn't changing for pure Databricks SPs, the issue is almost certainly Account-Level vs. Workspace-Level scope. If Service Principal was created at the Account Console level and then added to the Workspace, the Workspace...
- 2 kudos
- 132 Views
- 1 replies
- 0 kudos
Token report page
Hi All,I'm looking for the API for this token report page in databricks admin page so that I can get this on my notebook.There is a API for workspace level not for account level./api/2.0/token-management/tokensCan anyone point me to right API?
- 132 Views
- 1 replies
- 0 kudos
- 0 kudos
This page is an internal aggregation view; there is no single public API endpoint that generates this "Token Report."The Databricks Account Console performs a proprietary backend query that reaches into all your workspaces to build this view. To repl...
- 0 kudos
- 26570 Views
- 27 replies
- 26 kudos
Resolved! Unable to login to Azure Databricks Account Console
I have a personal Azure pay-as-you-go subscription in which I have the 'Global Administrator' role. I am also the databricks account administrator.Until two weeks ago, I was able to access the databricks account console without any issues, but I am f...
- 26570 Views
- 27 replies
- 26 kudos
- 26 kudos
Thanks Dustin.This solves my issue too,but I want to know WHY this happened?I used an email id (say,xx@gmail.com) to login to Azure,using the same id/user I have deployed databricks and able to launch the workspace.But not account console.what's so s...
- 26 kudos
- 1694 Views
- 12 replies
- 5 kudos
Resolved! I need a switch to turn off Data Apps in databricks workspaces
HiHow do I disable Data Apps on my workspace. This is really annoying that Databricks pushes new features without any option to disable them. At least you should have some tools to control access before rolling it out. It seems you only care about fe...
- 1694 Views
- 12 replies
- 5 kudos
- 5 kudos
@Raman_Unifeye , I don't have visibility into the roadmap. However, if you are a customer you can always log a feature request. Cheers, Louis.
- 5 kudos
- 25256 Views
- 11 replies
- 6 kudos
Resolved! Run workflow using git integration with service principal
Hi,I want to run a dbt workflow task and would like to use the git integration for that. Using my personal user I am able to do so but I am running my workflows using a service principal.I added git credentials and the repository using terraform. I a...
- 25256 Views
- 11 replies
- 6 kudos
- 6 kudos
On the other hand, another approach you could use. Configure your tasks with relative paths to notebooks and deploy all of them with DAB. Your job will reference directly the deployed notebook, no need to access GIT from jobs/notebooks. That is deleg...
- 6 kudos
- 173 Views
- 1 replies
- 1 kudos
Enable Compute Policy Management and Compute Policy Admin Role
Hi,I have an account with an Enterprise plan and wanted to change some features of the compute policy for a cluster i wanted to create in a workspace I am an Admin of. But I cannot because the fields are read-only.Co-Pilot directed me to look for an ...
- 173 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @hv_sg3 ,That's weird. As an workspace admin you should be able to do that. Could you attach some screens?
- 1 kudos
- 3240 Views
- 13 replies
- 6 kudos
Resolved! Install python packages from Azure DevOps feed with service principal authentication
At the moment I install python packages from our Azure DevOps feed with a PAT token as authentication mechanism. This works well, but I want to use a service principal instead of the PAT token.I have created an Azure service principal and assigned it...
- 3240 Views
- 13 replies
- 6 kudos
- 6 kudos
I'm kinda late to the party but what is the suggested way of retriving the access token rn? Using some #bash or python code stored in global init script or cluster scoped init scripts? I don't want to stored this code in the notebook.Idea is to block...
- 6 kudos
- 208 Views
- 1 replies
- 0 kudos
Resolved! Can anyone share Databricks security model documentation or best-practice references
Can anyone share Databricks security model documentation or best-practice references
- 208 Views
- 1 replies
- 0 kudos
- 0 kudos
Here is the official documentation of Databricks: https://docs.databricks.com/aws/en/security/ Do you need to dive deeper into any specific area?
- 0 kudos
- 237 Views
- 1 replies
- 1 kudos
Moving Databricks Metastore Storage Account Between Azure Subscriptions
I have two Azure subscriptions: one for Prod and another for Non-Prod. During the initial setup of the Non-Production Databricks Workspace, I configured the metastore storage account in the Non-Prod subscription. However, I now want to move this meta...
- 237 Views
- 1 replies
- 1 kudos
- 1 kudos
Assuming the metastore is the same for your DEV and PROD environments and what you want is just to use the same storage account + container to place managed tables, volumes, etc. in theory you just need to copy all content from your source storage ac...
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
61 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
| User | Count |
|---|---|
| 119 | |
| 39 | |
| 37 | |
| 28 | |
| 25 |